Académique Documents
Professionnel Documents
Culture Documents
Linear algebra
Linear algebra is the branch of mathematics concerning vector spaces, often finite or countably infinite dimensional, as well as linear mappings between such spaces. Such an investigation is initially motivated by a system of linear equations containing several unknowns. Such equations are naturally represented using the formalism of matrices and vectors. Linear algebra is central to both pure and applied mathematics. For instance, abstract algebra arises by relaxing the axioms of a vector space, leading to a number of generalizations. Functional analysis studies The three-dimensional Euclidean space R3 is a vector space, and lines and planes passing through the origin are vector subspaces in the infinite-dimensional version of the theory of vector R3. spaces. Combined with calculus, linear algebra facilitates the solution of linear systems of differential equations. Techniques from linear algebra are also used in analytic geometry, engineering, physics, natural sciences, computer science, computer animation, and the social sciences (particularly in economics). Because linear algebra is such a well-developed theory, nonlinear mathematical models are sometimes approximated by linear ones.
History
The study of linear algebra first emerged from the study of determinants, which were used to solve systems of linear equations. Determinants were used by Leibniz in 1693, and subsequently, Gabriel Cramer devised Cramer's Rule for solving linear systems in 1750. Later, Gauss further developed the theory of solving linear systems by using Gaussian elimination, which was initially listed as an advancement in geodesy. The study of matrix algebra first emerged in England in the mid-1800s. In 1844 Hermann Grassmann published his Theory of Extension which included foundational new topics of what is today called linear algebra. In 1848, James Joseph Sylvester introduced the term matrix, which is Latin for "womb". While studying compositions of linear transformations, Arthur Cayley was led to define matrix multiplication and inverses. Crucially, Cayley used a single letter to denote a matrix, thus treating a matrix as an aggregate object. He also realized the connection between matrices and determinants, and wrote "There would be many things to say about this theory of matrices which should, it seems to me, precede the theory of determinants".[1] In 1882, Hseyin Tevfik Pasha wrote the book titled "Linear Algebra".[2][3] The first modern and more precise definition of a vector space was introduced by Peano in 1888; by 1900, a theory of linear transformations of finite-dimensional vector spaces had emerged. Linear algebra first took its modern form in the first half of the twentieth century, when many ideas and methods of previous centuries were generalized as abstract algebra. The use of matrices in quantum mechanics, special relativity, and statistics helped spread the subject of linear algebra beyond pure mathematics. The development of computers led to increased research in efficient algorithms for Gaussian elimination and matrix decompositions, and linear algebra became an essential tool for modelling and simulations. The origin of many of these ideas is discussed in the articles on determinants and Gaussian elimination.
Linear algebra
Scope of study
Vector spaces
The main structures of linear algebra are vector spaces. A vector space over a field F is a set V together with two binary operations. Elements of V are called vectors and elements of F are called scalars. The first operation, vector addition, takes any two vectors v and w and outputs a third vector v + w. The second operation takes any scalar a and any vector v and outputs a new vector vector av. In view of the first example, where the multiplication is done by rescaling the vector v by a scalar a, the multiplication is called scalar multiplication of v by a. The operations of addition and multiplication in a vector space satisfy the following axioms. In the list below, let u, v and w be arbitrary vectors in V, and a and b scalars in F.
Axiom Associativity of addition Commutativity of addition Identity element of addition Inverse elements of addition Signification u + (v + w) = (u + v) + w u+v=v+u There exists an element 0 V, called the zero vector, such that v + 0 = v for all v V. For every v V, there exists an element v V, called the additive inverse of v, such that v + (v) = 0 a(u + v) = au + av
Distributivity of scalar multiplication with respect to vector addition Distributivity of scalar multiplication with respect to field addition Compatibility of scalar multiplication with field multiplication Identity element of scalar multiplication
(a + b)v = av + bv
a(bv) = (ab)v
[4]
Elements of a general vector space V may be objects of any nature, for example, functions, polynomials, vectors, or matrices. Linear algebra is concerned with properties common to all vector spaces.
Linear transformations
Similarly as in the theory of other algebraic structures, linear algebra studies mappings between vector spaces that preserve the vector-space structure. Given two vector spaces V and W over a field F, a linear transformation (also called linear map, linear mapping or linear operator) is a map
for any vectors u,v V and a scalar a F. Additionally for any vectors u, v V and scalars a, b F:
When a bijective linear mapping exists between two vector spaces (that is, every vector from the second space is associated with exactly one in the first), we say that the two spaces are isomorphic. Because an isomorphism preserves linear structure, two isomorphic vector spaces are "essentially the same" from the linear algebra point of view. One essential question in linear algebra is whether a mapping is an isomorphism or not, and this question can be answered by checking if the determinant is nonzero. If a mapping is not an isomorphism, linear algebra is interested in finding its range (or image) and the set of elements that get mapped to zero, called the kernel of the mapping.
Linear algebra Linear transformations have geometric significance. For example, 2 2 real matrices denote standard planar mappings that preserve the origin.
Linear algebra There is an important distinction between the coordinate n-space Rn and a general finite-dimensional vector space V. While Rn has a standard basis {e1, e2, , en}, a vector space V typically does not come equipped with such a basis and many different bases exist (although they all consist of the same number of elements equal to the dimension of V). One major application of the matrix theory is calculation of determinants, a central concept in linear algebra. While determinants could be defined in a basis-free manner, they are usually introduced via a specific representation of the mapping; the value of the determinant does not depend on the specific basis. It turns out that a mapping is invertible if and only if the determinant is nonzero. If the determinant is zero, then the nullspace is nontrivial. Determinants have other applications, including a systematic way of seeing if a set of vectors is linearly independent (we write the vectors as the columns of a matrix, and if the determinant of that matrix is zero, the vectors are linearly dependent). Determinants could also be used to solve systems of linear equations (see Cramer's rule), but in real applications, Gaussian elimination is a faster method.
where Id is the identity matrix. For there to be nontrivial solutions to that equation, det(T Id) = 0. The determinant is a polynomial, and so the eigenvalues are not guaranteed to exist if the field is R. Thus, we often work with an algebraically closed field such as the complex numbers when dealing with eigenvectors and eigenvalues so that an eigenvalue will always exist. It would be particularly nice if given a transformation T taking a vector space V into itself we can find a basis for V consisting of eigenvectors. If such a basis exists, we can easily compute the action of the transformation on any vector: if v1, v2, , vn are linearly independent eigenvectors of a mapping of n-dimensional spaces T with (not necessarily distinct) eigenvalues 1, 2, , n, and if v = a1v1 + ... + an vn, then, Such a transformation is called a diagonalizable matrix since in the eigenbasis, the transformation is represented by a diagonal matrix. Because operations like matrix multiplication, matrix inversion, and determinant calculation are simple on diagonal matrices, computations involving matrices are much simpler if we can bring the matrix to a diagonal form. Not all matrices are diagonalizable (even over an algebraically closed field).
Inner-product spaces
Besides these basic concepts, linear algebra also studies vector spaces with additional structure, such as an inner product. The inner product is an example of a bilinear form, and it gives the vector space a geometric structure by allowing for the definition of length and angles. Formally, an inner product is a map
that satisfies the following three axioms for all vectors u, v, w in V and all scalars a in F: Conjugate symmetry:
Linear algebra
Positive-definiteness: with equality only for v = 0. We can define the length of a vector v in V by
and so we can call this quantity the cosine of the angle between the two vectors. Two vectors are orthogonal if . An orthonormal basis is a basis where all basis vectors have length 1
and are orthogonal to each other. Given any finite-dimensional vector space, an orthonormal basis could be found by the GramSchmidt procedure. Orthonormal bases are particularly nice to deal with, since if v = a1 v1 + ... + an vn, then . The inner product facilitates the construction of many useful concepts. For instance, given a transform T, we can define its Hermitian conjugate T* as the linear transform satisfying
If T satisfies TT* = T*T, we call T normal. It turns out that normal matrices are precisely the matrices that have an orthonormal system of eigenvectors that span V.
Applications
Because of the ubiquity of vector spaces, linear algebra is used in many fields of mathematics, natural sciences, computer science, and social science. Below are just some examples of applications of linear algebra.
The Gaussian-elimination algorithm is as follows: eliminate x from all equations below L1, and then eliminate y from all equations below L2. This will put the system into triangular form. Then, using back-substitution, each unknown can be solved for.
Linear algebra In the example, x is eliminated from L2 by adding (3/2)L1 to L2. x is then eliminated from L3 by adding L1 to L3. Formally:
This result is a system of linear equations in triangular form, and so the first part of the algorithm is complete. The last part, back-substitution, consists of solving for the knowns in reverse order. It can thus be seen that
Then, z can be substituted into L2, which can then be solved to obtain Next, z and y can be substituted into L1, which can be solved to obtain The system is solved. We can, in general, write any system of linear equations as a matrix equation:
The solution of this system is characterized as follows: first, we find a particular solution x0 of this equation using Gaussian elimination. Then, we compute the solutions of Ax = 0; that is, we find the nullspace N of A. The solution set of this equation is given by . If the number of variables equal the number of equations, then we can characterize when the system has a unique solution: since N is trivial if and only if det A 0, the equation has a unique solution if and only if det A 0.
This series expansion is extremely useful in solving partial differential equations. In this article, we will not be concerned with convergence issues; it is nice to note that all Lipschitz-continuous functions have a converging Fourier series expansion, and nice enough discontinuous functions have a Fourier series that converges to the
Linear algebra function value at most points. The space of all functions that can be represented by a Fourier series form a vector space (technically speaking, we call functions that have the same Fourier series expansion the "same" function, since two different discontinuous functions might have the same Fourier series). Moreover, this space is also an inner product space with the inner product
The functions gn(x) = sin(nx) for n > 0 and hn(x) = cos(nx) for n 0 are an orthonormal basis for the space of Fourier-expandable functions. We can thus use the tools of linear algebra to find the expansion of any function in this space in terms of these basis functions. For instance, to find the coefficient ak, we take the inner product with hk:
and by orthonormality,
; that is,
Quantum mechanics
Quantum mechanics is highly inspired by notions in linear algebra. In quantum mechanics, the physical state of a particle is represented by a vector, and observables (such as momentum, energy, and angular momentum) are represented by linear operators on the underlying vector space. More concretely, the wave function of a particle describes its physical state and lies in the vector space L2 (the functions : R3 C such that is finite), and it evolves according to the Schrdinger equation. Energy is represented as the operator , where V is the potential energy. H is also known as the
Hamiltonian operator. The eigenvalues of H represents the possible energies that can be observed. Given a particle in some state , we can expand into a linear combination of eigenstates of H. The component of H in each eigenstate determines the probability of measuring the corresponding eigenvalue, and the measurement forces the particle to assume that eigenstate (wave function collapse).
Linear algebra Functional analysis mixes the methods of linear algebra with those of mathematical analysis and studies various function spaces, such as Lp spaces. Representation theory studies the actions of algebraic objects on vector spaces by representing these objects as matrices. It is interested in all the ways that this is possible, and it does so by finding subspaces invariant under all transformations of the algebra. The concept of eigenvalues and eigenvectors is especially important.
Notes
[1] [2] [3] [4] [5] [6] [7] [8] Vitulli, Marie http:/ / www. journals. istanbul. edu. tr/ tr/ index. php/ oba/ article/ download/ 9103/ 8452 http:/ / archive. org/ details/ linearalgebra00tevfgoog This axiom is not asserting the associativity of an operation, since there are two operations in question, scalar multiplication: bv; and field multiplication: ab. Axler (2004), pp. 2829 The existence of a basis is straightforward for countably generated vector spaces, and for well-ordered vector spaces, but in full generality it is logically equivalent to the axiom of choice. Axler (2204), p. 33 Axler (2004), p. 55
Further reading
History Fearnley-Sander, Desmond, "Hermann Grassmann and the Creation of Linear Algebra" ( (http://mathdl.maa. org/images/upload_library/22/Ford/DesmondFearnleySander.pdf)), American Mathematical Monthly 86 (1979), pp.809817. Grassmann, Hermann, Die lineale Ausdehnungslehre ein neuer Zweig der Mathematik: dargestellt und durch Anwendungen auf die brigen Zweige der Mathematik, wie auch auf die Statik, Mechanik, die Lehre vom Magnetismus und die Krystallonomie erlutert, O. Wigand, Leipzig, 1844. Introductory textbooks Bretscher, Otto (June 28, 2004), Linear Algebra with Applications (3rd ed.), Prentice Hall, ISBN978-0-13-145334-0 Farin, Gerald; Hansford, Dianne (December 15, 2004), Practical Linear Algebra: A Geometry Toolbox, AK Peters, ISBN978-1-56881-234-2 Friedberg, Stephen H.; Insel, Arnold J.; Spence, Lawrence E. (November 11, 2002), Linear Algebra (4th ed.), Prentice Hall, ISBN978-0-13-008451-4 Hefferon, Jim (2008), Linear Algebra (http://joshua.smcvt.edu/linearalgebra/) Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International Lay, David C. (August 22, 2005), Linear Algebra and Its Applications (3rd ed.), Addison Wesley, ISBN978-0-321-28713-7 Kolman, Bernard; Hill, David R. (May 3, 2007), Elementary Linear Algebra with Applications (9th ed.), Prentice Hall, ISBN978-0-13-229654-0 Leon, Steven J. (2006), Linear Algebra With Applications (7th ed.), Pearson Prentice Hall, ISBN978-0-13-185785-8 Poole, David (2010), Linear Algebra: A Modern Introduction (3rd ed.), Cengage Brooks/Cole, ISBN978-0-538-73545-2 Ricardo, Henry (2010), A Modern Introduction To Linear Algebra (1st ed.), CRC Press, ISBN978-1-4398-0040-9 Sadun, Lorenzo (2008), Applied Linear Algebra: the decoupling principle (2nd ed.), AMS, ISBN978-0-8218-4441-0
Linear algebra Strang, Gilbert (July 19, 2005), Linear Algebra and Its Applications (4th ed.), Brooks Cole, ISBN978-0-03-010567-8 Advanced textbooks Axler, Sheldon (February 26, 2004), Linear Algebra Done Right (2nd ed.), Springer, ISBN978-0-387-98258-8 Bhatia, Rajendra (November 15, 1996), Matrix Analysis, Graduate Texts in Mathematics, Springer, ISBN978-0-387-94846-1 Demmel, James W. (August 1, 1997), Applied Numerical Linear Algebra, SIAM, ISBN978-0-89871-389-3 Dym, Harry (2007), Linear Algebra in Action, AMS, ISBN978-0-8218-3813-6 Gantmacher, F.R. (2005, 1959 edition), Applications of the Theory of Matrices, Dover Publications, ISBN978-0-486-44554-0 Gantmacher, Felix R. (1990), Matrix Theory Vol. 1 (2nd ed.), American Mathematical Society, ISBN978-0-8218-1376-8 Gantmacher, Felix R. (2000), Matrix Theory Vol. 2 (2nd ed.), American Mathematical Society, ISBN978-0-8218-2664-5 Gelfand, I. M. (1989), Lectures on Linear Algebra, Dover Publications, ISBN978-0-486-66082-0 Glazman, I. M.; Ljubic, Ju. I. (2006), Finite-Dimensional Linear Analysis, Dover Publications, ISBN978-0-486-45332-3 Golan, Johnathan S. (January 2007), The Linear Algebra a Beginning Graduate Student Ought to Know (2nd ed.), Springer, ISBN978-1-4020-5494-5 Golan, Johnathan S. (August 1995), Foundations of Linear Algebra, Kluwer, ISBN0-7923-3614-3 Golub, Gene H.; Van Loan, Charles F. (October 15, 1996), Matrix Computations, Johns Hopkins Studies in Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN978-0-8018-5414-9 Greub, Werner H. (October 16, 1981), Linear Algebra, Graduate Texts in Mathematics (4th ed.), Springer, ISBN978-0-8018-5414-9 Hoffman, Kenneth; Kunze, Ray (April 25, 1971), Linear Algebra (2nd ed.), Prentice Hall, ISBN978-0-13-536797-1 Halmos, Paul R. (August 20, 1993), Finite-Dimensional Vector Spaces, Undergraduate Texts in Mathematics, Springer, ISBN978-0-387-90093-3 Horn, Roger A.; Johnson, Charles R. (February 23, 1990), Matrix Analysis, Cambridge University Press, ISBN978-0-521-38632-6 Horn, Roger A.; Johnson, Charles R. (June 24, 1994), Topics in Matrix Analysis, Cambridge University Press, ISBN978-0-521-46713-1 Lang, Serge (March 9, 2004), Linear Algebra, Undergraduate Texts in Mathematics (3rd ed.), Springer, ISBN978-0-387-96412-6 Marcus, Marvin; Minc, Henryk (2010), A Survey of Matrix Theory and Matrix Inequalities, Dover Publications, ISBN978-0-486-67102-4 Meyer, Carl D. (February 15, 2001), Matrix Analysis and Applied Linear Algebra (http://www.matrixanalysis. com/DownloadChapters.html), Society for Industrial and Applied Mathematics (SIAM), ISBN978-0-89871-454-8 Mirsky, L. (1990), An Introduction to Linear Algebra, Dover Publications, ISBN978-0-486-66434-7 Roman, Steven (March 22, 2005), Advanced Linear Algebra, Graduate Texts in Mathematics (2nd ed.), Springer, ISBN978-0-387-24766-3 Shafarevich, I. R.; A. O. Remizov (2012), Linear Algebra and Geometry (http://www.springer.com/ mathematics/algebra/book/978-3-642-30993-9), Springer, ISBN978-3-642-30993-9 Shilov, Georgi E. (June 1, 1977), Linear algebra, Dover Publications, ISBN978-0-486-63518-7 Shores, Thomas S. (December 6, 2006), Applied Linear Algebra and Matrix Analysis, Undergraduate Texts in Mathematics, Springer, ISBN978-0-387-33194-2
Linear algebra Smith, Larry (May 28, 1998), Linear Algebra, Undergraduate Texts in Mathematics, Springer, ISBN978-0-387-98455-1 Study guides and outlines Leduc, Steven A. (May 1, 1996), Linear Algebra (Cliffs Quick Review), Cliffs Notes, ISBN978-0-8220-5331-6 Lipschutz, Seymour; Lipson, Marc (December 6, 2000), Schaum's Outline of Linear Algebra (3rd ed.), McGraw-Hill, ISBN978-0-07-136200-9 Lipschutz, Seymour (January 1, 1989), 3,000 Solved Problems in Linear Algebra, McGrawHill, ISBN978-0-07-038023-3 McMahon, David (October 28, 2005), Linear Algebra Demystified, McGrawHill Professional, ISBN978-0-07-146579-3 Zhang, Fuzhen (April 7, 2009), Linear Algebra: Challenging Problems for Students, The Johns Hopkins University Press, ISBN978-0-8018-9125-0
10
External links
International Linear Algebra Society (http://www.math.technion.ac.il/iic/) MIT Professor Gilbert Strang's Linear Algebra Course Homepage (http://web.mit.edu/18.06/www) : MIT Course Website MIT Linear Algebra Lectures (http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/ index.htm): free videos from MIT OpenCourseWare Linear Algebra Toolkit (http://www.math.odu.edu/~bogacki/lat/). Hazewinkel, Michiel, ed. (2001), "Linear algebra" (http://www.encyclopediaofmath.org/index.php?title=p/ l059040), Encyclopedia of Mathematics, Springer, ISBN978-1-55608-010-4 Linear Algebra (http://mathworld.wolfram.com/topics/LinearAlgebra.html) on MathWorld. Linear Algebra tutorial (http://people.revoledu.com/kardi/tutorial/LinearAlgebra/index.html) with online interactive programs. Matrix and Linear Algebra Terms (http://www.economics.soton.ac.uk/staff/aldrich/matrices.htm) on Earliest Known Uses of Some of the Words of Mathematics (http://jeff560.tripod.com/mathword.html) Earliest Uses of Symbols for Matrices and Vectors (http://jeff560.tripod.com/matrices.html) on Earliest Uses of Various Mathematical Symbols (http://jeff560.tripod.com/mathsym.html) Linear Algebra (http://www.egwald.ca/linearalgebra/index.php) by Elmer G. Wiens. Interactive web pages for vectors, matrices, linear equations, etc. Linear Algebra Solved Problems (http://www.mathlinks.ro/Forum/index.php?f=346): Interactive forums for discussion of linear algebra problems, from the lowest up to the hardest level (Putnam). Linear Algebra for Informatics (http://xmlearning.maths.ed.ac.uk/). Jos Figueroa-O'Farrill, University of Edinburgh Online Notes / Linear Algebra (http://tutorial.math.lamar.edu/classes/linalg/linalg.aspx) Paul Dawkins, Lamar University Elementary Linear Algebra textbook with solutions (http://www.numbertheory.org/book/) Linear Algebra Wiki (http://www.linearalgebrawiki.org/) Linear algebra (math 21b) homework and exercises (http://www.courses.fas.harvard.edu/~math21b/) Textbook and solutions manual (http://www.saylor.org/courses/ma211/), Saylor Foundation. An Intuitive Guide to Linear Algebra (http://betterexplained.com/articles/linear-algebra-guide/) on BetterExplained
Linear algebra
11
Online books
Beezer, Rob, A First Course in Linear Algebra (http://linear.ups.edu/index.html) Connell, Edwin H., Elements of Abstract and Linear Algebra (http://www.math.miami.edu/~ec/book/) Hefferon, Jim, Linear Algebra (http://joshua.smcvt.edu/linalg.html/) Matthews, Keith, Elementary Linear Algebra (http://www.numbertheory.org/book/) Sharipov, Ruslan, Course of linear algebra and multidimensional geometry (http://arxiv.org/abs/math.HO/ 0405323) Treil, Sergei, Linear Algebra Done Wrong (http://www.math.brown.edu/~treil/papers/LADW/LADW.html)
12
License
Creative Commons Attribution-Share Alike 3.0 Unported //creativecommons.org/licenses/by-sa/3.0/