|You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.|
|Table of Contents Math Background Matrices Linear Equations|
|See also: Gauss-Jordan algorithm, equivalence operations|
One of the big advantages of matrix algebra is that systems of linear equations can be depicted as matrices. So most of the operations valid for matrices are also valid for the corresponding system of linear equations. This is quite important for multivariate statistics because many methods of multivariate statistics are based on solving systems of (linear) equations.
These equations can be denoted in matrix form as follows:
You see that the left sides of the equations have been decomposed into a product of the matrix of the coefficients and the unknown variables x1, x2, and x3. This equation can be written in matrix notation as
with A being the matrix of coefficients, x being the vector
of unknowns, and s being the constant vector at the right side of
the equation system.
Last Update: 2006-Jšn-17