|You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.
|Table of Contents Math Background Matrices Linear Dependence
|See also: rank of a matrix
|A given set of k vectors aj, is called linearly independent, if the equation s1a1 + s2a2 + ... + skak = o has no other solution than the trivial one (all scalars sj are zero). If any scalars sj different from zero exist, the set of vectors is called linearly dependent.
Linear independence is important for many aspects of data analysis. A general rule is that a set of n vectors of order m shows linear dependence if n is greater than m.
Linear independence is closely related to the rank of a matrix. If we
recognize a matrix as a set of n (row or column) vectors, we immediately
see that linear dependence among row or column vectors reduces the rank
of the matrix.
Last Update: 2005-Jul-16