You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.

Mathematical Details

The mean and variance of a sum of independent random variables can be derived from the general properties of the expectation operator. The expectation or mean of a random variable x is:

where P(xi) is the probability mass function (or cumulative density function).

The expectation is a linear operator, so it has the following properties:

So, when z is the sum of K independent random variables
the mean and the variance of z are:

So the mean of a sum of independent random variables is the sum of the means and its variance is the sum of its variances.

In the presence of additive noise, a raw signal S can be described by S = S0 + E, where S0 is the noise-free signal and E the random error. When we average n signals we obtain

and its variance will be

since the variance is only determined by the variance of the noise and not by the signal S0 , whose variance is zero by definition.

Under the assumption that the variance of each individual signal is equal, which is quite reasonable since the measurement process should not vary between the measurements, we obtain


i.e. the variance of a signal decreases with the number of samples used, or the standard deviation decreases by the square root of n.

Last Update: 2004-Jul-03