|You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information.|
|Table of Contents Bivariate Data Regression Straight Line|
|See also: general approach, assumptions, residuals, coefficient of variation, Regression - Confidence Interval|
For a particular value Xi of the independent variable, we can find the predicted value i by using the equation of a straight line:
i = a + bXi
The difference between i
and Yi is called the residual
and represents the error which is made when predicting Yi as
the response to variable X. The best fit for all available points Xi
can be found by minimizing the sum of all squared ei.
In fact, the criterium for minimizing the error could be any other suitable
function. However the sum of squares has certain mathematical advantages.
A more detailed discussion on the mathematics behind the regression can
be found elsewhere.
Please note that the linear regression is based on several assumptions,
which have to be fulfilled when applying regression methods to the data.
Last Update: 2005-Jul-16