You are working with the text-only light edition of "H.Lohninger: Teach/Me Data Analysis, Springer-Verlag, Berlin-New York-Tokyo, 1999. ISBN 3-540-14743-8". Click here for further information. |
Table of Contents Bivariate Data Regression Derivation of a Univariate Regression Formula | |
See also: regression, Curvilinear Regression, Regression - Confidence Interval |
Let us conduct this procedure for a particular example:
This formula is to be estimated from a series of data points [x_{i},y_{i}], where the x_{i} are the independent values, and the y_{i} are to be estimated. By substituting the y_{i} values with their estimates ax_{i}+bx_{i}^{2} we obtain the following series of data points: [xi, ax_{i}+bx_{i}^{2}]. The actual values of the y values are, however, the y_{i}. Thus the sum of squared errors S for n data points is defined by
S = (ax_{1}+bx_{1}^{2}-y_{1})^{2} + (ax_{2}+bx_{2}^{2}-y_{2})^{2} + (ax_{3}+bx_{3}^{2}-y_{3})^{2} + ...... + (ax_{n}+bx_{n}^{2}-y_{n})^{2}
Now we have to calculate the partial derivatives with respect to the parameters a and b, and equate them to zero:
dS/da = 0 = 2(ax_{1}+bx_{1}^{2}-y_{1})x_{1
}+
2(ax_{2}+bx_{2}^{2}-y_{2})x_{2}
+ 2(ax_{3}+bx_{3}^{2}-y_{3})x_{3}
+ ...... + 2(ax_{n}+bx_{n}^{2}-y_{n})x_{n}
dS/db = 0 = 2(ax_{1}+bx_{1}^{2}-y_{1})x_{1}^{2}_{
}+
2(ax_{2}+bx_{2}^{2}-y_{2})x_{2}^{2}
+ 2(ax_{3}+bx_{3}^{2}-y_{3})x_{3}^{2}
+ ...... + 2(ax_{n}+bx_{n}^{2}-y_{n})x_{n}^{2}
These two equations can easily be reduced by introducing the sums of the individual terms:
Now, solve these equations for the coefficients a and b:
And then substitute the expressions for a and b into their counterparts, with the following final results:
Last Update: 2006-Jän-17