Académique Documents
Professionnel Documents
Culture Documents
y b0 b1x e
y = dependent variable b0 and b1 are unknown population
x = independent variable y parameters, therefore are estimated
from the data.
b0 = y-intercept
b1 = slope of the line
Rise
e = error variable b1 = Rise/Run
b0 Run
x
4
Notation
(x1, y1), (x2, y2), . . . , (xn, yn)
draw the line y= b0 + b1x through the
scatterplot , the point on the line
corresponding to xi is
yi b0 b1 xi ; yi is the value of y predicted by the line
y b0 b1 x when x xi ;
yi is the observed value of y when x xi .
Observed y, Predicted y
FUEL CONSUMPTION vs CAR WEIGHT
7
6.5
FUEL CONSUMPTION
predicted y when
6 x=2.7
5.5 yhat = a + bx
= a + b*2.7
5
4.5
4 (2.7, 3.6)
3.5 3.6 = observed y
3
2.5
2
1.5 2 2.5 2.7 3 3.5 4 4.5
CAR WEIGHT
Residuals: graphically
Graphical Display of Residuals
positive residual
Yi negative residual
Yi ei=Yi - Yi
Xi X
Fitting linear equation:
= + = ()
= =
^ = error (residual)
ei = yi y i
^
yi
yi
b0
x
xi
Linear Least Squares
= 2 = 2
=1 =1
=0 and =0
= 2 = 0 = +
=1
= 2 = 0 = + 2
=1
= = =
2 2
Fitting transcendental Equations
Example: the familiar equation for population growth is given by
= 0
=
Using Least squares
= 2
=1
Fitting transcendental Equations
o minimise Q:
and
=0 =0
= 2
2
=
Fitting transcendental Equations
Example:
=
= +
=
( )2 2
1
= = =
=
Fitting a Polynomial Function
When a given set of data does not appear to satisfy a linear equation,
we can try a suitable polynomial as a regression curve to fit the data
Least square technique can be used
= 1 + 2 + 3 2 + + 1 = ()
Using Least squares
= ( ) 2
=1
Fitting a Polynomial Function
Consider the general term
=0
1
= 2 =0
=1
=0
2
( )
. = 1
.
.
.
( ) 1 = 0 , = 1,2,
=1
=0
1 1 =0
Fitting a Polynomial Function
1 1 + 2 + 3 2 + . . + 1 = 1
=1 =1
= 1,2, , for = 1
1 + 2 + 3 2 + . . + 1 =
1 + 2 2 + 3 3 + . . + =
.
.
.
1 1 + 2 + 3 +1 + + 21 = 1
Fitting a Polynomial Function
Matrix form: =
2 1 1
2 1
3 2
2 3 = . .
= B=
. .
. .
1
1 1 22
, = +2 , = 1,2, . , = 1,2, ,
=1
1 ,
B = =1 = 1,2, . . ,
Multiple Linear Regression
= 1 + 2 + 3
= 1 2 3 2
=1
Differentiating w.r.to 1 , 2 3
= 2 1 + 2 + 3
1
= 2 1 + 2 + 3
2
= 2 1 + 2 + 3
3
Multiple Linear Regression
Setting these partial derivatives equal to zero
1 + 2 + 3 =
1 + 3 2 =
1 + 2 + 3 2 =
1
2
2 =
2
2
What is ill-condition in least squares methods?
Practice all the methods with example and exercises