Vous êtes sur la page 1sur 22

Curve fitting: Regression

Dr. Sheikh Md. Rabiul Islam


Dept. of ECE
KUET
What is regression analysis?
The regression equation is the formula for the trend or fit line
which enables us to predict the dependent variable for any
given value of the independent variable
The regression equation has two parts the intercept and the
slope
The intercept is the point on the vertical axis where the
regression line crosses. It generally does not provide useful
information.
The slope is the change in the dependent variable for a one
unit change in the independent variable. The slope tells us the
direction and magnitude of change.
What is regression analysis?
Suppose the values of y for the different values
of x are given:
= ()
Where y is called dependent variable and
x the independent variable.
What is regression analysis?
The first order linear model

y b0 b1x e
y = dependent variable b0 and b1 are unknown population
x = independent variable y parameters, therefore are estimated
from the data.
b0 = y-intercept
b1 = slope of the line
Rise
e = error variable b1 = Rise/Run
b0 Run
x
4
Notation
(x1, y1), (x2, y2), . . . , (xn, yn)
draw the line y= b0 + b1x through the
scatterplot , the point on the line
corresponding to xi is
yi b0 b1 xi ; yi is the value of y predicted by the line
y b0 b1 x when x xi ;
yi is the observed value of y when x xi .
Observed y, Predicted y
FUEL CONSUMPTION vs CAR WEIGHT

7
6.5
FUEL CONSUMPTION

predicted y when
6 x=2.7
5.5 yhat = a + bx
= a + b*2.7
5
4.5
4 (2.7, 3.6)
3.5 3.6 = observed y
3
2.5
2
1.5 2 2.5 2.7 3 3.5 4 4.5
CAR WEIGHT
Residuals: graphically
Graphical Display of Residuals

positive residual

Yi negative residual

Yi ei=Yi - Yi

Xi X
Fitting linear equation:

= + = ()

= =

1. Mimimise the sum of error: =

2. Mimimise the sum of absolute values of errors:


=

3. Mimimise the sum of squares of error:


2
=
Linear Least Squares
Is the line of best fit for
a group of points

It seeks to minimize the


sum of all data points of
the square differences
between the function
value and data value.

It is the earliest form of


linear regression
Linear Least Squares
Considering an arbitrary straight line, y =b0+b1 x, is to be fitted
through these data points. The question is Which line is the most
representative?
y
^
y =b0+b1 x
b1
1

^ = error (residual)
ei = yi y i

^
yi
yi

b0
x
xi
Linear Least Squares

= 2 = 2

=1 =1


=0 and =0



= 2 = 0 = +

=1

= 2 = 0 = + 2

=1


= = =
2 2
Fitting transcendental Equations
Example: the familiar equation for population growth is given by

= 0

Example: the nonlinear model is the gas low relating to the


pressure and volume, as given by

=
Using Least squares

= 2

=1
Fitting transcendental Equations
o minimise Q:
and
=0 =0


= 2

2
=
Fitting transcendental Equations

Example:
=
= +

=
( )2 2

1
= = =

=
Fitting a Polynomial Function
When a given set of data does not appear to satisfy a linear equation,
we can try a suitable polynomial as a regression curve to fit the data
Least square technique can be used

Consider a polynomial of degree m-1

= 1 + 2 + 3 2 + + 1 = ()
Using Least squares

= ( ) 2

=1
Fitting a Polynomial Function
Consider the general term
=0
1

= 2 =0

=1
=0
2
( )
. = 1
.
.
.
( ) 1 = 0 , = 1,2,
=1

=0

1 1 =0
Fitting a Polynomial Function

1 1 + 2 + 3 2 + . . + 1 = 1
=1 =1

= 1,2, , for = 1

1 + 2 + 3 2 + . . + 1 =

1 + 2 2 + 3 3 + . . + =

.
.
.

1 1 + 2 + 3 +1 + + 21 = 1
Fitting a Polynomial Function
Matrix form: =

2 1 1
2 1

3 2
2 3 = . .
= B=
. .
. .

1
1 1 22

, = +2 , = 1,2, . , = 1,2, ,
=1

1 ,
B = =1 = 1,2, . . ,
Multiple Linear Regression

a two variable linear function as follows:

= 1 + 2 + 3

= 1 2 3 2

=1
Differentiating w.r.to 1 , 2 3

= 2 1 + 2 + 3
1

= 2 1 + 2 + 3
2

= 2 1 + 2 + 3
3
Multiple Linear Regression
Setting these partial derivatives equal to zero

1 + 2 + 3 =

1 + 3 2 =

1 + 2 + 3 2 =


1
2
2 =
2
2
What is ill-condition in least squares methods?
Practice all the methods with example and exercises

Vous aimerez peut-être aussi