Académique Documents
Professionnel Documents
Culture Documents
Chapter Goals
After completing this chapter, you should be able to: understand model building using multiple regression analysis apply multiple regression analysis to business decision-making situations analyze and interpret the computer output for a multiple regression model test the significance of the independent variables in a multiple regression model
2
Chapter Goals
(continued)
After completing this chapter, you should be able to: use variable transformations to model nonlinear relationships recognize potential problems in multiple regression analysis and take the steps to correct the problems. incorporate qualitative variables into the regression model by using dummy variables.
y +x + =+ 2 x ++ x 01 k 12 K k
Estimated multiple regression model:
Estimated (or predicted) value of y Estimated intercept Estimated slope coefficients
bb K y +2 + k =1 x + b1+ b x 0 x 2 k
4
0 11 b2 yb b + = x 2 + x
e op Sl
le ab ri va r fo
x1
x2
ble x 2
x1
0 11 b2 yb b + = x 2 + x
<
yi
e = (y y) x2i x2
<
x1i x1
The best fit equation, y , is found by minimizing the sum of squared errors, e2
6
Model Specification
Decide what you want to do and select the dependent variable Determine the potential independent variables for your model Gather sample data (observations) for all variables
Example
A distributor of frozen desert pies wants to evaluate factors thought to influence demand
Dependent variable: Independent variables: Pie sales (units per week)
Price (in $)
Advertising ($100s)
Data is collected for 15 weeks
10
Price ($)
Advertising ($100s)
Correlation matrix:
Pie Sales Price Advertising Pie Sales 1 -0.44327 0.55632 1 0.03044 1 Price Advertising
11
y-intercept (b0)
12
13
Scatter Diagrams
Sales
600 500 400 300 200 100 0 0 2 4 Price 6 8 10
Sales
600 500 400 300 200 100 0 0 1
2 3 Advertising
14
PHStat:
PHStat / Regression / Multiple Regression
15
S.-7+r ) a 5 57 v l 3 2i 4 e e 2 () 1 t s 6 P 3i =9 0 4 e1 i 6. r . A c ( s d n g
16
S.-7+r ) a 5 57 v l=9) 1 t e2 ( s 6 P 3i 3 2i 4 e 0 4 e1 i 6. r . A c ( s d n g
where Sales is in number of pies per week Price is in $ Advertising is in $100s.
b1 = -24.975: sales will decrease, on average, by 24.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising
b2 = 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price
17
Predictions in PHStat
PHStat | regression | multiple regression
Predictions in PHStat
Input values
(continued)
Predicted y value
Confidence interval for the mean y value, given these xs
Prediction interval for an individual y value, given these xs 20
< <
<
S So u sgs S u f q e r in R m a r eo s r es R == S S T Tl u f q e os o u s t m a a sr
2
21
(continued)
S 26 S 90 R 4. 0 R = = =4 .2 8 5 1 S 59 S 63 T 4. 3
2
52.1% of the variation in pie sales is explained by the variation in price and advertising
MS 14730.013 2252.776 P-value 0.01993 0.03979 0.01449 Lower 95% 57.58835 -48.57626 17.55303 Upper 95% 555.46404 -1.37392 130.70888 F 6.53861 Significance F 0.01201
22
Adjusted R2
R2 never decreases when a new x variable is added to the model This can be a disadvantage when comparing models What is the net effect of adding a new variable? We lose a degree of freedom when a new x variable is added Did the new x variable add enough explanatory power to offset the loss of one degree of freedom?
23
Adjusted R2
(continued)
Shows the proportion of variation in y explained by all x variables adjusted for the number of x variables used
2 2 A (where n = sample size, k = number of independent variables)
n -1 R= ( -R 11 ) n - -k 1
Penalize excessive use of unimportant independent variables Smaller than R2 Useful in comparing among models
24
(continued)
2 R =47 4 A . 12
44.2% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables
SS 29460.027 27033.306 56493.333 Standard Error 114.25389 10.83213 25.96732 t Stat 2.68285 -2.30565 2.85478 MS 14730.013 2252.776 P-value 0.01993 0.03979 0.01449 Lower 95% 57.58835 -48.57626 17.55303 Upper 95% 555.46404 -1.37392 130.70888 F 6.53861 Significance F 0.01201
25
(denominator) D2 = (n k - 1)
M 1 30 S 40 R 7. F = = =3 68 . 6 5 M 22 S E 2. 58
With 2 and 12 degrees of freedom
SS 29460.027 27033.306 56493.333 Standard Error 114.25389 10.83213 25.96732 t Stat 2.68285 -2.30565 2.85478 MS 14730.013 2252.776 P-value 0.01993 0.03979 0.01449 Lower 95% 57.58835 -48.57626 17.55303 Upper 95% 555.46404 -1.37392 130.70888 F 6.53861
28
Test Statistic:
MR S F = =. 3 6 65 8 ME S
Do not reject H0
Reject H0
F.05 = 3.885
30
(continued)
b i (df =0 k 1) - n t= s bi
31
(continued)
t-value for Price is t = -2.306, with p-value .0398 t-value for Advertising is t = 2.855, with p-value .0145
SS 29460.027 27033.306 56493.333 Standard Error 114.25389 10.83213 25.96732 t Stat 2.68285 -2.30565 2.85478 MS 14730.013 2252.776 P-value 0.01993 0.03979 0.01449 Lower 95% 57.58835 -48.57626 17.55303 Upper 95% 555.46404 -1.37392 130.70888 F 6.53861 Significance F 0.01201
32
The test statistic for each variable falls in the rejection region (p-values < .05)
/2=.025
Decision:
Reject H0 for each variable
Conclusion:
Reject H0
-t/2 -2.1788
Do not reject H0
t/2 2.1788
Reject H0
There is evidence that both Price and Advertising affect pie sales at = .05 33
b a/2s i t i b
Intercept Price Advertising Coefficients 306.52619 -24.97509 74.13096 Standard Error 114.25389 10.83213 25.96732
Example: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price
34
SE S s= =ME S e nk 1 - Is this value large or small? Must compare to the mean size of y for comparison
35
(continued)
36
(continued)
A rough prediction range for pie sales in a given week is .6 9 2 4= (7 ) 4 4 . 2 Pie sales in the sample were in the 300 to 500 per week range, so this range is probably too large to be acceptable. The analyst may want to look for additional variables that can explain more of the variation in weekly sales
37
Multicollinearity
Multicollinearity: High correlation exists between two independent variables This means the two variables contribute redundant information to the multiple regression model
38
Multicollinearity
(continued)
Including two highly correlated independent variables can adversely affect the regression results
No new information provided Can lead to unstable coefficients (large standard error and low t-values) Coefficient signs may not match prior expectations
39
1 VIFj = 2 1- R j
R2j is the coefficient of determination when the jth independent variable is regressed against the remaining k 1 independent variables
Regression Analysis
Price and all other X Multiple R R Square Adjusted R Square Standard Error Observations VIF
Regression intercepts are different if the variable is significant Assumes equal slopes for other variables The number of dummy variables needed is (number of levels - 1)
43
0 1 1 b2 yb b + = x 2 + x
44
b0 + b 2 b0
S 0 i) (d a - r+) l= c H e 3 1a s 0 5y 3(e o 0 P l i
1 c i rn f ah x= 2 in 0 t fo
1 lt e i si l v f p el x= 3 in 0 t fo
+ 22 b y 0 b b+ =11 x 33 b x + x
b2 shows the impact on price if the house is a ranch style, compared to a condo b3 shows the impact on price if the house is a split level style, compared to a condo
48
+31 y 35. 2 . 3 = 4 58 20 + 8 0. x 3 4 . 02 + 4 x x 1
For a condo: x2 = x3 = 0
For a ranch: x3 = 0
With the same square feet, a split-level will have an estimated average price of 18.84 thousand dollars more than a condo With the same square feet, a ranch will have an estimated average price of 23.53 thousand dollars more than a condo. 49
Nonlinear Relationships
The relationship between the dependent variable and an independent variable may not be linear Useful when scatter diagram indicates non-linear relationship Example: Quadratic model
The second0 1j independent variable
2 2j variable is
50
2 p i = Population regressionj coefficient j variable xj : jp k for = 1,j 2, 0 1 2 p = Order of the polynomial i = Model error
y+x + = + K + x x +
yj+ + = x + x 0 1
2 2j
51
x
residuals residuals
x
Linear fit does not give random residuals
x
Nonlinear fit gives random residuals
52
1 < 0 2 > 0
x1
1 > 0 2 > 0
x1
1 < 0 2 < 0
x1
1 > 0 2 < 0
53
x1
1 = the coefficient of the linear term 2 = the coefficient of the squared term
Hypotheses = 0 H:
0 2
(No 2nd order polynomial term) HA: 2 0 (2nd order polynomial term is needed)
54
y 0 + = x x + + x 1+ j
55
2 3 2 j 3 j
Interaction Effects
Hypothesizes interaction between pairs of x variables Response to one x variable varies at different levels of another x variable Contains two-way cross product terms
2 y 34 + = 1 5 + +x 1 x 13 + +x x x 2 2 x x 0 1 1 2 2
Basic Terms
Interactive Terms
56
Effect of Interaction
Given:
y +2+ =1 x3 2 2 x x + 1 x 0 1 +
Without interaction term, effect of x1 on y is measured by 1 With interaction term, effect of x1 on y is measured by 1 + 3 x2 Effect changes as x2 increases
57
Interaction Example
y = 1 + 2x1 + 3x2 + 4x1x2 y
12 8 4 0 0 0.5 1
where x2 = 0 or 1 (dummy variable)
x1
58
y +2+ =1 x3 2 2 x x + 1 x 0 1 +
Hypotheses: H0: 3 = 0 (no interaction between x1 and x2) HA: 3 0 (x1 interacts with x2)
60
Model Building
Goal is to develop a model with the best set of independent variables
Easier to interpret if unimportant variables are removed Lower probability of collinearity
Best-subset approach
Try all combinations and select the best using the highest adjusted R2 and lowest s
61
Stepwise Regression
Idea: develop the least squares regression equation in steps, either through forward selection, backward elimination, or through standard stepwise regression The coefficient of partial determination is the measure of the marginal contribution of each independent variable, given that other independent variables are in the model
62
Stepwise regression and best subsets regression can be performed using PHStat, Minitab, or other statistical software packages
63
e= - y ( y ) i
64
Residual Analysis
residuals
x Non-constant variance
residuals
Constant variance
x
residuals
residuals
Not Independent
Independent
65
66