Vous êtes sur la page 1sur 20

UNIVERSITY OF TORONTO

FACULTY OF ARTS & SCIENCE


JUNE FINAL EXAM, 2014
STA302Hl FI STAIOOIH F
DURATION: 3 Hours
AIDS ALLOWED: Scientific Calculator

LAST NAME:

FIRST NAME:

STUDENT NUMBER:

-------

---------

ENROLLED IN: D STA 302

D STA 1001

Instructions:
1.
2.
3.
4.
5.

There are 6 questions in 20 pages (including cover sheet) for this exam.
There is a formula sheet on page 17.
There are t- and F- distribution tables on pages 18 through 20.
Feel free to tear off the last four pages of the exam.
Unless otherwise instructed, you may assume that a= 5%, and give your final
answers to 4 decimal places unless they are trailing zeroes.
6. SLR= Simple Linear Regression, and MLR = Multiple Linear Regression
7. Total marks: 100

Ql

Ql

i-v

Vl-X

IO

Q4

Q2

IO

Q5

vii-x

Q3

Q3

Q4

Q4

ii-vi

1-V

VI

IO

II

Q6

Q6

Q6

1-11

lll-V

VI-Vil

IO

TOTAL

100

Page I I 20

[20 marks, 2 each]


1. Multiple Choice Questions: circle the correct answer (no explanation required)

I. Consider explanatory variables X 1, X 2 , and X 3. The MLR model


X 3 = /30 + /31X 1 + j32 X 2 + & has R 2 = .7. What is the variance inflation factor
(VIF) for X3 in the model Y
A . .3
B. .7

= /30 + /31X 1 + /32 X 2 + f33 X 3 + & ?

C. 1/.3
D. 1/.7

II. For the MLR model Y

11

12

xn,l

xn,2

/30 + /31X 1 + /32 X 2 +&,with design matrix

X=[~ ~ ~ 1,youhave (X'Xf1 =I.~ ~ ~5 ].


l-.5 .5 3
1

What is the correlation between bo and b2 ?


A. -0.1667
B. -0.2887
c. -0.5

D. -0.7071
III. Which of the following is a consequence of multicollinearity?
A.
B.
C.
D.

b's are highly correlated


b's have high variance
It is difficult to interpret parameters
All of the above

IV. For the multiple linear regression model Y

matrix is H

.92

-.04

-.12

.24

-.04

.98

-.06

.12

-.12

-.06

.82

.36

.24

.12

.36

h4,4

= /30 + /31X 1 + {32 X 2 + & , the hat

. What is the value of h4 4 ?

A . . 28
B . . 36
c. .44
D . . 12
V. In the SLR model
A.
B.
C.
D.

r; = /30 + /31X, + &; , when are b 0 &

b 1 uncorrelated?

When Y =0
When X=O
When Y = X = 0
None of the above

Page 2 I 20

VI. What happens to the adjusted-R 2 when you add one or more variables in a
MLR model?
A. It always increases
B. It always decreases
C. It always stays the same
D. None of the above

VII.

Consider the fitted SLR model Y = b0 + b1X which has an outlying

observation (Y,,, Xn). When is the influence of the outlier minimal?

A. If Y,,=Y
B. If xn = x
C. If bl= 0
D. If b0

=0

VIII.

Suppose our intention is to estimate the slope in SLR, and we know the
regression curve is linear from previous studies. For best efficiency (tightest
Cls), how many levels of the predictor variable X should we have?
A. 2
B. 3

c. 4
D. More than 4

IX.

Which of the following models cannot be expressed as a (general) linear


model?

A. Y = /3 0 + /31 X1 + /3 2 log(X 2 ) + {33 Xf


B. Y = E exp(/30 + /31 X1 + f32XJ.)
C. Y = {3 0 exp(/31 X1 ) + E
D. Y = [1 + exp(/30 + /31 X1 + E)]- 1
X.

A
A.
B.
C.
D.

+E

transformation on Y does NOT help in which of the following cases?


Serially correlated errors
Nonlinearity in the regression function
Non-constant error variances
Non-normal errors

Page 3 I 20

[10 marks]

2. Beside each description, write the letter of the term from the list below that
provides the best match. Some questions have more than one correct answer.
I.

A statistic used to identify problems with multicollinearity.

II.

A test comparing a full model to a model with only an intercept.

III.

A data point with a large residual.

IV.

A method for imputing missing data.

V.

More than one response variable.

VI.

A statistic used to compute the leverage of a data point.

VII.

The estimate of the error variance.

VIII.

The estimation method used in MLR

IX.

A measure of linear association between two variables.

X.

The proportion of variation explained by the regression surface

(A)

Analysis of Variance

(N)

Least Squares

( B)

Analysis of Variance F-test

(0)

Correlation

( c)

R-squared

(P)

Degrees of Freedom

(D) Adjusted R-squared

(Q)

Cook's Distance

(E)

t-test

( R)

Leverage

(F)

Residual

( s)

Variance Inflation Factor

( G)

Standardized residual

(T)

Mean squared error

(H)

Fitted value

(U)

Extra sum of squares

(I )

Interaction

(V)

Measurement error

( J)

Indicator

(W) DFFITs

( K)

Explanatory variable

( x)

K Nearest Neighbours

(L)

Response variable

(Y)

Regression through the origin

(Z)

Multivariate regression

(M) Outlier

Page 4 I 20

[20 marks]
3. The decathlon is a 10-event track and field competition over two days where the
competitors perform 3 throwing events, 3 jumping events, and 4 runs. In each
event they get a score based on their performance, and their final score is the sum
of the event scores. In the 2012 Olympic Games, 26 men completed the
decathlon. The first three competitors are shown below:

IRankll

Athlete

I Score II mom l~I

Shot

l[!!DI 400m llnomHll

Disc

l~I

Jav

59
18869111035 ' 11803 m1114 66 m112 05 m1146 90 ' IEJ253 mlls 20 m116196 m11~~: 1
94
18671
1110.
4
2
'
117
53
m
1115.
2
8
m
111.
9
9
m
EJEJl4826m
II
80m
1166
65
m
11~~
1
Hardee (USA)
ELeonel
08
Suarez (CUB) 18523 IE} 52 m1114 50 m[[211 mll49 04' IEJl-5 75 mII 70 m117694m11~;: 1

0
[!]-Trey

Ashton
Eaton (USA)

Ma

[!]

l isooml

There IS a functional relationship between the ten performances and the final
score, but it's not linear. We will try to predict the final score from the event
performances using a linear model, so we can expect some 'error'. We will try
using 5 different models; the ANOV A output is shown for each.

Model 1: All 10 events as predictors


Source

DF

Model
Error
Corrected Total

(A)
(B)

(C)

Root MSE
Dependent Mean

Analysis of Variance
Sum of
Squares
3S29197
532.6
3529730
(F)
8032.65385

Mean
Square
(D)
35.506

R-Square
Adj R-Sq

Pr > F

F Value
(E)

< .0001

0.9998
0.9997

Model 2: Only the throws (Shot, Disc, Jav) as predictors


Source

DF

Analysis of Variance
Sum of
Squares

Model
Error
Corrected Total

3
22
25

1315883
2213847
3529730

Root MSE
Dependent Mean

317. 22137
(I)

Mean
Square

F Value

Pr > F

4. 36

0.0149

438628
100629
R-Square
Adj R-Sq

(G)
(H)

Page 5 I 20

Model 3: Only the runs ( 1OOm, 400m, 11 OmH, 1500m) as predictors.


Analysis of Variance
Sum of
Squares

Source

OF

Model
Error
corrected Total

4
21
25

2503539
1026191
3529730

Root MSE
Dependent Mean

221.05709
8032.65385

Mean
Square
625885
48866

R-5quare
Adj R-Sq

F Value

Pr > F

12.81

<.0001

0.7093
0.6S39

Model 4: Throws (Shot, Disc, Jav) and Jumps (HJ, LJ, PV) as predictors.
Source

DF

Analysis of Variance
Sum of
Squares

Model
Error
Corrected Total

6
19
25

3021739
507991
3529730

Root MSE
Dependent Mean

163.51254
8032.65385

Mean
Square
503623
26736

R-Square
Adj R-Sq

F Value

Pr > F

18.84

<.0001

0.8561
0.8106

Model 5: Only the 1st three events (I OOm, LJ, Shot) as predictors.
Source

DF

Analysis of Variance
Sum of
Squares

Model
Error
Corrected Total

3
22
25

2565468
964261
3529730

Root MSE
Dependent Mean

209.35631
8032.65385

I.

Mean
Square
85S1S6
43830

R-Square
Adj R-Sq

F Value

Pr > F

19. 51

<.0001

0.7268
0.6896

[9 marks] Some output has been replaced with letters. Find the missing
values (A to I). It is not necessary to show your work.

Page 6120

II. [2 marks] Write out the full model (#1), using the variable names in the table.

III. [2 marks] Find the extra sum of squares for including the 3 jumps (given that
the 3 throws have already been included) and its degrees of freedom.

IV. [3 marks] Assuming the coefficients of all the running events are zero (so
there are only 6 possible predictors), perform a hypothesis test for Ho: the
coefficients of the three jumps are all zero. Clearly state the hypothesis being
tested, the test statistic, the critical value for rejection, and a conclusion.

V. [2 marks] A coach thinks it's not useful to know the throwing results if we
already know the jumps. Can we test this hypothesis with the output above? If
so, do it. If not, state why it cannot be tested.

VI. [2 marks] Which of models 2-5 do you prefer for making accurate
predictions? Why?

Page 7 I 20

[20 marks]
4. The Journal of Statistics Education provides some data from 804 sales of used
cars. The response variable is the selling Price of the car, and we will try to
predict the Price from the Mileage of the car. Some analysis from SAS is shown
below.
The REG Procedure
Dependent Variable: price
Number of Observations Used

Source

DF

Model
Error
Corrected Total

Analysis of Variance
Sum of
Squares

802
803

1605590375
76855792489
78461382864

Root MSE
Dependent Mean
Coeff Var

9789.28829
21343
45.86620

Variable
Intercept
mileage

OF

804
Mean
Square

24765
-0.17252

Pr > F

16.75

<.0001

1605590375
95830165

R-Square
Adj R-Sq

Parameter Estimates
Parameter
Standard
Estimate
Error

1
1

F Value

904.36328
0.04215

0.0205
0.0192

t Value

Pr > It I

27.38
-4.09

<.0001
<.0001

The MEANS Procedure


Analysis Variable
mileage
Corrected SS

Mean

Std Dev

Minimum

Maximum

53945264360

19831.93

8196.32

804

266.0000000

50387.00

I.

[2 marks] Build a 95% confidence interval for ~ 1 , the slope of Mileage.

Page 8 I 20

II. [3 marks] Build a 95% CI for the E{Yh} when the mileage is 20,000.

III. [1 mark] Compute the Working-Hotelling coefficient. You can use 100 for
the denominator df for the F-distribution.

IV. [2 marks] Build a 95% confidence band for the E{Yh} when mileage is
20,000, using the Working-Hotelling procedure.

V. [2 marks] Explain why your confidence band is wider than your confidence
interval.

Page 9120

The residual plot and Normal QQ plot are shown below.


~ ....

._.

... ~...~~

.....

.,.~~

.... 1 ... ~.,, ..

+~
-

+ +

10000

VI. [3 marks] To the right of each plot, write what assumptions you are checking
and state whether they are violated.

Page 10 I 20

A subsequent analysis was performed after transforming Price to a log scale, and including
several categorical predictors (7 in total). The output from SAS proc GLM is shown below.
The GLM Procedure
Class
make
type
cylinder
cruise
sound
leather

Levels
6
5
3
2
2
2

Class Level Information


Values
Buick Cadillac Chev role Pontiac SAAB Saturn
Converti Coupe Hatchbac Sedan Wagon
4 6 8
0 1
0 1
0 1

Number of Observations Used

804

Dependent Variable: log_price


Source
Model
Error
Corrected Total
R-Square
0.929158
Source
mileage
make
type
cylinder
cruise
sound
leather
Source
mileage
make
type
cylinder
cruise
sound
leather

DF

Sum of
Squares

15
788
803

125.4899707
9.5677636
135.0577343

Coeff Var
1.115390

Root MSE
0.110190

Mean Square

F Value

Pr > F

8.3659980
0.0121418

689.02

<.0001

log_price Mean
9.879050

DF

Type I SS

Mean Square

F Value

Pr > F

1
5
4
2
1
1
1

2.9S910240
85.80159266
9.60391438
26.80990540
0.04729142
0.07255603
0.19560840

2.95910240
17.16031853
2.40097859
13.40495270
0.04729142
0.07255603
0.19560840

243.71
1413. 32
197.74
1104.03
3.89
5.98
16.11

<.0001
<.0001
<.0001
<.0001
0.0488
0.0147
<.0001

DF

Type III SS

Mean Square

F Value

Pr > F

1
5
4
2
1
1
1

3.59498606
30.22448535
5. 50472843
22.35124396
0.06555974
0.03820368
0.19560840

3.59498606
6.04489707
1. 37618211
11.17562198
0.06555974
0.03820368
0.19560840

296.08
497.86
113. 34
920.42
5.40
3.15
16.11

<.0001
<.0001
<.0001
<.0001
0.0204
0.0765
<.0001

Page 11I20

VII. [2 marks] Is Sound a significant predictor of Price, for a model that already
includes Mileage, Make, Type, Cylinder and Cruise? What is the F-statistic for
this test?

VIII. [1 mark] Is Cruise a significant predictor of Price, for a model that already
includes all of the other predictors? What is the F-statistic for this test?

IX.

[1 mark] Suggest a reason why Sound is significant when Leather is not in the
model, but not significant when Leather is.

X.

[3 marks] Perform a partial F-test to assess whether Cruise, Sound and Leather
are significant predictors of Price, given that the other four predictors are
already in the model. Use a significance level of 1%.

Page 12 I 20

[6 marks]
5. In a simple linear regression model Yt = /3 0 + f3 1 Xi + Ei, suppose you try to
measure Xi but end up with Xi =Xi+ Ot instead (due to measurement error).

I.

[2 marks] Write out the modified true regression model.

II. [2 marks] Assuming E{oJ = E{Et} = E{otEi} = 0, find the Covariance


between the measured value of X (Xt) and the new error term from the
previous part.

III. [1 mark] Give an example of a regressor, X, that is a known constant, but


contains measurement error.

IV. [1 mark] Give an example of a regressor, Y, that is a random variable, but


measured without any error.

Page 13 I 20

[24 marks]
6. Consider the multiple regression model in matrix form
with

Xnxp

l~ ~l,l
I

I.

x"1

Xlt-l
xn.p-1

1r~o
=

X1

ynxl

x'. _ 1
1

1,
J

= xnxp . ppxl + Enxl'

where

X0

= lnxl =

r;1

[4 marks] Show that the least squares estimates of fJ are b = (X' X)- 1 X'Y.

II. [3 marks] Derive the distribution of b = (X' x)- 1 X'Y. You have three things
to derive - expected value, variance, and distribution.

Page 14 I 20

111.[3 marks] Show that e =(I - H)E.

IV. [3 marks] Show that I - His idempotent. You may not assume that His
idempotent.

V. [2 marks] Show that SSE+ SSR = SSTO using matrix expressions.

Page 15 I 20

VI.

[3 marks] Show that the distribution of the residuals is e - N ( 0, a 2 (1- H) ).


You have three things to show, and you may use any results from above.

VII.

[4 marks] Using matrices, show that for a regression model with only one
predictor (a categorical variable with two levels) and no intercept, the least
squares estimates are simply the two group means.

VIII. [2 marks] Congratulations! You made it this far. Check the box for two
marks.

I loved this class!


THE END
Page 16/20

Some formulae:
b

"n

"n

= L_,t=l (x, -x)(Y, -Y) = L_,t=I XY -nXY

L:'.' (X-X-)2
t=l

L:n X; 2-nX-2
1=1

-+ "n (x -X)_J
2

Var{b0 } = 0"

SSTO

SSR =

( n

L_,t=l

L;= Y; -Y)2
1

-Yr=

I~=l (xi -x)2


L:;~,(x, -x)(r;-r)

L:;=l (Y,

b1

r=---;=========================

~[ L:;=l( x, -xt][L::1 (r; -r)2]

0"

2{pred} = Var{r;, -~} =a- 2[1+~+ (xh -x)~


n

"n (x-x)
L_,t=I

Working-Hotelling Coefficient
2

W = ~2F(l-a;2,n-2)

Regression in matrix form:

Cov(AX) = ACov(X)A'

Cov(X) = E[(X-EX)(X-EX)']
=E(XX')-(EX)(EX)'

0"

R2

b=(X'Xf1X'Y

Cov(b) = 0" 2 (X'Xf1

Y=Xb=HY

e= Y-Y =(1-H)Y

=y

H=X(X'Xf1X'

SSR

SSE= Y'(I-H)Y

SSTO =

= SSR
SSTO

=0"

x'h(X'Xf1xh

0"

I(

H-

~ J) y

Y'(I - ~ J) Y

{pred}=o- 2 l+x\(X'Xf1xh

R 2 =l-(n-I)MSE
Adj
SSTO

Page 17 / 20

.975

12.706
. 4.303
3.182
2.776
2.571

2.447
2~365

2.306
. 2:262
2.228

. 2.201
2.179
2:160
2.145

:2:131
2.120.
2.110
2.101
. 2.093
2.086
2~080

. 2.074
2.069
2.064.

. 2.060.
2.056

2.os2
2~048

2.045
2;042
. 2~011
2.000
l.980
l.960

Page 18 / 20

F Distribution Table (Percentiles)

Numerator df
1

0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99

10

11

12

13

14

15

161.5
199.5
647.8
799.5
4052.2 4999.5
19.00
18.51
38.51
39.00
98.50
99.00
10.13
9.55
17.44
16.04
34.12
30.82
7.71
6.94
12.22
10.65
21.20
18.00
6.61
5.79
10.01
8.43
16.26
13.27
5.99
5.14
8.81
7.26
13.75
10.92
5.59
4.74
8.07
6.54
12.25
9.55
5.32
4.46
7.57
6.06
11.26
8.65
5.12
4.26
7.21
5.71
10.56
8.02
4.96
4.10
5.46
6.94
10.04
7.56
4.84
3.98
6.72
5.26
9.65
7.21
4.75
3.89
6.55
5.10
9.33
6.93
4.67
3.81
6.41
4.97
9.07
6.70
4.60
3.74
6.30
4.86
8.86
6.51
4.54
3.68
6.20
4.77
8.68
6.36

4
3
215.7
224.6
864.2
899.6
5403.4 5624.6
19.16
19.25
39.17 39.25
99.17
99.25
9.28
9.12
15.44
15.10
29.46
28.71
6.59
6.39
9.98
9.60
15.98
16.69
5.41
5.19
7.76
7.39
11.39
12.06
4.76
4.53
6.23
6.60
9.15
9.78
4.12
4.35
5.52
5.89
7.85
8.45
3.84
4.07
5.42
5.05
7.59
7.01
3.86
3.63
5.08
4.72
6.99
6.42
3.71
3.48
4.83
4.47
6.55
5.99
3.59
3.36
4.63
4.28
6.22
5.67
3.49
3.26
4.12
4.47
5.95
5.41
3.41
3.18
4.35
4.00
5.74
5.21
3.34
3.11
4.24
3.89
5.56
5.04
3.29
3.06
4.15
3.80
5.42
4.89

6
7
8
9
10
5
240.5 241.9
238.9
230.2
234.0
236.8
956. 7 963.3 968. 7
921.8
937.l
948.2
5763.7 5859.0 5928.4 5981.07 6022.4 6055.9
19.30
19.33
19.35
19.37 19.38
19.40
39.30
39.33 39.36
39.37 39.39 39.40
99.30 99.33
99.36
99.37 99.39
99.40
9.01
8.94
8.89
8.85
8.81
8.79
14.54
14.47 14.42
14.88
14. 73
14.62
27.49
27.35
27.23
28.24 27.91
27.67
6.26
6.16
6.09
6.04
6.00
5.96
8.98
8.90
8.84
9.36
9.20
9.07
14.80
14.66
14.55
15.52
15.21
14.98
4.77
4.74
4.88
5.05
4.95
4.82
6.68
6.62
6.76
6.85
7.15
6.98
10.16
10.05
10.97 10.67
10.46
10.29
4.10
4.06
4.39
4.28
4.21
4.15
5.70
5.60
5.52
5.46
5.99
5.82
8.10
7.98
7.87
8.26
8.75
8.47
3.73
3.68
3.64
3.79
3.97
3.87
4.90
4.82
4.76
4.99
5.29
5.12
6.99
6.84
6.72
6.62
7.46
7.19
3.39
3.35
3.50
3.44
3.69
3.58
4.43
4.36
4.30
4.53
4.82
4.65
6.03
5.91
5.81
6.18
6.63
6.37
3.29
3.23
3.18
3.14
3.48
3.37
4.03
3.96
4.48
4.32
4.20
4.10
5.35
5.26
5.61
5.47
6.06
5.80
3.02
2.98
3.33
3.22
3.14
3.07
4.24
4.07
3.85
3.95
3.78
3.72
5.64
5.39
5.20
5.06
4.94
4.85
3.20
3.09
3.01
2.95
2.90
2.85
4.04
3.88
3.66
3.59
3.53
3.76
4.89
4.74
4.63
4.54
5.32
5.07
3.11
3.00
2.91
2.85
2.80
2.75
3.89
3.73
3.61
3.51
3.44
3.37
4.39
4.30
5.06
4.82
4.64
4.50
2.77
2.71
2.67
3.03
2.92
2.83
3.31
3.25
3.77
3.60
3.48
3.39
4.19
4.10
4.86
4.62
4.44
4.30
2.65
2.60
2.96
2.85
2.76
2.70
3.66
3.50
3.38
3.29
3.21
3.15
4.69
4.46
4.28
4.14
4.03
3.94
2.90
2.79
2.64
2.59
2.54
2.71
3.58
3.41
3.29
3.12
3.06
3.20
4.56
4.32
4.00
3.89
3.80
4.14

F Distribution Table (Percentiles)

Numerator df
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99
0.95
0.975
0.99

1
16 4.49
6.12
8.53
17 4.45
6.04
8.40
18 4.41
5.98
8.29
19 4.38
5.92
8.18
20 4.35
5.87
8.10
22 4.30
5.79
7.95
24 4.26
5.72
7.82
26 4.23
5.66
7.72
28 4.20
5.61
7.64
30 4.17
5.57
7.56
40 4.08
5.42
7.31
50 4.03
5.34
7.17
60 4.00
5.29
7.08
80 3.96
5.22
6.96
100 3.94
5.18
6.90

2
3.63
4.69
6.23
3.59
4.62
6.11
3.55
4.56
6.01
3.52
4.51
5.93
3.49
4.46
5.85
3.44
4.38
5.72
3.40
4.32
5.61
3.37
4.27
5.53
3.34
4.22
5.45
3.32
4.18
5.39
3.23
4.05
5.18
3.18
3.97
5.06
3.15
3.93
4.98
3.11
3.86
4.88
3.09
3.83
4.82

3
3.24
4.08
5.29
3.20
4.01
5.18
3.16
3.95
5.09
3.13
3.90
5.01
3.10
3.86
4.94
3.05
3.78
4.82
3.01
3.72
4.72
2.98
3.67
4.64
2.95
3.63
4.57
2.92
3.59
4.51
2.84
3.46
4.31
2.79
3.39
4.20
2.76
3.34
4.13
2.72
3.28
4.04
2.70
3.25
3.98

4
3.01
3.73
4.77
2.96
3.66
4.67
2.93
3.61
4.58
2.90
3.56
4.50
2.87
3.51
4.43
2.82
3.44
4.31
2.78
3.38
4.22
2.74
3.33
4.14
2.71
3.29
4.07
2.69
3.25
4.02
2.61
3.13
3.83
2.56
3.05
3.72
2.53
3.01
3.65
2.49
2.95
3.56
2.46
2.92
3.51

5
2.85
3.50
4.44
2.81
3.44
4.34
2.77
3.38
4.25
2.74
3.33
4.17
2.71
3.29
4.10
2.66
3.22
3.99
2.62
3.15
3.90
2.59
3.10
3.82
2.56
3.06
3.75
2.53
3.03
3.70
2.45
2.90
3.51
2.40
2.83
3.41
2.37
2.79
3.34
2.33
2.73
3.26
2.31
2.70
3.21

6
2.74
3.34
4.20
2. 70
3.28
4.10
2.66
3.22
4.01
2.63
3.17
3.94
2.60
3.13
3.87
2.55
3.05
3.76
2.51
2.99
3.67
2.47
2.94
3.59
2.45
2.90
3.53
2.42
2.87
3.47
2.34
2.74
3.29
2.29
2.67
3.19
2.25
2.63
3.12
2.21
2.57
3.04
2.19
2.54
2.99

7
2.66
3.22
4.03
2.61
3.16
3.93
2.58
3.10
3.84
2.54
3.05
3.77
2.51
3.01
3.70
2.46
2.93
3.59
2.42
2.87
3.50
2.39
2.82
3.42
2.36
2.78
3.36
2.33
2.75
3.30
2.25
2.62
3.12
2.20
2.55
3.02
2.17
2.51
2.95
2.13
2.45
2.87
2.10
2.42
2.82

8
2.59
3.12
3.89
2.55
3.06
3.79
2.51
3.01
3.71
2.48
2.96
3.63
2.45
2.91
3.56
2.40
2.84
3.45
2.36
2.78
3.36
2.32
2.73
3.29
2.29
2.69
3.23
2.27
2.65
3.17
2.18
2.53
2.99
2.13
2.46
2.89
2.10
2.41
2.82
2.06
2.35
2.74
2.03
2.32
2.69

9
2.54
3.05
3.78
2.49
2.98
3.68
2.46
2.93
3.60
2.42
2.88
3.52
2.39
2.84
3.46
2.34
2.76
3.35
2.30
2.70
3.26
2.27
2.65
3.18
2.24
2.61
3.12
2.21
2.57
3.07
2.12
2.45
2.89
2.07
2.38
2.78
2.04
2.33
2.72
2.00
2.28
2.64
1.97
2.24
2.59

10
2.49
2.99
3.69
2.45
2.92
3.59
2.41
2.87
3.51
2.38
2.82
3.43
2.35
2.77
3.37
2.30
2.70
3.26
2.25
2.64
3.17
2.22
2.59
3.09
2.19
2.55
3.03
2.16
2.51
2.98
2.08
2.39
2.80
2.03
2.32
2.70
1.99
2.27
2.63
1.95
2.21
2.55
1.93
2.18
2.50

11
2.46
2.93
3.62
2.41
2.87
3.52
2.37
2.81
3.43
2.34
2.76
3.36
2.31
2.72
3.29
2.26
2.65
3.18
2.22
2.59
3.09
2.18
2.54
3.02
2.15
2.49
2.96
2.13
2.46
2.91
2.04
2.33
2.73
1.99
2.26
2.63
1.95
2.22
2.56
1.91
2.16
2.48
1.89
2.12
2.43

12
2.42
2.89
3.55
2.38
2.82
3.46
2.34
2.77
3.37
2.31
2.72
3.30
2.28
2.68
3.23
2.23
2.60
3.12
2.18
2.54
3.03
2.15
2.49
2.96
2.12
2.45
2.90
2.09
2.41
2.84
2.00
2.29
2.66
1.95
2.22
2.56
1.92
2.17
2.50
1.88
2.11
2.42
1.85
2.08
2.37

13
2.40
2.85
3.50
2.35
2.79
3.40
2.31
2.73
3.32
2.28
2.68
3.24
2.25
2.64
3.18
2.20
2.56
3.07
2.15
2.50
2.98
2.12
2.45
2.90
2.09
2.41
2.84
2.06
2.37
2.79
1.97
2.25
2.61
1.92
2.18
2.51
1.89
2.13
2.44
1.84
2.07
2.36
1.82
2.04
2.31

14
2.37
2.82
3.45
2.33
2.75
3.35
2.29
2.70
3.27
2.26
2.65
3.19
2.22
2.60
3.13
2.17
2.53
3.02
2.13
2.47
2.93
2.09
2.42
2.86
2.06
2.37
2.79
2.04
2.34
2.74
1.95
2.21
2.56
1.89
2.14
2.46
1.86
2.09
2.39
1.82
2.03
2.31
1.79
2.00
2.27

15
2.35
2.79
3.41
2.31
2.72
3.31
2.27
2.67
3.23
2.23
2.62
3.15
2.20
2.57
3.09
2.15
2.50
2.98
2.11
2.44
2.89
2.07
2.39
2.81
2.04
2.34
2.75
2.01
2.31
2.70
1.92
2.18
2.52
1.87
2.11
2.42
1.84
2.06
2.35
1.79
2.00
2.27
1.77
1.97
2.22

Vous aimerez peut-être aussi