Académique Documents
Professionnel Documents
Culture Documents
1. Introduction to Statistical
Inference
1 / 53
Industrial Statistics 1. Introduction to Statistical Inference
1.1 Overview
The aim of statistical inference is to make decisions and draw conclusions about populations.
major areas of statistical inference: parameter estimation, confidence intervals, and hypothesis
testing
2 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Let 2 (0, 1). Suppose that L and U only depend on the sample variables X1 , ..., Xn . If
P# (L # U ) 1 8# 2 ()
then the interval [L, U ] is called a two-sided 100(1 )% confidence interval for #. L is called
lower control limit, U upper control limit, and 1 is the confidence coefficient.
If both sides in (*) are equal then the confidence interval is called exact.
Example:
risk behavior of a financial investment ; upper c. i.
tear strength of a rope ; lower c. i.
3 / 53
Industrial Statistics 1. Introduction to Statistical Inference
[X c p , X + c p ]
n n
with c > 0. c is chosen as a function of such that (*) is valid. Note that
p |X |
2 X c p , X + c p , n c.
n n
p
Since n(X )/ the quantity c is determined such that
p |X | !
P n c = 2 (c) 1 = 1 .
Consequently c = 1 (1 /2) = z/2 . z/2 is the upper 100/2 percentage point of the
standard normal distribution.
100(1 )% confidence interval for ( known)
X z/2 p , X + z/2 p
n n
4 / 53
Industrial Statistics 1. Introduction to Statistical Inference
with tn 1;/2 = tn 1 1 (1 /2) (upper 100/2% percentage point of the t distribution with
n 1 degrees of freedom.
Example: mean annual rainfall (in millimeters) in Australia from 1983 to 2002:
1983 1984 1985 1986 1987 1988 1989 1990 1991 1992
499.2 555.2 398.8 391.9 453.4 459.8 483.7 417.6 469.2 452.4
1993 1994 1995 1996 1997 1998 1999 2000 2001 2002
499.3 340.6 522.8 469.9 527.2 565.5 584.1 727.3 558.6 338.6
It is n = 20, x = 485.755, s = 90.33872, and t19;0.025 = 2.093. Thus the confidence interval is
given by
5 / 53
Industrial Statistics 1. Introduction to Statistical Inference
program:
6 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Histogram
output:
0.008
700
0.006
600
0.004
500
0.002
400
0.000
1
Theoretical Quantiles
Sample Quantiles 7 / 53
Industrial Statistics 1. Introduction to Statistical Inference
program:
output:
8 / 53
Industrial Statistics 1. Introduction to Statistical Inference
9 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Example: Suppose that X1 , X2 , ..., are i.i.d. with E(Xi ) = for all i 1.
rule of thumb: n 30
rule of thumb: n 40
10 / 53
Industrial Statistics 1. Introduction to Statistical Inference
example: mercury contamination in largemouth bass (in ppm) - a sample of fish was selected
from 53 Florida lakes
It holds that n = 53, x = 0.5319583, s = 0.3567051, and z0.025 = 1.96. Thus the asymptotic
confidence interval is equal to [0.4311, 0.6188].
program:
11 / 53
Industrial Statistics 1. Introduction to Statistical Inference
output:
1.2
2
1.0
Theoretical Quantiles
1
0.8
0
0.6
0.4
1
0.2
2
0.0
0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 0.0 0.2 0.4 0.6 0.8 1.0 1.2
12 / 53
Industrial Statistics 1. Introduction to Statistical Inference
program:
output:
13 / 53
Industrial Statistics 1. Introduction to Statistical Inference
N (, 2 ), 2
known X X p z/2 X + p z/2
n n
2 S
. . ., unknown X X p tn 1;/2 X + pS tn 1;/2
n n
N (, 2 ), known 2
S 2 n S 2 2
n S 2
2 2
n;/2 n;1 /2
2 (n 1) S 2 (n 1) S 2
. . ., unknown S2 2 2
2
n 1;/2 n 1;1 /2
1 2 2
2-dim. normal distr.
p + 1p z/2
z/2
n n
n
2 1 X 2
with S = (Xi )
n
i=1
14 / 53
Industrial Statistics 1. Introduction to Statistical Inference
reality
decision
# 2 0 # 2 1
H0 not rejected no error type II error
H1 accepted type I error no error
15 / 53
Industrial Statistics 1. Introduction to Statistical Inference
procedure:
An upper bound for the type I error is fixed, e.g. 2 0.01, 0.05, 0.1 . The critical region C
for the test (reject H0 ) is determined such that the type I error fulfills this condition.
Such a test is called test of significance at level for H0 if the probability of a type I error is
smaller or equal to , i.e.
Because only the type I error and not the type II error is controlled by a test of significance, the
size of the type II error may be large. For that reason it is only possible to accept H1 , i.e. to
reject H0 . It is incorrect to accept the null hypothesis H0 .
16 / 53
Industrial Statistics 1. Introduction to Statistical Inference
p X 0
n| | > z/2 ; accept H0 (reject H0 )
p X 0
n| | z/2 ; fail to reject H0
t test ( unknown)
p X 0
n| | > tn 1;/2 ; accept H1 (reject H0 )
S
p X 0
n| | tn 1;/2 ; fail to reject H0
S
H0 is rejected if
S2 2 S2 2
(n 1) 2
< n 1;1 /2 or (n 1) 2
> n 1;/2
0 0
17 / 53
Industrial Statistics 1. Introduction to Statistical Inference
p
Example: Power function G() = P ( n|X 0 |/ > z/2 ) of the two-sided Gauss test (i.e.
probability to accept H1 as a function of ) for = 0.05, n = 5, = 1 and 0 = 0)
G()
qqqq q
q
... ..
qq
.... ....
.q
...qq ..q
.. .
qq......q
.... ...
.... qq
.
.. ... ...
.. q
qqq.....
.... .... ....
.... qq
.. .. ..
.. qq qq .......
.... .... ....
.... qq q
.. .. ..
.. q qq ......
.... .... ....
.... qq q
.. .. ..
.. q qq ......
.... .... ....
.... qq q
0.8 .. .. ..
.. qq qq ......
.... .... ....
.... qq q
.. .. ..
qqq q .....
..... .... .....
qqq q
q
qq ......
.. .. ..
.... ..... ....
q
qqq q
q
qq ......
.. .. . ..
.... .... ... ....
q
.
q q
..
qqq qq ......
.. .. ..
.... .... ... ....
q
.
q q
..
qqq
.. .. ..
q ......
.... .... ... ....
q
.
qqq q
.. .. .. ..
q ......
.... .... ... ....
q
.
qqq q
0.6 .. .. .
... ..
q ......
.... .... ....
q
.
q
.
q
.
qqq
.. .. ... ..
q ......
.... .... ....
q
.
.
qqq q
.. .. .. ..
...
q ......
.... .... ....
q q
.
qqq q
.
qq ......
.. .. ... ..
.... .... ....
q
.
q
.
qqq q
..
qq
.. .. ..
.... .... ... ....
q q
.
qqq q
..
qq
.. .. ..
.... .... ... ....
q q
.
qqq q
.. ..
qq
.. .. ... ..
.... .... ... ....
..qq ..qqq
. .
.. ..
0.4 .. .. ..
..qq
... ....
..q
.... ..... ....
....qq .q...q
. .
q
.. .. ... ... ..
.. qq qqq......
.... .... .... .... ....
.... qq q
.. .. .. .. ..
.. qq qq......
.... .... .... .... ....
.... qq q
.. .. .. .. ..
.. qq qq ......
..... .... ..... .... .....
.... qq q
.. .. .. .. ..
.. q qq ......
.... .... .... .... ....
.... qq q
.. .. .. .. ..
.. q q .....
.... .... .... .... ....
.... qq
q
q
.. qq
.. .. .. .. ..
q ......
.... .... .... .... ....
q
qqq .... qq .... qq
0.2 .. .. .
...
.
... ..
.... .... . . ....
q
. .
q q
.
qq .. qq .. q
.. .. .. .. ..
.............................................................................................................................................................................................................................q.....q.q..q...........................................................................................................................................................................................................................
.... .... .... .
. .
. ...
. ....
.. .. .. .
.... .... ..... . . ...
.
.
..
....
. . .. .
.. .. .. .. .. .. .. .. .. .. .. .. .. ..
.... .... .... ... .... .... ....
.. .. .. ... .. .. ..
3 2 1 0 1 2 3
"
H1 H1 !
H0
18 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Large-Sample Tests
Suppose that the variables X1 , X2 , ... are i.i.d. with E(Xi ) = and V ar(Xi ) = 2 for
i = 1, .., n.
p
The null hypothesis H0 is rejected if n| X S0 | > z/2 .
rule of thumb: n > 100, for 30 n 100 use tn 1;/2 instead of z/2
19 / 53
Industrial Statistics 1. Introduction to Statistical Inference
n
X n
2 1 2 2 1 X 2
for known expectation , S = (Xi X) , S = (Xi )
n 1 n 20 / 53
i=1 i=1
Industrial Statistics 1. Introduction to Statistical Inference
Example: performance of new golf clubs - ratio of outgoing velocity of a golf ball to the
incoming velocity (coefficient of restitution)
program:
CoR <- c(0.8411, 0.8191, 0.8182, 0.8125, 0.8750, 0.8580, 0.8532, 0.8483, 0.8276,
0.7983, 0.8042, 0.8730, 0.8282, 0.8359, 0.8660);
program:
#---unpaired t-test---#
22 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Histogram
output:
20
0.86
Coefficient of Restitution
15
0.84
10
0.82
5
0.80
0
Theoretical Quantiles
Sample Quantiles 23 / 53
Industrial Statistics 1. Introduction to Statistical Inference
.......
........... ................
...... ......
= 0.05 .....
.....
......
.....
.....
.....
.....
.....
= 0.0179
.... ....
.... ....
.... ....
.... ...
... ...
... ...
.... ...
. ...
... ...
.
.... ...
.. ...
... ...
.... ...
...
.
... ...
..... ...
... ...
... ...
... ...
. ...
.. ...
... ...
...
. ...
.. ...
... ...
...
. ...
...
.
... ...
..... ...
. ...
.... ....
......
. ....
....
.
.... ....
.
....
....
.... 1 ....
....
.......
.....
.
....
.......
.
. ..
... ......
.... ........
.....
......
..... .
... ..... ......
..... ......
.
.....
.... .... ......
...... ......
...... .
...
....... ......
....... . ... ......
r r
............ .... .... .............. ..........
.. .
.......... ... ... ..........................
.............. ................ ... ... ...... ................
......................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
... /
..
....
...
.
....
...
z
1 ....
. t
...
.
....
...
.
....
...
.
....
...
H0 H1 .
....
...
.
....
...
..
If an upper bound for the type I error is given then H0 is rejected if < . Else, H0 is not
rejected.
24 / 53
Industrial Statistics 1. Introduction to Statistical Inference
with ci as above. ci is a confidence interval for with confidence level 1 . Thus the
test directly provides a confidence interval. 25 / 53
Overview: Statistical Inference for a Single Sample
Industrial Statistics 1. Introduction to Statistical Inference
Suppose that X1 and X2 are independent characteristics. Let X11 ,..,X1n1 be a random sample
of X1 and let X21 ,..,X2n2 be a random sample of X2 .
27 / 53
Industrial Statistics 1. Introduction to Statistical Inference
X1 X2
1, 2 unknown 1 = 2 1 6= 2 q tn1 +n2 2 |T | > tn +n
1 2 2;/2
2( 1 + 1 )
n1 n2
1, 2 unknown 1 = 2 1 6= 2 rX1 X2
tdf , df = |T | > tdf ;/2
S2 S2
1+ 2
n1 n2
(1+R)2
1 6= 2, 1 2 1 > 2 T > tdf ;
R2 + 1
n1 1 n2 1
n2 S 2
1 2 1 < 2 R= 1 T < tdf ;
n1 S 2
2
n
X
2 1 2 1 1
2 = n +n S 2 + n +n S2
1 n 2 n
with Sk = (Xki Xk ) , k = 1, 2,
n 1 1 2 2 1 1 2 2 2
i=1 28 / 53
Industrial Statistics 1. Introduction to Statistical Inference
p1 p2
binomial distr. p1 = p2 p1 6= p2 q |T | > z/2
p(1 p)( 1 + 1 )
n1 n2
29 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Example: arsenic in drinking water - drinking water arsenic concentration in parts per billion
(ppb) for 10 metropolitan Phoenix communities and 10 communities in rural Arizona
program:
MetroPhoenix <- c(3, 7, 25, 10, 15, 6, 12, 25, 15, 7);
RuralArizona <- c(48, 44, 40, 38, 33, 21, 20, 12, 1, 18);
ArsenicConcentration <- cbind(MetroPhoenix, RuralArizona);
30 / 53
Industrial Statistics 1. Introduction to Statistical Inference
program:
boxplot(ArsenicConcentration);
31 / 53
Industrial Statistics 1. Introduction to Statistical Inference
output:
40
30
20
10
0 MetroPhoenix RuralArizona
1.5
1.0
1.0
Theoretical Quantiles
Theoretical Quantiles
0.5
0.5
0.0
0.0
1.5 1.0 0.5
5 10 15 20 25 0 10 20 30 40
output:
33 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Example: running time for a marathon before (characteristic X1 ) and after a training camp
(characteristic X2 )
idea: Using a transformation the data are transferred to a univariate sample which can be
handled by well-known methods. Consider D = X2 X1 .
test problem
H0 : 2 1 against H1 : 2 < 1
test statistic:
p D
T = n v
u n
u 1 X ` 2
t Di D
n 1 i=1
34 / 53
Industrial Statistics 1. Introduction to Statistical Inference
subject 1 2 3 4 5 6 7 8 9 10
before 223 259 248 220 287 191 229 270 245 201
after 220 244 243 211 299 170 210 276 252 189
dierence di -3 -15 -5 -9 12 -21 -19 6 7 -12
n
X
1 ` 2
di d = 129.65.
n 1 i=1
Thus
p 5.9
t= 10 p = 1.639 .
129.65
Choosing = 0.1 ist t9;0.9 = t9;0.1 = 1.383. Since t < t9;0.9 the alternative hypothesis
H1 : 2 < 1 is accepted.
35 / 53
Industrial Statistics 1. Introduction to Statistical Inference
program
X <- c(223, 259, 248, 220, 287, 191, 229, 270, 245, 201);
Y <- c(220, 244, 243, 211, 299, 170, 210, 276, 252, 189);
output:
Paired t-test
data: X and Y
t = 1.6385, df = 9, p-value = 0.1357
alternative hypothesis: true difference in means is not equal to 0
95 percent confidence interval:
-2.245511 14.045511
sample estimates:
mean of the differences
5.9
36 / 53
Overview: Statistical Inference for Two Samples
Industrial Statistics 1. Introduction to Statistical Inference
0.4
0.2
0.0
38 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Thus d = 0.441. For = 0.1 we get that c0.1 = 0.565. Because d c0.1 the null hypothesis
H0 : F = is not rejected.
program
x <- c(-1.0, 0.5, 0.5, 1.5);
ks.test(x, pnorm) 39 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Si is equal to the number of observations equal to ti . The expected number is equal to npi .
Xr ` 2
Si npi
test statistic: Q =
i=1
npi
Remark: The asymptotic test can be applied if npi 5 for all i. Contrary to the test of
Kolmogorov the test of Pearson can be applied to discrete as well as continuous distributions.
program
y <- c(24, 12, 15, 25, 16, 28);
chisq.test(y, p=x)
41 / 53
Industrial Statistics 1. Introduction to Statistical Inference
decision:
`
If the points vi , x(i) are roughly lying on a straight line then H0 is rejected.
0.04
0.02
0.02
Sample Quantiles
0.0119 0.012
0.000796
0.000796
rWIG
0.00
0.00
0.02
0.02
0.04
0.04
3 2 1 0 1 2 3 4 2 0 2 4
Theoretical Quantiles v
p
vi = 1 i 1/2 vi = t5 1
i 1/2
/ 5/(5 2)
n n
x = 0.000796, s = 0.0120, = 0.0119 x = 0.000796, s = 0.0120, = 0.0120
time period: 01.01.2002 31.12.2003 (# 499) time period: 01.01.2002 31.12.2003 (# 499)
43 / 53
Industrial Statistics 1. Introduction to Statistical Inference
Example: We consider the mean annual rainfall data from Australia considered above.
program
x <- c(499.2,...); shapiro.test(x)
We get W = 0.9536 and a P-value of 0.4248. Thus the normal assumption is not rejected.
1.6 Appendix
discrete distributions
probability mass parameter expectation variance
function f (m) set
2
`
= E(X) = E [X ]2
n
m n m
binomial p (1 p) 0<p<1 np n p (1 p)
B(n, p) m n 2 {1, 2, . . .}
m 2 {0, 1, . . . , n}
`M `N M
m n m M M N M N n
hyper- `N N 2 {1, 2, . . .}, n n
geometric M 2 {0, 1, . . . , N }, N N N N 1
n
H(N, M, n) n 2 {1, 2, . . . N }
m
Poisson e >0
P( ) m!
m 2 {0, 1, . . .}
1 1 p
geometric p (1 p)m 1 0<p<1
G(p) p p2
m 2 {1, 2, . . .} 46 / 53
Industrial Statistics 1. Introduction to Statistical Inference
continuous distributions
density f parameter expectation variance
set
2
= E(X) = E(X )2
1 a+b (b a)2
, x 2 [a, b] 1 < a <
uniform b a b<1 2 12
distr.
U (a, b)
(x )2
1 2
p e 2 2 , x 2 IR 2 IR, >0
normal distr. 2 2
N (, 2 )
x 1 1
f (x) = e , x 0 >0
exponential 2
distr.
E( )
r
r 1 x r r
x e , x 0 > 0, r > 0
Gamma (r) 2
distr.
47 / 53
Industrial Statistics 1. Introduction to Statistical Inference
(m/n)m/2 m m m+n
n 2 n2 (m + n 2)
1 2
x 2 1+ x ,
F -distr. B( m
2
, n
2
) n n 2 m (n 2)2 (n 4)
Fm,n
x 0, m, n 2N (n > 2) (n > 4)
R1 (x) (y)
Note that (x) = 0
exp( t)tx 1 dt and B(x, y) = (x+y)
.
48 / 53
Industrial Statistics 1. Introduction to Statistical Inference
1 z 1 z 1 z 1 z
0.9999 3.7190 0.9975 2.8070 0.965 1.8119 0.83 0.9542
0.9998 3.5401 0.9970 2.7478 0.960 1.7507 0.82 0.9154
0.9997 3.4316 0.9965 2.6968 0.955 1.6954 0.81 0.8779
0.9996 3.3528 0.9960 2.6521 0.950 1.6449 0.80 0.8416
0.9995 3.2905 0.9955 2.6121 0.945 1.5982 0.79 0.8064
0.9994 3.2389 0.9950 2.5758 0.940 1.5548 0.78 0.7722
0.9993 3.1947 0.9945 2.5427 0.935 1.5141 0.76 0.7063
0.9992 3.1559 0.9940 2.5121 0.930 1.4758 0.74 0.6433
0.9991 3.1214 0.9935 2.4838 0.925 1.4395 0.72 0.5828
0.9990 3.0902 0.9930 2.4573 0.920 1.4051 0.70 0.5244
0.9989 3.0618 0.9925 2.4324 0.915 1.3722 0.68 0.4677
0.9988 3.0357 0.9920 2.4089 0.910 1.3408 0.66 0.4125
0.9987 3.0115 0.9915 2.3867 0.905 1.3106 0.64 0.3585
0.9986 2.9889 0.9910 2.3656 0.900 1.2816 0.62 0.3055
0.9985 2.9677 0.9905 2.3455 0.890 1.2265 0.60 0.2533
0.9984 2.9478 0.9900 2.3263 0.880 1.1750 0.58 0.2019
0.9983 2.9290 0.9850 2.1701 0.870 1.1264 0.56 0.1510
0.9982 2.9112 0.9800 2.0537 0.860 1.0803 0.54 0.1004
0.9981 2.8943 0.9750 1.9600 0.850 1.0364 0.52 0.0502
0.9980 2.8782 0.9700 1.8808 0.840 0.9945 0.50 0.0000
49 / 53
Industrial Statistics 1. Introduction to Statistical Inference
51 / 53
Industrial Statistics 1. Introduction to Statistical Inference
2
Percentage Points of the -distribution cont.
df 0.01 0.025 0.05 0.1 0.9 0.95 0.975 0.99
46 26.66 29.16 31.44 34.22 58.64 62.83 66.62 71.20
47 27.42 29.96 32.27 35.08 59.77 64.00 67.82 72.44
48 28.18 30.75 33.10 35.95 60.91 65.17 69.02 73.68
49 28.94 31.55 33.93 36.82 62.04 66.34 70.22 74.92
50 29.71 32.36 34.76 37.69 63.17 67.50 71.42 76.15
55 33.57 36.40 38.96 42.06 68.80 73.31 77.38 82.29
60 37.48 40.48 43.19 46.46 74.40 79.08 83.30 88.38
65 41.44 44.60 47.45 50.88 79.97 84.82 89.18 94.42
70 45.44 48.76 51.74 55.33 85.53 90.53 95.02 100.4
75 49.48 52.94 56.05 59.79 91.06 96.22 100.8 106.4
80 53.54 57.15 60.39 64.28 96.58 101.9 106.6 112.3
85 57.63 61.39 64.75 68.78 102.1 107.5 112.4 118.2
90 61.75 65.65 69.13 73.29 107.6 113.1 118.1 124.1
95 65.90 69.92 73.52 77.82 113.0 118.8 123.9 130.0
100 70.06 74.22 77.93 82.36 118.5 124.3 129.6 135.8
110 78.46 82.87 86.79 91.47 129.4 135.5 140.9 147.4
120 86.92 91.57 95.70 100.6 140.2 146.6 152.2 159.0
130 95.45 100.3 104.7 109.8 151.0 157.6 163.5 170.4
140 104.0 109.1 113.7 119.0 161.8 168.6 174.6 181.8
150 112.7 118.0 122.7 128.3 172.6 179.6 185.8 193.2
160 121.3 126.9 131.8 137.5 183.3 190.5 196.9 204.5
170 130.1 135.8 140.8 146.8 194.0 201.4 208.0 215.8
180 138.8 144.7 150.0 156.2 204.7 212.3 219.0 227.1
190 147.6 153.7 159.1 165.5 215.4 223.2 230.1 238.3
200 156.4 162.7 168.3 174.8 226.0 234.0 241.1 249.4
220 174.2 180.8 186.7 193.6 247.3 255.6 263.0 271.7
240 192.0 199.0 205.1 212.4 268.5 277.1 284.8 293.9
260 209.9 217.2 223.7 231.2 289.6 298.6 306.6 316.0
280 227.9 235.5 242.2 250.1 310.7 320.0 328.2 338.0
300 246.0 253.9 260.9 269.1 331.8 341.4 349.9 359.9
320 264.1 272.3 279.6 288.0 352.8 362.7 371.4 381.8
340 282.3 290.8 298.3 307.0 373.8 384.0 393.0 403.6
360 300.5 309.3 317.0 326.1 394.8 405.2 414.5 425.3
380 318.8 327.9 335.8 345.1 415.7 426.5 435.9 447.1
400 337.2 346.5 354.6 364.2 436.6 447.6 457.3 468.7
450 383.2 393.1 401.8 412.0 488.8 500.5 510.7 522.7
500 429.4 439.9 449.1 459.9 540.9 553.1 563.9 576.5
550 475.8 486.9 496.6 507.9 592.9 605.7 616.9 630.1
600 522.4 534.0 544.2 556.1 644.8 658.1 669.8 683.5
650 569.1 581.2 591.9 604.2 696.6 710.4 722.5 736.8
700 615.9 628.6 639.6 652.5 748.4 762.7 775.2 790.0
750 662.9 676.0 687.5 700.8 800.0 814.8 827.8 843.0
800 709.9 723.5 735.4 749.2 851.7 866.9 880.3 896.0
850 757.0 771.1 783.3 797.6 903.2 918.9 932.7 948.8
900 804.3 818.8 831.4 846.1 954.8 970.9 985.0 1002
950 851.5 866.5 879.5 894.6 1006 1023 1037 1054
1000 898.9 914.3 927.6 943.1 1058 1075 1090 1107
52 / 53
Industrial Statistics 1. Introduction to Statistical Inference
n
m
1 2 3 4 5 6 7 8 9 10 15 20 30 40 50 60 70 80 90 100
1 39.87 8.526 5.538 4.545 4.060 3.776 3.589 3.458 3.360 3.285 3.073 2.975 2.881 2.835 2.809 2.791 2.779 2.769 2.762 2.756
2 49.52 9.000 5.462 4.325 3.780 3.463 3.257 3.113 3.006 2.924 2.695 2.589 2.489 2.440 2.412 2.393 2.380 2.370 2.363 2.356
3 53.62 9.162 5.391 4.191 3.619 3.289 3.074 2.924 2.813 2.728 2.490 2.380 2.276 2.226 2.197 2.177 2.164 2.154 2.146 2.139
4 55.87 9.243 5.343 4.107 3.520 3.181 2.961 2.806 2.693 2.605 2.361 2.249 2.142 2.091 2.061 2.041 2.027 2.016 2.008 2.002
5 57.28 9.293 5.309 4.051 3.453 3.108 2.883 2.726 2.611 2.522 2.273 2.158 2.049 1.997 1.966 1.946 1.931 1.921 1.912 1.906
6 58.24 9.326 5.285 4.010 3.404 3.055 2.827 2.668 2.551 2.461 2.208 2.091 1.980 1.927 1.895 1.875 1.860 1.849 1.841 1.834
7 58.95 9.349 5.266 3.979 3.368 3.014 2.785 2.624 2.505 2.414 2.158 2.040 1.927 1.873 1.840 1.819 1.804 1.793 1.785 1.778
8 59.48 9.367 5.252 3.955 3.339 2.983 2.752 2.589 2.469 2.377 2.119 1.999 1.884 1.829 1.796 1.775 1.760 1.748 1.739 1.732
9 59.90 9.381 5.240 3.936 3.316 2.958 2.725 2.561 2.440 2.347 2.086 1.965 1.849 1.793 1.760 1.738 1.723 1.711 1.702 1.695
10 60.24 9.392 5.230 3.920 3.297 2.937 2.703 2.538 2.416 2.323 2.059 1.937 1.819 1.763 1.729 1.707 1.691 1.680 1.670 1.663
15 61.26 9.425 5.200 3.870 3.238 2.871 2.632 2.464 2.340 2.244 1.972 1.845 1.722 1.662 1.627 1.603 1.587 1.574 1.564 1.557
20 61.78 9.441 5.184 3.844 3.207 2.836 2.595 2.425 2.298 2.201 1.924 1.794 1.667 1.605 1.568 1.543 1.526 1.513 1.503 1.494
30 62.31 9.463 5.168 3.817 3.174 2.800 2.555 2.383 2.255 2.155 1.873 1.738 1.606 1.541 1.502 1.476 1.457 1.443 1.432 1.423
40 62.57 9.472 5.160 3.804 3.157 2.781 2.535 2.361 2.232 2.132 1.845 1.708 1.573 1.506 1.465 1.437 1.418 1.403 1.391 1.382
50 62.73 9.477 5.155 3.795 3.147 2.770 2.523 2.348 2.218 2.117 1.828 1.690 1.552 1.483 1.441 1.413 1.392 1.377 1.365 1.355
60 62.84 9.480 5.151 3.790 3.140 2.762 2.514 2.339 2.208 2.107 1.817 1.677 1.538 1.467 1.424 1.395 1.374 1.358 1.346 1.336
70 62.91 9.483 5.146 3.786 3.135 2.756 2.508 2.333 2.202 2.100 1.808 1.667 1.527 1.455 1.412 1.382 1.361 1.344 1.332 1.321
80 62.97 9.484 5.144 3.782 3.132 2.752 2.504 2.328 2.196 2.095 1.802 1.660 1.519 1.447 1.402 1.372 1.350 1.334 1.321 1.310
90 63.01 9.486 5.143 3.780 3.129 2.749 2.500 2.324 2.192 2.090 1.797 1.655 1.512 1.439 1.395 1.364 1.342 1.325 1.312 1.301
100 63.05 9.487 5.142 3.778 3.126 2.746 2.497 2.321 2.189 2.087 1.793 1.650 1.507 1.434 1.388 1.358 1.335 1.318 1.304 1.293
53 / 53