Vous êtes sur la page 1sur 15

SAMPLE MOMENTS

1. POPULATION MOMENTS
1.1. Moments about the origin (raw moments). The rth moment about the origin of a random variable X,
denoted by

r
, is the expected value of X
r
; symbolically,

r
=E(X
r
)
=

x
x
r
f(x)
(1)
for r = 0, 1, 2, . . . when X is discrete and

r
=E(X
r
)
=
_

x
r
f(x) dx
(2)
when X is continuous. The rth moment about the origin is only dened if E[ X
r
] exists. A moment about
the origin is sometimes called a raw moment. Note that

1
= E(X) =
X
, the mean of the distribution of
X, or simply the mean of X. The rth moment is sometimes written as function of where is a vector of
parameters that characterize the distribution of X.
If there is a sequence of random variables, X
1
, X
2
, . . . X
n
, we will call the rth population moment of the
ith random variable

i, r
and dene it as

i,r
= E (X
r
i
) (3)
1.2. Central moments. The rth moment about the mean of a random variable X, denoted by
r
, is the
expected value of ( X
X
)
r
symbolically,

r
=E[ ( X
X
)
r
]
=

x
( x
X
)
r
f(x)
(4)
for r = 0, 1, 2, . . . when X is discrete and

r
=E[ (X
X
)
r
]
=
_

(x
X
)
r
f(x) dx
(5)
when X is continuous. The r
th
moment about the mean is only dened if E[ (X -
X
)
r
] exists. The rth
moment about the mean of a random variable X is sometimes called the rth central moment of X. The rth
central moment of X about a is dened as E[ (X - a)
r
]. If a =
X
, we have the rth central moment of X about

X
.
Note that
Date: December 7, 2005.
1
2 SAMPLE MOMENTS

1
= E[X
X
] =
_

(x
X
) f(x) dx = 0

2
= E[(X
X
)
2
] =
_

(x
X
)
2
f(x) dx = V ar(X) =
2
(6)
Also note that all odd moments of X around its mean are zero for symmetrical distributions, provided
such moments exist.
If there is a sequence of random variables, X
1
, X
2
, . . . Xn, we will call the r
th
central population moment
of the ith random variable
i,r
and dene it as

i,r
= E
_
X
r
i

i,1
_
r
(7)
When the variables are identically distributed, we will drop the i subscript and write

r
and
r
.
2. SAMPLE MOMENTS
2.1. Denitions. Assume there is a sequence of randomvariables, X
1
, X
2
, . . . X
n
. The rst sample moment,
usually called the average is dened by

X
n
=
1
n
n

i = 1
X
i
(8)
Corresponding to this statistic is its numerical value, x
n
, which is dened by
x
n
=
1
n
n

i = 1
x
i
(9)
where x
i
represents the observed value of X
i
. The rth sample moment for any t is dened by

X
r
n
=
1
n
n

i =1
X
r
i
(10)
This too has a numerical counterpart given by
x
r
n
=
1
n
n

i=1
x
r
i
(11)
2.2. Properties of Sample Moments.
2.2.1. Expected value of

X
r
n
. Taking the expected value of equation 10 we obtain
E
_

X
r
n

= E

X
r
n
=
1
n
n

i=1
E X
r
i
=
1
n
n

i=1

i,r
(12)
If the Xs are identically distributed, then
E
_

X
r
n

= E

X
r
n
=
1
n
n

i=1

r
=

r
(13)
SAMPLE MOMENTS 3
2.2.2. Variance of

X
r
n
. First consider the case where we have a sample X
1
, X
2
, . . . ,X
n
.
V ar
_

X
r
n
_
= V ar
_
1
n
n

i =1
X
r
i
_
=
1
n
2
V ar
_
n

i = 1
X
r
i
_
(14)
If the Xs are independent, then
V ar
_

X
r
n
_
=
1
n
2
n

i =1
V ar (X
r
i
) (15)
If the Xs are independent and identically distributed, then
V ar
_

X
r
n
_
=
1
n
V ar (X
r
) (16)
where X denotes any one of the random variables (because they are all identical). In the case where r =1,
we obtain
V ar
_

X
n
_
=
1
n
V ar ( X ) =

2
n
(17)
3. SAMPLE CENTRAL MOMENTS
3.1. Denitions. Assume there is a sequence of random variables, X
1
, X
2
, . . . ,X
n
. We dene the sample
central moments as
C
r
n
=
1
n
n

i=1
_
X
i

i,1
_
r
, r = 1, 2, 3, . . .,
C
1
n
=
1
n
n

i=1
_
X
i

i,1
_
C
2
n
=
1
n
n

i=1
_
X
i

i,1
_
2
(18)
These are only dened if

i , 1
is known.
3.2. Properties of Sample Central Moments.
3.2.1. Expected value of C
r
n
. The expected value of C
r
n
is given by
E (C
r
n
) =
1
n
n

i=1
E
_
X
i

i,1
_
r
=
1
n
n

i =1

i,r (19)
The last equality follows from equation 7.
If the X
i
are identically distributed, then
E ( C
r
n
) =
r
E
_
C
1
n
_
=0
(20)
4 SAMPLE MOMENTS
3.2.2. Variance of C
r
n
. First consider the case where we have a sample X
1
, X
2
, . . . ,X
n
.
V ar ( C
r
n
) = V ar
_
1
n
n

i=1
_
X
i

i,1
_
r
_
=
1
n
2
V ar
_
n

i=1
_
X
i

i,1
_
r
_
(21)
If the Xs are independently distributed, then
V ar ( C
r
n
) =
1
n
2
n

i=1
V ar
__
X
i

i,1
_
r

(22)
If the Xs are independent and identically distributed, then
V ar ( C
r
n
) =
1
n
V ar
_
( X

1
)
r

(23)
where X denotes any one of the random variables (because they are all identical). In the case where r =1,
we obtain
V ar
_
C
1
n
_
=
1
n
V ar [ X

1
]
=
1
n
V ar [ X ]
=
1
n

2
2 Cov [ X , ] + V ar [ ]
=
1
n

2
(24)
4. SAMPLE ABOUT THE AVERAGE
4.1. Denitions. Assume there is a sequence of random variables, X
1
, X
2
, . . . X
n
. Dene the rth sample
moment about the average as
M
r
n
=
1
n
n

i=1
_
X
i


X
n
_
r
, r = 1, 2, 3, . . . , (25)
This is clearly a statistic of which we can compute a numerical value. We denote the numerical value by,
m
r
n
, and dene it as
m
r
n
=
1
n
n

i = 1
( x
i
x
n
)
r
(26)
In the special case where r = 1 we have
M
1
n
=
1
n
n

i =1
_
X
i


X
n
_
=
1
n
n

i =1
X
i


X
n
=

X
n


X
n
= 0
(27)
4.2. Properties of Sample Moments about the Average when r = 2.
SAMPLE MOMENTS 5
4.2.1. Alternative ways to write M
r
n
. We can write M
2
n
in an alternative useful way by expanding the squared
term and then simplifying as follows
M
r
n
=
1
n
n

i =1
_
X
i


X
n
_
r
M
2
n
=
1
n
n

i=1
_
X
i


X
n
_
2
=
1
n
_
n

i =1
_
X
2
i
2 X
i

X
n
+

X
2
n

_
=
1
n
n

i =1
X
2
i

2

X
n
n

n
i =1
X
i
+
1
n
n

i = 1

X
2
n
=
1
n
n

i =1
X
2
i
2

X
2
n
+

X
2
n
=
1
n
_
n

i =1
X
2
i
_


X
2
n
(28)
4.2.2. Expected value of M
2
n
. The expected value of M
r
n
is then given by
E
_
M
2
n
_
=
1
n
E
_
n

i =1
X
2
i
_
E
_

X
2
n

=
1
n
n

i=1
E
_
X
2
i

_
E
_

X
n
_
2
V ar(

X
n
)
=
1
n
n

i =1

i , 2

_
1
n
n

i =1

i , 1
_
2
V ar(

X
n
)
(29)
The second line follows from the alternative denition of variance
V ar ( X ) =E
_
X
2
_
[ E ( X ) ]
2
E
_
X
2
_
=[ E ( X ) ]
2
+ V ar ( X )
E
_

X
2
n
_
=
_
E
_

X
n
_
2
+ V ar(

X
n
)
(30)
and the third line follows from equation 12. If the X
i
are independent and identically distributed, then
6 SAMPLE MOMENTS
E
_
M
2
n
_
=
1
n
E
_
n

i=1
X
2
i
_
E
_

X
2
n

=
1
n
n

i=1

i,2

_
1
n
n

i=1

i,1
_
2
V ar(

X
n
)
=

2
(

1
)
2


2
n
=
2

1
n

2
=
n 1
n

2
(31)
where

1
and

2
are the rst and second population moments, and
2
is the second central population
moment for the identically distributed variables. Note that this obviously implies
E
_
n

i=1
_
X
i


X
_
2
_
= nE
_
M
2
n
_
= n
_
n 1
n
_

2
= (n 1)
2
(32)
4.2.3. Variance of M
2
n
. By denition,
V ar
_
M
2
n
_
= E
_
_
M
2
n
_
2
_

_
E M
2
n
_
2
(33)
The second term on the right on equation 33 is easily obtained by squaring the result in equation 31.
E
_
M
2
n
_
=
n 1
n

2

_
E
_
M
2
n
__
2
=
_
E M
2
n
_
2
=
(n 1)
2
n
2

4
(34)
Now consider the rst term on the right hand side of equation 33. Write it as
E
_
_
M
2
n
_
2
_
=E
_
_
_
1
n
n

i =1
_
X
i


X
n
_
2
_
2
_
_
(35)
Now consider writing
1
n

n
i =1
_
X
i


X
n
_
2
as follows
SAMPLE MOMENTS 7
1
n
n

i =1
_
X
i


X
_
2
=
1
n
n

i =1
_
(X
i
) (

X )
_
2
=
1
n
n

i =1
_
Y
i


Y
_
2
where Y
i
=X
i

Y =

X
(36)
Obviously,
n

i = 1
_
X
i


X
_
2
=
n

i = 1
_
Y
i


Y
_
2
, where Y
i
= X
i
,

Y =

X (37)
Nowconsider the properties of the random variable Y
i
which is a transformationof X
i
. First the expected
value.
Y
i
=X
i

E (Y
i
) = E (X
i
) E ( )
=
=0
(38)
The variance of Y
i
is
Y
i
= X
i

V ar (Y
i
) = V ar (X
i
)
=
2
if X
i
are independently and identically distributed
(39)
Also consider E(Y
i
4
). We can write this as
E( Y
4
) =
_

y
4
f ( x) d x
=
_

( x )
4
f(x) d x
=
4
(40)
Now write equation 35 as follows
8 SAMPLE MOMENTS
E
_
_
M
2
n
_
2
_
=E
_
_
_
1
n
n

i =1
_
X
i


X
n
_
2
_
2
_
_
(41a)
=E
_
_
_
1
n
n

i =1
_
X
i


X
_
2
_
2
_
_
(41b)
=E
_
_
_
1
n
n

i =1
_
Y
i


Y
_
2
_
2
_
_
(41c)
=
1
n
2
E
_
_
_
n

i =1
_
Y
i


Y
_
2
_
2
_
_
(41d)
Ignoring
1
n
2
for now, expand equation 41 as follows
E
_
_
_
n

i = 1
_
Y
i


Y
_
2
_
2
_
_
=E
_
_
_
n

i = 1
_
Y
2
i
2 Y
i

Y +

Y
2
_
_
2
_
_
(42a)
= E
_
_
_
n

i =1
Y
2
i
2

Y
n

i =1
Y
i
+
n

i =1

Y
2
_
2
_
_
(42b)
= E
_
_
__
n

i =1
Y
2
i
_
2 n

Y
2
+ n

Y
2
_
2
_
_
(42c)
= E
_
_
__
n

i =1
Y
2
i
_
n

Y
2
_
2
_
_
(42d)
= E
_
_
_
n

i =1
Y
2
i
_
2
2 n

Y
2
n

i =1
Y
2
i
+ n
2

Y
4
_
_
(42e)
= E
_
_
_
n

i =1
Y
2
i
_
2
_
_
2 nE
_

Y
2
n

i =1
Y
2
i
_
+ n
2
E
_

Y
4
_
(42f)
Now consider the rst term on the right of 42 which we can write as
SAMPLE MOMENTS 9
E
_
_
_
n

i =1
Y
2
i
_
2
_
_
=E
_
_
n

i =1
Y
2
i
n

j =1
Y
2
j
_
_
(43a)
=E
_
n

i =1
Y
4
i
+

i = j
Y
2
i
Y
2
j
_
(43b)
=
n

i =1
E Y
4
i
+

i = j
E Y
2
i
E Y
2
j
(43c)
=n
4
+ n(n 1 )
2
2
(43d)
=n
4
+ n(n 1 )
4
(43e)
Now consider the second term on the right of 42 (ignoring 2n for now) which we can write as
E
_

Y
2
n

i =1
Y
2
i
_
=
1
n
2
E
_
_
n

j =1
Y
j
n

k =1
Y
k
n

i =1
Y
2
i
_
_
(44a)
=
1
n
2
E
_

_
n

i =1
Y
4
i
+

i = j
Y
2
i
Y
2
j
+

j = k
Y
j
Y
k

i = j
i = k
Y
2
i
_

_
(44b)
=
1
n
2
_

_
n

i = 1
E Y
4
i
+

i = j
E Y
2
i
E Y
2
j
+

j = k
E Y
j
E Y
k

i = j
i = k
EY
2
i
_

_
(44c)
=
1
n
2
_
n
4
+ n(n 1)
2
2
+ 0

(44d)
=
1
n
_

4
+ (n 1 )
4

(44e)
The last term on the penultimate line is zero because E(Y
j
) = E(Y
k
) = E(Y
i
) = 0.
10 SAMPLE MOMENTS
Now consider the third term on the right side of 42 (ignoring n
2
for now) which we can write as
E
_

Y
4

=
1
n
4
E
_
_
n

i =1
Y
i
n

j =1
Y
j
n

k =1
Y
k
n

=1
Y

_
_
(45a)
=
1
n
2
E
_
_
n

i =1
Y
4
i
+

i =k
Y
2
i
Y
2
k
+

i = j
Y
2
i
Y
2
j
+

i = j
Y
2
i
Y
2
j
+
_
_
(45b)
where for the rst double sum (i = j = k = ), for the second (i = k = j = ), and for the last (i = = j =
k) and ... indicates that all other terms include Y
i
in a non-squared form, the expected value of which will
be zero. Given that the Y
i
are independently and identically distributed, the expected value of each of the
double sums is the same, which gives
E
_

Y
4

=
1
n
4
E
_
_
n

i = 1
Y
4
i
+

i = k
Y
2
i
Y
2
k
+

i = j
Y
2
i
Y
2
j
+

i = j
Y
2
i
Y
2
j
+
_
_
(46a)
=
1
n
4
_
n

i =1
E Y
4
i
+ 3

i = j
Y
2
i
Y
2
j
+ terms containing EX
i
_
(46b)
=
1
n
4
_
n

i =1
E Y
4
i
+ 3

i = j
Y
2
i
Y
2
j
_
(46c)
=
1
n
4
_
n
4
+ 3 n(n 1) (
2
)
2

(46d)
=
1
n
4
_
n
4
+ 3 n(n 1)
4

(46e)
=
1
n
3
_

4
+ 3 (n 1 )
4

(46f)
Now combining the information in equations 44, 45, and 46 we obtain
SAMPLE MOMENTS 11
E
_
_
_
n

i =1
_
Y
i


Y
_
2
_
2
_
_
=E
_
_
_
n

i = 1
_
Y
2
i
2 Y
i

Y +

Y
2
_
_
2
_
_
(47a)
=E
_
_
_
n

i =1
Y
2
i
_
2
_
_
2 nE
_

Y
2
n

i =1
Y
2
i
_
+ n
2
E
_

Y
4
_
(47b)
=n
4
+ n(n 1 )
2
2
2n
_
1
n
_

4
+ (n 1 )
2
2

_
+ n
2
_
1
n
3
_

4
+ 3 (n 1 )
2
2

_
(47c)
=n
4
+ n(n 1 )
2
2
2
_

4
+ (n 1 )
2
2

+
_
1
n
_

4
+ 3 (n 1 )
2
2

_
(47d)
=
n
2
n

4

2 n
n

4
+
1
n

4
+
n
2
(n 1 )
n

2
2

2 n(n 1 )
n

2
2
+
3(n 1 )
n

2
2
(47e)
=
n
2
2 n + 1
n

4
+
(n 1 ) (n
2
2 n + 3 )
n

2
2
(47f)
=
n
2
2 n + 1
n

4
+
(n 1 ) (n
2
2 n + 3 )
n

4
(47g)
Now rewrite equation 41 including
1
n
2
as follows
E
_
_
M
2
n
_
2
_
=
1
n
2
E
_
_
_
n

i = 1
_
Y
i


Y
_
2
_
2
_
_
(48a)
=
1
n
2
_
n
2
2 n + 1
n

4
+
(n 1) (n
2
2 n + 3 )
n

4
_
(48b)
=
n
2
2 n + 1
n
3

4
+
(n 1 ) (n
2
2 n + 3)
n
3

4
(48c)
=
( n 1 )
2
n
3

4
+
(n 1 ) (n
2
2 n + 3)
n
3

4
(48d)
Now substitute equations 34 and 48 into equation 33 to obtain
V ar
_
M
2
n
_
=E
_
_
M
2
n
_
2
_

_
E M
2
n
_
2
=
(n 1 )
2
n
3

4
+
(n 1 ) (n
2
2 n + 3 )
n
3

4

( n 1 )
2
n
2

4
(49)
We can simplify this as
12 SAMPLE MOMENTS
V ar
_
M
2
n
_
=E
_
_
M
2
n
_
2
_

_
E M
2
n
_
2
(50a)
=
( n 1 )
2
n
3

4
+
(n 1 ) (n
2
2 n + 3 )
n
3

4

n( n 1 )
2
n
3

4
(50b)
=

4
( n 1)
2
+
_
( n 1 )
4
_
n
2
2 n + 3 n( n 1)
_
n
3
(50c)
=

4
( n 1)
2
+
_
( n 1 )
4
_
n
2
2 n + 3 n
2
+ n
_
n
3
(50d)
=

4
( n 1)
2
+
_
( n 1 )
4

(3 n )
n
3
=

4
( n 1)
2

_
( n 1 )
4

(n 3 )
n
3
(50e)
=
( n 1)
2

4
n
3

( n 1 ) (n 3 )
4
n
3
(50f)
5. SAMPLE VARIANCE
5.1. Denition of sample variance. The sample variance is dened as
S
2
n
=
1
n 1
n

i =1
_
X
i


X
n
_
2
(51)
We can write this in terms of moments about the mean as
S
2
n
=
1
n 1
n

i =1
_
X
i


X
n
_
2
=
n
n 1
M
2
n
where M
2
n
=
1
n
n

i=1
_
X
i


X
n
_
2
(52)
5.2. Expected value of S
2
. We can compute the expected value of S
2
by substituting in from equation 31 as
follows
E
_
S
2
n
_
=
n
n 1
E
_
M
2
n
_
=
n
n 1
n 1
n

2
=
2
(53)
5.3. Variance of S
2
. We can compute the variance of S
2
by substituting in from equation 50 as follows
SAMPLE MOMENTS 13
V ar
_
S
2
n
_
=
n
2
( n 1 )
2
V ar
_
M
2
n
_
=
n
2
( n 1 )
2
_
( n 1)
2

4
n
3

( n 1) (n 3)
4
n
3
_
=

4
n

(n 3)
4
n(n 1 )
(54)
5.4. Denition of
2
. One possible estimate of the population variance is
2
which is given by

2
=
1
n
n

i =1
_
X
i


X
n
_
2
= M
2
n
(55)
5.5. Expected value of
2
. We can compute the expected value of
2
by substituting in from equation 31 as
follows
E
_

2
_
=E
_
M
2
n
_
=
n 1
n

2
(56)
5.6. Variance of
2
. We can compute the variance of
2
by substituting in from equation 50 as follows
V ar
_

2
_
= V ar
_
M
2
n
_
=
(n 1)
2

4
n
3

(n 1) (n 3)
4
n
3
=

4

2
2
n

2 (
4
2
2
2
)
n
2
+

4
3
2
2
n
3
(57)
We can also write this in an alternative fashion
V ar
_

2
_
=V ar
_
M
2
n
_
=
( n 1)
2

4
n
3

( n 1 ) (n 3 )
4
n
3
=
( n 1)
2

4
n
3

( n 1 ) (n 3 )
2
2
n
3
=
n
2

4
2 n
4
+
4
n
3

n
2

2
2
4 n
2
2
+ 3
2
2
n
3
=
n
2
(
4

2
2
) 2 n(
4
2
2
2
) +
4
3
2
2
n
3
=

4

2
2
n

2 (
4
2
2
2
)
n
2
+

4
3
2
2
n
3
(58)
14 SAMPLE MOMENTS
6. NORMAL POPULATIONS
6.1. Central moments of the normal distribution. For a normal population we can obtain the central mo-
ments by differentiating the moment generating function. The moment generating function for the central
moments is as follows
M
X
(t) = e
t
2

2
2
. (59)
The moments are then as follows. The rst central moment is
E(X ) =
d
dt
_
e
t
2

2
2
_
|
t = 0
= t
2
_
e
t
2

2
2
_
|
t =0
= 0
(60)
The second central moment is
E(X )
2
=
d
2
dt
2
_
e
t
2

2
2
_
|
t =0
=
d
dt
_
t
2
_
e
t
2

2
2
__
|
t = 0
=
_
t
2

4
_
e
t
2

2
2
_
+
2
_
e
t
2

2
2
__
|
t =0
=
2
(61)
The third central moment is
E(X )
3
=
d
3
dt
3
_
e
t
2

2
2
_
|
t = 0
=
d
dt
_
t
2

4
_
e
t
2

2
2
_
+
2
_
e
t
2

2
2
__
|
t =0
=
_
t
3

6
_
e
t
2

2
2
_
+ 2 t
4
_
e
t
2

2
2
_
+ t
4
_
e
t
2

2
2
_ _
|
t =0
=
_
t
3

6
_
e
t
2

2
2
_
+ 3 t
4
_
e
t
2

2
2
__
|
t =0
= 0
(62)
The fourth central moment is
SAMPLE MOMENTS 15
E(X )
4
=
d
4
dt
4
_
e
t
2

2
2
_
|
t =0
=
d
dt
_
t
3

6
_
e
t
2

2
2
_
+ 3 t
4
_
e
t
2

2
2
__
|
t =0
=
_
t
4

8
_
e
t
2

2
2
_
+ 3 t
2

6
_
e
t
2

2
2
_
+ 3 t
2

6
_
e
t
2

2
2
_
+ 3
4
_
e
t
2

2
2
__
|
t =0
=
_
t
4

8
_
e
t
2

2
2
_
+ 6 t
2

6
_
e
t
2

2
2
_
+ 3
4
_
e
t
2

2
2
__
|
t =0
= 3
4
(63)
6.2. Variance of S
2
. Let X
1
, X
2
, . . . X
n
be a random sample from a normal population with mean and
variance
2
< .
We know from equation 54 that
V ar
_
S
2
n
_
=
n
2
( n 1 )
2
V ar
_
M
2
n
_
=
n
2
( n 1)
2
_
( n 1)
2

4
n
3

( n 1) (n 3)
4
n
3
_
=

4
n

(n 3 )
4
n ( n 1 )
(64)
If we substitute in for
4
from equation 63 we obtain
V ar
_
S
2
n
_
=

4
n

(n 3 )
4
n (n 1 )
=
3
4
n

(n 3 )
4
n ( n 1 )
=
( 3 ( n 1 ) (n 3 ))
4
n ( n 1 )
=
( 3 n 3 n + 3 ))
4
n ( n 1 )
=
2 n
4
n ( n 1 )
=
2
4
( n 1 )
(65)
6.3. Variance of
2
. It is easy to show that
V ar (
2
) =
2
4
( n 1 )
n
2
(66)

Vous aimerez peut-être aussi