Vous êtes sur la page 1sur 4

3.2.

Sums of Gamma Random Variables

In this section let T1, , Tn be independent gamma random variables with Tj ~ (mj,j) or
Tj ~ (mj,1/j) and S = T1 + + Tn be their sum with density function
A(t) = A(t;m,) = f(t;m1,1) * f(t;m2,2) * * f(t;mn,n)
= * **
as discussed in the previous section. Also, following the previous section, we sometimes
express the density function as
A(t) = A0(t) = A0(t;m,) = f0(t;m1,1) * f0(t;m2,2) * * f0(t;mn,n)
= * **
where j = 1/j. We can write

(1)

A0(t;m,) = mE(t;m,)

where
(2)

m = (1)m1(2)m2(n)mn

(3)

E(t;m,) =

* **

Note that if the mj are positive, then A0(t;m,) and E(t;m,) are defined for all real j

3.2 - 1

Theorem 1. Let m = (m1,...,mn) be a vectors of positive numbers and (,...,) be a vector

of n identical real numbers. Then
(4)

A(t;m, (,...,)) =

(5)

E(t;m, (,...,)) =

where
(6)

|m| = m1 + ... + mn

Proposition 2. (a) E(S) = m11 + m22 + + mnn

(b) (S)2 = m1(1)2+ m2(2)2 + + mn(n)2
Proof. E(S) = E(T1) + E(T2) + + E(Tn) = m11 + m22 + + mnn since E(Tj) = mjj by
Proposition 2 in Section 2.2. (S)2 = (T1)2 + (T2)2 + + (Tn)2 = m1(1)2 + m2(2)2 +
+ mn(n)2 since (Tj)2 = mj(j)2 by Proposition 3 in Section 2.2.

Proposition 3. Let L(s) and M(r) be the Laplace transform and moment generating
function of A(t;m,). Also, let LE(s) and ME(r) be the Laplace transform and moment
generating function of E(t;m,). Then
L(s) =
M(r) =
LE(s) =
ME(r) =

3.2 - 2

Proof. These formulas follow from Proposition 6 in section 2.2 and the fact that the
Laplace transform and moment generating function of a convolution is the product of the
Laplace transforms and moment generating functions.
Let
(7)

(m,|) =

provided is different from all the j.

Theorem 4. Let m = (m1,...,mn) be a vector of positive numbers, = (1,...,n) be a vector
of real numbers, (,...,) be a vector of n identical real numbers and be a real number.
Let - = (1-,...,n-), = min{1,...,n} and = max{1,...,n}. For (9) assume is
different from all the j.
(8)

etE(t;m,) = E(t;m,-)

(9)

etA(t;m,) = (m,|) A(t;m,-)

(10)

E(t;m, (,...,) ) =

(11)

E(t;m,)

Proof. (8) follows from Proposition 2 in section 1.5. (9) follows from (8) and (1). (10)
follows from (1) and (4). To prove (11) note that
E(t;m, (,...,) ) E(t) E(t;m, (,...,) since e-*t e-jt e-*t for all j. (11) follows
from this and (10).

One question of interest is for which is etA(t;m,) an increasing function of t. If n = 1

then A(t;m,) = is the density function of a gamma random variable. If and m 1

3.2 - 3

then etA(t;m,) = is a non-decreasing function of t for t 0. If n 2 then the following

theorem provides a partial answer to the question.

Theorem 5. Let = (1,...,n) be a vector of real numbers with n 2, E(t) = E(t;m,) and
1 j n. If mj = 1 then
(12)

= ejtE(t; (m1,...,mj-1,mj+1,...,mn) , (1,...,j-1,j+1,...,n) )

If mj > 1 then
(13)

= ejtE(t; (m1,...,mj-1,mj-1,mj+1,...,mn) ,)

If mj 1 and j then
(14)

etE(t) is a non-decreasing function of t

Proof. Since E(t) is a symmetric function of the (mj,j) it suffices to prove (12) and (13)
for j = n in which case (8) and (6) give entE(t) = En-1(t) * tmn-1/ (mn) where
En-1(t) = E(t; (m1,...,mn-1) , (1-n,...,n-1-n) ). If mn = 1 then this gives from which one
obtains (entE(t))' = En-1(t). Using (8) again gives (12). If mn > 1 then using (f * g)' = f ' *
g + f(0)g gives (entE(t))' = En-1(t) * tmn-2/ (mn-1) = E(t; (m1,...,mn-1,mn-1) , (1-n,...,n-n,0) ). Using (8) again gives (13). If mj 1 then ejtE(t) is a non-decreasing function of

t since the right side of (12) and (13) are positive. (14) then follows.

3.2 - 4