Académique Documents
Professionnel Documents
Culture Documents
Random-Number Generation
Properties of Random Numbers
Random numbers should be uniformly distributed and independent.
Uniformity: If we divide (0,1) into n equal intervals, then we expect the number
of observations in each sub-interval to be N/n where N is the total number of
observations.
Independence: The probability of observing a value in any sub-interval is not
influenced by any previous value drawn.
Random Number, Ri, must be independently drawn from a uniform distribution with
pdf:
1
Generation of Pseudo-Random Numbers
“Pseudo”, because generating numbers using a known method removes the potential for
true randomness.
f ( x) =
Pseudo-random numbers are model random numbers for simulation purposes.
Goal: To produce a sequence of numbers in [0,1] that simulates, or imitates, the ideal
properties of random numbers (RN).
Potential issues:
Non-uniformity
0
Discrete valued, not continuously valued
Inaccurate mean
Inaccurate variance
Dependence
Autocorrelation between numbers
Runs of numbers with skewed values, with respect to previous numbers or mean
value
Important considerations in RN routines:
Fast
Portable to different computers
Have sufficiently long cycle
Replicable
Closely approximate the ideal statistical properties of uniformity and independence.
X0 -seed
a-constant multiplier
c-increment
m-modulus
c=0: multiplicative congruential method, otherwise mixed congruential method
The selection of the values for a, c, m, and X0 drastically affects the statistical properties
and the cycle length.
The random integers are being generated [0,m-1], and to convert the integers to random
Xi
numbers: Ri = , i =1,2,...
m
Approximately the chi-square distribution with n-1 degrees of freedom (where the
critical values are tabulated in Table A.6)
For the uniform distribution, Ei, the expected number in the each class is:
N
Ei = , where N is the total # of observation
Valid only for large nsamples, e.g. N >= 50
Tests for Autocorrelation [Tests for RN]
Testing the autocorrelation between every m numbers (m is a.k.a. the lag), starting with
the ith number
The autocorrelation rim between numbers: Ri, Ri+m, Ri+2m, Ri+(M+1)m
M is the largest integer such that i +(M +1 )m ≤ N
H 0 : ρ im = 0, if numbers are independentt
H 1 : ρ im ≠ 0, if numbers are dependent
Hypothesis:
1 M
ρim =
ˆ
M +1
∑
k=0
Ri +m Ri +
k (k +
1 )m −
0.2
5
3 M +
1 7
ˆ ρim =
σ
2 (M +
1 1)
If rim > 0, the subsequence has positive autocorrelation
High random numbers tend to be followed by high ones, and vice versa.
If rim < 0, the subsequence has negative autocorrelation
Low random numbers tend to be followed by high ones, and vice versa.
Example:
Shortcomings:
The test is not very sensitive for small values of M, particularly when the numbers being
tests are on the low side.
Problem when “fishing” for autocorrelation by performing numerous tests:
If a = 0.05, there is a probability of 0.05 of rejecting a true hypothesis.
If 10 independence sequences are examined,
• The probability of finding no significant autocorrelation, by chance alone, is
0.9510 = 0.60.
• Hence, the probability of detecting significant autocorrelation when it does
not exist = 40%
Summary
In this chapter, we described:
Generation of random numbers
Testing for uniformity and independence
Caution:
Even with generators that have been used for years, some of which still in used, are
found to be inadequate.
This chapter provides only the basic
Also, even if generated numbers pass all the tests, some underlying pattern might
have gone undetected.
CHAPTER 8: Random Variate Generation
Inverse-transform Technique
• Find x:
r=F
(x)
x=F -1(
r) r1
x1
Exponential Distribution
Exponential cdf:
r= F(x)
= 1 – e-x for x 0
Xi = F-1(Ri)
= -(1/λ
ln(1-Ri)
Generate 200 Rs with U (0,1) and utilize above equation, the histogram of Xs become:
Check: Does the random variable X1 have the desired distribution?
P ( X 1 ≤ x0 ) = P ( R1 ≤ F ( x0 )) = F ( x 0 )
Other Distributions
• Examples of other distributions for which inverse cdf works are:
1. Uniform distribution
2. Weibull distribution
3. Triangular distribution
ˆ −1 ( R ) =x (i − 1)
X =F 1) +
(i − ai R −
n
where
x(i ) − x( i −1) x( i ) − x(i −1)
ai = =
1 / n −(i −1) / n 1/ n
Example: Suppose the data collected for100 broken-widget repair times are:
Example: Suppose the number of shipments, x, on the loading dock of IHW company is 0, 1, or 2. Internal
consultants have been asked to improve the efficiency of loading and hauling operation.
x p
(x) F
(x)
0 0
.50 0
.50
1 0
.30 0
.80
2 0
.20 1
.00
0, x<0
Consider R 1 =0.73:
F(x i-1) < R <= F( xi)
0.5, 0<=x<1 F(x 0) <0.73<=F(x 1)
Hence, x 1 =1
F (x)= 0.8, 1<=x<2
1.0, 2<=x<3
CDF of a discrete random variable has horizontal line segments with jumps of size p (x) at those points X that
the RV can assume.
0, R ≤ 0 .5
x = 1, 0 .5 ≤ R ≤ 0 .8
2, 0 .8 ≤ R ≤ 1 .0
Acceptance-Rejection technique G e n e ra te R
no
• Useful particularly when inverse CDF does not exist in closed form,
a.k.a. thinning C o n d itio n
• Illustration: To generate random variates, X ~ U (1/4, 1) ye s
• Procedures:
O utp ut R ’
Step 1. Generate R ~ U [0,1]
• R does not have the desired distribution, but R conditioned (R’) on the event {R >= ¼} does.
Probability of Rejection ¼
Probability of Acceptation ¾
Poisson Distribution:
but more important, N can be interpreted as the number of arrivals from a Poisson arrival process in
one unit of time. Recall that the inter- arrival times, A1, A2,... of successive customers are
exponentially distributed with rate a (i.e., a is the mean number of arrivals per unit time); in
addition, an exponential variate can be generated by Equation (5.3). Thu/there is a relationship
between the (discrete) Poisson distribution and the (continuous) exponential distribution, namely
N=n
if and only if
Equation (5.29)/N = n, says there were exactly n arrivals during one unit of time; but relation (8.30)
says that the nth arrival occurred before time 1 while the (n + l)st arrival occurred after time l)
Clearly, these two statements are equivalent. Proceed now by generating exponential interarrival
times until some arrival, say n + 1, occurs after time 1; then set N = n.
For efficient generation purposes, relation (5.30) is usually simplified by first using Equation (5.3),
Ai= (—1/α )ln Ri, to obtain
Next multiply through by reverses the sign of the inequality, and use the fact that a sum of logarithms
is the logarithm of a product, to get
which is equivalent to relation (8.30).The procedure for generating a Poisson random variate, N,
Step 1. Set n = 0, P = 1.
Step 3. If P < e-0, then accept N = n. Otherwise, reject the current n, \ increase n by one, and return to step 2.
Notice that upon completion of step 2, P is equal to the rightmost expression in relation (5.31). The basic idea
of a rejection technique is again exhibited; if P > e~a in step 3, then n is rejected and the generation process
must proceed through at least one more trial.
How many random numbers will be required, on the average to generate one Poisson variate, N? If N=n, then
n+1 random numbers are required so the average number is given by
Which is quite large if the mean , alpha, of the Poisson distribution is large.
Example 4:
NSPP
G e ne ra te E ~ Eλx*)p(
: Non-stationary Poisson Process t=t+E
no
• It is a Poisson arrival process with an arrival rate that varies with
time C o n ditio n
• Idea behind thinning:
R <=λ (t)
ye s
1. Generate a stationary Poisson arrival process at the
fastest rate, l* = max l(t). O ut p ut E ’~ t
2. But “accept” only a portion of arrivals, thinning out just
enough to get the desired time-varying rate
E = -5ln(0.213) = 13.13
t = 13.13
Mean Time
Step 3. Generate R = 0.8830 Between Arrival
t Arrivals Rate λ(t)
l(13.13)/l*=(1/15)/(1/5)=1/3 (min) (min) (#/min)
0 15 1/15
Since R>1/3, do 60 12 1/12
120 7 1/7
not generate the arrival 180 5 1/5
240 8 1/8
Step 2. For random number R = 0.5530, 300 10 1/10
360 15 1/15
E = -5ln(0.553) = 2.96 420 20 1/20
480 20 1/20
t = 13.13 + 2.96 = 16.09
l(16.09)/l*=(1/15)/(1/5)=1/3
and i = i + 1 = 2
Gamma Distribution:
Special Properties
• For example:
2. Convolution
Direct Transformation
1. Approach for normal (0,1):
Consider two standard normal random variables, Z1 and Z2, plotted as a point in the plane:
In polar coordinates:
Z1 = B cos φ
Z2 = B sin φ
= 2). Hence,
Xi = µ + σ
Zi
3. Approach for lognormal (µ , σ 2):
Generate X ~ N (µ ,σ 2)
Yi = eXi