Vous êtes sur la page 1sur 91

Markov Models

Prof C S P Rao
Dept. of Mechanical Engg
N I T, Warangal
Who was Markov?
 Andrei A. Markov graduated from Saint
Petersburg University in 1878 and
subsequently became a professor there.

 His early work dealt mainly in number theory


and analysis, etc.

 Markov is particularly remembered for his


study of Markov chains.

2
Markov Process
 Markov process is a simple stochastic
process in which the distribution of
future states depends only on the
present state and not on how it arrived
in the present state.

 Markov chain is a discrete-time


stochastic process with the Markov
property
3
Terminology
 A Markov chain is irreducible if every
state can be reached from every other
state.
 Each state of a Markov chain is either
transient or recurrent.
 A state is called absorbing, if once the
chain reaches that state, it stays there
forever.
 A Markov chain is acyclic, if once it
leaves any state, it never returns to that
state. 4
Markov Property
 Many systems have the property that,
given the present state, the past states
have no influence on the future. This
property is called Markov property.

 We can say Markov process is a process


or simulation that satisfies Markov
property.

5
Markov Chain
Let {Xt : t is in T} be a stochastic process with
discrete-state space S and discrete-time space
T satisfying
 
P(Xn+1 = j|Xn = i, Xn-1 = in-1, · · ·,X0 = i0)
= P(Xn+1 = j|Xn = i)
 
for any set of state i0, i1, · · · , in-1, i, j in S and n
≥ 0 is called a Markov Chain.
6
Markov Model
 Sometimes Markov Model
restricts attention to Markov
chains with stationary transition
probabilities. But some people
tend to avoid this usage for sake of
confusion.
 Markov Model is also used to
refer to all Markov processes
that satisfying Markov Property. 7
Components of Stochastic Processes

The state space of a stochastic process is


the set of all values that the Xt’s can take.
(we will be concerned with
stochastic processes with a finite # of states )
Time: t = 0, 1, 2, . . .
State: m-dimensional vector, s = (s1, s2, . . . , sm )
or s = (s1, s2, . . . , sm ) or (s0, s1, . . . , sm-1)
Sequence Xt, Xt takes one of m values, so Xt ↔ s.
8
Markov Chain Definition

A stochastic process { Xt } is called a Markov Chain if

Pr{ Xt+1 = j | X0 = k0, . . . , Xt-1 = kt-1, Xt = i }

= Pr{ Xt+1 = j | Xt = i } ← transition probabilities

for every i, j, k0, . . . , kt-1 and for every t.


The future behavior of the system depends only on the
current state i and not on any of the previous states.

9
Discrete-Time Markov
Chains

10
Discrete-Time Markov
Chain
A stochastic process { Xn } where n ∈ N = { 0, 1, 2, . . . } is
called a discrete-time Markov chain if
Pr{ Xn+1 = j | X0 = k0, . . . , Xn-1 = kn-1, Xn = i }

= Pr{ Xn+1 = j | Xn = i } ← transition probabilities

for every i, j, k0, . . . , kn-1 and for every n.


The future behavior of the system depends only on the
current state i and not on any of the previous states.

11
Stationary Transition
Probabilities
Pr{ Xn+1 = j | Xn = i } = Pr{ X1 = j |X0 = i } for all n
(They don’t change over time)
We will only consider stationary Markov chains.

The one-step transition matrix for a Markov chain


with states S = { 0, 1, 2 } is
 p00 p01 p02 
P   p10 p11 p12 
 p20 p21 p22 

where pij = Pr{ X1 = j | X0 = i }


12
Example

 Consider a manufacturing facility comprising a


single machine that is prone to failures.
 At any instant of time the machine is either
 Working properly (state ‘0’) or

 Undergoing repair (state ‘1’)

Consider that the state of the machine is


being examined once every hour
Let ‘a’ be the probability that the machine
fails in a given hour
‘b’ the probability that the failed machine
gets repaired in a given hour
13
 More specifically,
‘a’ is the probability that the machine is in the
failed condition by the next observation given
that it is working in the current observation

Similarly, ‘b’ is the probability that the


machine gets repaired by the next observation
given that it is in the failed state in the current
observation.

Assume that these probabilities remain the


same immaterial of the history of the working
of the machine.

Also we assume that these probabilities are


the same at any observation epoch.
Formulation of the system
The above system can be formulated as a
homogeneous DTMC

{Xn  S : n N} where S = {0,1} and the time


instants t0, t1, t2 …. Corresponds to 0, 1h, 2h,….
respectively.

The TPM is given by


 1 a a 
P  0  a, b  1
b 1  b 
DTMC models for a single machine system
16
Example

 Consider an NC machine processing parts that may


belong to one of three part types.
 Assume that raw parts of each type are always available
and wait in separate queues.
 We consider the system evaluation as constituting a
homogeneous Markov chain, with state space given by
S = {1,2,3} where each state corresponds to the type
of part getting processed.
 The observation epochs t0, t1, t2,… are given by the time
instants at which the ith part (i =1,2,3….) is taken up for
processing.

17
An NC machine processing three types of parts

18
 Suppose that the NC machines processes the
parts in a cyclic way i.e. type 1 part followed
by type 2, followed by type 3 part and the
same sequence is repeated again and again.
 The state transition diagram is shown below

 0 1 0
The TPM of the system is
P   0 0 1 
 1 0 0 
Suppose that the type of the next part to be
processed is selected probabilistically: part
type i is chosen next with probability qi = (i =
1,2,3…), where 0 <qi <1, q1+q2+q3 = 1.
In this case, the state transition diagram and
TPM are shown in figure below

 q1 q2 q3 
P   q1 q2 q3 
 q1 q2 q3 
Example 3
 Consider the queuing network shown in figure
below.

Closed central server queuing network model

After transportation by the AGV, the part leaves the system with
probability q0 or undergoes one more operation on one of the
machines with probabilities q1, q2, …. Qm.

21
Let there be a single job in this system and let us look at the
progress of this job in the system.
It is easy to note that the job can be in (m+1) states.
0 (if getting serviced by AGV)
i (if getting served by Mi, i=1,2,…m)
The flow of the job can be described by a DTMC model.
The TPM of this model is  q0 q1 q2 ....qm 
 
 1 0 0 ....0 
P   1 0 0 ....0 
 
 ....................... 
 1 0 0 ....0 
 
DTMC model for central server network

23
An Example Problem -- Gambler’s
Ruin
At time zero, I have X0 = $2, and each day I make a $1 bet.
I win with probability p and lose with probability 1– p.
I’ll quit if I ever obtain $4 or if I lose all my money.
Xt = amount of money I have after the bet on day t.

3 with probability p
So, X1 = { 1 with probability 1 – p

if Xt = 4 then Xt+1 = Xt+2 = •••


=4
if Xt = 0 then Xt+1 = Xt+2 = •••
= 0.

The possible values of Xt is S = { 0, 1, 2, 3, 4 }


Property of Transition Matrix

If the state space S = { 0, 1, . . . , m–1} then we have

∑j pij = 1 ∀ i and pij ≥ 0 ∀ i, j

(we must (each transition


go somewhere) has prob ≥ 0)

Stationary Property assumes that these values


does not change with time

25
Transition Matrix of the Gambler’s
problem
At time zero I have X0 = $2, and each day I make a $1 bet.
I win with probability p and lose with probability 1– p.
I’ll quit if I ever obtain $4 or if I lose all my money.
Xt = amount of money I have after the bet on day t.

Transition Matrix of Gambler’s Ruin Problem

0 1 2 3 4

0 1 0 0 0 0

1 1-p 0 p 0 0

2 0 1-p 0 p 0

3 0 0 1-p 0 p
State Transition Diagram

State Transition Diagram


Node for each state,
Arc from node i to node j if pij > 0

The state-transition diagram of Gambler’s problem

1-p p p p
0 1
1
22 3
3 4
4

1-p 1-p
1 1

Notice nodes 0 and 4 are “trapping” node

27
Printer Repair Problem
• Two printers are used for Russ Center.

• When both working in morning, 30% chance that one


will fail by evening and a 10% chance that both will fail.

• If only one printer is serving at the beginning of the day,


there is a 20% chance that it will fail by the close of
business.

• If neither is working in the morning, the office sends all


work to a printing service.

• If failed during the day, a printer can be repaired


overnight and be returned earlier the next day
28
States for Computer Repair Example

Index s State definitions


No printers have failed. The office
0 s0 = (0) starts the day with both computers
functioning properly.

One printer has failed. The office


1 s1 = (1) starts the day with one working
computer and the other in the shop
until the next morning.

Both computers have failed. All


2 s2 = (2) work must be sent out for the day.
Events and Probabilities for Computer
Repair Example
Current Events Probabili Next
Index
state ty state

0 s0 = (0) No printer fails. 0.6 s = (0)

    One printer fails. 0.3 s = (1)

    Both printer fail. 0.1 s = (2)

1 s1 = (1) printer no fail and the s = (0)


other is returned. 0.8

    printer no fail and the s = (1)


other is returned. 0.2

2 s2 = (2) Both printer returned. 1.0 s = (0)


State-Transition Matrix and Network
State-Transition Matrix
The major properties of a Markov chain 0.6 0.3 0.1
can be described by the m × m matrix:
P = 0.8 0.2 0 
 
P = (pij).  1 0 0 
For printer repair example 

(0.6)
State-Transition Network:
0
Node for each state,
(0.1) (0.3)
Arc from node i to node j if pij > 0.
(1) (0.8)

2 1
For printer repair example 
(0.2)
Market Share/Brand Switching
Problem
Market Share Problem:
You are given the original market of three companies. The
following table gives the number of consumers that
switches from brand i to brand j in two consecutive weeks

Bran (j) 1 2 3 Total


d
(i)
1 90 7 3 100
2 5 205 40 250
3 30 18 102 150
Total 125 230 145 500
How to model the problem as a stochastic process ?
Empirical Transition Probabilities for
Brand Switching, pij
Transition Matrix

Brand (a) 1 2 3

(i)

1 90/100 = 7/100 = 0.07 3/100 = 0.03


0.90
2 5/250 = 0.02 205/250 = 40/250 = 0.16
0.82
3 30/150 = 18/150 = 0.12 102/150 =
0.20 0.68

33
Assumption Revisited
• Markov Property
Pr{ Xt+1 = j | X0 = k0, . . . , Xt-1 = kt-1, Xt = i }

= Pr{ Xt+1 = j | Xt = i } ← transition probabilities

for every i, j, k0, . . . , kt-1 and for every t.

• Stationary Property

Pr{ Xt+1 = j | Xt = i } = Pr{ X1 = j | X0 = i } for all t

(They don’t change over time)


We will only consider stationary Markov chains.

34
Transform a Process to a Markov
Chain

Sometimes a non-Markovian stochastic process can


be transformed into a Markov chain by expanding the
state space.
Example: Suppose that the chance of rain tomorrow
depends on the weather conditions for the previous two days
(yesterday and today).

Specifically,
P{ rain tomorrowrain last 2 days (RR)} = .7
P{ rain tomorrowrain today but not yesterday (NR)} = .5
P{ rain tomorrowrain yesterday but not today (RN)} = .4
P{ rain tomorrowno rain in last 2 days (NN)} = .2

Does the Markovian Property Hold ??


The Weather Prediction Problem

How to model this problem as a Markovian Process ??

The state space: 0 (RR) 1 (NR) 2(RN) 3(NN)

The transition matrix:


0 (RR) 1 (NR) 2(RN) 3(NN)
0 ( RR) .7 0 .3
1 (NR) .5 0 .5
2 (RN) 0 .4 0
3 (NN) 0 .2 0

This is a Discrete Time Markov Process

36
Choosing Balls from an Urn
An urn contains two unpainted balls at present. We
choose a ball at random and flip a coin.

If the chosen ball is unpainted and the coin comes


up heads, we paint the chosen unpainted ball red

If the chosen ball is unpainted and the coin comes up


tails, we paint the chosen unpainted ball blue.
If the ball has already been painted, then (whether
heads or tails has been tossed), we change the color of
the ball (from red to blue or from blue to red)

Model this problem as a Discrete Time Markov Chain


(represent it using state diagram & transition matrix)
Insurance Company Example
An insurance company charges customers annual
premiums based on their accident history
in the following fashion:

• No accident in last 2 years: $250 annual premium


• Accidents in each of last 2 years: $800 annual premium
• Accident in only 1 of last 2 years: $400 annual premium

Historical statistics:
2. If a customer had an accident last year then they
have a 10% chance of having one this year;
3. If they had no accident last year then they have a
3% chance of having one this year.
38
Find the steady-state probability and the long-run
average annual premium paid by the customer.

Solution approach: Construct a Markov chain with four


states: (N, N), (N, Y), (Y, N), (Y,Y) where these indicate
(accident last year, accident this year).

(N, N) (N, Y) (Y, N) (Y, Y)

P= (N, N) .97 .03 0 0

(N, Y) 0 0 .90 .10

(Y, N) .97 .03 0 0

(Y, Y) 0 0 .90 .10

39
State-Transition Network for
Insurance Company

.03 .90 .90


.03 Y, N Y, Y
.97 N, N N, Y
.10
.97 .10

This is an ergodic Markov chain:


All states communicate (irreducible);
Each state is recurrent (you will return, eventually);
Each state is aperiodic.

40
Solving the steady – state equations:

π(N,N) = 0.97 π(N,N) + 0.97 π(Y,N)


π(N,Y) = 0.03 π(N,N) + 0.03 π(Y,N)
π(Y,N) = 0.9 π(N,Y) + 0.9 π(Y,Y)
π(N,N) + π(N,Y)+π(Y,N) + π(Y,Y) = 1

Solution:
π(N,N) = 0.939, π(N,Y) = 0.029, π(Y,N) = 0.029, π(Y,Y) = 0.003

& the long-run average annual premium is


0.939*250 + 0.029*400 + 0.029*400 + 0.003*800 = 260.5
First Passage Times
Let µij = expected number of steps to transition
from state i to state j

If the probability that we will eventually visit state j


given that we start in i is less than one then
we will have µij = +∞.

For example, in the Gambler’s Ruin problem,


µ20 = +∞ because there is a positive probability
that we will be absorbed in state 4 given that we
start in state 2 (and hence visit state 0).

42
Computations for All States Recurrent
If the probability of eventually visiting state j given
that we start in i is 1 then the expected number
of steps until we first visit j is given by

µij = 1 + ∑ pirµrj, for i = 0,1, . . . , m–1


r≠j

It will always take We go from i to r in the first step


at least one step. with probability pir and it takes µrj
steps from r to j.

For j fixed, we have linear system in m equations and


m unknowns mij , i = 0,1, . . . , m–1.
43
First-Passage Analysis for Insurance
Company
Suppose that we start in state (N,N) and want to find
the expected number of years until we have accidents
in two consecutive years (Y,Y).

This transition will occur with probability 1, eventually.


For convenience number the states
0 1 2 3
(N,N) (N,Y) (Y,N) (Y,Y)

Then, µ03 = 1 + p00 µ03 + p01 µ13 + p02µ23

µ13 = 1 + p10 µ03 + p11 µ13 + p12µ23

µ23 = 1 + p20 µ03 + p21 µ13 + p22µ23


(N, N) (N, Y) (Y, N) (Y, Y)
0
1 (N, N) .97 .03 0 0
Using P =
2
(N, Y) 0 0 .90 .10
3
(Y, N) .97 .03 0 0

µ03 0=
(Y, Y) 1 + 00.97µ03
.90 + 0.03µ
.10 13
µ13 = 1 + 0.9µ23
µ23 = 1 + 0.97µ03 + 0.03µ13

Solution: µ03 = 343.3, µ13 = 310, µ23 = 343.3

So, on average it takes 343.3 years to transition


from (N,N) to (Y,Y).
Continuous Time Markov
Chains
Continuous Time Markov Chains

A stochastic process is a sequence of random


variables indexed by an ordered set T.
Generally, T records time.
A discrete-time stochastic process
is a sequence of random variables
X0, X1, X2, . . . typically denoted by { Xn }.
where index T or n takes countable discrete values

Discrete Time Markov chains


1) discrete: we do not care how long each step takes
2) Markov Property:
What happens only depends on the previous state,
not on how you get there.
47
Continuous Time Markov
Chain
A Continuous Time Markov Chain
is a sequence of random variables Xt , t ≥ 0
which satisfies the following
Markovian Property
Pr{ Xt+s = j | Xu, 0 ≤ u ≤ s } = Pr { X t+s = j | Xs }

At time s, the future behavior of the system in t times Xt+s


depends only on the current state Xs
but not on the past Xu, where 0<u<s
Stationary Property
Pr{ Xt+s = j | Xs } = Pr { X t = j | X0 }

48
Property of Continuous Time Markov
Chain

If the state space S = { 0, 1, . . . , m–1}. Suppose we


start at state i, and have entered state i for about s
minutes, what is the probability that a transaction will
not occur in the next t times?
According to Markovian Property
Look at Inter-Arrival Time T
Pr{ T i > t + s | T i > s, 0 ≤ u ≤ s } = Pr { T i > t }
There is only one random variable, exponential, that
satisfies this memory-less property
49
Continuous Time Markov Chain
Definition

Examples

Poisson Process
The Birth/Death Processes in General
The M/M/s queue
Brown Motion
Illustration
The M/M/s queue

50
An ATM example (M/M/1/5 Queue)

Consider an ATM located at the foyer of a bank. Only one


person can use the machine, so a queue forms when two
or more customers are present.
The foyer is limited in size and can hold only five people.
When there are more than five people, arriving customers
will balk when the foyer is full

Statistics indicates that the average time between arrivals


is 30 seconds, or 2 customer per minutes whereas the time
for service averages 24 seconds. or 2.5 customers per
minutes. Both times follows exponential distribution

Try to help the manager of the bank and answer the


following questions.
An ATM example (M/M/1/5 Queue)

Manager’s Questions

a) The proportion of time that the ATM is idle ?

b) The efficiency of the ATM?

c) The throughput rate of the system?


Customer’s Questions

d) The proportion of customer that obtain immediate service?

e) The proportion of a customer who arrive and find the


system is full?
f) The average number of time in the system?
DTMT Model for the ATM example
What are the State Space of the system
Number of customers in the system: {0}, {1}, {2}, {3}, {4}, {5}

For DTMC, we use Transition Matrix, P


(BTW, can you model this problem as a DTMC?)

For CTMC, we use Rate Matrix, R

λ0 λ1 λ2 λ3 λ4

0 1 2 3 4 5

µ1 µ2 µ3 µ3 µ3
Steady State Probability
What is the long term steady state
probability?

We will investigate steady-state (not transient) results


for CTMC based on the same
Flow Balance Principle
(Rate in = Rate out)

Let πn = steady-state probability of being in state n.

π0 + π1 +π2 +π3 +π4 +π5 = 1

54
Balance Equations
λ λ λ λ λ

0 1 2 3 4 5

µ µ µ µ µ

Flow into 0 → µπ1 = λπ0 ← flow out of 0


Flow into 1 → λπ0 + µπ2 = (λ + µ)π1 ← flow out of 1
Flow into 2 → λπ1 + µπ3 = (λ + µ)π2 ← flow out of 2

Flow into 5 λπ4 = µπ5 ← flow out of n 55


Rate Matrix & Solution with
M/M/1/5
Rate Matrix
State
s 0 1 2 3 4 5
0 0 2 0 0 0 0
1 2.5 0 2 0 0 0
2 0 2.5 0 2 0 0
3 0 0 2.5 0 2 0
4 0 0 0 2.5 0 2
5 0 0 0 0 2.5 0
Solution
State 0 State 1 State 2 State 3 State 4 State 5
0.271 0.217 0.173 0.139 0.111 0.0888
Solution Analysis
Manager’s Questions
a) The proportion of time that the ATM is idle ? π0 =27%
b) The efficiency of the ATM? 1−π0 = 73%
c) The throughput rate of the system? λ(1−π5) = 1.822
d)What is the average number of customers in the system?
1π1 +2π2 +3π3 +4π4 +5π5 = 1.868
Customer’s Questions
d) The proportion of time a customer obtain immediate service?
π0 =27%
e) The proportion of a customer find the system is full?
π5 =9%
f) The average time in the system? Little’s Law, see queuing
Additional Questions
What if we want to add a new ATM machine, what will the
system perform? M/M/2/5
What if we want to add two new ATM machines, what will
the system perform? M/M/3/5

What if we want to add more spaces so that 8 customer


can wait. What will the system perform? M/M/1/8

What if we want to add more spaces so that 12 customer


can wait. What will the system perform? M/M/1/8
What if we want to add a teller them with a service time
exponential distributed at 1 minute a customer, what will
the system perform? This is not a Standard Queue
M/M/2/5

What if we want to add a new ATM machine there, what


will the system perform? M/M/2/5
What will the state-transition network looks like?

λ λ λ λ λ

0 1 2 3 4 5

µ 2µ 2µ 2µ 2µ

59
Rate Matrix & Solution with
M/M/2/5
Rate Matrix
State
s 0 1 2 3 4 5
0 0 2 0 0 0 0
1 2.5 0 2 0 0 0
2 0 5 0 2 0 0
3 0 0 5 0 2 0
4 0 0 0 5 0 2
5 0 0 0 0 5 0
Solution
State 0 State 1 State 2 State 3 State 4 State 5
0.431 0.345 0.137 0.055 0.022 0.009
M/M/3/5

What if we want to add two new ATM machines, what will


the system perform? M/M/3/5

What will the state-transition network looks like?


λ λ λ λ λ

0 1 2 3 4 5

µ 2µ 3µ 3µ 3µ

61
Rate Matrix & Solution with M/M/3/5
Rate Matrix
State
s 0 1 2 3 4 5
0 0 2 0 0 0 0
1 2.5 0 2 0 0 0
2 0 5 0 2 0 0
3 0 0 7.5 0 2 0
4 0 0 0 7.5 0 2
5 0 0 0 0 7.5 0
Solution
State 0 State 1 State 2 State 3 State 4 State 5
0.448 0.359 0.143 0.038 0.010 0.0027
Comparison of different alternatives
M/M/1/5 and M/M/2/5 and M/M/3/5

System Measure 1 ATM 2 ATM 3 ATM

Efficiency 0.729 0.396 0.265

Throughput 1.822 1.982 1.995

Average # in queue 1.14 0.1258 0.016


Average time in
queue 0.625 0.064 0.008
Proportion of
customers who wait 0.64 0.2152 0.0484
Proportion of
customers lost 0.0888 0.0088 0.0027
The addition of spaces to the
foyer

What if we want to add more spaces so that 8 customer


can wait. What will the system perform? M/M/1/8
λ λ λ λ λ

0 1 2 3 …. 8

µ µ µ µ µ

What if we want to add more spaces so that 12 customer


can wait. What will the system perform? M/M/1/12
λ λ λ λ λ

0 1 2 3 …. 12

µ µ µ µ µ

64
Comparison of different alternatives
M/M/1/5 and M/M/2/5 and M/M/3/5

System Measure M/M/1/5 M/M/1/8 M/M/1/12

Efficiency 0.729 0.769 0.7884

Throughput 1.822 1.923 1.971

Average # in queue 1.14 1.84 2.46


Average time in
queue 0.625 0.955 1.246
Proportion of
customers who wait 0.64 0.73 0.77
Proportion of
customers lost 0.0888 0.038 0.01565
Adding a Human Server
The manager decide to add a human teller with a service
time exponential distributed at 1 minute a customer

However, when a customer enters into the system, he/she


would prefer to go to the human server first if the server is
available?
In this case, how would the system perform?

Approach
Notice you have to differentiate the two server now
Let us use (HS, MS, # waiting) to represent the system,
suppose the foyer can hold at most 5 people
(0,0,0), (0,1,0), (0,1,0), (1,1,0), (1,1,1), (1,1,2), (1,1,3)
Addition of a Human
Server
State: (HS, MS, # waiting)

λ (1,0,0) λ λ λ λ
µm 1,1,0) 1,1,1) 1,1,2) 1,1,3)
(0,0,0) µh
µh
(0,1,0) µm+ µh µm+ µh µm+ µh
µm λ

µm = 2.5, ATM service rate


µh = 1, human service rate

67
Addition of Human Server
Rate Matrix 0,0, 0,1, 1,0, 1,1, 1,1, 1,1, 1,1,
States 0 0 0 0 1 2 3
0,0,0 0 0 2 0 0 0 0
0,1,0 2.5 0 0 2 0 0 0
1,0,0 1 0 0 2 0 0 0
1,1,0 0 1 2.5 0 2 0 0
1,1,1 0 0 0 3.5 0 2 0
1,1,2 0 0 0 0 3.5 0 2
1,1,3 0 0 0 0 0 3.5 2
Solution
States
Solutio 0,0,0
0.21 0,1,0 1,0,0
0.31 1,1,0
0.20 1,1,1
0.11 1,1,2 1,1,3
n 4 0.046 3 5 7 0.067 0.038
A Queue With Finite Input Sources

A taxi company with a fleet of 6 cabs and a repair shop


to handle breakdowns

Assume that taxis are identical and are exponential


distributed with breakdown rate of 1/3 per month
The company is thinking of setting up several service
bays and the estimate repair time is exponential
distributed with a rate of 4 per month

Do a analysis to help the company to figure out


how many service bay to set up

69
A Queue With Finite Input Sources

One Bay
6×1/3 5×1/3 4×1/3 3×1/3 1×1/3

0 1 2 3 …. 6

4 4 4 4 4

Two Bay
6×1/3 5×1/3 4×1/3 3×1/3 1×1/3

0 1 2 3 …. 6

4 2×4 2×4 2×4 2×4

70
A Queue With Finite Input Sources (1
Bay)
Rate Matrix
States 0 1 2 3 4 5 6
0 0 2 0
1.66 0 0 0 0
1 4 0 7 0
1.33 0 0 0
2 0 4 0 3 0 0 0
3 0 0 4 0 1 0
0.66 0
4 0 0 0 4 0 7 0
0.33
5 0 0 0 0 4 0 3
6 0 0 0 0 0 4 0
Solution
States
Solutio 0
0.55 1 2
0.11 3
0.03 4
0.01 5
0.00 6
n 6 0.278 6 9 0 2 0
A Queue With Finite Input Sources (2
Bay)
Rate Matrix
States 0 1 2 3 4 5 6
0 0 2 0
1.66 0 0 0 0
1 4 0 7 0
1.33 0 0 0
2 0 8 0 3 0 0 0
3 0 0 8 0 1 0
0.66 0
4 0 0 0 8 0 7 0
0.33
5 0 0 0 0 8 0 3
6 0 0 0 0 0 8 0
Solution
States
Solutio 0
0.61 1 2
0.06 3
0.01 4
0.00 5 6
n 6 0.308 4 1 1 0 0
Economics With These Results

Suppose each taxi on average can bring a revenue of


$1200 a day, what would be the expected revenue for
each configuration?

One Bay: 7200×π0+ 6000×π1+4800×π2+3600×π3


+2400×π4+1200×π5+0×π6 =$6630

Two Bay: 7200×π0+ 6000×π1+4800×π2+3600×π3


+2400×π4+1200×π5+0×π6 =$6392
If it costs $300 dollars a day to operate a bay, would it
be beneficial to the company

73
Probability Transitions:
Service with Rework
Consider a machine operation in which there is a 0.4
probability that on completion, a processed part will
not be within tolerance.
If the part is unacceptable, the operation is repeated
immediately. This is called rework.
Assume that the second try is always successful
What will the system looks like if
a) Arrivals can occur only when the machine is idle
b) Arrivals can occur any time

74
Probability Transitions:
Service with Rework
a) Arrivals can occur only when the machine is idle
a 0.4d1

i w rw

0.6d1

0.6d2

b) Arrivals can occur any time


a a
0,r 1,r 2,r
0.6d2 0.6d2
0.4d1 0.4d1 0.4d1 ……
a a a
0,i 0,w 1,w 2,w

0.6d1 0.6d1

75
An ATM with a Human Server
Consider an ATM located together with a Human Server at
the foyer of a bank. The foyer is limited in size and when
there are more than five people, arriving customers will balk
Statistics indicates that the average time between arrivals
is exponential distributed with an average of 30 seconds, or
2 customer per minute
The service time of the ATM is exponential distributed with
an average of 24 seconds. or 2.5 customers per minutes.
The service time of human server is exponential distributed
with an average of 1 minute per customer.
It is further assumed that when a customer enters into the
system, he/she would prefer to go to the human server first
if the server is available.
CTMC Model for ATM and
Human Server
State: (HS, MS, # waiting)

λ (1,0,0) λ λ λ λ
µm 1,1,0) 1,1,1) 1,1,2) 1,1,3)
(0,0,0) µh
µh
(0,1,0) µm+ µh µm+ µh µm+ µh
µm λ

µm = 2.5, ATM service rate


µh = 1, human service rate

77
CTMC Model for ATM and Human
Server
Rate Matrix 0,0, 0,1, 1,0, 1,1, 1,1, 1,1, 1,1,
States 0 0 0 0 1 2 3
0,0,0 0 0 2 0 0 0 0
0,1,0 2.5 0 0 2 0 0 0
1,0,0 1 0 0 2 0 0 0
1,1,0 0 1 2.5 0 2 0 0
1,1,1 0 0 0 3.5 0 2 0
1,1,2 0 0 0 0 3.5 0 2
1,1,3 0 0 0 0 0 3.5 2
Solution
1,1,
States
Solutio 0,0,0
0.21 0,1,0 1,0,0
0.31 1,1,0 1 1,1,2
0.20 0.11 0.06 1,1,3
0.03
n 4 0.046 3 5 7 7 8
Birth & Death Process
Pure Birth Process; e.g., Hurricanes

0
a0
1
a1
2
a2
3
a3
4 …

Pure Death Process; e.g., Delivery of a truckload of parcels

0 1 2 3 4 …
d1 d2 d3 d4

Birth-Death Process; M/M/s/k/ will be picked up


in Queuing Theory
d1 d2 d3 d4
0 1 2 3 4 …
a0 a1 a2 a3

79
Pure Birth Process – Poisson Process

Pure Birth Process – Poisson Process

0
a0
1
a1
2
a2
3
a3
4 …

Poisson Process Rate Matrix


State
s 0 1 2 3 … …
0 0 λ 0 0 0 0
1 0 0 λ 0 0 0
2 0 0 0 λ 0 0
3 0 0 0 0 λ 0
… 0 0 0 0 0 λ
… 0 0 0 0 0 0
80
Poisson Process
NO Steady State Probability

The embedded Markov Chain is not ergodic


The number in the system is increasing with time

Transient Probability: number of events within time t

This is a random variable with Poisson distribution.

A general scheme is what we call a counting process,

Exponential Random and Poisson Process

81
Pure Death Process

Pure Death Process


0 1 2 3 4 …
d1 d2 d3 d4

NO Steady State Probability


The embedded Markov Chain is not ergodic
The number in the system is decreasing with time

Transient Probability: number of events within time t

This again is a random variable with Poisson distribution.

82
An Example

Suppose that you arrive at a single teller bank to find five


other customers in the bank. One being served and the
other four waiting in line. You join the end of the line. If
What is the
the service prob.
time are that you will bewith
all exponential served
rate in 10 minutes ?
5 minutes.

What is the prob. that you will be served in 20 minutes ?


What is the prob. that you will be served in 10 minutes ?
What is the expected waiting time before you are served?
What is the prob. that you will be served in 20 minutes ?

What is the expected waiting time before you are served?


83
Assumption Revisited
Markov Property:
Inter-arrival has to exponential distributed
Arrival and service time
Steady-State Probability
Flow Balance  Rate in == Rate Out
Solve a set of linear equations

Arrival time : Exponential ?


A large population n, each one has a small
percentage p of entering a store, When n is large, p
is small, exponential is a good approximation.

84
Assumption Revisited
Service time: Exponential ?

Grocery Store, might still be valid


Hair Cut: Might not be an exponential now

What if they are not exponential distributed?


Markovian Property does not hold any.
Might not be able to use the rate in = rate out
principle
It will be much difficult to get analytical results,
A lot of times, simulation will have to be used.

85
A Machine Repair Example
A factory contains two major machines which fails
independently according to a exponential
distribution with a mean time of 10 per hour

The repair of a machine takes an avearage of 8


hours and the repair time is distributed according
to a exponential distribution.
Model the problem as a CTMC or a queuing model
and give analytic results.

86
State-Transition Diagram
λ = rate at which a single machine breaks down
= 1/10 hr

µ = rate at which machines are repaired


 =1/8 hr

State of the system = # of broken machines.


2λ λ
State-transition diagram:
00 2
1 2
µ µ

87
Balance Equations for Repair Example
µπ1 = 2λπ0

2λπ0 + µπ2 = (λ + µ)π1

λπ1 = µπ2
We can solve these balance equations for π0, π1 and π2,
but in this case, we can simply use the formulas
that solve general birth-and-death equations:

λn-1 … λ0
Cn =           πn = Cnπ0 and π0 = 1 / Σn=0,∞ Cn
µn … µ1

88
Here, λ0 = 2λ µ1 = µ
λ1 = λ µ2 = µ
λ2 = λ
λ0 2λ λ1λ0 2λ2
C1 =  µ = µ   C2 = µ µ    =
1 2 1   µ2  

and C0 = 1 (by definition). Thus


1 2λ
π0 = 2λ 2 λ2 = 0.258 , π1 = µ  π0 = 0.412
1+ + 2
µ µ

π2 = 2λ2
π = 0.330
µ2   0

L = 0π0 + 1π1 + 2π2 = 1.072 (avg # machines in system)

Lq = 0π1 + 1π2 = 0.33 (avg # waiting for repair)


λ average arrival rate = Σ λnπn = λ0π0 + λ1π1 + λ2π2


n=s

= (2λ)π0 + λπ1 = 0.0928

1 L = 1 1 1
W=— (1.072) Wq = — Lq = 0.0928 (0.33)
λ 0.0928 λ
= 11.55 hours = 3.56 hours

Average amount of time  Average amount of time that a 
that a machine has to wait machine has to wait until the 
 to be repaired, including  repairman initiates the work.
the time until the repairman 
initiates the work.
Proportion of time repairman is busy = π1 + π2 = 0.742
Proportion of time that machine #1 is working
1  1
= π0 + π1 = 0.258 +  (0.412) = 0.464
2 2

Vous aimerez peut-être aussi