程序代做CS代考 chain Chapter 6 slides, Computer Networking, 3rd edition

Chapter 6 slides, Computer Networking, 3rd edition

Advanced Network Technologies
Queueing Theory
School of Computer Science
Dr. | Lecturer

1

Outline
Markov Chain
Queueing System and Little’s Theorem
M/M/1 Queue foundations
M/M/1 Queue

Markov Chain
Markov Chain

Markov Chain
A stochastic process
X1, X2, X3, X4…
{Xn, n = 1, 2, . . .}
Xn takes on a finite or countable number of possible values.
Xn  ∈  {1,2, …, S}
i: ith state
Markov Property: The state of the system at time n+1 depends only on the state of the system at time n

X1
X2
X3
X4
X5
• Stationary Assumption: Transition probabilities are independent of time (n)

Markov Chain

An example
6
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow
80% no rain tomorrow

rain
no rain
0.6
0.4
0.8
0.2
Stochastic FSM:

6

An example
7
Weather:
raining today 40% rain tomorrow
60% no rain tomorrow

not raining today 20% rain tomorrow
80% no rain tomorrow

Matrix:

Stochastic matrix:
Rows sum up to 1

7

Transition matrix
8

8

9
– Gambler starts with $10
– At each play we have one of the following:
• Gambler wins $1 with probability p
• Gambler looses $1 with probability 1-p
– Game ends when gambler goes broke, or gains a fortune of $100
(Both 0 and 100 are absorbing states)
0
1
2
99
100
p
p
p
p
1-p
1-p
1-p
1-p
Start (10$)
Gambler’s Example

9

10
– transient state 
if, given that we start in state i, there is a non-zero probability that we will never return to i

– recurrent state
Non-transient

– absorbing state
impossible to leave this state.

Gambler’s Example

10

11
Given that a person’s last cola purchase was Coke, there is a 90% chance that his next cola purchase will also be Coke.
If a person’s last cola purchase was Pepsi, there is an 80% chance that his next cola purchase will also be Pepsi.
coke
pepsi
0.1
0.9
0.8
0.2

transition matrix:

Coke vs. Pepsi Example
state transition diagram:

11

12
Given that a person is currently a Pepsi purchaser, what is the probability that he will purchase Coke two purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 * 0.9 + 0.8 * 0.2 = 0.34

Pepsi  ?

?  vs. Pepsi Example

12

13
Given that a person is currently a Coke purchaser, what is the probability that he will purchase Pepsi three purchases from now?

Coke vs. Pepsi Example

Edit Master text styles
Second level
Third level
Fourth level
Fifth level

13

Assume each person makes one cola purchase per week
Suppose 60% of all people now drink Coke, and 40% drink Pepsi
What fraction of people will be drinking Coke three weeks from now?

Pr[X3=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438

Qi – the distribution in week i
Q0=(0.6,0.4) – initial distribution
Q3= Q0 * P3 =(0.6438,0.3562)

Coke vs. Pepsi Example

14

15

Simulation:

Coke vs. Pepsi Example
week – i
Pr[Xi = Coke]
2/3

stationary distribution

coke
pepsi
0.1
0.9
0.8
0.2

15

Steady State and Stationary distribution

16

Steady State and Stationary distribution

17

Steady State and Stationary distribution
PageRank: A Web surfer browses pages in a five-page Web universe shown in figure. The surfer selects the next page to view by selecting with equal probability from the pages pointed to by the current page. If a page has no outgoing link (e.g., page 2), then the surfer selects any of the pages in the universe with equal probability. Find the probability that the surfer views page i.

18

Steady State and Stationary distribution

Transition matrix P

19

Steady State and Stationary distribution
Stationary Distribution:
Solve the following equations:

(0.12195, 0.18293, 0.25610, 0.12195, 0.317072)

Search engineer. page rank: 5, 3, 2, 1, 4

20

Queueing System and Little’s Theorem
Queueing System and Little’s Theorem

Queueing System

• Customers = Data packets
• Service Time = Packet Transmission Time (function of packet length and transmission speed)
• Queueing delay = time spent in buffer before transmission
• Average number of customers in systems
– Typical number of customers either waiting in queue or undergoing service
• Average delay per customer
– Typical time a customer spends waiting in queue + service time

22

Queueing System

• W: average waiting time in queue
• X: average service time
• T: average time spent in system (T = W + X)
• NQ = average number of customers in queue
• ρ = utilization = average number of customers in service
• N = average number of customer in system (N = NQ + ρ)
• Want to show later: N = λT (Little’s theorem)
• λ Average arrival rate

NQ
W
X
ρ
N
T
Number
Time
+
+
=
=

23

Little’s Theorem
α(t) = Number of customers who arrived in the interval [0, t]
β(t) = Number of customers who departed in the interval [0, t]
N(t) = Number of customers in the system at time t, N(t) = α(t) – β(t)
Ti = Time spent in the system by the i-th arriving customer
1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

24

Little’s Theorem

Average # of customers until t
Average # of customers in long-term
1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

25

Little’s Theorem

Average # arrival rate until t
Average # arrival rate in long-term

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

26

Little’s Theorem

Average customer delay till t
Average customer delay in long-term

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

27

Little’s Theorem
Shaded area when the queue is empty: two ways to compute

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

28

Little’s Theorem
Shaded area when the queue is empty: two ways to compute

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

29

Little’s Theorem
Shaded area when the queue is empty: two ways to compute

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

30

Little’s Theorem
Shaded area when the queue is empty: two ways to compute

1
2
3
4
5
6
7

0
τ
α(τ)
β(τ)
N(τ)
Delay T1
Customer 1
Customer 3
Customer 2
Number of departures β(τ)
Number of arrivals α(τ)
t

31

Little’s Theorem
Note that the above Little’s Theorem is valid for any service disciplines (e.g., first-in-first-out, last-in-first-out), interarrival time distributions, and service time distributions.

32

Little’s Theorem
• N = λT
• NQ = λW
• ρ = proportion of time that the server is busy = λX
• T = W + X
• N = NQ + ρ

NQ
W
X
ρ
N
T
Number
Time
+
+
=
=
Rate
λ
λ
λ
Rate
Time
Number

33

M/M/1 Queue foundations
M/M/1 Queue foundations

Exponential Distribution
Exponential Distribution
• The cumulative distribution function F(x) and probability
density function f(x) are:
F(x) =1−e−λ x f (x) =λ e−λ x x ≥ 0, λ > 0

The mean is equal to its standard deviation: E[X] = σX = 1/λ

Memoryless Property
P(X > s + t| X > t) = P(X > s) for all s, t ≥ 0
The only continuous distribution with this property
Practice Q2 in Tutorial Week 4

Other Properties of Exponential Distribution
Let X1 ,…, Xn be i.i.d. exponential r.v.s with mean 1/λ,
then X1+X2 +…+ Xn (Practice Q1 in Tutorial Week 4)

gamma distribution with parameters n and λ.
Suppose X1 and X2 are independent exponential r.v.s with means
1/λ1 and 1/λ2, respectively, then

37

Counting Process
A stochastic process {N(t), t ≥ 0} is a counting process if N(t) represents the total # of events that have occurred up to time t.
1. N(t) ≥ 0 and N(t) is integer valued.
2. If s < t, then N(s) ≤ N(t) 3. For s < t, N(t) – N(s) = # of events that have occurred in (s, t) • Examples: – # of people who have entered a particular store by time t – # of packets sent by a mobile phone • A counting process is said to be independent increment if # of events which occur in disjoint time intervals are independent. • A counting process is said to be stationary increment if the distribution of # of events which occur in any interval of time depends only on the length of the time interval. 38 Poisson Process The counting process {N(t), t ≥ 0} is said to be a Poisson process having rate λ > 0, if
1. N(0) = 0
2. The process has independent increments (i.e., # of events which occur in disjoint time intervals are independent)
– for 0 < t1 < t2 < t3 < t4 , for all n , j ≥ 0 – P{ N(t4) – N(t3) = n | N(t2) – N(t1) = j } = P{N(t4) – N(t3) = n } 3. Number of events in any interval of length t is Poisson distributed with mean λt. That is, for all s, t ≥ 0 39 Poisson Process t 0 … N(t) Arrivals N(0)=0 Arrivals in [t1, t2] is independent of arrivals in [t3, t4] t1 t2 t3 t4 Mean arrival λ(t2-t1) Mean arrival λ(t4-t3) 40 Poisson Process: Inter arrival time distribution t 0 T1 T2 T3 P(T1 > t) = P(N (t) = 0) = e−λ t
P(T2 > t) = P{T2 > t | T1 =t1 } = e−λ t
Exponential distribution with parameter λ
(mean 1/ λ )

41

Number of arrivals in a short period of time
Number of arrival events in a very short period

P{N(t+h) – N(t) = 1} = λh + o(h); and
P{N(t+h) – N(t) ≥ 2} = o(h).
o(). Small o notation. The function f (.) is said to be o(h) if

42

Summery
Poisson process:
Independent increments
# of arrivals: Poisson distributed
# of arrivals in a small period of time h. 1 arrival, probability λh Inter-arrival time distribution: exponential distribution

43

M/M/1 Queue
M/M/1 Queue

Queues: Kendall’s notation
Notations Used in Queueing Systems
X/Y/Z
– X refers to the distribution of the interarrival times
– Y refers to the distribution of service times
– Z refers to the number of servers
Common distributions:
– M = Memoryless = exponential distribution
– D = Deterministic arrivals or fixed-length service
– G = General distribution of interarrival times or service times
M/M/1 refers to a single-server queuing model with exponential interarrival times (i.e., Poisson arrivals) and exponential service times.
In all cases, successive interarrival times and service times are assumed to be statistically independent of each other.

M/M/1 Queue
Arrival:
Poisson arrival with rate λ
Service:
Service time: exponential distribution with mean 1/μ
μ: service rate,
λ< μ: Incoming rate < outgoing rate Markov Chain Formulation δ: a small value Nk = Number of customers in the system at time kδ N0 N1 N2… is a Markov Chain! Q: How to compute the transition probability? Markov Chain Formulation Markov Chain Formulation No one in the system Markov Chain Formulation Aim to compute Pij = P{Nk +1 = j| Nk = i} For example, P{Nk +1 = i| Nk = i}, i>0

Markov Chain Formulation

Result:

Stationary distribution of state i
The probability that there are i units in the system

Stationary Distribution Derivation

How to derive

balance equation satisfied
Incoming = outgoing

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

For any i

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

Sum of geometric sequence

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

Sum of geometric sequence

57

Stationary Distribution Derivation

How to derive

balance equation is performed at each state

Sum of geometric sequence

Average number of user
Average number of users in the system

Average waiting time
Average waiting time

Little’s Theorem

Stationary Distribution Derivation

Stationary Distribution Derivation

balance equation is performed at each state

Stationary Distribution Derivation

balance equation is performed at each state

Following the same step, derive the same result

Stationary Distribution Derivation

Queueing delay goes to infinity when arrival rate approaches service rate!

M/M/m Queue

M/M/m Queue
Arrival:
Poisson arrival with rate λ
Service:
Service time for one server: exponential distribution with mean 1/μ
service rate is i μ, if there are i=m users in the system

M/M/m Queue

Then,

can be solved

67

M/M/∞ Queue

M/M/m/m Queue

M/M/m/n Queue

Buffer size is n-m
Arrivals will dropped if there are n users in the system.
How do you derive its stationary distribution?

Practice
Analyze M/M/ ∞, M/M/m/n queues
Draw the state transition diagrams
Derive their stationary distributions
For M/M/m/n queue, calculate the probability that an incoming user is dropped. Calculate the probability that the queue is empty (i.e., all users are served in the servers or there are no users at all.)

[
]
[
]

x

| X
x

X

x
X
x
X
x
| X
x

X
n
n
n
n
n
n
n
n
=
=
=
=
=
=
=
+
+
+
+
1
1
1
1
2
2
1
1
Pr
,
,…,
Pr

[
]
ab
n
n
p

a
b | X

X
=
=
=
+
1
Pr

÷
÷
ø
ö
ç
ç
è
æ
=
8
.
0
2
.
0
6
.
0
4
.
0
P

÷
÷
÷
÷
÷
ø
ö
ç
ç
ç
ç
ç
è
æ
=
SS
S
S
S
S
p
p
p
p
p
p
p
p
p
P



2
1
2
22
21
1
12
11
M
O
M

ú
û
ù
ê
ë
é
=
8
.
0
2
.
0
1
.
0
9
.
0
P

ú
û
ù
ê
ë
é
=
ú
û
ù
ê
ë
é
ú
û
ù
ê
ë
é
=
66
.
0
34
.
0
17
.
0
83
.
0
8
.
0
2
.
0
1
.
0
9
.
0
8
.
0
2
.
0
1
.
0
9
.
0
2
P

ú
û
ù
ê
ë
é
=
ú
û
ù
ê
ë
é
ú
û
ù
ê
ë
é
=
562
.
0
438
.
0
219
.
0
781
.
0
66
.
0
34
.
0
17
.
0
83
.
0
8
.
0
2
.
0
1
.
0
9
.
0
3
P

ú
û
ù
ê
ë
é
=
562
.
0
438
.
0
219
.
0
781
.
0
3
P

[
]
[
]
3
1
3
2
3
1
3
2
8
.
0
2
.
0
1
.
0
9
.
0
=
ú
û
ù
ê
ë
é

1
π
=
¥
®
n
lim
P
n

P
×
=
π
π

i
n
n
i
X
P
p
=
=
¥
®
)
(
lim

ú
û
ù
ê
ë
é
=
0.3522
0.6478
0.3239
0.6761
10
P

[
]
[
]
3
1
3
2
3
1
3
2
8
.
0
2
.
0
1
.
0
9
.
0
=
ú
û
ù
ê
ë
é

[
]
3
1
3
2
=
π

ú
û
ù
ê
ë
é
=
0.3333
0.6667
0.3333
0.6667
100
P

1
5
1
i
i
=
×
=
å
=
p
P
π
π

=
π

t
t
d
N
t
N
t
t
ò
=
0
)
(
1

t
t
d
N
t
N
t
t
ò
¥
®
=
0
)
(
1
lim

t
t
t
)
(
a
l
=

t
t
t
)
(
lim
a
l
¥
®
=

)
(
)
(
1
i
t
T
T
t
i
t
a
a
å
=
=

)
(
li
)
(
1
i
t
T
m
T
t
i
t
a
a
å
=
¥
®
=

t
t
d
N
t
ò
0
)
(

å
=
)
(
1
i
t
i
T
a

=

t
t
d
N
t
ò
0
)
(
t
1

å
=
)
(
1
i
t
1
t
i
T
a

)
(
t
)
(
)
(
1
i
t
T
t
t
i
a
a
a
å
=

t
N

t
l

t
T

T
N
T
N
t
t
l
l
=
=
t

´

=

2
1
1
2
1
)
(
l
l
l
+
=
< X X P t n e n t n s N s t N P l l - = = - + ! ) ( ) ) ( ) ( ( t s N s t N E l = - + )) ( ) ( ( 0 ) ( lim 0 = ® h h f h ) ( ) arrives customer 2 ( ) ( ) arrives customer 1 ( ) ( 1 ) arrives customer 0 ( d d ld d ld o P o P o P = ³ + = + - = ) ( ) leaves customer 2 ( 0 0 1 ) ( ) leaves customer 1 ( ) ( 1 ) leaves customer 0 ( d d md d md o P i i o P o P = ³ î í ì = ³ + = + - = ) other ( ) departs customer 1 ( ) arrives customer 1 ( ) departs customer 0 ( ) arrives customer 0 ( P P P P P + + ) ( 1 )] ( 1 )][ ( 1 [ d md ld d md d ld o o o + - - = + - + - ) ( )] ( )][ ( [ d d md d ld o o o = + + ) ( ) ( ) ( d d d o o o = ) ( 1 : Result d md ld o + - - i p i p 3 1 2 2 mdp ldp mdp ldp + = + 2 ldp 1 ldp 3 mdp 2 mdp 1 0 mdp ldp = 2 0 1 1 mdp ldp mdp ldp + = + 2 1 mdp ldp = 3 1 2 2 mdp ldp mdp ldp + = + 3 2 mdp ldp = 3 2 mdp ldp = 1 i i + = mdp ldp 0 1 p m l p = 0 2 2 p m l p ÷ ÷ ø ö ç ç è æ = 0 i p m l p i ÷ ÷ ø ö ç ç è æ = ... 1 0 i i = å ¥ = p 0 1 rp p = ( ) 0 2 2 p r p = ( ) 0 i p r p i = 1 < = m l r r p r r p - = - - ¥ ® 1 1 ) 1 ( li 0 0 N N m 1 = i r r p r p ) - 1 ( - 1 i 0 = = r r r r r r r r r r r r r r r r - = ¶ ú û ù ê ë é - ¶ - = ¶ ú û ù ê ë é ¶ - = - = - = å å å ¥ = - ¥ = ¥ = 1 1 ) 1 ( ) 1 ( ) 1 ( ) 1 ( ) ( 0 1 0 0 n n n n n n n n N E l m l - 1 ) ( ) ( = = N E T E 1 0 mp lp = 2 0 1 1 mp lp mp lp + = + ... i 1 - i i mp lp = i 1 - i m mp lp = m £ i m >
i

(
)
ï
ï
î
ï
ï
í
ì
>
£
=
m
n
m
m
m
n
n
m
n
m
n
!
!
0
0
n
r
p
r
p
p

0
p

1
< = m l r m