CS代考 COMP9334 2

Capacity Planning for Computer Systems and Networks
Week 4A: Non-markovian queueing models. Processor sharing.

Week 3A: Queues with Poisson arrivals (1)

Copyright By PowCoder代写 加微信 powcoder

• Single-server M/M/1 Exponential inter-arrivals (l)
Departures
Exponential service time (μ)
• By using a Markov chain, we can show that the mean
response time is:
T1,2022 COMP9334 2

Week 3A: Queues with Poisson arrivals (2)
• Multi-server M/M/m Arrivals
Exponential inter-arrivals (l) Exponential service time (μ)
• By using Markov chain, we know the mean response time is
Departures 2
T1,2022 COMP9334

Week 3B: Closed-queueing networks
• Analyse closed-queueing network with Markov chain • Thetransitionbetweenstatesiscausedbyanarrivalora
departure according to exponential distribution
• General procedure
• Identifythestates
• Findthestatetransitionrates
• Setupthebalanceequations
• Solveforthesteadystate probabilities
• Findtheresponsetimeetc.

This lecture: Road Map
• Single-server queues
• Whatifthearrivalrateand/ortheservicerateisnotexponentially
distributed
• Multi-server queues
• Whatifthearrivalrateand/ortheservicerateisnotexponentially distributed
• Processor sharing
T1,2022 COMP9334 5

General single-server queues
Departures
• Needtospecifythe
• Inter-arrival time probability distribution
• Service time probability distribution
• Independenceassumptions
• All inter-arrival times are independent
• All service times are independent
• The amount of service of customer A needs is independent of the amount of
time customer B needs
• The inter-arrival time and service time are independent of each other
• Undertheindependenceassumption,wecananalyseanumberof types of single server queues
• Without the independence assumption, queueing problems are very difficult to solve!
T1,2022 COMP9334 6

Classification of single-server queues
Departures
• RecallKendall’snotation:“M/M/1”means
• “M” in the 1st place means inter-arrival time is exponentially distributed
• “M” in the 2nd place means service time probability is exponentially distributed
• “1” in 3rd position means 1 server
• Weusea“G”todenoteageneralprobabilitydistribution
• Meaning any probability distribution
• Classificationofsingle-serverqueues:
Service time Distribution: General
Exponential
Exponential
Inter-arrival time distribution:
M/G/1 G/G/1

Example M/G/1 queue problem
• Consider an e-mailer server
• E-mails arrive at the mail server with a Poisson distribution
with mean arrival rate of 1.2 messages/s
• The service time distribution of the emails are:
• 30%ofmessagesprocessedin0.1s,50%in0.3s,20%in2s
• Average waiting time for a message?
• Average response time for a message?
• Average number of messages in the mail system?
• This is an M/G/1 queue problem • ArrivalisPoisson
• Servicetimeisnotexponential
• In order to solve an M/G/1 queue, we need to understand what the moment of a probability distribution is.
T1,2022 COMP9334 8

Revision: moment of a probability distribution (1)
• Consider a discrete probability distribution • Therearenpossibleoutcomes:x1,x2,…,xn
• The probability that xi occurs is pi
• Example: For a fair die
• Thepossibleoutcomesare1,2,…,6
• Theprobabilitythateachoutcomeoccursis1/6
• The first moment (also known as the mean or expected value) is
• For a fair die, the first moment is
= 1 * 1/6 + 2 * 1/6 + … + 6 * 1/6 = 3.5
T1,2022 COMP9334 9

Revision: moment of a probability distribution (2)
• The second moment of a discrete probability distribution is
• For a fair die, the second moment is
=12 *1/6+22 *1/6+…+62 *1/6
• You can prove that
• Second moment of X = (E[X])2 + Variance of X
• Note: The above definitions are for discrete probability distribution. We will look at continuous probability distribution a moment later
T1,2022 COMP9334 10

Solution to M/G/1 queue
• M/G/1 analysis is still tractable
• M/G/1 is no longer a Markov chain
• For a M/G/1 queue with the characteristics • ArrivalisPoissonwithratel
• ServicetimeShas
• Mean=1/μ=E[S]=Firstmoment • Secondmoment=E[S2]
• The mean waiting time W of a M/G/1 queue is given by the Pollaczek-Khinchin (P-K) formula:
T1,2022 COMP9334

Back to our example queueing problem (1)
• Consider an e-mailer server
• E-mails arrive at the mail server with a Poisson distribution
with mean arrival rate of 1.2 messages/s
• The service time distribution of the emails are:
• 30%ofmessagesprocessedin0.1s,50%in0.3s,20%in2s
• Exercise: In order to find the mean waiting time using the P-K formula, we need to know
• Meanarrivalrate,
• Meanservicetime,and,
• Secondmomentofservicetime.
• Can you find them?
T1,2022 COMP9334 12

Back to our example queueing problem (2)
• Consider an e-mailer server
• E-mails arrive at the mail server with a Poisson distribution
with mean arrival rate of 1.2 messages/s
• The service time distribution of the emails are:
• 30%ofmessagesprocessedin0.1s,50%in0.3s,20%in2s
• Solution
• Meanarrivalrate=1.2messages/s
• Meanservicetime
= 0.1*0.3+0.3*0.5+2*0.2 = 0.58s
• Secondmomentoftheservicetime
= 0.12 *0.3 +0.32 *0.5+22 *0.2 = 0.848 s2
• You now have everything you need to compute the mean
waiting time using the P-K formula
T1,2022 COMP9334 13

Back to our example queueing problem (3)
• Mean arrival rate l = 1.2 messages/s
• Mean service time (E[S] or 1 / μ) = 0.58s
• Second moment of mean service time E[S2] = 0.848 s2
• Utilisationr=l/μ=lE[S]=1.2*0.58=0.696 • Substituting these values in the P-K formula
W = 1.673s.
•How about:
•Average response time for a message
•Average number of messages in the mail system
T1,2022 COMP9334 14

Back to our example queueing problem (4)
Since the mean waiting time W = 1.673s. The mean response time T is
T = W + E[S] = 1.673 + 0.58 = 2.253
By Little’s Law,
Average # messages in the system
= Throughput x mean response time =lT =1.2*2.253=2.704messages
Exercise: Can you use mean waiting time and Little’s Law to determine the mean number of messages in the queue?
T1,2022 COMP9334 15

Understanding the P-K formula
• Since the Second moment of S = E[S]2 + Variance of S • We can write the P-K formula as
• Meaningwaitingtime=
• Smaller variance in service time!smaller waiting time • M/D/1 is a special case of M/G/1
• “D”standsfordeterministic:ConstantservicetimeE[S]and Variance of S = 0
• ForthesamevalueofrandE[S],deterministichasthesmallest mean response time
T1,2022 COMP9334 16

Moments for continuous probability density
• Exponential function is a continuous probability density
• If a random variable X has continuous probability density function f(x), then its
• firstmoment(=mean,expectedvalue)E[X]and • secondmomentE[X2]
are given by
• IftheservicetimeSis
exponential with rate μ,
• E[S]=1/μ
• E[S2]=2/μ2
T1,2022 COMP9334

M/M/1 as a special case of M/G/1
• Let us apply the result of the M/G/1 queue to exponential service time
• LetusputE[S]=1/μandE[S2]=2/μ2 intheP-Kformula:
• Which is the same as the M/M/1 queue waiting time
formula that we derive in Week 3A
T1,2022 COMP9334 18

Remark on M/G/1
T1,2022 COMP9334 19

Deriving the P-K formula (1)
Alice: An arriving
This customer needs 5 min
This customer
needs 6 min
This customer still needs 3 min when Alice joins the queue
Residual Service Time
• How long does Alice (the arriving customer) need to wait before she gets served?
T1,2022 COMP9334

Deriving the P-K formula (2)
Substitution
T1,2022 COMP9334
Arrival rate l
• Applying Little’s Law to the queue
• W=Meanwaitingtime
• N=Meannumberof customers in the queue
• 1/μ=Meanservicetime
• R=Meanresidual
service time
• We can prove that
• W=N*(1/μ)+R

Deriving P-K formula (3)
• The P-K formula says
• We have just showed that the mean waiting time in a M/G/1 queue is
• We can prove the P-K formula if we can show that the mean residual time R is
T1,2022 COMP9334 22

How residual service time changes over time?
Arrival time
Processing time required
Time when each job is being served:
Residual service time seen by a customer arriving at time t
time 246 10 14

What is the mean residual time …
Residual service time seen by a customer arriving at time t
Mean residual time seen by an arriving customer over time [0,14]
Service time!
T1,2022 COMP9334

In general
Residual service time seen by a customer arriving at
Assuming M jobs are completed in time T Mean residual time
S1 S2 S3 S4
T1,2022 COMP9334

The P-K formula
• Thus, the mean residual time R is
• By substituting this into • We get the P-K formula
• This derivation also shows that the waiting time is proportional to the residual service time
• The residual service time is proportional to the 2nd moment of service time
T1,2022 COMP9334 26

G/G/1 queue
• G/G/1 queue are harder to analyse
• Generally, we cannot find an explicit formula for the the
waiting time or response time for a G/G/1 queue
• Results on G/G/1 queue include • Approximationresults
• Boundsonwaitingtime
T1,2022 COMP9334 27

Approximate G/G/1 waiting time
• Therearemanydifferentmethodstofindtheapproximatewaitingtime for a G/G/1 queue
• Mostoftheapproximationworkswellwhenthetrafficisheavy,i.e. when the utilisation r is high
• Mean arrival rate = l
• Variance of inter-arrival time = sa2
• Service time S has mean 1/ μ = E[S]
• Variance of service time = ss2
• TheapproximatewaitingtimeforaG/G/1queueis
• Note:r®1,W®¥
• Largevariancemeanslargewaitingtime
T1,2022 COMP9334

Bounds for G/G/1 waiting time
• Meanarrivalrate=l
• Varianceofinter-arrivaltime=sa2
• ServicetimeShasmean1/μ=E[S] • Varianceofservicetime=ss2
• A bound for the waiting time for a G/G/1 queue is
• Note that the bound suggests that large variance means large waiting time
T1,2022 COMP9334 29

Approximation for G/G/m queue
• Only approximate waiting time available for G/G/m • The waiting time is
• Coefficient of variation of a random variable X = Standard deviation of X / mean of X
Note: Variance in arrival or service time increases queueing
T1,2022 COMP9334 30

Processor sharing (PS)
• We have so far assumed that the processor performs work on a first-come-first-serve basis
• However, this is not how CPUs perform tasks
• Consider an example: a CPU has a job queue with three tasks called Tasks 1, 2 and 3 in it
• CPUworksonTask1foracertainamountoftime(calleda quantum) and then returns the task to the job queue if it is not yet finished
• CPUworksonTask2foraquantumandreturnsthetasktothejob queue if it is not yet finished
• CPUworksonTask3foraquantumandreturnsthetasktothejob
queue if it is not yet finished
A quantum time
T1,2022 COMP9334

Modelling processor sharing
• We assume the context switching time is negligible
• We assume the quantum is small compared with the length of the task, we can think about continuous processing instead of discrete processing
• In a duration of time when there are n jobs in the job queue, each job receives 1/n of the service
T1,2022 COMP9334 32

PS: Example 1
• Example 1:
• Attime0,thereare2jobsinthejobqueue
• Job1stillneeds5secondsofservice
• Job2stillneeds3secondsofservice
• Assumingnomorejobswillarrive,determinethetimeatwhichthe jobs will be completed
T1,2022 COMP9334 33

PS: Example 2
• Example 2:
• Attime0,thereare2jobsinthejobqueue
• Job 1 still needs 5 seconds of service
• Job 2 still needs 3 seconds of service
• Job3arrivesattime=1secondandrequires4secondsofservice • Job4arrivesattime=2secondandrequires1secondofservice • NomorejobswillarriveafterJob4
• Questions:
Without computing the finished times for Jobs 1 and 3, are you able to tell which of these two jobs will finish first?
Determine the time at which the jobs will be completed
COMP9334 34

M/M/1/PS queues
• Jobs arrive according to Poisson distribution
• Exponential service time
• One processor using processor sharing
• State n = there are n jobs in the job queue
• State diagram: same as M/M/1 queue and there is a
reason for that
T1,2022 COMP9334

• We have studied a few types of non-Markovian queues • M/G/1,G/G/1,G/G/m
• Key method to derive the M/G/1 waiting time is via the residual service time
• Processor sharing (PS)
T1,2022 COMP9334 36

References
• Recommended reading
• BertsekasandGallager,“DataNetworks”
• Section 3.5 for M/G/1 queue
• The result on G/G/1 bound is taken from Section 3.5.4
• Processingsharing
• Harchol-Balter Section 22.2.2
COMP9334 37

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com