introduction to queuing analysis
Post on 19-Jan-2016
52 Views
Preview:
DESCRIPTION
TRANSCRIPT
Introduction to Queuing Analysis
Queuing Analysis:
Hongwei Zhang
http://www.cs.wayne.edu/~hzhang
Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis.
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Sources of Network Delay?
� Processing Delay
� Time between receiving a packet and assigning the packet to an outgoing link queue
� Queueing Delay
� Time buffered waiting for transmission
� Transmission Delay� Transmission Delay
� Time between transmitting the first and the last bit of the packet
� Propagation Delay
� Time spend on the link – transmission of electrical signal
� Independent of traffic carried by the link
Focus: Queueing & Transmission Delay
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Basic Queueing Model
Arrivals Departures
Buffer Server(s)
Queued In Service
� A queue models any service station with:
� One or multiple servers
� A waiting area or buffer
� Customers arrive to receive service
� A customer that upon arrival does not find a free
server waits in the buffer
Characteristics of a Queue
mb
� Number of servers m: one, multiple, infinite
� Buffer size b
� Service discipline (scheduling)
� FCFS, LCFS, Processor Sharing (PS), etc
� Arrival process
� Service statistics
Arrival Process
n 1n −1n +
nτ
ntt
� : interarrival time between customers n and n+1
� is a random variable
� is a stochastic process
� Interarrival times are identically distributed and have a common
mean , where λ is called the arrival rate
nτ
nτ
{ , 1}n
nτ ≥
[ ] [ ] 1/n
E Eτ τ λ= =
Service-Time Process
n 1n −1n +
ns
t
� : service time of customer n at the server
� is a stochastic process
� Service times are identically distributed with common mean
µ is called the service rate
For packets, are the service times really random?
ns
{ , 1}n
s n ≥
[ ] [ ]n
E s E s µ= =
Queue Descriptors
� Generic descriptor: A/S/m/k
� A denotes the arrival process
� For Poisson arrivals we use M (for Markovian)
� S denotes the service-time distribution
M: exponential distribution� M: exponential distribution
� D: deterministic service times
� G: general distribution
� m is the number of servers
� k is the max number of customers allowed in the system – either in
the buffer or in service
� k is omitted when the buffer size is infinite
Queue Descriptors: Examples
� M/M/1: Poisson arrivals, exponentially distributed
service times, one server, infinite buffer
� M/M/m: same as previous with m servers
� M/M/m/m: Poisson arrivals, exponentially distributed
service times, m server, no buffering
� M/G/1: Poisson arrivals, identically distributed service
times follows a general distribution, one server,
infinite buffer
� */D/∞ : A constant delay system
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Some probability distributions and random process
� Exponential Distribution
� Memoryless Property
� Poisson Distribution
Poisson Process� Poisson Process
� Definition and Properties
� Interarrival Time Distribution
� Modeling Arrival Statistics
Exponential Distribution
� A continuous R.V. X follows the exponential distribution with
parameter µ, if its pdf is:
if 0( )
0 if 0
x
X
e xf x
x
µµ − ≥=
<
=> Probability distribution function:
� Usually used for modeling service time
1 if 0( ) { }
0 if 0
x
X
e xF x P X x
x
µ− − ≥= ≤ =
<
Exponential Distribution (contd.)
� Mean and Variance:
Proof:
2
1 1[ ] , Var( )E X X
µ µ= =
[ ] ( )x
E X x f x dx x e dxµµ
∞ ∞−= = =∫ ∫0 0
00
2 2 2
0 20 0
2 2
2 2 2
[ ] ( )
1
2 2[ ] 2 [ ]
2 1 1Var( ) [ ] ( [ ])
x
X
x x
x x x
E X x f x dx x e dx
xe e dx
E X x e dx x e xe dx E X
X E X E X
µ
µ µ
µ µ µ
µ
µ
µµ µ
µ µ µ
∞ ∞−
∞− ∞ −
∞ ∞− − ∞ −
= = =
= − + =
= = − + = =
= − = − =
∫ ∫
∫
∫ ∫
Memoryless Property
� Past history has no influence on the future
Proof:
{ | } { }P X x t X t P X x> + > = >
{ , } { }{ | }
{ } { }
P X x t X t P X x tP X x t X t
P X t P X t
> + > > +> + > = =
> >
� Exponential: the only continuous distribution with the memoryless
property
( )
{ } { }
{ }x t
x
t
P X t P X t
ee P X x
e
µµ
µ
− +−
−
> >
= = = >
Poisson Distribution
� A discrete R.V. X follows the Poisson distribution with parameter λ if
its probability mass function is:
Wide applicability in modeling the number of random events that occur
{ } , 0,1,2,...!
k
P X k e kk
λ λ−= = =
� Wide applicability in modeling the number of random events that occur
during a given time interval (=>Poisson Process)
� Customers that arrive at a post office during a day
� Wrong phone calls received during a week
� Students that go to the instructor’s office during office hours
� packets that arrive at a network switch
� etc
Poisson Distribution (contd.)
� Mean and Variance
Proof:
[ ] , Var( )E X Xλ λ= =
0 0 0
[ ] { }! ( 1)!
k k
k k k
E X kP X k e k ek k
λ λλ λ∞ ∞ ∞− −
= = =
= = = =−
∑ ∑ ∑0 0 0
0
2 2 2
0 0 0
2
0 0 0
2 2 2
! ( 1)!
!
[ ] { }! ( 1)!
( 1)! ! !
Var( ) [ ] ( [ ])
k k k
j
j
k k
k k k
j j j
j j j
k k
e e ej
E X k P X k e k e kk k
e j je ej j j
X E X E X
λ λ λ
λ λ
λ λ λ
λλ λ λ
λ λ
λ λ λλ λ λ λ λ
λ
= = =
∞− −
=
∞ ∞ ∞− −
= = =
∞ ∞ ∞− − −
= = =
−
= = =
= = = =−
= + = + = +
= − = +
∑
∑ ∑ ∑
∑ ∑ ∑2λ λ λ− =
Sum of Poisson Random Variables
� Xi , i =1,2,…,n, are independent R.V.s
Xi follows Poisson distribution with parameter λi
� Sum1 2
...n n
S X X X= + + +
� Follows Poisson distribution with parameter λ
1 2...
nλ λ λ λ= + + +
Sum of Poisson Random Variables (cont.)
Proof: For n = 2. Generalization by induc-
tion. The pmf of S = X1 +X2 is
PfS =mg =mX
k=0
PfX1 = k;X2 =m¡ kg
=mX
f = g f = ¡ g=X
k=0
PfX1 = kgPfX2 = m¡ kg
=mX
k=0
e¡¸1¸k1
k!¢ e¡¸2
¸m¡k2
(m¡ k)!
= e¡(¸1+¸2)1
m!
mX
k=0
m!
k!(m¡ k)!¸k1¸
m¡k2
= e¡(¸1+¸2)(¸1 + ¸2)
m
m!
Poisson with parameter ¸ = ¸1 + ¸2.
Sampling a Poisson Variable
� X follows Poisson distribution with parameter λ
� Each of the X arrivals is of type i with probability pi,
i =1,2,…,n, independent of other arrivals;
p1 + p2 +…+ pn = 11 2 n
� Xi denotes the number of type i arrivals, then
� X1 , X2 ,…Xn are independent
� Xi follows Poisson distribution with parameter λi= λpi
=> Splitting of Poisson process (later)
Sampling a Poisson Variable (contd.)
Proof: For n= 2. Generalize by induction. Joint pmf:
PfX1 = k1; X2 = k2g =
= PfX1 = k1;X2 = k2jX = k1 + k2gPfX = k1 + k2g
=³k1 + k2
k1
´pk1
1 pk2
2 ¢ e¡¸ ¸k1+k2
(k1 + k2)!
³
k1
´¢
(k1 + k2)!
=1
k1!k2!(¸p1)
k1(¸p2)k2 ¢ e¡¸(p1+p2)
= e¡¸p1(¸p1)k1
k1!¢ e¡¸p2
(¸p2)k2
k2!
² X1 and X2 are independent
² PfX1 = k1g = e¡¸p1 (¸p1)k1
k1!, PfX2 = k2g = e¡¸p2 (¸p2)
k2
k2!
Xi follows Poisson distribution with parameter ¸pi.
Poisson Approximation to Binomial
� Binomial distribution with parameters
(n, p)
� As n→∞ and p→0, with np=λ
� Proof:
{ } (1 )k n k
nP X k p p
k
− = = −
{ } (1 )
( 1)...( 1)1
( 1)...( 1)
!
k n k
n kk
nP X k p p
k
n k n n
n n
n k n n
k
λλ
−
−
= = −
− + − = ⋅ −
− + −� As n→∞ and p→0, with np=λ
moderate, binomial distribution
converges to Poisson with parameter λ
( 1)...( 1)1
1
1 1
{ }!
nk
n
n
k
n
k
n
n k n n
n
en
n
Pk
X k e
λ
λ
λ
λ
λ
→∞
−
→∞
→∞
−
→∞
− + −→
− →
− →
= →
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Poisson Process with Rate λ
� {A(t): t≥0} counting process
� A(t) is the number of events (arrivals) that have occurred from time 0 to
time t, when A(0)=0
� A(t)-A(s) number of arrivals in interval (s, t]
� Number of arrivals in disjoint intervals are independent� Number of arrivals in disjoint intervals are independent
� Number of arrivals in any interval (t, t+τ] of length τ
� Depends only on its length τ
� Follows Poisson distribution with parameter λτ
=> Average number of arrivals λτ; λ is the arrival rate
( ){ ( ) ( ) } , 0,1,...
!
n
P A t A t n e nn
λτ λττ −+ − = = =
Interarrival-Time Statistics
� Interarrival times for a Poisson process are independent and follow
exponential distribution with parameter λ
tn: time of nth arrival; τn=tn+1-tn: nth interarrival time
{ } 1 , 0s
nP s e s
λτ −≤ = − ≥{ } 1 , 0n
P s e sτ ≤ = − ≥
Proof:
� Probability distribution function
� Independence follows from independence of number of arrivals in disjoint
intervals
{ } 1 { } 1 { ( ) ( ) 0} 1s
n n n nP s P s P A t s A t e
λτ τ −≤ = − > = − + − = = −
Small Interval Probabilities
� Interval (t+ δ, t] of length δ
{ ( ) ( ) 0} 1 ( )
{ ( ) ( ) 1} ( )
{ ( ) ( ) 2} ( )
P A t A t
P A t A t
P A t A t
δ λδ ο δ
δ λδ ο δ
δ ο δ
+ − = = − +
+ − = = +
+ − ≥ =
Proof:
Merging & Splitting Poisson Processes
λ1
λ2
λ1+ λ2 λ
λp
λ(1-p)
p
1-p
� A1,…, Ak independent Poisson
processes with rates λ1,…, λk
� Merged in a single process
A= A1+…+ Ak
A is Poisson process with rate
λ= λ1+…+ λk
� A: Poisson processes with rate λ
� Split into processes A1 and A2
independently, with probabilities p and
1-p respectively
A1 is Poisson with rate λ1= λp
A2 is Poisson with rate λ2= λ(1-p)
λ2 λ(1-p)
Modeling Arrival Statistics
� Poisson process widely used to model packet arrivals in numerous
networking problems
� Justification: provides a good model for aggregate traffic of a large
number of “independent” users
� n traffic streams, with independent identically distributed (iid) interarrival � n traffic streams, with independent identically distributed (iid) interarrival
times with PDF F(s) – not necessarily exponential
� Arrival rate of each stream λ/n
As n→∞, combined stream can be approximated by Poisson under mild
conditions on F(s) – e.g., F(0)=0, F’(0)>0
☺ Most important reason for Poisson assumption: Analytic tractability of
queueing models
Outline
� Delay in packet networks
� Introduction to queuing theory
� Exponential and Poisson distributions
� Poisson process
� Little’s Theorem
Little’s Theorem
Nλ
� λ: customer arrival rate
� N: average number of customers in system
� T: average delay per customer in system
Little’s Theorem: System in steady-state
N Tλ=
T
Counting Processes of a Queue
α(t)
N(t)
β(t)
� N(t) : number of customers in system at time t
� α(t) : number of customer arrivals till time t
� β(t) : number of customer departures till time t
� Ti : time spent in system by the ith customer
t
Time Averages
� Time average over interval [0,t]
� Steady state time averages
� Little’s theorem:
� N=λT
� Applies to any queueing system
provided that:
� Limits T, λ, and δ exist, and
δ
0
1( ) lim
( )lim
t
t tt
N N s ds N Nt
a tλ λ λ
→∞= =
= =
∫
� λ= δ
� We give a simple graphical proof
under a set of more restrictive
assumptions
( )
1
( )lim
1lim
( )
( )lim
t tt
a t
t i tt
i
t tt
a t
t
T T T Ta t
t
t
λ λ λ
βδ δ δ
→∞
→∞=
→∞
= =
= =
= =
∑
Proof of Little’s Theorem for FCFS
� FCFS system, N(0)=0
α(t) and β(t): staircase graphs
N(t) = α(t)- β(t)
Shaded area between graphs
( ) ( )t
S t N s ds= ∫
α(t)
N(t)
Ti
i
β(t)
� Assumption: infinitely often, N(t)=0. For any such t
If limits Nt→N, Tt→T, λt→λ exist, Little’s formula follows
We will relax the last assumption (i.e., infinitely often, N(t)=0)
t
0( ) ( )S t N s ds= ∫
T1
T2
( )
1
( )
0 01
1 ( )( ) ( )
( )
t
i
tt t
i t t t
i
TtN s ds T N s ds N T
t t t
αα αλ
α=
= ⇒ = ⇒ =∑∑∫ ∫
Proof of Little’s for FCFS (contd.)
α(t)
N(t)
Ti
i
β(t)
� In general – even if the queue is not empty infinitely often:
� Result follows assuming the limits Tt →T, λt→λ, and δt→δ exist, and λ=δ
T1
T2
( ) ( )
1 1
( ) ( )
0 01 1
( ) 1 ( )( ) ( )
( ) ( )
t t
i i
t tt t
i i
i i
t t t t t
T Tt tT N s ds T N s ds
t t t t t
T N T
β αβ α β α
β α
δ λ
= =
≤ ≤ ⇒ ≤ ≤
⇒ ≤ ≤
∑ ∑∑ ∑∫ ∫
Probabilistic Form of Little’s Theorem
� Have considered a single sample function for a stochastic process
� Now will focus on the probabilities of the various sample
functions of a stochastic process
� Probability of n customers in system at time t� Probability of n customers in system at time t
� Expected number of customers in system at t
( ) { ( ) }n
p t P N t n= =
0 0
[ ( )] . { ( ) } ( )n
n n
E N t n P N t n np t∞ ∞
= =
= = =∑ ∑
Probabilistic Form of Little (contd.)
� pn(t), E[N(t)] depend on t and initial distribution at t=0
� We will consider systems that converge to steady-state, where there exist pn
independent of initial distribution
lim ( ) , 0,1,...n n
tp t p n
→∞= =
� Expected number of customers in steady-state [stochastic aver.]
� For an ergodic process, the time average of a sample function is equal to the
steady-state expectation, with probability 1.
0
lim [ ( )]n
tn
EN np E N t∞
→∞=
= =∑
lim lim [ ( )]t
t tN N E N t EN
→∞ →∞= ==
Probabilistic Form of Little (contd.)
� In principle, we can find the probability distribution of the delay Ti for customer
i, and from that the expected value E[Ti], which converges to steady-state
� For an ergodic system
lim [ ]i
iET E T
→∞=
1lim lim [ ]i
TT E T ET
∞
== =∑
� Probabilistic Form of Little’s Formula:
where the arrival rate is define as
1lim lim [ ]ii i
T E T ETi→∞ →∞
== =∑
.EN ETλ=
[ ( )]limt
E t
t
αλ
→∞=
Time vs. Stochastic Averages
� “Time averages = Stochastic averages” for all systems of interest in
this course
� It holds if a single sample function of the stochastic process contains all
possible realizations of the process at t→∞
� Can be justified on the basis of general properties of Markov chains
Example 0: a single line
For a transmission line,
� λ: packet arrival rate
� NQ: average number of packets waiting in queue (i.e., not under
transmission)
� W: average time spent by a packet waiting in queue (i.e., not including � W: average time spent by a packet waiting in queue (i.e., not including
transmission time)
=>
Similarly, if X is the average transmission time, then the average # of
packets under transmission is
ρ is also called the utilization factor
WNQ λ=
Xλρ =
Example 1: a network
� Given
� A network with packets arriving at n different nodes, and the arrival rates
are λ1, ..., λn respectively.
� N: average # of packets inside the network,
� Then Then
� Average delay per packet (regardless of packet length distribution and
routing algorithms) is
� Ni = λTi for each node i
∑ =
=n
i i
NT
1λ
Example 2: data transport (congestion control)
� Consider
� a window flow congestion system with a window of size W for each session
� λ: per session packet arrival rate
� T: average packet delay in the network
� Then TW λ≥� Then
=> if congestion builds up (i.e., T increases), λ must eventually decrease
� Now suppose
� network is congested and capable of maintaining λ delivery rate, then
=> increasing W only increases delay T
TW λ≥
TW λ≈
Summary
� Delay in packet networks
� Introduction to queuing theory
� A few more points about probability theory
� The Poisson process
� Little’s Theorem
Homework #7
� Problems 3.1, 3.4, and 3.6 of R1
� Grading:
� Overall points 130Overall points 130
� 20 points for Prob. 3.1
� 50 points for Prob. 3.4
� 60 points for Prob. 3.6
top related