intTypePromotion=1
zunia.vn Tuyển sinh 2024 dành cho Gen-Z zunia.vn zunia.vn
ADSENSE

Networking Theory and Fundamentals - Lectures 9 & 10

Chia sẻ: Nguyễn Nhi | Ngày: | Loại File: PDF | Số trang:32

50
lượt xem
2
download
 
  Download Vui lòng tải xuống để xem tài liệu đầy đủ

Tham khảo tài liệu 'networking theory and fundamentals - lectures 9 & 10', công nghệ thông tin, quản trị mạng phục vụ nhu cầu học tập, nghiên cứu và làm việc hiệu quả

Chủ đề:
Lưu

Nội dung Text: Networking Theory and Fundamentals - Lectures 9 & 10

  1. TCOM 501: Networking Theory & Fundamentals Lectures 9 & 10 M/G/1 Queue Prof. Yannis A. Korilis 1
  2. Topics 10-2 M/G/1 Queue Pollaczek-Khinchin (P-K) Formula Embedded Markov Chain Observed at Departure Epochs Pollaczek-Khinchin Transform Equation Queues with Vacations Priority Queueing
  3. M/G/1 Queue 10-3 Arrival Process: Poisson with rate λ Single server, infinite waiting room Service times: Independent identically distributed following a general distribution Independent of the arrival process Main Results: Determine the average time a customer spends in the queue waiting service (Pollaczek-Khinchin Formula) Calculation of stationary distribution for special cases only
  4. M/G/1 Queue – Notation 10-4 Wi : waiting time of customer i X i : service time of customer i Qi : number of customers waiting in queue (excluding the one in service) upon arrival of customer i Ri : residual service time of customer i = time until the customer found in service by customer i completes service Ai : number of arrivals during the service time X i of customer i Service Times X1, X2, …, independent identically distributed RVs Independent of the inter-arrival times Follow a general distribution characterized by its pdf f X ( x ) , or cdf FX ( x ) Common mean E [ X ] = 1 / µ Common second moment E [ X 2 ]
  5. M/G/1 Queue 10-5 State Representation: {N (t ) : t ≥ 0} is not a Markov process – time spent at each state is not exponential R(t) = the time until the customer that is in service at time t completes service {( N (t ), R (t )) : t ≥ 0} is a continuous time Markov process, but the state space is not a countable set Finding the stationary distribution can be a rather challenging task Goals: Calculate average number of customers and average time-delay without first calculating the stationary distribution Pollaczek-Khinchin (P-K) Formula: λE [ X 2 ] E [W ] = 2(1 − λE[ X ]) To find the stationary distribution, we use the embedded Markov chain, defined by observing N (t ) at departure epochs only – transformation methods
  6. A Result from Probability Theory 10-6 Proposition: Sum of a Random Number of Random Variables N: random variable taking values 0,1,2,…, with mean E[ N ] X1, X2, …, XN: iid random variables with common mean E[ X ] Then: E [ X 1 + + X N ] = E[ X ] ⋅ E[ N ] Proof: Given that N=n the expected value of the sum is E  ∑ j =1 X j | N = n  = E  ∑ j =1 X j  = ∑ j =1 E [ X j ] =nE[ X ] N n n     Then: ∞ ∞ E  ∑ j =1 X j  = ∑ E  ∑ j =1 X j | N = n  × P{N = n} = ∑ nE [ X ] ⋅ P{N = n} N N   n =1   n =1 ∞ = E[ X ]∑ nP{N = n} = E[ X ]E [ N ] n =1
  7. Pollaczek-Khinchin Formula 10-7 Assume FCFS discipline Waiting time for customer i is: + X i −Qi = Ri + ∑ j =i −Q X j i −1 Wi = Ri + X i −1 + X i − 2 + i Take the expectation on both sides and let i → ∞ , assuming the limits exist: E[Wi ] = E[ Ri ] + E  ∑ j =i −Q X j  = E [ Ri ] + E [ X ]E[Qi ] ⇒ i −1   i E[W ] = E [ R ] + E[ X ]E [Q ] Averages E[Q], E[R] in the above equation are those seen by an arriving customer. Poisson arrivals and Lack of Anticipation: averages seen by an arriving customer are equal averages seen by an outside observer – PASTA property Little’s theorem for the waiting area only: E[Q ] = λE [W ] E[ R] E[W ] = E [ R ] + λE [ X ] ⋅ E [W ] = R + ρE [W ] ⇒ E[W ] = 1− ρ ρ = λE[ X ] = λ / µ : utilization factor = proportion of time server is busy ρ = λE[ X ] = 1 − p0 Calculate the average residual time: E[ R ] = lim E[ Ri ] i →∞
  8. Average Residual Time 10-8 R (t ) X1 t X1 X2 X D(t ) Graphical calculation of the long-term average of the residual time t Time-average residual time over [0,t]: t −1 ∫ R ( s )ds 0 Consider time t, such that R(t)=0. Let D(t) be the number of departures in [0,t] and assume that R(0)=0. From the figure we see that: 1 D ( t ) X i2 1 D (t ) ∑ i =1 X i D(t ) 2 1t R ( s )ds = ∑ t ∫0 =⋅ ⋅ ⇒ t i =1 2 2t D(t ) ∑ i =1 X i2 D(t ) 1t 1 D (t ) lim ∫ R ( s )ds = ⋅ lim ⋅ lim t →∞ t 0 2 t →∞ t t →∞ D (t ) Ergodicity: long-term time averages = steady-state averages (with probability 1) 1t E [ R ] = lim E [ Ri ] = lim ∫ R ( s )ds t →∞ t 0 i →∞
  9. Average Residual Time (cont.) 10-9 ∑ i =1 X i2 D(t ) 1t 1 D (t ) lim ∫ R ( s )ds = ⋅ lim ⋅ lim t →∞ t 0 2 t →∞ t t → ∞ D (t ) lim D (t ) / t : long-term average departure rate. Should be equal to the long-term average t →∞ arrival rate. Long-term averages = steady-state averages (with probability 1): D (t ) =λ lim t t →∞ Law of large numbers: ∑ ∑ D(t ) n X i2 X i2 = lim = E[ X 2 ] i =1 i =1 lim D(t ) n t →∞ n →∞ Average residual time: 1 E [ R ] = λE [ X 2 ] 2 P-K Formula: E [ R ] λE [ X 2 ] E [W ] = = 1 − ρ 2(1 − ρ)
  10. P-K Formula 10-10 P-K Formula: E [ R ] λE [ X 2 ] E[W ] = = 1 − ρ 2(1 − ρ) Average time a customer spends in the system 1 λE [ X 2 ] E[T ] = E[ X ] + E[W ] = + µ 2(1 − ρ) Average number of customers waiting for service: λ 2 E[ X 2 ] E[Q ] = λE[W ] = 2(1 − ρ) Average number of customers in the system (waiting or in service): λ 2 E[ X 2 ] E[ N ] = λE[T ] = ρ + 2(1 − ρ) Averages E[W], E[T], E[Q], E[N] depend on the first two moments of the service time
  11. P-K Formula: Examples 10-11 M/D/1 Queue: Deterministic service times all equal to 1/µ 1 1 E[ X ] = E[ X 2 ] = , µ µ2 λE [ X 2 ] ρ λ 2 E[ X 2 ] ρ2 E[W ] = = , E[Q ] = = 2(1 − ρ) 2µ(1 − ρ) 2(1 − ρ) 2(1 − ρ) 1 λE [ X 2 ] 1 ρ 2−ρ ρ( 2 − ρ ) E[T ] = + =+ = , E[ N ] = λE[T ] = µ 2(1 − ρ) µ 2µ(1 − ρ) 2µ(1 − ρ) 2(1 − ρ) M/M/1 Queue: Exponential service times with mean 1/µ 1 2 E[ X ] = , E[ X 2 ] = 2 µ µ λE [ X 2 ] ρ λ 2 E[ X 2 ] ρ2 E[W ] = = , E[Q ] = = 2(1 − ρ) µ(1 − ρ) 2(1 − ρ) (1 − ρ) 1 λE [ X 2 ] 1 ρ λ 1 E[T ] = + =+ = , E[ N ] = λE[T ] = µ 2(1 − ρ) µ µ(1 − ρ) µ − λ µ−λ
  12. Distribution Upon Arrival or Departure 10-12 Theorem 1: For an M/G/1 queue at steady-state, the distribution of customers seen by an arriving customer is the same as that left behind by a departing customer. Proof: Customers arrive one at a time and depart one at a time. A(t ), D (t ) : number of arrivals and departures (respectively) in (0,t) U n (t ) : number of (n,n+1) transitions in (0,t) = number of arrivals that find system at state n Vn (t ) : number of (n+1,n) transitions in (0,t) = number of departures that leave system at state n U n (t ) and Vn (t ) differ by at most 1 [when a (n,n+1) transition occurs, another (n,n+1) transition can occur only if the state has moved back to n, i.e., after a (n+1,n) transition has occurred] Stationary probability that an arriving customer finds the system at state n: αn = lim P{N (t ) = n | arrival at t + } t →∞ αn is the proportion of arrivals that find the system at state n: U (t ) αn = lim n t →∞ A( t ) Similarly, stationary probability that a departing customer leaves system at state n: Vn (t ) βn = lim t →∞ D ( t ) Noting that lim A(t ) / t = lim D (t ) / t = λ , we have: t →∞ t →∞ U n (t ) V (t ) U (t ) V (t ) A(t ) D (t ) = lim n ⇒ lim n lim = lim n lim ⇒ αn = β n lim t t t →∞ A( t ) t →∞ t t →∞ D ( t ) t →∞ t t →∞ t →∞
  13. Distribution Upon Arrival or Departure (cont.) 10-13 Theorem 2: For an M/G/1 queue at steady-state, the probability that an arriving customer finds n customers in the system is equal to the proportion of time that there are n customers in the system. Therefore, the distribution seen by an arriving customer is identical to the stationary distribution. Proof: Identical to the PASTA theorem due to: Poisson arrivals Lack of anticipation: future arrivals independent of current state N(t) Theorem 3: For an M/G/1 queue at steady-state, the system appears statistically identical to an arriving and a departing customer. Both an arriving and a departing customer, at steady-state, see a system that is statistically identical to the one seen by an observer looking at the system at an arbitrary time. Analysis of the M/G/1 Queue: Consider the embedded Markov chain resulting by observing the system at departure epochs At steady-state, the embedded Markov chain and {N(t)} are statistically identical Stationary distribution pn is equal to the stationary distribution of the embedded Markov chain
  14. Embedded Markov Chain 10-14 s j : time of jth departure L j = N ( s j ) : number of customers left behind by the jth departing customer Show that {L j : j ≥ 1} is a Markov chain If L j −1 ≥ 1 : customer j enters service immediately at time s j −1 . Then: L j = L j −1 − 1 + Aj , if L j −1 ≥ 1 If L j −1 = 0 : customer j arrives after time s j −1 and departs at time s j . Then: L j = Aj , if L j −1 = 0 Combining the above: L j = L j −1 + Aj − 1{L j −1 > 0} Aj : number of arrivals during service time X j : ∞ ∞ P{ Aj = k } = ∫ P{ Aj = k | X j = t} f X (t )dt = (1 / k !) ∫ e −λt (λt ) k f X (t )dt 0 0 A1, A2,…: independent – arrivals in disjoint intervals L j depends on the past only through L j −1 . Thus: {L j : j ≥ 1} is a Markov chain
  15. Number of Arrivals During a Service Time 10-15 A1, A2, …: iid. Drop the index j – equivalent to considering the system at steady state 1∞ ∞ ak = P{ A = k } = ∫ P{ A = k | X = t} f X (t )dt = ∫ e −λt (λt ) k f X (t )dt , k = 0,1, ... k! 0 0 Find the first two moments of A Proposition: For the number of arrivals A during service time X, we have: E[ A] = λE[ X ] = ρ E[ A2 ] = λ 2 E[ X 2 ] + λE[ X ] Proof: Given X=t, the number of arrivals A follows the Poisson distribution with parameter λt . ∞ ∞ ∞ E[ A] = ∫ E[ A | X = t ] f X (t )dt = ∫ (λt ) f X (t )dt =λ ∫ tf X (t )dt =λE[ X ] 0 0 0 ∞ ∞ E[ A2 ] = ∫ E[ A2 | X = t ] f X (t )dt = ∫ (λ 2 t 2 + λt ) f X (t )dt 0 0 ∞ ∞ = λ 2 ∫ t 2 f X (t )dt + λ ∫ tf X (t )dt = λ 2 E[ X 2 ] + λE[ X ] 0 0 Lemma: Let Y be a RV following the Poisson distribution with parameter α > 0 . Then: α α k k E[Y ] = ∑ k =1 k ⋅ e = α, E [Y ] = ∑ k =1 k ⋅ e ∞ ∞ −α −α = α2 + α 2 2 k! k!
  16. Embedded Markov Chain 10-16 α2 α3 i+2 i −1 i +1 i α0 α1 Transition Probabilities: Pij = P{Ln +1 = j | Ln = i} P0 j = α j , j = 0,1, ... α , j ≥ i −1 Pij =  j −i +1 , i ≥1 j < i −1 0, Stationary distribution: π j = lim P{Ln = j} n →∞  α0 α1 α2  α α α  π j = π0 α j + π1α j + + π j α1 + π j +1α0 , j ≥ 1  or π = πP, P =   0 1 2  0 α0 α1  π0 = π0 α0 + π1α0      Unique solution: π j is the fraction of departing customers that leave j customers behind From Theorem 3: π j is also the proportion of time that there are j customers in the system
  17. Calculating the Stationary Distribution 10-17 Applying Little’s Theorem for the server, the proportion of time that the server is busy is: 1 − π0 = λE[T ] = ρ ⇒ π0 = 1 − ρ Stationary distribution can be calculated iteratively: π0 = π0 α0 + π1α0 π1 = π0 α1 + π1α1 + π2 α0 Iterative calculation might be prohibitively involved Often, we want to find only the first few moments of the distribution, e.g., E[N] and E[N2] We will present a general methodology based on z-transforms that can be used to Find the moments of the stationary distribution without calculating the distribution itself 1. Find the stationary distribution, in special cases 2. Derive approximations of the stationary distribution 3.
  18. Moment Generating Functions 10-18 Definition: Moment generating function of random variable X; for any t ∈ R  ∞ etx f ( x) dx, ∫ X continuous X M X (t ) = E[etX ] =  −∞ ∑ j e j P{ X = x j }, tx X discrete  Theorem 1: If the moment generating function M X (t ) exists and is finite in some neighborhood of t=0, it determines the distribution (pdf or pmf) of X uniquely. Theorem 2: For any positive integer n: dn M X (t ) = E[ X n etX ] 1. n dt dn M X (0) = E[ X n ] 2. n dt Theorem 3: If X and Y are independent random variables: M X +Y (t ) = M X (t ) M Y (t )
  19. Z-Transforms of Discrete Random Variables 10-19 For a discrete random variable, the moment generating function is a polynomial of et . It is more convenient to set z = et and define the z-transform (or characteristic function): GX ( z ) = E[ z X ] = ∑ z j P{ X = x j } x j Let X be a discrete random variable taking values 0, 1, 2,…, and let pn = P{ X = n} . The z- transform is well-defined for | z |< 1 : ∞ = ∑ pn z n GX ( z ) = p0 + zp1 + z 2 p2 + z 3 p3 + n=0 Z-transform uniquely determines the distribution of X If X and Y are independent random variables: GX +Y ( z ) = GX ( z )GY ( z ) Calculating factorial moments: ∞ ∞ lim GX ( z ) = lim ∑ npn z n −1 = ∑ npn = E[ X ] ′ z →1 z →1 n =1 n =1 ∞ ∞ lim GX ( z ) = lim ∑ n(n − 1) pn z n − 2 = ∑ n(n − 1) pn = E[ X ( X − 1)] ′′ z →1 z →1 n=2 n=2 Higher factorial moments can be calculated similarly
  20. Continuous Random Variables 10-20 Distribution Prob. Density Fun. Moment Gen. Fun. Mean Variance (parameters) fX (x) MX (t) E [X ] Var(X ) (b¡a)2 etb ¡eta 1 a+b Uniform over b¡a t(b¡a) 2 12 (a; b) a
ADSENSE

CÓ THỂ BẠN MUỐN DOWNLOAD

 

Đồng bộ tài khoản
2=>2