Let Xt Be an Irreducible Continuous time Markov Chain Show That for Each I J and Every T 0
11.3.4 Solved Problems
Problem
Consider a continuous-time Markov chain $X(t)$ with the jump chain shown in Figure 11.25. Assume $\lambda_1=2$, $\lambda_2=3$, and $\lambda_3=4$.
- Find the stationary distribution of the jump chain $\tilde{\pi}=\big[ \tilde{\pi}_1, \tilde{\pi}_2, \tilde{\pi}_3 \big]$.
- Using $\tilde{\pi}$, find the stationary distribution for $X(t)$.
Note: Although this jump chain has self-transitions, you can still use the discussed methods. In fact, $X(t)$ can be equivalently shown by a jump chain with no self-transitions along with appropriate holding times.
- Solution
-
- To find the stationary distribution of the jump chain, $\tilde{\pi}=\big[ \tilde{\pi}_1, \tilde{\pi}_2, \tilde{\pi}_3 \big]$, we need to solve \begin{align*} & \tilde{\pi}_1 =\frac{1}{2} \tilde{\pi}_3, \\ & \tilde{\pi}_2 =\frac{1}{2} \tilde{\pi}_1+\frac{1}{3} \tilde{\pi}_2\\ & \tilde{\pi}_3 =\frac{1}{2} \tilde{\pi}_1+\frac{2}{3} \tilde{\pi}_2+\frac{1}{2} \tilde{\pi}_3, \\ & \tilde{\pi}_1+\tilde{\pi}_2+\tilde{\pi}_3=1. \end{align*} We find \begin{align*} \tilde{\pi}_1 =\frac{4}{15}, \; \tilde{\pi}_2=\frac{3}{15}, \; \tilde{\pi}_3=\frac{8}{15}. \end{align*}
- We have obtained $$\tilde{\pi}=\frac{1}{15} [4, \; 3, \; 8].$$ We can find the limiting distribution of $X(t)$ using \begin{align*} \pi_j=\frac{\frac{\tilde{\pi}_j}{\lambda_j}}{\sum_{k \in S} \frac{\tilde{\pi}_k}{\lambda_k}}. \end{align*} We obtain \begin{align*} \pi_1&=\frac{\frac{\tilde{\pi}_1}{\lambda_1}}{\frac{\tilde{\pi}_1}{\lambda_1}+\frac{\tilde{\pi}_2}{\lambda_2}+\frac{\tilde{\pi}_3}{\lambda_3}}\\ &=\frac{\frac{4}{2}}{\frac{4}{2}+\frac{3}{3}+\frac{8}{4}}\\ &=\frac{2}{5}. \end{align*} \begin{align*} \pi_2&=\frac{\frac{\tilde{\pi}_2}{\lambda_2}}{\frac{\tilde{\pi}_1}{\lambda_1}+\frac{\tilde{\pi}_2}{\lambda_2}+\frac{\tilde{\pi}_3}{\lambda_3}}\\ &=\frac{\frac{3}{3}}{\frac{4}{2}+\frac{3}{3}+\frac{8}{4}}\\ &=\frac{1}{5}. \end{align*} \begin{align*} \pi_3&=\frac{\frac{\tilde{\pi}_3}{\lambda_3}}{\frac{\tilde{\pi}_1}{\lambda_1}+\frac{\tilde{\pi}_2}{\lambda_2}+\frac{\tilde{\pi}_3}{\lambda_3}}\\ &=\frac{\frac{8}{4}}{\frac{4}{2}+\frac{3}{3}+\frac{8}{4}}\\ &=\frac{2}{5}. \end{align*} Thus, we conclude that $\pi=\frac{1}{5}[2, 1, 2]$ is the limiting distribution of $X(t)$.
-
Problem
Consider a continuous-time Markov chain $X(t)$ that has the jump chain shown in Figure 11.26 (this is the same Markov chain given in Example 11.19). Assume $\lambda_1=2$, $\lambda_2=1$, and $\lambda_3=3$.
- Find the generator matrix for this chain.
- Find the limiting distribution for $X(t)$ by solving $\pi G=0$.
- Solution
- The jump chain is irreducible and the transition matrix of the jump chain is given by \begin{equation} \nonumber P = \begin{bmatrix} 0 & 1 & 0 \\[5pt] 0 & 0 & 1\\[5pt] \frac{1}{2} & \frac{1}{2} & 0 \\[5pt] \end{bmatrix}. \end{equation} The generator matrix can be obtained using \begin{align} \nonumber g_{ij} = \left\{ \begin{array}{l l} \lambda_i p_{ij} & \quad \textrm{ if }i \neq j \\ & \quad \\ -\lambda_i & \quad \textrm{ if }i = j \end{array} \right. \end{align} We obtain \begin{equation} \nonumber G = \begin{bmatrix} -2 & 2 & 0 \\[5pt] 0 & -1 & 1\\[5pt] \frac{3}{2} & \frac{3}{2} & -3 \\[5pt] \end{bmatrix}. \end{equation} Solving $$\pi G=0, \quad \textrm{and} \quad \pi_1+\pi_2+\pi_3=1$$ we obtain $\pi=\frac{1}{19}[3, 12, 4]$, which is the same answer that we obtained in Example 11.19.
Problem
(A queuing system) Suppose that customers arrive according to a Poisson process with rate $\lambda$ at a service center that has a single server. Customers are served one at a time in order of arrival. Service times are assumed to be i.i.d. $Exponential (\mu)$ random variables and independent of the arrival process. Customers leave the system after being served. Our goal in this problem is to model the above system as a continuous-time Markov chain. Let $X(t)$ be the number of customers in the system at time $t$, so the state space is $S=\{0, 1, 2, \cdots \}$. Assume $i>0$. If the system is in state $i$ at time $t$, then the next state would either be $i+1$ (if a new customers arrive) or state $i-1$ (if a customer leaves).
- Suppose that the system is in state $0$, so there are no customers in the system and the next transition will be to state $1$. Let $T_0$ be the time until the next transition. Show that $T_0 \sim Exponential (\lambda)$.
- Suppose that the system is currently in state $i$, where $i>0$. Let $T_i$ be the time until the next transition. Show that $T_i \sim Exponential (\lambda+\mu)$.
- Suppose that the system is at state $i$. Find the probability that the next transition will be to state $i+1$.
- Draw the jump chain, and provide the holding time parameters $\lambda_i$.
- Find the Generator matrix.
- Draw the transition rate diagram.
- Solution
- Note that to solve this problem, we use several results from the Poisson process section. In particular, you might want to review merging and splitting of Poisson processes before reading the solution to this problem.
- If there are no customers in the system, the next transition occurs when a new customer arrives. Since the customers arrive according to a Poisson process, and the interarrival times in the Poisson process have $Exponential (\lambda)$ distribution, we conclude $T_0 \sim Exponential (\lambda)$.
- Suppose that the system is in state $i$, where $i>0$. Thus, there is a customer being served. We assume the service times have $Exponential (\mu)$ distribution. The next transition occurs either when a new customer arrives, or when the service time of the current customer is ended. Thus, we can express $T_i$ as \begin{align*} T_i=\min(X,Y), \end{align*} where $X \sim Exponential(\lambda)$ and $Y \sim Exponential(\mu)$, and $X$ and $Y$ are independent. We claim that $T_i \sim Exponential (\lambda+\mu)$. One way to see this is as follows. Here, you can imagine two independent Poisson processes. The first one is the customer arrival process. The second one is the process that has interarrival times equal to the service times. Now, the merged process has rate $\lambda+\mu$. Since $T_i$ can be thought of the first arrival in the merged process, we conclude that $T_i \sim Exponential (\lambda+\mu)$.
- Suppose that the system is ate state $i$. We would like to find the probability that the next transition will be to state $i+1$, shown by $p_{i,i+1}$. Again consider the two Poisson processes defined above. We can model this system as follows. Each arrival in the merged process is of type 1 (customer arrival) with probability $\frac{\lambda}{\lambda+\mu}$, or of type 2 (customer departure) with probability $\frac{\mu}{\lambda+\mu}$. The probability $p_{i,i+1}$ is the probability that the first arrival in the merged process is of type 1. This happens with probability $\frac{\lambda}{\lambda+\mu}$, so we conclude \begin{align*} P_{i,i+1}&=\frac{\lambda}{\lambda+\mu},\\ p_{i,i-1}&=1-p_{i,i+1}=\frac{\mu}{\lambda+\mu}. \end{align*}
- From the above, we can draw the jump chain as in Figure 11.27
- The generator matrix can be obtained using \begin{align} \nonumber g_{ij} = \left\{ \begin{array}{l l} \lambda_i p_{ij} & \quad \textrm{ if }i \neq j \\ & \quad \\ -\lambda_i & \quad \textrm{ if }i = j \end{array} \right. \end{align} We obtain \begin{equation} \nonumber G = \begin{bmatrix} -\lambda & \lambda & 0 & 0 & \cdots \\[5pt] \mu & -(\mu+\lambda) & \lambda & 0 & \cdots \\[5pt] 0 & \mu & -(\mu+\lambda) & \lambda & \cdots \\[5pt] \vdots & \vdots & \vdots & \vdots \end{bmatrix}. \end{equation}
- Remember that in the transition rate diagram, the values $g_{ij}$ are shown on the edges (the values of $g_{ii}$'s are not usually shown). The transition rate diagram for this chain is shown in Figure 11.28.
- Note that to solve this problem, we use several results from the Poisson process section. In particular, you might want to review merging and splitting of Poisson processes before reading the solution to this problem.
The print version of the book is available through Amazon here.
Source: https://www.probabilitycourse.com/chapter11/11_3_4_solved_probs.php
0 Response to "Let Xt Be an Irreducible Continuous time Markov Chain Show That for Each I J and Every T 0"
Post a Comment