Consider a Markov chain with state space {0, 1, 2, 3} and a transition matrix so P 0,3 = 3/5 is the.

Consider a Markov chain with state space {0, 1, 2, 3} and a
transition matrix

Don't use plagiarized sources. Get Your Custom Essay on
Consider a Markov chain with state space {0, 1, 2, 3} and a transition matrix so P 0,3 = 3/5 is the.
Just from $13/Page
Order Essay

so P0,3 = 3/5 is the probability of moving from
state 0 to state 3.

(a) Find the stationary distribution of the Markov chain.

(b) Find the probability of being in state 3 after 32 steps
if the chain begins at state 0.

(c) Find the probability of being in state 3 after 128 steps
if the chain begins at a state chosen uniformly at random from the four states.

(d) Suppose that the chain begins in state 0. What is the
smallest value of t for which maxs  ≤ 0.01? Here ¯π is the
stationary distribution. What is the smallest value of t for which maxs