A Markov chain {X0, X1, ...} has state space Z = {0, 1, 2} and transition matrix
(1) Determine P ⎝X2= 2|X1= 0, X0 = 1) and P(X2= 2, X1= 0|X0= 1 ⎠
(2) Determine P⎝X2= 2, X1= 0|X0= 0) and, for n > 1,
(3) Assuming the initial distribution
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here