Markov chain X0, X1, X2,.... has the transition probability matrix
and is known to start in state X0= 0. Eventually, the process will end up in state 2. What is the probability that when the process moves into state 2, it does so from state 1?
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here