A Markov chain has three states, A, B, and C. The probability of going from state A to state B in one trial is .2, the probability of going from state A to state C in one trial is .5, the prob-ability of going from state B to state A in one trial is .8, the probability of going from state B to state C in one trial is .2, the probability of going from state C to state A in one trial is .1, and the probability of going from state C to state B in one trial is .3. Draw a transition diagram and write a transition matrix for this chain.
Already registered? Login
Not Account? Sign up
Enter your email address to reset your password
Back to Login? Click here