Markov Chains The following is the transition probability ma

Markov Chains

The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4

P =(0.4 0.3 0.2 0.1)
(0.2 0.2 0.2 0.4)
(0.25 0.25 0.5 0 )
(0.2 0.1 0.4 0.3)


If X0 = 1
(a) find the probability that state 3 is entered before state 4;
(b) find the mean number of transitions until either state 3 or state 4 is entered.

Solution

1) 3/4

2)

Markov Chains The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4 P =(0.4 0.3 0.2 0.1) (0.2 0.2 0.2 0.4) (0.25 0.25 0.5

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site