Consider a Markov chain Xnn 0 with three states 123 and the

Consider a Markov chain (X_n)_n 0 with three states 1,2,3 and the transition matrix A = [0.5 0.3 0.2 0.1 0.4 0.5 0 0.2 0.8] (Python) For the Markov chain in Problem (1), choose the initial state using the uniform initial distribution [1/3,1/3, 1/3], and simulate ten steps of this chain. Repeat this procedure 1000 times and find approximately the distribution after 10 steps. Is it close to the stationary distribution from (1) (iii)?

Solution

import numpy as np P = np.matrix([[0.5,0.3,0.2], [.1,.4,.5], [0,0.2,0.8], ) n = 10 Pn = P**n import numpy as np from matplotlib import pyplot P = np.matrix([[1., 0., 0., 0., 0., 0.], [1./4, 1./2, 0., 1./4, 0., 0.], [0., 0., 0., 1., 0., 0.], [1./16, 1./4, 1./8, 1./4, 1./4, 1./16], [0., 0., 0., 1./4, 1./2, 1./4], [0., 0., 0., 0., 0., 1.]]) v = np.matrix([[1/3,1/3,1/3]]) # Get the data plot_data = [] for step in range(20): result = v * P**step plot_data.append(np.array(result).flatten()) # Convert the data format plot_data = np.array(plot_data) # Create the plot pyplot.figure(1) pyplot.xlabel(\'Steps\') pyplot.ylabel(\'Probability\') lines = [] for i, shape in zip(range(6), [\'x\', \'h\', \'H\', \'s\', \'8\', \'r+\']): line, = pyplot.plot(plot_data[:, i], shape, label=\"S%i\" % (i+1)) lines.append(line) pyplot.legend(handles=lines, loc=1) pyplot.show()
 Consider a Markov chain (X_n)_n 0 with three states 1,2,3 and the transition matrix A = [0.5 0.3 0.2 0.1 0.4 0.5 0 0.2 0.8] (Python) For the Markov chain in Pr

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site