Probability and Markov Chains Suppose that on each play of t
Probability and Markov Chains.
Suppose that on each play of the game a gambler either wins 1 with probability p or loses 1 with probability 1 p.
The gambler continues betting until she or he is either up n or down m.
What is the probability that the gambler quits a winner?
Solution
