*Bounty: 50*

*Bounty: 50*

I have the following problem:

Bob is a salesman. Each week, Bob will either make no money, make a small amount of money, or make a large amount of money. And if Bob makes money, he will either make $100 (small amount) or $1000 (large amount). For any given week, if Bob doesn’t make any money in the previous week, then the probability that he will make a small amount of money is 0.6, and the probability that he will make a large amount of money is 0.3. For a week where he made a small amount of money in the previous week, the probability of making a small amount of money is 0.4 and the probability of making a large amount of money is 0.2. If Bob made a large amount of money in the previous week, then this week he will make a small amount of money with probability 0.3, and no money with probability 0.7. Let this be a Markov chain $(X_n, n ge 0)$ with state space ${0, 1, 2 }$, where making no money is $0$, making a small amount of money is $1$, and making a large amount of money is $2$.

Assume that Bob earned a small amount of money this week. If we then wanted to calculate the expected value of the total number of times he earns money (both small and large amounts) over the next two weeks (after this week), then how would we do so? Is this a so-called ‘hitting times’ problem, or something else? In the field of stochastic processes, these things often go by different names, so I’m not exactly sure what it would be called; I would appreciate a link to some learning material, such as some online lecture notes or something similar.