Thursday, October 11, 2012

Lecture 3

You should now be able to do all of Example Sheet 1 (excepting that #12 will be easier after seeing Section 4.1 in the next lecture).

I mentioned that Theorem 3.4 in this lecture is similar to the result that you were taught in Probability IA, regarding the probability of ultimate extinction in a branching process. Remember that in a branching process each individual independently produces offspring in the next generation, according to a distribution in which there are $k$ offspring with probability $p_k$ ($k=0,1,\dots$). Given that we start with one individual, the probability of ultimate extinction, say $u$, is the minimal solution to
u = G(u) = \sum_k p_k u^k
\]where $G$ is the probability generating function of the number of offspring.

Do you remember the proof that $u$ is the minimal solution to $u=G(u)$, and do you see how similar it is to the proof in Theorem 3.4?

I have called the equations of the form $x=Px$, i.e. $x_i = \sum_j p_{ij}x_j$, the right-hand equations, because $x$ appears on the right-hand side of $P$. Later in the course we will find a use for left-hand equations, of the form $x=xP$. So far as I can tell, this terminology does not appear in any modern books on Markov chains. However, it was language used by David Kendall in the course I took from him in 1972 and I have always found it to be helpful as mnemonic.

I hope you enjoy question #13 on Example Sheet 1. It contains a result that many people find surprising. In a Red-Black game in which $p < q$ the strategy of bold play is optimal (but not necessarily uniquely so). This fact is proved in the Part II course Optimization and Control (see Section 4.3 "Optimal gambling" in the Optimization and Control course notes.) Part of that course is about Markov Decision Processes, which are Markov chains in which we have some control over the transitions that occur, and we try to minimize (or maximize) costs (or rewards) that accrue as we move through states.

Question #14 on Examples Sheet 1 extends the idea of a gambling game between 2 players to that of a game amongst 3 players. I think it is remarkable that there exists such a simple formula for the expected number of games that will be played until one of the three players becomes bankrupt. No one has ever discovered a simple formula for this game with $\geq 4$ players.