## Tuesday, October 23, 2012

### Corrections to notes

This page maintains a list of changes made to the notes since the first lecture, on 4 October, 2012. I continually update the M.pdf file to reflect these changes.

Lecture 11. Page 41. Proof. First we check that $\hat P$ is a stochastic matrix:

Lecture 10. On page 39 (10.2) I have changed a $E$ to $E_i$ and similarly in the next displayed line $E$s to $E_i$ and $E_j$.

Lecture 9. I have streamlined a bit the proof in step 2 on page 36.

Lecture 8. On page 31, $m_i=E(T_i)$ has been corrected to $m_i=E_i(T_i)$. See the blog post for some further comments on this lecture.

Lecture 6. In the final line of page 22 it should be

$p_{00}^{(2n)}=\cdots$

Lecture 4. Subsequent to the lecture I have made some improvements at the middle of page 14, to better explain where and how we use Fubini's theorem to interchange $\sum_{j\in I}$ and $\sum_{t\geq 1}$ in the proof of Theorem 4.3.

I also expanded the explanation of why $P_2(\text{hit 0})=P_1(\text{hit 0})^2.$ Notice that any path which starts in state $2$ and eventually hits state $0$ must at some point hit state $1$. That is why the strong Markov property gives us $P_2(\text{hit 0})=P_2(\text{hit 1})P_1(\text{hit 0})$.

Furthermore, $P_2(\text{hit 1})=P_1(\text{hit 0})$, This is because paths that go from $2$ to $1$ are in one-to-one correspondence with paths that go from $1$ to $0$ (just shifted $1$ to the right).

I made a change to Appendix C. Some students correctly pointed out to me at the end of the lecture that if in state $(2,3,3,0,0)$ we fire a chip from location $2$ then we reach $(3,0,4,1,0)$ and cannot subsequently fire $3$. So I should have said "If we fire 2 then we reach $(3,0,4,1,0)$ and we next must fire $0$ again, but ultimately the final configuration will be the same." This is not at all obvious! Its proof is on page 55 of the notes.

Lecture 3. I have made a few trivial changes at the end of page 12.

Lecture 2. Spotted by a student:

Page 5. for some constants $a_1,\dotsc,a_M$ $\to$ for some constants $a_1,\dotsc,a_m$

Page 7, line 1. indentify $\to$ identity.

Page 7. In Theorem 2.4 (ii) we should say,"either $i=j$, or $\dots$". (The given condition does not cover the case of $i=j$. "Communicates" is supposed to be reflexive, i.e. $i\leftrightarrow i$.)

I have decided to revert Theorem 2.4 to a statement only about $i$ and $j$ that are distinct. So

Theorem 2.4. For distinct states $i$ and $j$ the following are equivalent.
$\begin{array}{cl} (i) & i\rightarrow j;\\[6pt] (ii) & p_{i_0i_1}p_{i_1i_2}\dotsm p_{i_{n-1}i_n}>0 \text{ for some states }i_0,i_1,\dotsc,i_n, \text{where }n\geq 1,\ i_0=i\text{ and }i_n=j;\\[6pt] (iii) & p_{ij}^{(n)}>0\text{ for some }n\geq 1. \end{array}$
Lecture 1. A student has spotted 3 mistakes in the notes for Lecture 1. I have uploaded a corrected file M.pdf. It is listed under RESOURCES in menu at the right of this page.

Page 1, Question (a): should be $\lim_{n\to\infty} p_{ij}^{(n)}=1/5$ instead of $\to 1/5$.

Page 2, final equation should start $P(X_n = i_n \mid X_0 = i_0, \dotsc, X_{n-1}=i_{n-1})=\dots$

Page 3, section 1.4: The second line should end with: "$\implies J = 1$".