Today's lecture was an introduction to the ideas of an

You might like to think about the implication of today's lecture for cards and card-shuffing (a subject which has facinated me from childhood, and in my undergraduate days when I was secretary of the Pentacle Club). It is always good to have a few standard models in your mathematical locker (such as card-shuffling, birth-death chains, random walk on $\mathbb{Z}$ and $\mathbb{Z}^2$, etc) with which you can test your intuition and make good guesses about what might or might not be true. The state space of a deck of cards is of size

\[

52! =80,658,175,170,943,878,571,660,636,856,403,766,975,289,505,440,883,277,824,000,000,000,000.

\]The equlibrium distribution is one in which each state has equal probability $\pi_i = 1/52$!.

I mentioned today that $m_i=1/\pi_i$ (which is intuitively obvious, and we prove rigorously in Lecture 9). This means that if you start with a new deck of cards, freshly unwrapped, and then shuffle once every 5 seconds (night and day), it will take you $1.237 \times 10^{61}$ years (on average) before the deck returns to its starting condition. Notice that this result holds for any sensible meaning of a shuffle, provided that it has the effect of turning the state space into one closed class (an irreducible chain).

A very important question in many applications of Markov chains is "how long does it take to reach equilbrium?" (i.e. how large need be $n$ so that the distribution of $X_n$ is almost totally independent of $X_0$?) You might enjoy reading about the work of mathematician/magician Persi Diaconis in answering the question "how many shuffles does it take to randomize a deck of cards?". Here is the paper Trailing the Dovetail Shuffle to its Lair.

**invariant measure**and**invariant distribution**. I mentioned that the invariant distribution is also called a stationary distribution, equilibrium distribution, or steady-state distribution.You might like to think about the implication of today's lecture for cards and card-shuffing (a subject which has facinated me from childhood, and in my undergraduate days when I was secretary of the Pentacle Club). It is always good to have a few standard models in your mathematical locker (such as card-shuffling, birth-death chains, random walk on $\mathbb{Z}$ and $\mathbb{Z}^2$, etc) with which you can test your intuition and make good guesses about what might or might not be true. The state space of a deck of cards is of size

\[

52! =80,658,175,170,943,878,571,660,636,856,403,766,975,289,505,440,883,277,824,000,000,000,000.

\]The equlibrium distribution is one in which each state has equal probability $\pi_i = 1/52$!.

I mentioned today that $m_i=1/\pi_i$ (which is intuitively obvious, and we prove rigorously in Lecture 9). This means that if you start with a new deck of cards, freshly unwrapped, and then shuffle once every 5 seconds (night and day), it will take you $1.237 \times 10^{61}$ years (on average) before the deck returns to its starting condition. Notice that this result holds for any sensible meaning of a shuffle, provided that it has the effect of turning the state space into one closed class (an irreducible chain).

A very important question in many applications of Markov chains is "how long does it take to reach equilbrium?" (i.e. how large need be $n$ so that the distribution of $X_n$ is almost totally independent of $X_0$?) You might enjoy reading about the work of mathematician/magician Persi Diaconis in answering the question "how many shuffles does it take to randomize a deck of cards?". Here is the paper Trailing the Dovetail Shuffle to its Lair.