Tuesday, November 15, 2011

Lecture 12

The last lecture was on Tuesday November 15. I have enjoyed giving this course and I hope you have enjoyed it too.

The notes are now finalized and probably safe for you to download as a final version. I made a change to page 48 around 9am today. If there are any more changes then I will put a remark on the course page. You can now also look at the overhead projector slides that I sometimes used in lectures, such as those today that I used to summarise four Part II courses that you may like to study next year.

Some students say that the notation is one of the most difficult things about this course. I recommend that you make for yourself a one-page crib sheet of all the notation: 
(Xn)n≥0,   I,   Markov(λ,P),   P = (pij),   P(n) = (pij(n)),   hiA,   kiA,   Hi,   Ti,   Vi,   Vi(n),   fi,   λ,   π,   γik,   mi .
Write a little explanation for yourself as to what each notation means, and how it used in our theorems about right-hand equations, recurrence/transience, left-hand equations, existence/uniqueness of invariant measure, aperiodicity/periodicity, positive/null recurrence and detailed balance. It should all seems pretty straightforward and memorable once you summarise it on one page and make some notes to place it in context.

Of course I could easily typeset a page like this for you — but I think that you'll learn more, and it will be more memorable for you personally, if you create this crib sheet yourself!

In my discussion of random walk and electrical networks in Section 12.4 I appealed to Rayleigh's Monotonicity Law: " if some resistances of a circuit are increased (decreased) the resistance between any two points of the circuit can only increase (decrease)." A proof of this "obvious" fact can be constructed by (i) proving Thomson's Principle: "Flows determined by Kirchhoff's Laws minimize energy dissipation", and then (ii) showing that Thomson's Principle implies Rayleigh's Monotonicity Law. You can read the details of this in Doyle and Snell Random walks and electric networks, pages 51–52.

In Section 12.2 I mentioned Burke's output theorem (1956) which says that the output process of a M/M/1 queue in equilibrium is a Poisson process with the same rate as the input. In writing "M/M/1" the "M"s mean Markovian (i.e. a Poisson input process of rate λ and i.i.d. exponentially distributed service times with parameter μ (where μ>λ), and the "1" means a single server. In queueing theory this very useful notation is known as Kendall's notation.)

I remarked (just for fun) that queueing is the only common word in the OED with five vowels in a row. Obscure words are ones like "miaoued" (what the cat did).

I once proved a generalization of Burke's output theorem that holds even when the queue has not reached equilibrium (see: The interchangeability of ·/M/1 queues in series, Weber, 1979). Suppose we have two single-server queues in series, which we might write as /M/1 → /M/1. The customers' service times in the first queue are i.i.d. exponentially distributed with parameter λ and in the second queue they are i.i.d. exponentially distributed with parameter μ. On finishing service in the first queue a customer immediately joins the second queue. Suppose the system starts with N customers in the first (upstream) queue and no customers in the second (downstream) queue. My theorem says that all statistics that we might measure about the departure process from the second queue are the same if λ and μ are interchanged. Thus by observing the process of departures from the second queue we cannot figure out which way around the two /M/1 servers are ordered. For example, the time at which we see the first departure leave the second queue has expected value 1/λ + 1/μ (which is symmetric in λ and μ). All other statistics are also symmetric in λ and μ. Burke's theorem is a corollary of this that can be obtained by thinking about N tending to infinity (can you see how?)