1 00:00:00,330 --> 00:00:00,830 Hi. 2 00:00:00,830 --> 00:00:03,540 This is our second lecture on Markov chains. 3 00:00:03,540 --> 00:00:06,000 We will concentrate on developing further 4 00:00:06,000 --> 00:00:10,400 the general principles and tools behind Markov chains. 5 00:00:10,400 --> 00:00:13,800 We will first briefly review the main definitions, 6 00:00:13,800 --> 00:00:17,530 and apply them to illustrate some additional calculations 7 00:00:17,530 --> 00:00:20,560 one can do, as a way to warm up. 8 00:00:20,560 --> 00:00:22,370 We will then concentrate most of the time 9 00:00:22,370 --> 00:00:25,440 on the central topic of today-- steady-state behavior 10 00:00:25,440 --> 00:00:29,670 of chains-- that is, on what a Markov chain does 11 00:00:29,670 --> 00:00:32,509 when it runs for a long time, and on what 12 00:00:32,509 --> 00:00:34,570 we can say about the probabilities of being 13 00:00:34,570 --> 00:00:35,880 in different states. 14 00:00:35,880 --> 00:00:39,310 And in order to do that, we will review recurrent states, 15 00:00:39,310 --> 00:00:42,180 transient states, and recurrent classes, 16 00:00:42,180 --> 00:00:44,500 talk a bit more about periodic states, 17 00:00:44,500 --> 00:00:48,720 concentrate most of the time on the convergence theorem here, 18 00:00:48,720 --> 00:00:52,600 and the consequence associated with the balanced equations 19 00:00:52,600 --> 00:00:54,060 of Markov chains. 20 00:00:54,060 --> 00:00:56,830 And finally, we will end the lecture 21 00:00:56,830 --> 00:00:59,100 with an important, special class-- 22 00:00:59,100 --> 00:01:02,840 the so-called birth/death processes. 23 00:01:02,840 --> 00:01:05,310 So let us start.