1 00:00:00,060 --> 00:00:02,500 The following content is provided under a Creative 2 00:00:02,500 --> 00:00:04,019 Commons license. 3 00:00:04,019 --> 00:00:06,360 Your support will help MIT OpenCourseWare 4 00:00:06,360 --> 00:00:10,730 continue to offer high-quality educational resources for free. 5 00:00:10,730 --> 00:00:13,340 To make a donation or view additional materials 6 00:00:13,340 --> 00:00:17,215 from hundreds of MIT courses, visit MIT OpenCourseWare 7 00:00:17,215 --> 00:00:17,840 at ocw.mit.edu. 8 00:00:20,709 --> 00:00:22,250 PROFESSOR: So today, what we're going 9 00:00:22,250 --> 00:00:25,160 to do is we want to talk about this idea of autoregulation, 10 00:00:25,160 --> 00:00:27,930 so what happens when a gene regulates its own expression. 11 00:00:27,930 --> 00:00:30,910 And I think that this is a really nice topic 12 00:00:30,910 --> 00:00:33,960 because it really encapsulates several of the big themes 13 00:00:33,960 --> 00:00:36,350 that we're going to be seeing throughout the semester. 14 00:00:36,350 --> 00:00:39,710 So first there's this idea of a network motif. 15 00:00:39,710 --> 00:00:42,980 So this motif, it's the simplest possible motif 16 00:00:42,980 --> 00:00:44,344 where a gene regulates itself. 17 00:00:44,344 --> 00:00:46,760 It occurs more frequently than you would expect by chance. 18 00:00:46,760 --> 00:00:48,550 We'll explain what we mean by that. 19 00:00:48,550 --> 00:00:51,460 And then possibly an evolutionary explanation 20 00:00:51,460 --> 00:00:53,730 is that this thing may be advantageous in one 21 00:00:53,730 --> 00:00:55,660 way or another. 22 00:00:55,660 --> 00:00:58,800 Now, there are two basic kinds of negative autoregulation, 23 00:00:58,800 --> 00:01:00,820 or sorry, two basic kinds of autoregulation. 24 00:01:00,820 --> 00:01:02,150 There's negative and positive. 25 00:01:02,150 --> 00:01:04,640 So negative means that that protein 26 00:01:04,640 --> 00:01:05,890 represses its own expression. 27 00:01:05,890 --> 00:01:07,570 Positive autoregulation would mean 28 00:01:07,570 --> 00:01:09,420 that it is activating or upregulating 29 00:01:09,420 --> 00:01:10,814 its own expression. 30 00:01:10,814 --> 00:01:12,480 So down here is positive autoregulation. 31 00:01:16,480 --> 00:01:20,865 And these things have very different possible purposes. 32 00:01:20,865 --> 00:01:22,740 All right, we'll see that there are basically 33 00:01:22,740 --> 00:01:25,300 two things that negative autoregulation may do for us. 34 00:01:25,300 --> 00:01:30,480 First is that it speeds the on time 35 00:01:30,480 --> 00:01:33,494 relative to simple no regulation. 36 00:01:33,494 --> 00:01:35,660 But the other thing is kind of an interesting thing, 37 00:01:35,660 --> 00:01:40,046 which is that the protein concentration is somehow 38 00:01:40,046 --> 00:01:40,545 robust. 39 00:01:45,300 --> 00:01:48,250 And robust means that it's comparatively 40 00:01:48,250 --> 00:01:50,640 non-sensitive or insensitive to something. 41 00:01:50,640 --> 00:01:52,723 And particularly, this means protein concentration 42 00:01:52,723 --> 00:01:55,930 is robust to say fluctuations in the production rate. 43 00:01:58,640 --> 00:02:02,620 In various contexts, we're going to see how robustness 44 00:02:02,620 --> 00:02:06,246 is an important goal, that biology might have, 45 00:02:06,246 --> 00:02:07,700 a particular cell might have. 46 00:02:07,700 --> 00:02:10,169 And that you would like this protein concentration 47 00:02:10,169 --> 00:02:13,030 to be robust to a wide range of things. 48 00:02:13,030 --> 00:02:15,145 But in this case, anytime you talk about robust, 49 00:02:15,145 --> 00:02:16,770 you have to say all right, it's robust. 50 00:02:16,770 --> 00:02:18,020 What is it that's robust against what? 51 00:02:18,020 --> 00:02:19,810 In this case, its protein concentration 52 00:02:19,810 --> 00:02:21,935 is robust, the fluctuations in the production rate. 53 00:02:25,290 --> 00:02:28,610 Now, positive autoregulation, on the other hand, in some ways 54 00:02:28,610 --> 00:02:31,610 makes these two things worse. 55 00:02:31,610 --> 00:02:33,540 We'll discuss what we mean by that. 56 00:02:33,540 --> 00:02:36,210 But it perhaps performs some other functions. 57 00:02:36,210 --> 00:02:40,050 In particular, it allows for the possibility of bistability, 58 00:02:40,050 --> 00:02:41,240 and hence memory. 59 00:02:41,240 --> 00:02:45,570 And we'll explain how memory appears in this process. 60 00:02:50,977 --> 00:02:52,560 But before we get into autoregulation, 61 00:02:52,560 --> 00:02:54,930 I just want to do a quick review of this thing 62 00:02:54,930 --> 00:03:00,460 about getting intuition or binding of activators, 63 00:03:00,460 --> 00:03:02,430 for example, to the promoter. 64 00:03:02,430 --> 00:03:05,345 But also how we can use this thing about sequestration 65 00:03:05,345 --> 00:03:06,720 in order to generate what we call 66 00:03:06,720 --> 00:03:07,969 this ultra-sensitive response. 67 00:03:11,710 --> 00:03:13,220 Now, when we talk about this, we're 68 00:03:13,220 --> 00:03:15,250 often thinking about a question that we'd 69 00:03:15,250 --> 00:03:19,970 like the rate of expression of some gene 70 00:03:19,970 --> 00:03:23,460 to be sensitive or be ultra-sensitive to 71 00:03:23,460 --> 00:03:25,100 the concentration of something. 72 00:03:25,100 --> 00:03:31,290 So if you have x activating y, then what you might like, 73 00:03:31,290 --> 00:03:33,290 in many cases, is for there to be an essentially 74 00:03:33,290 --> 00:03:36,320 digital response, where if you think about the production 75 00:03:36,320 --> 00:03:42,060 rate of y as a function of this input x, 76 00:03:42,060 --> 00:03:45,300 you might like it to just be very sharp, to be essentially 77 00:03:45,300 --> 00:03:46,190 0. 78 00:03:46,190 --> 00:03:50,590 Quickly to come up, and then saturate at some level, 79 00:03:50,590 --> 00:03:53,030 beta maybe. 80 00:03:53,030 --> 00:03:56,390 Now, the question is how is it that you can do that? 81 00:03:56,390 --> 00:03:59,070 One thing that we discussed that could get you 82 00:03:59,070 --> 00:04:00,910 something kind of like this. 83 00:04:00,910 --> 00:04:04,816 Does anybody remember what the base solution might be? 84 00:04:04,816 --> 00:04:06,190 AUDIENCE: Cooperative regulation. 85 00:04:06,190 --> 00:04:07,898 PROFESSOR: Right, cooperative regulation. 86 00:04:07,898 --> 00:04:10,500 So if it's the case that you have, for example, 87 00:04:10,500 --> 00:04:14,760 a tetramer of x that is somehow activating y, 88 00:04:14,760 --> 00:04:16,980 then you can get something that looks, 89 00:04:16,980 --> 00:04:20,600 maybe not this sharp, but much sharper than what you get, 90 00:04:20,600 --> 00:04:22,900 which is the simple Michaelis-Menten-type 91 00:04:22,900 --> 00:04:25,600 looking curve if you just had a single monomer of x activating 92 00:04:25,600 --> 00:04:26,790 y. 93 00:04:26,790 --> 00:04:29,460 So one solution is indeed cooperativity. 94 00:04:33,610 --> 00:04:36,950 But I think that there's another interesting one that 95 00:04:36,950 --> 00:04:45,180 is this-- second solution is this idea 96 00:04:45,180 --> 00:04:46,865 of molecular titration. 97 00:04:52,520 --> 00:04:55,880 I kind of explained the basic idea at the end of the lecture 98 00:04:55,880 --> 00:04:57,364 on Tuesday. 99 00:04:57,364 --> 00:04:58,780 But then what we want to do is try 100 00:04:58,780 --> 00:05:00,680 to get a sense of what the requirements might 101 00:05:00,680 --> 00:05:04,016 be for the bindings in order to generate this effect. 102 00:05:04,016 --> 00:05:05,890 So what we have is a situation where we have, 103 00:05:05,890 --> 00:05:08,580 again, just x is regulating y. 104 00:05:08,580 --> 00:05:10,720 But it's a little bit more complicated, 105 00:05:10,720 --> 00:05:13,110 because this protein x, in addition 106 00:05:13,110 --> 00:05:20,900 to binding the promoter and activating expression of y, 107 00:05:20,900 --> 00:05:22,730 we also have the possibility that x 108 00:05:22,730 --> 00:05:25,340 can bind to something else. 109 00:05:25,340 --> 00:05:27,980 In particular, there might be some other protein, maybe w, 110 00:05:27,980 --> 00:05:33,125 which combined reversibly into some complex wx. 111 00:05:35,900 --> 00:05:39,210 Now, for some situations-- in particular, 112 00:05:39,210 --> 00:05:47,310 if we describe this by some Kw and this by some Kd 113 00:05:47,310 --> 00:05:50,990 we can ask in what regime will this 114 00:05:50,990 --> 00:05:54,460 generate an ultra-sensitive response, where 115 00:05:54,460 --> 00:05:57,750 as a function of the total concentration of x-- So 116 00:05:57,750 --> 00:06:02,040 we might want to call that x total, just be clear. 117 00:06:02,040 --> 00:06:04,750 We'd like it that there's very little expression of y, 118 00:06:04,750 --> 00:06:08,830 and that all of a sudden to get maximal, if you like, 119 00:06:08,830 --> 00:06:10,004 expression of y. 120 00:06:10,004 --> 00:06:11,420 And the question is, well, what do 121 00:06:11,420 --> 00:06:16,375 we need in terms of these KwKd's, and so forth. 122 00:06:16,375 --> 00:06:18,500 So there are going to be three different conditions 123 00:06:18,500 --> 00:06:19,910 that we're going to play with. 124 00:06:27,680 --> 00:06:32,340 First might be the relationship between this Kd, the affinity 125 00:06:32,340 --> 00:06:35,810 of binding of that transcription factor to the promoter, 126 00:06:35,810 --> 00:06:37,165 as compared to this Kw. 127 00:06:39,670 --> 00:06:42,430 I will write some options, but as always, you 128 00:06:42,430 --> 00:06:43,465 should start thinking. 129 00:06:43,465 --> 00:06:44,840 You don't need to watch me write. 130 00:07:13,570 --> 00:07:14,724 Don't know again. 131 00:07:14,724 --> 00:07:15,640 So I'll give you that. 132 00:07:15,640 --> 00:07:18,960 Just 30 seconds to think about it. 133 00:07:18,960 --> 00:07:19,460 Yes. 134 00:07:19,460 --> 00:07:22,255 AUDIENCE: Is Kw the dissociation constant? 135 00:07:22,255 --> 00:07:24,630 Like if the concentration w times 136 00:07:24,630 --> 00:07:26,560 concentration x over the-- 137 00:07:26,560 --> 00:07:28,370 PROFESSOR: Right, so it's the guy that 138 00:07:28,370 --> 00:07:30,450 has units of concentration. 139 00:07:30,450 --> 00:07:34,420 So lower in both these cases corresponds to tighter binding. 140 00:07:42,570 --> 00:07:44,599 I'll let you think for 20 seconds maybe. 141 00:08:01,984 --> 00:08:02,900 Do you need more time? 142 00:08:06,200 --> 00:08:10,350 And it's OK if you are not 100% convinced of something. 143 00:08:10,350 --> 00:08:13,200 It's useful to just make a guess and then we can discuss. 144 00:08:13,200 --> 00:08:15,040 Shall we go for it? 145 00:08:15,040 --> 00:08:17,660 Everybody have your tools in front of you? 146 00:08:17,660 --> 00:08:19,870 All right, ready. 147 00:08:19,870 --> 00:08:22,150 Three, two, one. 148 00:08:25,900 --> 00:08:30,990 So we have, I'd say, a fair amount of disagreement, 149 00:08:30,990 --> 00:08:31,550 actually. 150 00:08:31,550 --> 00:08:33,010 I'd say we got some A's, B's, C's. 151 00:08:36,569 --> 00:08:37,110 I don't know. 152 00:08:37,110 --> 00:08:38,476 There might be a one or two D's. 153 00:08:38,476 --> 00:08:40,600 Why don't we go ahead and turn to a neighbor there. 154 00:08:40,600 --> 00:08:42,808 You should definitely be able to find a neighbor that 155 00:08:42,808 --> 00:08:44,220 disagrees with you. 156 00:08:44,220 --> 00:08:46,280 And try to convince them that you-- 157 00:08:46,280 --> 00:08:47,530 AUDIENCE: What's the question? 158 00:08:47,530 --> 00:08:50,430 PROFESSOR: Oh, yeah, sorry. 159 00:08:50,430 --> 00:08:51,900 No, no, all right, so the question 160 00:08:51,900 --> 00:08:54,820 is, what conditions do these things need in order 161 00:08:54,820 --> 00:08:56,750 to have an ultra-sensitive response, 162 00:08:56,750 --> 00:08:58,710 where the rate of production is what? 163 00:08:58,710 --> 00:10:04,484 [SIDE CONVERSATIONS] 164 00:10:04,484 --> 00:10:06,900 PROFESSOR: All right, why don't we go ahead and reconvene. 165 00:10:06,900 --> 00:10:09,714 And maybe I'll just get a sense of what 166 00:10:09,714 --> 00:10:11,380 the state of your thinking is right now. 167 00:10:11,380 --> 00:10:12,963 All right, so once again, the question 168 00:10:12,963 --> 00:10:16,280 is, we want to know what the relationship between all 169 00:10:16,280 --> 00:10:18,530 these binding affinities concentrations 170 00:10:18,530 --> 00:10:21,090 has to be in order to get an ultra-sensitive response, 171 00:10:21,090 --> 00:10:25,802 where the function of the total amount of x-- you add x. 172 00:10:25,802 --> 00:10:28,260 At first, you don't get really much of any expression of y, 173 00:10:28,260 --> 00:10:31,070 but all of a sudden, you get a lot. 174 00:10:31,070 --> 00:10:37,140 And first, we want to know the relationship between Kd and Kw. 175 00:10:37,140 --> 00:10:38,140 Let's go ahead and vote. 176 00:10:38,140 --> 00:10:38,820 Ready? 177 00:10:38,820 --> 00:10:43,150 Three, two, one. 178 00:10:43,150 --> 00:10:45,710 Oh, I'm sorry. 179 00:10:45,710 --> 00:10:46,710 We're still on this one. 180 00:10:46,710 --> 00:10:47,980 So yeah, you could ignore these. 181 00:10:47,980 --> 00:10:49,688 It's just that it takes me time to write. 182 00:10:49,688 --> 00:10:53,050 So I wanted to take advantage of your discussion. 183 00:10:53,050 --> 00:10:55,320 We're still on this one. 184 00:10:55,320 --> 00:10:58,390 Do you guys understand what I'm trying to ask? 185 00:10:58,390 --> 00:11:00,070 Okay, all right, ready. 186 00:11:00,070 --> 00:11:03,480 Three, two, one. 187 00:11:03,480 --> 00:11:07,130 All right, I'm going to cover this so nobody gets confused. 188 00:11:07,130 --> 00:11:09,980 All right, so we have a fair pocket of C's 189 00:11:09,980 --> 00:11:14,850 back there, but then some other distributions. 190 00:11:14,850 --> 00:11:17,470 So we've got some discussion over here. 191 00:11:17,470 --> 00:11:22,240 So maybe somebody can offer their insight? 192 00:11:27,550 --> 00:11:28,885 What did your neighbor think? 193 00:11:28,885 --> 00:11:33,466 AUDIENCE: My neighbor thought that at first, there 194 00:11:33,466 --> 00:11:38,784 should be the binding between x and to the gene. 195 00:11:38,784 --> 00:11:44,700 So x should bind much more equally with w. 196 00:11:44,700 --> 00:11:47,435 That's why Kw should be-- 197 00:11:47,435 --> 00:11:49,060 PROFESSOR: All right, so you're arguing 198 00:11:49,060 --> 00:11:52,610 that you want Kw to be much smaller than Kd here. 199 00:11:52,610 --> 00:11:55,910 Because as you add x, you want initially for them 200 00:11:55,910 --> 00:12:01,691 all to be sequestered by molecule w. 201 00:12:01,691 --> 00:12:04,000 Is that OK? 202 00:12:04,000 --> 00:12:08,260 So if somebody's neighbor thought it should instead be B, 203 00:12:08,260 --> 00:12:09,920 do you want to offer their argument? 204 00:12:17,260 --> 00:12:19,630 Everybody's neighbors now convinced that C is-- 205 00:12:24,320 --> 00:12:27,330 So I'm going to side with that. 206 00:12:27,330 --> 00:12:31,280 So the idea here is that what you'd really like 207 00:12:31,280 --> 00:12:34,780 is-- The principle of this is that Kw is perfect, 208 00:12:34,780 --> 00:12:37,220 and then that corresponds to going to 0. 209 00:12:37,220 --> 00:12:44,930 Then that means that if you plot the free x 210 00:12:44,930 --> 00:12:49,170 as a function of x total, then what 211 00:12:49,170 --> 00:12:53,630 does it start out being if Kw is 0? 212 00:12:53,630 --> 00:12:55,724 0, right? 213 00:12:55,724 --> 00:12:57,890 And when thinking about the concentration of free x, 214 00:12:57,890 --> 00:13:01,380 do we have to think about or worry about how much x is down 215 00:13:01,380 --> 00:13:04,170 to the promoter? 216 00:13:04,170 --> 00:13:06,432 No, we'll say that so long as you 217 00:13:06,432 --> 00:13:08,720 have any reasonable concentration of x, then 218 00:13:08,720 --> 00:13:11,690 the one x that binds to that promoter, binds that DNA, 219 00:13:11,690 --> 00:13:14,440 is not going to affect the concentration of free x. 220 00:13:14,440 --> 00:13:17,890 So we really just have to worry about how much is bound to w. 221 00:13:17,890 --> 00:13:19,240 So it's going to be 0. 222 00:13:19,240 --> 00:13:25,110 And this is in the limit of Kw going to 0. 223 00:13:25,110 --> 00:13:27,310 And so when is it that something starts happening? 224 00:13:30,656 --> 00:13:34,010 AUDIENCE: 1x is the concentration of w. 225 00:13:34,010 --> 00:13:36,090 PROFESSOR: Right, so it's when you get n. 226 00:13:36,090 --> 00:13:38,860 And is it the concentration of w, 227 00:13:38,860 --> 00:13:41,965 or is it-- maybe we can be a little bit more precise? 228 00:13:41,965 --> 00:13:43,340 What shall I actually write here? 229 00:13:43,340 --> 00:13:47,030 Should I just write w, or should I write-- w total, right. 230 00:13:47,030 --> 00:13:51,190 Because the free w is changing all the time. 231 00:13:51,190 --> 00:13:54,490 And indeed, at you approach here, the free w goes to-- 232 00:13:57,650 --> 00:13:58,150 AUDIENCE: 0. 233 00:13:58,150 --> 00:13:59,180 PROFESSOR: 0, right. 234 00:13:59,180 --> 00:14:02,220 So over here, the free w is equal to wt. 235 00:14:02,220 --> 00:14:05,010 But then it decreases linearly until you get here. 236 00:14:05,010 --> 00:14:07,360 And now, there's no free w. 237 00:14:07,360 --> 00:14:10,370 And it's at that stage that this free x is going 238 00:14:10,370 --> 00:14:12,110 to go up with a slope of what? 239 00:14:17,300 --> 00:14:18,880 1, right? 240 00:14:18,880 --> 00:14:19,520 Right, exactly. 241 00:14:19,520 --> 00:14:22,210 All the x that you add just turns into free x, 242 00:14:22,210 --> 00:14:23,680 so this goes up with slope 1. 243 00:14:27,080 --> 00:14:30,770 So this is the idea behind this mechanism. 244 00:14:30,770 --> 00:14:35,210 What I'm drawing is the case of perfect sequestration. 245 00:14:35,210 --> 00:14:38,990 And in this case, what happens is, 246 00:14:38,990 --> 00:14:46,810 if you look at the production rate of y, 247 00:14:46,810 --> 00:14:48,886 it's a function again of x total. 248 00:14:48,886 --> 00:14:51,010 The production rate in here is to be equal to what? 249 00:14:53,431 --> 00:14:53,930 0. 250 00:14:56,870 --> 00:15:00,690 Until you get to this wt, and then it's going to come up. 251 00:15:00,690 --> 00:15:03,311 And how is it going to come up? 252 00:15:03,311 --> 00:15:05,435 Does it immediately come all the way up to maximal? 253 00:15:07,960 --> 00:15:12,244 No, so what determines how rapidly that's going to happen? 254 00:15:12,244 --> 00:15:14,185 AUDIENCE: The concentration of free x. 255 00:15:14,185 --> 00:15:15,810 PROFESSOR: The concentration of free x. 256 00:15:15,810 --> 00:15:17,440 And we actually know what the concentration of free x 257 00:15:17,440 --> 00:15:17,940 is here now. 258 00:15:20,570 --> 00:15:21,470 AUDIENCE: The Kd. 259 00:15:21,470 --> 00:15:23,770 PROFESSOR: Right, so it's the Kd of the binding. 260 00:15:23,770 --> 00:15:27,110 So this is going to be some-- so this 261 00:15:27,110 --> 00:15:30,310 is again a Michaelis-Menten looking kind of curve, 262 00:15:30,310 --> 00:15:32,610 where the half maximal here is just going 263 00:15:32,610 --> 00:15:35,850 to be this wt plus that Kd. 264 00:15:40,930 --> 00:15:44,210 All right, so there's a sense that this is ultra-sensitive, 265 00:15:44,210 --> 00:15:47,152 because initially you don't get much of any expression, 266 00:15:47,152 --> 00:15:49,610 then all of a sudden you start getting significant amounts. 267 00:15:55,279 --> 00:15:57,820 Are there any questions about the basic-- the intuition here? 268 00:15:57,820 --> 00:15:59,680 We still have come back and think a little bit more 269 00:15:59,680 --> 00:16:01,096 carefully, kind of quantitatively, 270 00:16:01,096 --> 00:16:04,620 about what this mechanism means for the other comparisons? 271 00:16:04,620 --> 00:16:09,370 But at least, based on this idea of the perfect situation, where 272 00:16:09,370 --> 00:16:12,872 the sequestration of w with x is complete, 273 00:16:12,872 --> 00:16:14,580 then this is the idea of what's going on. 274 00:16:18,800 --> 00:16:20,930 Is it clear right now? 275 00:16:20,930 --> 00:16:21,864 Yeah. 276 00:16:21,864 --> 00:16:26,534 AUDIENCE: What's the direction of Kw in this situation? 277 00:16:26,534 --> 00:16:27,480 Is that-- 278 00:16:27,480 --> 00:16:30,040 PROFESSOR: In what I'm drawing here? 279 00:16:30,040 --> 00:16:31,410 AUDIENCE: What you drew there. 280 00:16:31,410 --> 00:16:32,690 PROFESSOR: This one? 281 00:16:32,690 --> 00:16:40,460 Oh, OK, so the definition of kw is going to be that kw-- 282 00:16:40,460 --> 00:16:42,127 and this is a definition. 283 00:16:42,127 --> 00:16:43,710 It has units of concentration, so that 284 00:16:43,710 --> 00:16:46,670 means we have to put up-- this a conservation of w, 285 00:16:46,670 --> 00:16:52,454 a concentration of x, the concentration of wx. 286 00:16:52,454 --> 00:16:53,897 AUDIENCE: Oh, [INAUDIBLE]. 287 00:16:59,190 --> 00:17:02,982 PROFESSOR: Any other questions about this? 288 00:17:02,982 --> 00:17:03,482 Yeah. 289 00:17:03,482 --> 00:17:05,967 AUDIENCE: Why does the flow go the free x 1, 290 00:17:05,967 --> 00:17:08,452 because isn't the free x now binding with the gene? 291 00:17:08,452 --> 00:17:10,618 PROFESSOR: OK, right, so this is what we were saying 292 00:17:10,618 --> 00:17:14,060 is that there's just one copy of this gene. 293 00:17:14,060 --> 00:17:16,920 So given that there might be-- where in this case, 294 00:17:16,920 --> 00:17:19,040 this might be 1,000 of these proteins, 295 00:17:19,040 --> 00:17:21,359 and then it's just not a significant-- yeah, 296 00:17:21,359 --> 00:17:21,859 in the back? 297 00:17:21,859 --> 00:17:23,531 AUDIENCE: So whenever you have a k, 298 00:17:23,531 --> 00:17:24,739 do have always concentration? 299 00:17:28,099 --> 00:17:34,350 PROFESSOR: OK, so in this class whenever possible, 300 00:17:34,350 --> 00:17:36,560 at least in the lectures, we'll always 301 00:17:36,560 --> 00:17:40,340 try stick with the dissociation constant, which is the guy that 302 00:17:40,340 --> 00:17:42,320 has units of concentration. 303 00:17:42,320 --> 00:17:45,030 You can also talk about the association constant, 304 00:17:45,030 --> 00:17:47,330 and then they also use k for that. 305 00:17:47,330 --> 00:17:49,090 Horribly confusing, right? 306 00:17:49,090 --> 00:17:53,008 So whenever possible, I'll be referring to the dissociation 307 00:17:53,008 --> 00:17:53,508 constant. 308 00:17:56,854 --> 00:18:00,354 AUDIENCE: Is that written a lower case k or a big K? 309 00:18:00,354 --> 00:18:01,520 PROFESSOR: Oh, that's right. 310 00:18:01,520 --> 00:18:04,170 No, it's a good thing that my K's 311 00:18:04,170 --> 00:18:06,456 are so clearly big K's that there's 312 00:18:06,456 --> 00:18:08,080 no possible source for confusion there. 313 00:18:11,060 --> 00:18:11,690 Yes. 314 00:18:11,690 --> 00:18:14,254 But if you're ever confused about which K or whatnot 315 00:18:14,254 --> 00:18:15,670 I'm referring to, please just ask. 316 00:18:18,315 --> 00:18:20,690 All right, any other questions about the basic mechanism? 317 00:18:20,690 --> 00:18:21,600 Yeah. 318 00:18:21,600 --> 00:18:22,808 AUDIENCE: Just a correct one. 319 00:18:22,808 --> 00:18:26,159 So if we take the other limit for Kd going to 0, 320 00:18:26,159 --> 00:18:28,549 would it be the same-- so free x versus x total, 321 00:18:28,549 --> 00:18:30,215 would it be the same slope, but starting 322 00:18:30,215 --> 00:18:32,530 from 0 that should be left? 323 00:18:32,530 --> 00:18:33,280 PROFESSOR: Oh, OK. 324 00:18:33,280 --> 00:18:34,960 The other limit of Kd going to 0. 325 00:18:34,960 --> 00:18:37,300 Yeah, this is an interesting point. 326 00:18:37,300 --> 00:18:43,440 Right, so I think this plot is independent of Kd, 327 00:18:43,440 --> 00:18:45,740 because binding to that promoter doesn't 328 00:18:45,740 --> 00:18:48,650 affect the free concentration of x anyways, right? 329 00:18:48,650 --> 00:18:51,850 It does affect this, though, because if Kd 330 00:18:51,850 --> 00:18:54,094 gets smaller and smaller, then this curve actually 331 00:18:54,094 --> 00:18:55,135 gets more and more steep. 332 00:19:00,180 --> 00:19:02,431 Does that answer your question? 333 00:19:02,431 --> 00:19:03,889 AUDIENCE: Isn't that more sensitive 334 00:19:03,889 --> 00:19:05,530 if the curve is deeper? 335 00:19:05,530 --> 00:19:06,640 PROFESSOR: OK, right, OK. 336 00:19:06,640 --> 00:19:09,300 Yes, OK, that's a good point. 337 00:19:09,300 --> 00:19:11,550 But now you're taking two limits of things going to 0, 338 00:19:11,550 --> 00:19:13,841 so we have to be a little bit careful, because it still 339 00:19:13,841 --> 00:19:16,950 maybe depends on the relationship between those two 340 00:19:16,950 --> 00:19:18,220 as they go to 0. 341 00:19:25,190 --> 00:19:26,370 No, that's fair. 342 00:19:26,370 --> 00:19:29,790 But the idea is that if you're in this other limit, then 343 00:19:29,790 --> 00:19:33,735 you end up not having significant sequestration 344 00:19:33,735 --> 00:19:39,290 in the sense that as you start adding x, 345 00:19:39,290 --> 00:19:42,320 you start getting expression of that protein early on. 346 00:19:42,320 --> 00:19:44,410 And so then the whole mechanism doesn't work. 347 00:19:44,410 --> 00:19:46,535 So it's true that in principle that thing is steep, 348 00:19:46,535 --> 00:19:48,367 but it was never inhibited to begin with. 349 00:19:48,367 --> 00:19:49,950 Because the moment you start adding x, 350 00:19:49,950 --> 00:19:52,740 you start getting expression from the gene. 351 00:19:56,930 --> 00:19:59,700 Other questions about that? 352 00:19:59,700 --> 00:20:02,450 Let's try out this next one. 353 00:20:02,450 --> 00:20:07,828 Now, this is a question of K-- so this is the binding affinity 354 00:20:07,828 --> 00:20:13,290 of x to that sequesterer as compared 355 00:20:13,290 --> 00:20:18,190 to the total amount of that w. 356 00:20:18,190 --> 00:20:21,220 All right, so what's going to be the requirement there? 357 00:20:21,220 --> 00:20:23,300 So we'll go ahead and give you 30 seconds to-- 358 00:20:23,300 --> 00:20:25,800 AUDIENCE: Excuse me, but don't we already have what we want? 359 00:20:27,449 --> 00:20:29,530 What's the more specific question? 360 00:20:29,530 --> 00:20:31,000 PROFESSOR: Right, OK, so here, this 361 00:20:31,000 --> 00:20:34,850 is the case where w went to zero. 362 00:20:34,850 --> 00:20:36,210 Sorry, Kw went to 0. 363 00:20:36,210 --> 00:20:36,970 Right? 364 00:20:36,970 --> 00:20:38,550 And indeed, this works. 365 00:20:38,550 --> 00:20:40,271 And the question is, in general-- 366 00:20:40,271 --> 00:20:42,020 Kw is not actually going to be equal to 0. 367 00:20:44,710 --> 00:20:47,370 And it might be relevant for this question still. 368 00:20:52,360 --> 00:20:54,990 Yeah, because in some ways, this is the idealized version, 369 00:20:54,990 --> 00:20:56,660 and any real thing is going to have just 370 00:20:56,660 --> 00:20:57,910 some numbers for these things. 371 00:20:57,910 --> 00:20:59,840 The question is, will those numbers work. 372 00:21:30,560 --> 00:21:32,820 Do you need more time? 373 00:21:32,820 --> 00:21:35,350 Let's go ahead and make our best guess. 374 00:21:35,350 --> 00:21:39,800 In order to get ultra-sensitivity response of y 375 00:21:39,800 --> 00:21:42,072 to the concentration of x, what is that you need here? 376 00:21:42,072 --> 00:21:42,780 All right, ready. 377 00:21:42,780 --> 00:21:44,735 Three, two, one. 378 00:21:47,680 --> 00:21:52,925 So we got some A's, some B's, some D's. 379 00:21:52,925 --> 00:21:54,205 C is not very popular. 380 00:21:58,460 --> 00:22:03,062 OK, so it seems like-- Well, why don't we 381 00:22:03,062 --> 00:22:04,520 just go ahead and spend 30 seconds? 382 00:22:04,520 --> 00:22:04,960 Turn your neighbor. 383 00:22:04,960 --> 00:22:07,376 You should be able to find someone who disagrees with you. 384 00:22:07,376 --> 00:23:03,715 [SIDE CONVERSATIONS] 385 00:23:03,715 --> 00:23:05,840 PROFESSOR: OK, why don't we go ahead and reconvene, 386 00:23:05,840 --> 00:23:09,080 and you can give your argument to the group? 387 00:23:09,080 --> 00:23:12,420 All right, does somebody want to volunteer an explanation? 388 00:23:16,568 --> 00:23:17,510 Yeah. 389 00:23:17,510 --> 00:23:19,870 AUDIENCE: I haven't had time to test this explanation. 390 00:23:19,870 --> 00:23:21,036 PROFESSOR: That's just fine. 391 00:23:21,036 --> 00:23:22,740 Well, your neighbor will appreciate it. 392 00:23:22,740 --> 00:23:27,820 AUDIENCE: So I was thinking that the time scale of that 393 00:23:27,820 --> 00:23:30,660 axis really on the second graph is 394 00:23:30,660 --> 00:23:36,670 set by how quickly that slope, the curving slope, rises? 395 00:23:36,670 --> 00:23:39,380 PROFESSOR: OK. so times scale I'm a little bit worried about. 396 00:23:39,380 --> 00:23:40,630 AUDIENCE: Sorry, concentration scale. 397 00:23:40,630 --> 00:23:41,671 PROFESSOR: OK, all right. 398 00:23:41,671 --> 00:23:46,605 AUDIENCE: It's set by how quickly the second curve rises, 399 00:23:46,605 --> 00:23:50,871 so once you go above 0. 400 00:23:50,871 --> 00:23:54,094 That's the only feature on that graph. 401 00:23:54,094 --> 00:23:57,503 And that should be in a Michaelis-Menten curve that's 402 00:23:57,503 --> 00:23:58,750 determined by K. 403 00:23:58,750 --> 00:23:59,865 PROFESSOR: K, which K? 404 00:23:59,865 --> 00:24:03,335 AUDIENCE: Oh, the K of the Kd. 405 00:24:03,335 --> 00:24:03,960 PROFESSOR: Yes. 406 00:24:03,960 --> 00:24:04,760 AUDIENCE: Oh, no. 407 00:24:04,760 --> 00:24:06,410 PROFESSOR: OK, right, so I think I 408 00:24:06,410 --> 00:24:08,368 like everything you're saying, although I think 409 00:24:08,368 --> 00:24:11,930 you're about to answer the next one, although the next one, 410 00:24:11,930 --> 00:24:16,086 I think, is in some ways the hardest one, so you're ready. 411 00:24:16,086 --> 00:24:18,430 AUDIENCE: Right, which we determined 412 00:24:18,430 --> 00:24:20,774 should be a lot more than Kw. 413 00:24:20,774 --> 00:24:23,190 PROFESSOR: Right, so we know that Kd should be a lot a lot 414 00:24:23,190 --> 00:24:25,582 more than Kw. 415 00:24:25,582 --> 00:24:31,016 AUDIENCE: And I think that this should be a decent amount less 416 00:24:31,016 --> 00:24:35,712 than-- Sorry, so I think that Kd should be a decent amount 417 00:24:35,712 --> 00:24:36,884 smaller than w2. 418 00:24:36,884 --> 00:24:38,300 PROFESSOR: All right, so you think 419 00:24:38,300 --> 00:24:40,720 Kd should be a decent amount smaller than w2. 420 00:24:40,720 --> 00:24:43,250 OK, so you really are answering the next question. 421 00:24:43,250 --> 00:24:47,440 AUDIENCE: But no, I think this is important. 422 00:24:47,440 --> 00:24:50,754 PROFESSOR: Oh, no, I agree it's important. 423 00:24:50,754 --> 00:24:51,254 But-- 424 00:24:55,550 --> 00:25:00,757 AUDIENCE: Basically speaking, if you let wt become too small, 425 00:25:00,757 --> 00:25:04,170 then it compares into everything else, all 426 00:25:04,170 --> 00:25:06,944 the relevant quantities, then this feature, which is really 427 00:25:06,944 --> 00:25:09,686 determining the over-sensitivity strengths have 0 side. 428 00:25:09,686 --> 00:25:11,060 That's what I wanted to say. 429 00:25:11,060 --> 00:25:12,730 PROFESSOR: OK, yeah. 430 00:25:12,730 --> 00:25:16,070 I very much like that explanation. 431 00:25:16,070 --> 00:25:19,110 And since we have it on tape, we can now play it again in a few 432 00:25:19,110 --> 00:25:25,690 minutes and then-- I think you're making a very nice 433 00:25:25,690 --> 00:25:27,840 argument, and actually, we'll go ahead and even-- 434 00:25:30,302 --> 00:25:31,760 Does anybody want to argue against. 435 00:25:31,760 --> 00:25:32,259 Yeah. 436 00:25:34,920 --> 00:25:38,980 So what he's arguing is that what you-- So he's 437 00:25:38,980 --> 00:25:41,410 actually arguing that it's this over here, which 438 00:25:41,410 --> 00:25:44,410 is that what sets the scale that this thing increases by 439 00:25:44,410 --> 00:25:45,030 is the Kd. 440 00:25:50,370 --> 00:25:51,980 And you want this thing to come up 441 00:25:51,980 --> 00:25:55,950 quickly relative to this other scale, which is the total w. 442 00:25:55,950 --> 00:25:58,700 So if total w is just too small here, 443 00:25:58,700 --> 00:26:01,360 then it's not like you get any ultra-sensitive response, 444 00:26:01,360 --> 00:26:03,280 because you want it to be low, low low, 445 00:26:03,280 --> 00:26:05,090 and then come up kind of quickly. 446 00:26:05,090 --> 00:26:07,694 So there's some sense that you want 447 00:26:07,694 --> 00:26:09,860 this scale to be of the order of this scale or maybe 448 00:26:09,860 --> 00:26:11,630 even a bit shorter. 449 00:26:11,630 --> 00:26:17,040 Is that-- and I agree that this actually one that generates 450 00:26:17,040 --> 00:26:19,090 a lot of argument an discussion in general, 451 00:26:19,090 --> 00:26:25,180 because I think that reasonable people can disagree. 452 00:26:25,180 --> 00:26:28,144 I think this is actually-- I will side with you on this. 453 00:26:35,250 --> 00:26:41,250 But what about this, because you haven't said anything about KW. 454 00:26:41,250 --> 00:26:45,140 AUDIENCE I think that combining one and three. 455 00:26:45,140 --> 00:26:46,285 PROFESSOR: OK, OK, fine. 456 00:26:46,285 --> 00:26:48,160 AUDIENCE: That's why I was trying to answer-- 457 00:26:48,160 --> 00:26:49,000 PROFESSOR: OK, no. 458 00:26:49,000 --> 00:26:50,860 OK, fair. 459 00:26:50,860 --> 00:26:52,500 But it's just too many logical leaps. 460 00:26:52,500 --> 00:26:53,320 I think it's true. 461 00:26:53,320 --> 00:26:55,710 And there's a reason I ordered the questions in this way, 462 00:26:55,710 --> 00:26:58,720 so that is so this one you could-- Otherwise, 463 00:26:58,720 --> 00:27:01,390 I agree that if you combine these things, you do get-- 464 00:27:01,390 --> 00:27:03,256 and which one do you get? 465 00:27:03,256 --> 00:27:05,332 AUDIENCE: I think A. 466 00:27:05,332 --> 00:27:07,040 PROFESSOR: Is that what other people got? 467 00:27:10,960 --> 00:27:13,820 OK, I agree that actually you can get to A here 468 00:27:13,820 --> 00:27:16,490 from the combination of C and B here. 469 00:27:16,490 --> 00:27:18,940 But it's a little bit crazy. 470 00:27:18,940 --> 00:27:21,520 Is there a more direct explanation we can get? 471 00:27:21,520 --> 00:27:22,380 Yeah, go ahead. 472 00:27:22,380 --> 00:27:26,756 AUDIENCE: Well, if you don't have a lot of w, 473 00:27:26,756 --> 00:27:30,853 then your binding is too sensitive in a way. 474 00:27:30,853 --> 00:27:32,436 You'll add a little bit of x, and then 475 00:27:32,436 --> 00:27:34,227 the binding will be all used up, and you'll 476 00:27:34,227 --> 00:27:36,290 be making y immediately. 477 00:27:36,290 --> 00:27:40,190 You want a lot of w to be able to soak up a sizeable amount-- 478 00:27:40,190 --> 00:27:42,920 PROFESSOR: Yeah, and then maybe in the back. 479 00:27:42,920 --> 00:27:48,318 AUDIENCE: So originally, I agreed, but that just 480 00:27:48,318 --> 00:27:52,800 makes you switch, like Evy said, you can sequester less, 481 00:27:52,800 --> 00:27:58,415 but that still gives you a sense of [INAUDIBLE]. 482 00:27:58,415 --> 00:28:00,456 You're more sensitive to fluctuations if you have 483 00:28:00,456 --> 00:28:04,970 a lower sequestration capacity, but it doesn't really change-- 484 00:28:04,970 --> 00:28:06,627 You still have that switch-- 485 00:28:06,627 --> 00:28:07,460 PROFESSOR: Sure, OK. 486 00:28:07,460 --> 00:28:09,610 So I think that this is actually [INAUDIBLE], 487 00:28:09,610 --> 00:28:13,610 because there are two ways of having more of this wt helps. 488 00:28:13,610 --> 00:28:17,220 One is that it pushes this boundary further to the right. 489 00:28:17,220 --> 00:28:20,809 But the other is that it makes it a better sequestering agent 490 00:28:20,809 --> 00:28:22,850 because there are just more of these w's to bind. 491 00:28:22,850 --> 00:28:24,891 And if you go and you do you do the calculations, 492 00:28:24,891 --> 00:28:28,030 and I encourage you to do it, what you actually find 493 00:28:28,030 --> 00:28:31,310 is that the concentration of free x 494 00:28:31,310 --> 00:28:35,280 is given approximately by x total divided 495 00:28:35,280 --> 00:28:40,870 by wt over K sub w. 496 00:28:40,870 --> 00:28:45,370 So what you see is that the free concentration of x-- and this 497 00:28:45,370 --> 00:28:46,900 is in the region. 498 00:28:46,900 --> 00:28:54,360 This is for x total less than and maybe significantly 499 00:28:54,360 --> 00:28:57,030 less than w total. 500 00:28:57,030 --> 00:28:59,350 But the idea is that when you're in this sequestering 501 00:28:59,350 --> 00:29:02,200 regime, what we see is that the concentration of x 502 00:29:02,200 --> 00:29:05,020 is going to increase linearly with the total x. 503 00:29:05,020 --> 00:29:10,800 But it's going to be sort of suppressed by this amount 504 00:29:10,800 --> 00:29:16,060 here, where this ratio is much larger than 1. 505 00:29:16,060 --> 00:29:19,200 You want to be much more than 1, wt over Kw. 506 00:29:19,200 --> 00:29:22,280 And this is telling you how good of a sequesterer this guy, 507 00:29:22,280 --> 00:29:25,160 this w is, because you want to bind tightly 508 00:29:25,160 --> 00:29:26,787 and you need to have kind of enough 509 00:29:26,787 --> 00:29:28,828 to keep the free concentration of x from growing. 510 00:29:38,140 --> 00:29:39,050 Yes. 511 00:29:39,050 --> 00:29:41,050 AUDIENCE: So that's what I was thinking as well, 512 00:29:41,050 --> 00:29:45,018 but then I was wondering if there's too much of w, 513 00:29:45,018 --> 00:29:49,482 then wouldn't x never bind completely, but always 514 00:29:49,482 --> 00:29:50,735 get sequestered? 515 00:29:50,735 --> 00:29:52,610 PROFESSOR: So this is a very important point, 516 00:29:52,610 --> 00:29:57,790 which is that it needs to be possible for the cell to make 517 00:29:57,790 --> 00:29:59,830 enough x to overcome the w. 518 00:29:59,830 --> 00:30:03,700 If there's so much of the sequestering protein that you 519 00:30:03,700 --> 00:30:07,016 cannot even get beyond that, then you're never-- 520 00:30:07,016 --> 00:30:09,640 It's true that you might have a nice ultra-sensitive switch out 521 00:30:09,640 --> 00:30:11,280 there, but you just can't get there. 522 00:30:11,280 --> 00:30:15,737 So yeah, this is certainly relevant as well. 523 00:30:21,210 --> 00:30:26,430 All right, incidentally, if you plot free x as a function of xT 524 00:30:26,430 --> 00:30:35,830 on a log-log scale, the question is 525 00:30:35,830 --> 00:30:38,020 what is it going to look like? 526 00:30:38,020 --> 00:30:40,770 Now we, know that on a linear scale, 527 00:30:40,770 --> 00:30:44,330 it looks something like that. 528 00:30:44,330 --> 00:30:51,591 So if it's on a log-log scale, question is-- 529 00:30:51,591 --> 00:30:53,340 And log-log is nice because you can really 530 00:30:53,340 --> 00:30:56,270 get the full dynamic range of what's going on. 531 00:30:56,270 --> 00:31:00,280 If you plot x and this is free x, 532 00:31:00,280 --> 00:31:16,379 as a function of x total in this regime for large X and x total, 533 00:31:16,379 --> 00:31:18,045 what is it going to end up looking like? 534 00:31:25,242 --> 00:31:27,230 AUDIENCE: It's a straight line. 535 00:31:27,230 --> 00:31:29,401 PROFESSOR: It's a straight line. 536 00:31:29,401 --> 00:31:31,900 And eventually, it's going to be straight line with slope 1, 537 00:31:31,900 --> 00:31:38,320 actually, because they're just linearly related to each other. 538 00:31:38,320 --> 00:31:41,680 Now, down in the low regime, what 539 00:31:41,680 --> 00:31:46,880 is it going to look like well below this region of wt? 540 00:31:51,200 --> 00:31:55,245 AUDIENCE:You should also get a straight line with 1 slope. 541 00:31:55,245 --> 00:31:56,370 PROFESSOR: With what slope? 542 00:31:56,370 --> 00:32:00,420 AUDIENCE: With 1 over [INAUDIBLE]. 543 00:32:00,420 --> 00:32:03,340 PROFESSOR: OK, so why does everybody agree? 544 00:32:03,340 --> 00:32:08,840 No, we have a disagreeable-- Right, so what's the-- 545 00:32:08,840 --> 00:32:11,976 AUDIENCE: It's going to be also slope 1, but just lower. 546 00:32:11,976 --> 00:32:12,850 PROFESSOR: OK, right. 547 00:32:12,850 --> 00:32:14,680 So it's actually also slope 1, but it has 548 00:32:14,680 --> 00:32:17,349 a lower-- This is dangerous. 549 00:32:17,349 --> 00:32:19,140 This is why I brought this up, because it's 550 00:32:19,140 --> 00:32:21,580 really easy to look at these things and think that-- Yeah, 551 00:32:21,580 --> 00:32:25,940 because they're still linearly related to each other. 552 00:32:25,940 --> 00:32:30,310 So when you take the logs, it just affects the level. 553 00:32:30,310 --> 00:32:32,950 So you still get the same slope, but it's 554 00:32:32,950 --> 00:32:34,950 going to be down here somewhere. 555 00:32:34,950 --> 00:32:38,130 And then in this regime, you get this ultra-- this thing 556 00:32:38,130 --> 00:32:40,120 where it goes up suddenly. 557 00:32:40,120 --> 00:32:42,280 One thing that I strongly encourage everyone to do 558 00:32:42,280 --> 00:32:44,070 is, in these sorts of problems, it's 559 00:32:44,070 --> 00:32:48,420 wonderful to spend time-- Oh, sorry. 560 00:32:48,420 --> 00:32:49,035 This is log. 561 00:32:52,810 --> 00:32:55,600 Free x log of xT. 562 00:32:55,600 --> 00:32:58,930 It's really very valuable to plot things 563 00:32:58,930 --> 00:33:02,939 in multiple different ways, by hand or by the computer or both 564 00:33:02,939 --> 00:33:04,980 or whatnot, just to make sure that you're keeping 565 00:33:04,980 --> 00:33:06,580 track of what's going on. 566 00:33:06,580 --> 00:33:10,400 Because often what you see and what you think 567 00:33:10,400 --> 00:33:12,880 is very different depending on what you plot. 568 00:33:12,880 --> 00:33:15,510 And you'd like to be able to see your problem from as many 569 00:33:15,510 --> 00:33:16,718 different angles as possible. 570 00:33:19,487 --> 00:33:21,070 All right, I think that we've probably 571 00:33:21,070 --> 00:33:23,160 spent about as much time on this as we ought to. 572 00:33:23,160 --> 00:33:26,850 But are there any other questions 573 00:33:26,850 --> 00:33:32,150 about how this is going to come about? 574 00:33:32,150 --> 00:33:32,650 Yes 575 00:33:32,650 --> 00:33:35,560 AUDIENCE: Are there any negative autoregulation 576 00:33:35,560 --> 00:33:39,440 that can use sequestration to get a switch-like [INAUDIBLE]? 577 00:33:39,440 --> 00:33:41,990 PROFESSOR: OK, that's an interesting question, 578 00:33:41,990 --> 00:33:44,814 although we have to be careful about-- If you really 579 00:33:44,814 --> 00:33:46,730 want it to be more switch-like, you'd probably 580 00:33:46,730 --> 00:33:49,050 use positive autoregulation. 581 00:33:49,050 --> 00:33:50,960 And I'm not aware of a case where 582 00:33:50,960 --> 00:33:58,404 this has been combined, although-- It's 583 00:33:58,404 --> 00:33:59,320 likely there are some. 584 00:33:59,320 --> 00:34:00,278 I just don't know them. 585 00:34:04,040 --> 00:34:06,829 I'm going to switch gears to this autoregulation, which 586 00:34:06,829 --> 00:34:09,120 is something, of course, that you guys just read about. 587 00:34:09,120 --> 00:34:13,440 And it looked like your understanding of it was solid. 588 00:34:13,440 --> 00:34:16,719 But we want to move through these different ideas. 589 00:34:16,719 --> 00:34:19,340 First, this idea of a network motif. 590 00:34:19,340 --> 00:34:22,500 And this is just the simplest example of a network motif. 591 00:34:22,500 --> 00:34:25,480 And it's so simple, we often don't even 592 00:34:25,480 --> 00:34:28,850 call it a network motif. 593 00:34:28,850 --> 00:34:33,850 But the idea here is that we have some network, 594 00:34:33,850 --> 00:34:37,690 and it has maybe N nodes and E edges. 595 00:34:42,550 --> 00:34:46,310 And the example that they give, that Uri gives in his book, N 596 00:34:46,310 --> 00:34:55,760 was 420, and E was 520. 597 00:34:55,760 --> 00:34:56,934 , 598 00:34:56,934 --> 00:34:58,600 And there's a basic question that if you 599 00:34:58,600 --> 00:35:03,140 have a network with N nodes and then you have directed edges-- 600 00:35:03,140 --> 00:35:07,010 these are edges that have an arrow pointing on one end. 601 00:35:07,010 --> 00:35:11,180 Now, in that case, and if you allow self-directed , 602 00:35:11,180 --> 00:35:14,870 edges how many possible edges are there? 603 00:35:14,870 --> 00:35:17,920 Does anybody remember what this looks like? 604 00:35:22,840 --> 00:35:24,820 AUDIENCE: More than 20 possible, right? 605 00:35:24,820 --> 00:35:25,780 PROFESSOR: What's that? 606 00:35:25,780 --> 00:35:26,780 AUDIENCE: There'd be more than 20 possible. 607 00:35:26,780 --> 00:35:29,154 PROFESSOR: Right, not even in terms of the actual number. 608 00:35:29,154 --> 00:35:32,157 Just in terms of N, for example. 609 00:35:32,157 --> 00:35:36,380 AUDIENCE: Total or just self-directed? 610 00:35:36,380 --> 00:35:38,810 PROFESSOR: Total self-directed. 611 00:35:38,810 --> 00:35:43,510 Or Sorry, total directed edges, total number 612 00:35:43,510 --> 00:35:46,307 of possible directed edges, if we included self edges, 613 00:35:46,307 --> 00:35:47,530 I guess. 614 00:35:47,530 --> 00:35:49,330 N squared. 615 00:35:49,330 --> 00:35:52,060 And you can think about this in multiple ways. 616 00:35:52,060 --> 00:35:55,730 One is just that, well, you can start at any of the N nodes. 617 00:35:55,730 --> 00:35:57,350 And you can end at any of the N nodes, 618 00:35:57,350 --> 00:35:59,120 and that gives you N squared. 619 00:35:59,120 --> 00:36:00,170 Right? 620 00:36:00,170 --> 00:36:03,220 But there's another way that you could-- For instance, 621 00:36:03,220 --> 00:36:03,965 this is Emacs. 622 00:36:07,070 --> 00:36:08,610 It's NE N squared. 623 00:36:08,610 --> 00:36:12,190 You could also think about this, if you like, as just-- say, 624 00:36:12,190 --> 00:36:20,650 well, as they point out, there's sort of-- 1/2 N times N minus 1 625 00:36:20,650 --> 00:36:23,760 is the total number of pairs in the network. 626 00:36:23,760 --> 00:36:25,449 And then for each of the pairs, you 627 00:36:25,449 --> 00:36:27,240 can have an edge pointing either direction, 628 00:36:27,240 --> 00:36:30,100 so that gives you a 2. 629 00:36:30,100 --> 00:36:33,510 Plus, you can have N self edges, right? 630 00:36:35,985 --> 00:36:37,860 Now, of course, these are just different ways 631 00:36:37,860 --> 00:36:39,957 of counting N squared. 632 00:36:39,957 --> 00:36:42,290 But it's useful to just think about it in different ways 633 00:36:42,290 --> 00:36:43,790 to make sure that you're comfortable 634 00:36:43,790 --> 00:36:45,670 with the combinatorics of it. 635 00:36:45,670 --> 00:36:50,136 In particular, because next week or the week after that, 636 00:36:50,136 --> 00:36:51,760 we're going to be talking about network 637 00:36:51,760 --> 00:36:54,160 motifs in more detail, in particular the feed-forward 638 00:36:54,160 --> 00:36:54,660 loop. 639 00:36:54,660 --> 00:36:55,990 And then we really have to keep track 640 00:36:55,990 --> 00:36:57,614 of these sorts of combinatorics better. 641 00:37:00,500 --> 00:37:02,530 The way that Uri thinks about this is he says, 642 00:37:02,530 --> 00:37:04,340 all right, well, we're going to invoke 643 00:37:04,340 --> 00:37:11,070 this null model of a network, this Erdos-Renyi network, where 644 00:37:11,070 --> 00:37:18,000 we just simply say we're going to assign the E edges randomly 645 00:37:18,000 --> 00:37:20,039 across the Emacs possible. 646 00:37:20,039 --> 00:37:21,580 So of all possible edges, we're going 647 00:37:21,580 --> 00:37:23,980 to place the edges randomly and then 648 00:37:23,980 --> 00:37:25,570 I generate some random network. 649 00:37:29,290 --> 00:37:31,820 So this is basically what we typically 650 00:37:31,820 --> 00:37:34,890 call a random network. 651 00:37:34,890 --> 00:37:37,540 And that's going to allow us to define some null model 652 00:37:37,540 --> 00:37:39,340 that if everything were random, we 653 00:37:39,340 --> 00:37:43,515 can ask, for example, how many self edges do expect to get? 654 00:37:49,544 --> 00:37:51,960 Well, one way to construct this sort Erdos-Renyi network-- 655 00:37:51,960 --> 00:37:52,456 Yeah. 656 00:37:52,456 --> 00:37:53,955 AUDIENCE: But how do you know if you 657 00:37:53,955 --> 00:37:56,672 have other kind of constraints in the system, how do 658 00:37:56,672 --> 00:38:02,609 you know that transcription might require in some cases 659 00:38:02,609 --> 00:38:03,275 autoregulation-- 660 00:38:06,230 --> 00:38:09,791 The answer that was give that was a good answer was that 661 00:38:09,791 --> 00:38:12,152 evolution is the only thing that can-- 662 00:38:12,152 --> 00:38:14,610 if you find a network motif that has to do with evolution-- 663 00:38:14,610 --> 00:38:16,660 PROFESSOR: So that is the argument that Uri makes, 664 00:38:16,660 --> 00:38:19,034 and you're maybe saying maybe that's not a good argument. 665 00:38:19,034 --> 00:38:23,449 And what Uri is saying as well, if you see these networks 666 00:38:23,449 --> 00:38:24,990 more frequently than you would expect 667 00:38:24,990 --> 00:38:26,406 by chance-- and of course, you can 668 00:38:26,406 --> 00:38:29,000 define what you mean by chance-- then 669 00:38:29,000 --> 00:38:31,930 you can say oh, maybe it was select for, it was evolved. 670 00:38:31,930 --> 00:38:34,850 And I think that, in the case of this autoregulation case, 671 00:38:34,850 --> 00:38:37,070 I think the results are not very sensitive. 672 00:38:37,070 --> 00:38:39,630 But I think that this question of what the right null model 673 00:38:39,630 --> 00:38:44,960 is, is a real issue, especially when you're talking about some 674 00:38:44,960 --> 00:38:45,960 of these other networks. 675 00:38:45,960 --> 00:38:47,918 And what we'll see for the feed-forward loop is 676 00:38:47,918 --> 00:38:50,820 that you have to decide-- well, one thing we're going to see 677 00:38:50,820 --> 00:38:55,610 is an Erdos-Renyi random network is very much not an accurate 678 00:38:55,610 --> 00:38:58,335 description of real transcription networks. 679 00:38:58,335 --> 00:39:00,710 So then you could say, well, that's not a good null model 680 00:39:00,710 --> 00:39:01,720 to use. 681 00:39:01,720 --> 00:39:04,771 And so we'll definitely spend some more time thinking 682 00:39:04,771 --> 00:39:05,270 about this. 683 00:39:08,821 --> 00:39:10,570 In the context of the Erdos-Renyi network, 684 00:39:10,570 --> 00:39:13,700 though, one way that you can generate it 685 00:39:13,700 --> 00:39:16,140 is that for each of the Emacs possible, 686 00:39:16,140 --> 00:39:18,140 each of these total number of edges, 687 00:39:18,140 --> 00:39:20,040 there is some probability that you're 688 00:39:20,040 --> 00:39:21,980 going to actually place a real edge there. 689 00:39:21,980 --> 00:39:25,659 And that probability is just E over N squared. 690 00:39:25,659 --> 00:39:26,700 E is the number of edges. 691 00:39:26,700 --> 00:39:29,040 N squared is the number of possible edges. 692 00:39:29,040 --> 00:39:32,720 So if you just create a random network in that way, 693 00:39:32,720 --> 00:39:37,640 then this is a manifestation of a random network that 694 00:39:37,640 --> 00:39:41,320 has at least the basic properties, the same number 695 00:39:41,320 --> 00:39:45,740 of edges as our network. 696 00:39:45,740 --> 00:39:50,520 So from this, you say, well, how many self edges would 697 00:39:50,520 --> 00:39:55,070 you expect in this world? 698 00:39:55,070 --> 00:40:01,380 And you'd say well, in that case-- 699 00:40:01,380 --> 00:40:03,760 There are two ways of thinking about this. 700 00:40:03,760 --> 00:40:08,150 So you can either say, we're going to take, 701 00:40:08,150 --> 00:40:13,490 for each of the N edges, there's one 702 00:40:13,490 --> 00:40:18,040 possible self-directed arrow. 703 00:40:18,040 --> 00:40:20,990 And for each of those cases, we can just multiply this by P. 704 00:40:20,990 --> 00:40:27,380 And this gives us E over N. 705 00:40:27,380 --> 00:40:33,370 You could also think about it as-- There 706 00:40:33,370 --> 00:40:36,387 are multiple ways once again, of doing the counting. 707 00:40:36,387 --> 00:40:38,470 In an Erdos-Renyi network, you can say, all right, 708 00:40:38,470 --> 00:40:44,300 you would expect to get roughly E over N autoregulatory loops. 709 00:40:44,300 --> 00:40:45,360 And this is of order 1. 710 00:40:48,130 --> 00:40:53,320 So this is 1.2, in the case of the network that Uri analyzes 711 00:40:53,320 --> 00:40:56,482 in his book and his paper. 712 00:40:56,482 --> 00:40:58,815 Whereas, how many were actually observing in the network 713 00:40:58,815 --> 00:40:59,481 that he studied? 714 00:41:03,034 --> 00:41:03,950 AUDIENCE: 40. 715 00:41:03,950 --> 00:41:05,910 PROFESSOR: There were 40, right? 716 00:41:05,910 --> 00:41:07,820 So in the observed transcriptional network-- 717 00:41:07,820 --> 00:41:13,550 and this is in E. coli-- he found that there were 40. 718 00:41:13,550 --> 00:41:15,730 And the basic statement here is that 40 is just 719 00:41:15,730 --> 00:41:18,904 much larger than 1.2. 720 00:41:18,904 --> 00:41:20,820 And you can quantify this a little bit better, 721 00:41:20,820 --> 00:41:25,310 because, it's really you would expect 1.2 plus or minus 722 00:41:25,310 --> 00:41:29,310 the square root of this, in a random network like this. 723 00:41:29,310 --> 00:41:33,960 But that is, you'd expect 0, 1, 2, maybe 3. 724 00:41:33,960 --> 00:41:36,620 So 40 is definitely not what you would expect based 725 00:41:36,620 --> 00:41:38,885 on an Erdos-Renyi network. 726 00:41:38,885 --> 00:41:41,010 So this is the sense in which it's a network motif. 727 00:41:41,010 --> 00:41:44,870 It's that the observed network just 728 00:41:44,870 --> 00:41:47,820 doesn't look like a random network in this very 729 00:41:47,820 --> 00:41:50,910 particular sense. 730 00:41:50,910 --> 00:41:54,310 And of these 40, does anybody remember kind the distribution 731 00:41:54,310 --> 00:41:58,290 between negative autoregulation and positive autoregulation? 732 00:41:58,290 --> 00:41:59,790 AUDIENCE: It was 30-10 or something. 733 00:41:59,790 --> 00:42:00,980 PROFESSOR: Yeah, I think it was like 34 734 00:42:00,980 --> 00:42:02,130 and 6 was my recollection. 735 00:42:02,130 --> 00:42:03,330 I didn't write this down. 736 00:42:03,330 --> 00:42:08,190 So most of these guys have a form of x inhibiting x. 737 00:42:08,190 --> 00:42:10,850 But some had the form of x activating x. 738 00:42:10,850 --> 00:42:14,809 So this was something like 34 and 6. 739 00:42:14,809 --> 00:42:17,100 What you would say then is that negative autoregulation 740 00:42:17,100 --> 00:42:20,830 is a very strong network motif, whereas positive autoregulation 741 00:42:20,830 --> 00:42:23,610 is a weaker network motif, but still 742 00:42:23,610 --> 00:42:25,452 something that occurs perhaps more than you 743 00:42:25,452 --> 00:42:26,410 would expect by chance. 744 00:42:32,060 --> 00:42:34,570 Are there any questions about that where that argument came 745 00:42:34,570 --> 00:42:37,980 from, other concerns about it? 746 00:42:44,216 --> 00:42:46,340 All right, then from this, then Uri says, OK, well, 747 00:42:46,340 --> 00:42:49,290 maybe these things evolved for a reason. 748 00:42:49,290 --> 00:42:53,690 And so what we'll do in the next half hour just argue or discuss 749 00:42:53,690 --> 00:42:58,260 what possible evolutionary advantages 750 00:42:58,260 --> 00:43:00,913 such a social network motif might have. 751 00:43:00,913 --> 00:43:01,412 Yeah. 752 00:43:01,412 --> 00:43:04,196 AUDIENCE: Can you really propose adversarial explanations 753 00:43:04,196 --> 00:43:04,909 like that? 754 00:43:04,909 --> 00:43:06,950 PROFESSOR: Oh, you can propose whatever you want. 755 00:43:06,950 --> 00:43:10,934 AUDIENCE: I know, but it doesn't have much value. 756 00:43:10,934 --> 00:43:12,375 It's unquantifiable. 757 00:43:12,375 --> 00:43:13,000 PROFESSOR: Yes. 758 00:43:13,000 --> 00:43:15,090 No, I'd say this is a major issue in a lot 759 00:43:15,090 --> 00:43:17,160 of evolutionary arguments. 760 00:43:17,160 --> 00:43:22,690 And I would say that the purpose of ideas and hypotheses 761 00:43:22,690 --> 00:43:25,570 is to get you to go and make new measurements. 762 00:43:25,570 --> 00:43:27,410 And so right now, we're the stage, OK, well, 763 00:43:27,410 --> 00:43:29,440 maybe these things are occurring more frequently than you 764 00:43:29,440 --> 00:43:30,650 would expect by chance. 765 00:43:30,650 --> 00:43:32,830 So now, we can just sit down and think, oh, 766 00:43:32,830 --> 00:43:34,450 what advantage might it give. 767 00:43:34,450 --> 00:43:36,800 And then we can go and try to experimentally ask 768 00:43:36,800 --> 00:43:39,449 whether those advantages are at least manifested 769 00:43:39,449 --> 00:43:40,115 in real systems. 770 00:43:40,115 --> 00:43:42,430 It doesn't prove that that's why they evolved, 771 00:43:42,430 --> 00:43:45,460 but it makes you more comfortable with the argument. 772 00:43:45,460 --> 00:43:45,960 feel 773 00:43:45,960 --> 00:43:48,702 Ultimately, we assign some-- we have some agent probability 774 00:43:48,702 --> 00:43:49,660 somewhere in our brain. 775 00:43:49,660 --> 00:43:51,330 And the more evidence that we can 776 00:43:51,330 --> 00:43:54,700 accumulate that's consistent with these ideas, the more 777 00:43:54,700 --> 00:43:56,510 likely that we think it is. 778 00:43:56,510 --> 00:43:59,100 But in general, you don't prove things 779 00:43:59,100 --> 00:44:01,120 in this sort of evolutionary space the way 780 00:44:01,120 --> 00:44:04,426 you prove things in many other fields. 781 00:44:04,426 --> 00:44:04,925 Yeah. 782 00:44:04,925 --> 00:44:08,760 AUDIENCE: I feel like it's hard to call this an argument. 783 00:44:08,760 --> 00:44:11,754 It feels more like just an observation. 784 00:44:11,754 --> 00:44:13,670 PROFESSOR: Which thing is an argument versus-- 785 00:44:13,670 --> 00:44:18,445 AUDIENCE: I guess the thing is it should be evolutionarily 786 00:44:18,445 --> 00:44:19,885 advantageous, that's an argument, 787 00:44:19,885 --> 00:44:22,010 but essentially, the whole thing is an observation, 788 00:44:22,010 --> 00:44:25,340 and then there's a little bit of an argument in the end. 789 00:44:25,340 --> 00:44:27,400 PROFESSOR: Yeah, I will let each person decide 790 00:44:27,400 --> 00:44:28,650 what fraction and observation. 791 00:44:31,010 --> 00:44:33,010 Yeah, I don't feel especially strongly about it. 792 00:44:35,630 --> 00:44:38,980 My guess is that it did evolve it because it 793 00:44:38,980 --> 00:44:40,970 provides some useful function. 794 00:44:40,970 --> 00:44:44,210 And therefore, I think it's valuable to explore 795 00:44:44,210 --> 00:44:45,830 what those useful functions might be. 796 00:44:45,830 --> 00:44:48,650 But for example, it's very hard to know 797 00:44:48,650 --> 00:44:52,580 which of these explanation-- this thing about 798 00:44:52,580 --> 00:44:54,800 increasing the response time, or sorry, 799 00:44:54,800 --> 00:44:56,690 increasing the response rate as compared 800 00:44:56,690 --> 00:45:00,370 to increasing robustness, how do you decide 801 00:45:00,370 --> 00:45:02,369 which one's more important? 802 00:45:02,369 --> 00:45:04,160 Then I think, once again, reasonable people 803 00:45:04,160 --> 00:45:05,743 can disagree about these things, yeah. 804 00:45:11,984 --> 00:45:13,900 So first negative autoregulation, because this 805 00:45:13,900 --> 00:45:15,816 is the one that is the stronger network motif. 806 00:45:18,910 --> 00:45:22,910 I think that the book does a nice explanation of why 807 00:45:22,910 --> 00:45:27,445 it decreases the response time. 808 00:45:30,580 --> 00:45:31,880 OK, we can just ask. 809 00:45:34,620 --> 00:45:39,370 OK, response time-- and this is for a negative autoregulation. 810 00:45:42,450 --> 00:45:45,360 Response time goes down. 811 00:45:45,360 --> 00:45:59,130 And is this for turning on, off, both, or maybe neither, or E, 812 00:45:59,130 --> 00:46:00,220 don't know. 813 00:46:00,220 --> 00:46:03,550 And I'll give you just 10 seconds to think about this. 814 00:46:03,550 --> 00:46:05,470 It's nice if you just remember it, 815 00:46:05,470 --> 00:46:08,810 but it's also maybe even better if you can figure it out. 816 00:46:08,810 --> 00:46:10,821 Because in a week, you're probably 817 00:46:10,821 --> 00:46:12,320 not going to just have it memorized, 818 00:46:12,320 --> 00:46:15,610 but you should be able to think through the logic of it 819 00:46:15,610 --> 00:46:18,734 and understand why this is going to be what it is. 820 00:46:26,150 --> 00:46:29,580 All right, so the question is, negative autoregulation, 821 00:46:29,580 --> 00:46:30,740 maybe it does something. 822 00:46:30,740 --> 00:46:33,283 Maybe it decreases the response time. 823 00:46:33,283 --> 00:46:34,657 But does it decrease the response 824 00:46:34,657 --> 00:46:39,610 time for turning a gene on, for turning it off, for both, 825 00:46:39,610 --> 00:46:41,013 neither, or don't know 826 00:46:41,013 --> 00:46:42,554 AUDIENCE:When you say turning it off, 827 00:46:42,554 --> 00:46:44,800 what exactly is the process you're imagining. 828 00:46:44,800 --> 00:46:49,340 PROFESSOR: I'm imagining a process where the expression 829 00:46:49,340 --> 00:46:51,540 turns off immediately. 830 00:46:51,540 --> 00:46:53,110 So there's a signal that just stops-- 831 00:46:53,110 --> 00:46:55,520 AUDIENCE: what transcription can go ahead. 832 00:46:55,520 --> 00:46:59,270 PROFESSOR: Right, so then it's just 833 00:46:59,270 --> 00:47:04,080 I chop up all the polymerases, and no more expression. 834 00:47:04,080 --> 00:47:07,180 But so a signal comes and tells the polymerases to stop making. 835 00:47:07,180 --> 00:47:07,680 Yeah. 836 00:47:12,850 --> 00:47:14,350 All right, so do you need more time? 837 00:47:18,430 --> 00:47:18,930 No. 838 00:47:18,930 --> 00:47:21,270 Ready, three, two, one. 839 00:47:25,630 --> 00:47:28,240 All right, so we actually are all over the place on this. 840 00:47:31,290 --> 00:47:33,210 OK, turn to your neighbor. 841 00:47:33,210 --> 00:47:35,885 And you should be able to explain 842 00:47:35,885 --> 00:47:37,885 one way or the other why thi-- what is going on. 843 00:47:37,885 --> 00:49:06,880 [SIDE CONVERSATIONS] 844 00:49:06,880 --> 00:49:09,050 Let's go ahead and reconvene. 845 00:49:09,050 --> 00:49:11,150 I just want to remind everybody that when 846 00:49:11,150 --> 00:49:15,620 I say simple regulation, there's no autoregulation. 847 00:49:15,620 --> 00:49:18,140 It's just responding to a signal. 848 00:49:18,140 --> 00:49:24,050 That for a stable protein, the time to get to say, 849 00:49:24,050 --> 00:49:26,840 for example, half saturating concentration 850 00:49:26,840 --> 00:49:29,210 here is defined by the cell generation time. 851 00:49:29,210 --> 00:49:33,170 And that's true for turning on and for turning off. 852 00:49:33,170 --> 00:49:34,810 And what was the strategy that you 853 00:49:34,810 --> 00:49:37,550 could use if you wanted to decrease the response 854 00:49:37,550 --> 00:49:41,344 time in this situation? 855 00:49:41,344 --> 00:49:43,010 AUDIENCE: Increase the degradation rate. 856 00:49:43,010 --> 00:49:46,140 PROFESSOR: Right, so you could increase the degradation rate. 857 00:49:46,140 --> 00:49:48,925 And does that the on, off, or both? 858 00:49:48,925 --> 00:49:49,550 AUDIENCE: Both. 859 00:49:49,550 --> 00:49:51,470 PROFESSOR: Both. 860 00:49:51,470 --> 00:49:52,992 But there's a cost, which was what? 861 00:49:52,992 --> 00:49:54,450 AUDIENCE: You have to make protein. 862 00:49:54,450 --> 00:49:55,960 PROFESSOR: Right, you have to make a bunch of protein, 863 00:49:55,960 --> 00:49:56,920 and then you're just going to chop it 864 00:49:56,920 --> 00:49:58,045 up right after you make it. 865 00:50:01,230 --> 00:50:04,540 There is a reasonable-- there is a way to make things faster, 866 00:50:04,540 --> 00:50:05,850 but it has a significant cost. 867 00:50:05,850 --> 00:50:08,710 The question is, if you have negative autoregulation-- 868 00:50:08,710 --> 00:50:14,520 so in this case, you have x that is repressing itself-- 869 00:50:14,520 --> 00:50:16,830 what is it that it's going to do? 870 00:50:16,830 --> 00:50:19,664 Is it going to affect the on time, the off time, or both. 871 00:50:19,664 --> 00:50:20,830 Let's just see where we are. 872 00:50:20,830 --> 00:50:23,390 Ready, three, two, one. 873 00:50:27,140 --> 00:50:30,200 OK, so it's interesting. 874 00:50:30,200 --> 00:50:32,620 We're moving towards C, it seems. 875 00:50:32,620 --> 00:50:34,370 OK, so can somebody give me an explanation 876 00:50:34,370 --> 00:50:44,214 for C. Did we read the chapter? 877 00:50:44,214 --> 00:50:45,672 AUDIENCE: Well, the chapter doesn't 878 00:50:45,672 --> 00:50:48,528 discuss the effect of negative autoregulation and turning off. 879 00:50:48,528 --> 00:50:50,432 I don't think it does. 880 00:50:50,432 --> 00:50:52,360 PROFESSOR: Wow, it's a good thing 881 00:50:52,360 --> 00:50:55,200 we're doing that here then. 882 00:50:55,200 --> 00:50:55,700 All right. 883 00:50:58,300 --> 00:51:00,760 So first of all, can somebody give the explanation. 884 00:51:00,760 --> 00:51:04,305 Does T on go up, down, or sideways. 885 00:51:04,305 --> 00:51:05,550 AUDIENCE: Up. 886 00:51:05,550 --> 00:51:08,035 PROFESSOR: So T on-- It's the time that goes down. 887 00:51:08,035 --> 00:51:09,160 I always get this confused. 888 00:51:09,160 --> 00:51:12,500 So time is the one that goes down, so the rate goes up. 889 00:51:12,500 --> 00:51:16,101 Negative autoregulation is faster turning on, we decided. 890 00:51:16,101 --> 00:51:16,600 Right? 891 00:51:21,664 --> 00:51:23,580 And does somebody want to give the explanation 892 00:51:23,580 --> 00:51:24,340 for why this is? 893 00:51:27,602 --> 00:51:32,060 AUDIENCE: Well, your equilibrium level is lower. 894 00:51:32,060 --> 00:51:33,930 PROFESSOR: Yeah, right. 895 00:51:33,930 --> 00:51:34,605 Yeah, exactly. 896 00:51:34,605 --> 00:51:36,438 Yes, this is actually surprisingly difficult 897 00:51:36,438 --> 00:51:40,290 to explain even though it's not a deep concept. 898 00:51:40,290 --> 00:51:43,640 But the idea is that you start out expressing a lot, 899 00:51:43,640 --> 00:51:46,502 so that if you had kept on expressing that high level, 900 00:51:46,502 --> 00:51:48,210 you would have done some exponential-- It 901 00:51:48,210 --> 00:51:51,140 would have take cell generation time from way up here. 902 00:51:51,140 --> 00:51:54,040 But instead, what happens is that you shoot on up. 903 00:51:54,040 --> 00:51:57,510 But then, once you get up here, you repress expression. 904 00:51:57,510 --> 00:52:00,490 So then you get an effective thing, 905 00:52:00,490 --> 00:52:05,790 where the time it takes you get half of your equilibrium, that 906 00:52:05,790 --> 00:52:08,510 goes down. 907 00:52:08,510 --> 00:52:10,540 So Tl in here is shorter than here. 908 00:52:13,282 --> 00:52:13,781 Yes. 909 00:52:13,781 --> 00:52:16,560 AUDIENCE: So in negative autoregulation, 910 00:52:16,560 --> 00:52:20,440 for decreasing what the book calls beta, 911 00:52:20,440 --> 00:52:22,450 to have the same steady state? 912 00:52:22,450 --> 00:52:23,450 PROFESSOR: That's right. 913 00:52:23,450 --> 00:52:26,160 The initial beta, that rate, that maximal rate 914 00:52:26,160 --> 00:52:28,100 of expression, that goes up in a case 915 00:52:28,100 --> 00:52:29,330 of negative autoregulation. 916 00:52:29,330 --> 00:52:31,750 But then you start repressing expression 917 00:52:31,750 --> 00:52:35,301 once your concentration of x here 918 00:52:35,301 --> 00:52:36,550 gets to some reasonable level. 919 00:52:41,670 --> 00:52:48,090 So now we're just talking about production rate of x. 920 00:52:48,090 --> 00:52:51,900 And that's as a function of x. 921 00:52:51,900 --> 00:52:56,770 And of this logic approximation is when it just 922 00:52:56,770 --> 00:52:59,850 is maximal until it gets to some K 923 00:52:59,850 --> 00:53:02,060 and then is completely repressed. 924 00:53:02,060 --> 00:53:04,370 So real versions will be much smoother, 925 00:53:04,370 --> 00:53:07,320 but this is just useful to start getting the intuition. 926 00:53:11,420 --> 00:53:14,780 And the idea is that you shoot up to this K, 927 00:53:14,780 --> 00:53:17,110 and then you stop expressing. 928 00:53:17,110 --> 00:53:21,711 In this limit, actually, it's not even-- it's like a kink 929 00:53:21,711 --> 00:53:22,210 here. 930 00:53:22,210 --> 00:53:26,210 Because it just shoots up and then it turns around. 931 00:53:26,210 --> 00:53:29,040 But any real system will be smoother. 932 00:53:29,040 --> 00:53:30,045 Yes, question. 933 00:53:30,045 --> 00:53:32,520 AUDIENCE: So if you get to a certain equilibrium level, 934 00:53:32,520 --> 00:53:36,490 then in autoregulation, you would need a stronger promoter. 935 00:53:36,490 --> 00:53:39,040 PROFESSOR: Yes, you want a stronger promoter, because you 936 00:53:39,040 --> 00:53:41,750 really want to have high expression initially 937 00:53:41,750 --> 00:53:45,700 and then later repress that. 938 00:53:45,700 --> 00:53:50,070 So negative autoregulation allows you to speed up 939 00:53:50,070 --> 00:53:51,450 turning on, so T on goes down. 940 00:53:51,450 --> 00:53:54,075 AUDIENCE: Without increasing the promoter, 941 00:53:54,075 --> 00:53:56,075 which is a good thing, because someone would die 942 00:53:56,075 --> 00:53:58,110 if you increase the promoter. 943 00:53:58,110 --> 00:54:02,604 PROFESSOR: This is a very important point, 944 00:54:02,604 --> 00:54:04,020 which I was about to get to, which 945 00:54:04,020 --> 00:54:10,050 is that this is something that was done-- We could have done 946 00:54:10,050 --> 00:54:11,720 that without negative autoregulation 947 00:54:11,720 --> 00:54:13,512 by increasing the degradation rate. 948 00:54:13,512 --> 00:54:15,470 So the question the that we're bringing up here 949 00:54:15,470 --> 00:54:17,860 is, is there that same cost that we 950 00:54:17,860 --> 00:54:20,915 were referring to before of this futile expression of protein 951 00:54:20,915 --> 00:54:21,540 at equilibrium. 952 00:54:24,456 --> 00:54:25,337 AUDIENCE: No. 953 00:54:25,337 --> 00:54:26,670 PROFESSOR: So it's actually not. 954 00:54:31,580 --> 00:54:33,770 In a cell, you start out expressing a lot, 955 00:54:33,770 --> 00:54:36,440 but then later, you actually bring down 956 00:54:36,440 --> 00:54:37,645 your rate of expression. 957 00:54:37,645 --> 00:54:42,490 And in any case, there's no degradation. 958 00:54:42,490 --> 00:54:44,140 in this. 959 00:54:44,140 --> 00:54:45,810 The only effective degradation is 960 00:54:45,810 --> 00:54:48,610 due to the dilution, the growth of the cell. 961 00:54:48,610 --> 00:54:50,640 So if you have the same concentration, 962 00:54:50,640 --> 00:54:53,179 then actually, you don't make any more protein 963 00:54:53,179 --> 00:54:55,595 than you did here, because you have the same concentration 964 00:54:55,595 --> 00:54:57,810 at equilibrium. 965 00:54:57,810 --> 00:55:01,140 So this is neat because this speeds up 966 00:55:01,140 --> 00:55:02,640 the response when you're turning on, 967 00:55:02,640 --> 00:55:06,249 without the associated cost of making 968 00:55:06,249 --> 00:55:07,540 that protein then degrading it. 969 00:55:11,662 --> 00:55:13,120 Any questions about that statement? 970 00:55:15,640 --> 00:55:18,140 So now what about off? 971 00:55:18,140 --> 00:55:20,300 Is the off time the same as the on time here? 972 00:55:28,610 --> 00:55:31,741 So what sets how fast-- 973 00:55:31,741 --> 00:55:33,865 AUDIENCE: Should the off time be slower because you 974 00:55:33,865 --> 00:55:37,032 have lots of degradation. 975 00:55:37,032 --> 00:55:38,490 PROFESSOR: Right, and in principle, 976 00:55:38,490 --> 00:55:42,644 is there any active degradation that we've invoked on this? 977 00:55:42,644 --> 00:55:43,970 AUDIENCE: I think not. 978 00:55:43,970 --> 00:55:45,700 PROFESSOR: Of course, we could have 979 00:55:45,700 --> 00:55:48,800 both negative autoregulation and active degradation. 980 00:55:48,800 --> 00:55:50,610 But in principle right now, you can 981 00:55:50,610 --> 00:55:53,600 have the negative autoregulation without any active degradation. 982 00:55:53,600 --> 00:55:56,110 In that case, how long does it take for that concentration 983 00:55:56,110 --> 00:55:59,989 to go away when you stop expressing? 984 00:55:59,989 --> 00:56:01,530 AUDIENCE: The cell degeneration time. 985 00:56:01,530 --> 00:56:03,113 PROFESSOR: The cell degeneration time. 986 00:56:03,113 --> 00:56:05,360 So this thing actually looks the exact same as this. 987 00:56:11,090 --> 00:56:17,000 So these guides are the same, whereas this one is faster 988 00:56:17,000 --> 00:56:17,583 than that one. 989 00:56:21,130 --> 00:56:23,380 Because the idea is that the best that-- unless you're 990 00:56:23,380 --> 00:56:25,978 inactively degraded, all you can do is you 991 00:56:25,978 --> 00:56:27,270 can shut off expression. 992 00:56:27,270 --> 00:56:29,060 But then if you turn off expression 993 00:56:29,060 --> 00:56:30,560 on the negative autoregulation, it's 994 00:56:30,560 --> 00:56:32,530 the exact same thing as turning off expression 995 00:56:32,530 --> 00:56:34,530 in the absence of the neg-- in either case, 996 00:56:34,530 --> 00:56:36,170 you just stop making protein. 997 00:56:36,170 --> 00:56:38,560 So the concentration just goes down 998 00:56:38,560 --> 00:56:43,140 because it's being diluted away during cell growth. 999 00:56:43,140 --> 00:56:45,800 So this is saying that response time 1000 00:56:45,800 --> 00:56:49,530 was down only when turning off in the case 1001 00:56:49,530 --> 00:56:51,049 of negative autoregulation. 1002 00:56:57,195 --> 00:56:58,861 Are there any questions about that idea? 1003 00:57:02,980 --> 00:57:03,480 Yes. 1004 00:57:03,480 --> 00:57:05,452 AUDIENCE: With the negative autoregulation, 1005 00:57:05,452 --> 00:57:08,903 in order to reach the same protein levels, 1006 00:57:08,903 --> 00:57:14,542 you'd need much greater production rates, correct? 1007 00:57:14,542 --> 00:57:21,070 PROFESSOR: Yeah, so the idea is that this beta might be-- 1008 00:57:21,070 --> 00:57:23,680 so this is the beta of negative autoregulation. 1009 00:57:23,680 --> 00:57:29,269 It could be much larger than the beta of simple regulation 1010 00:57:29,269 --> 00:57:30,935 in order to get to the same equilibrium. 1011 00:57:39,480 --> 00:57:43,600 OK, so what about this idea of robustness? 1012 00:57:43,600 --> 00:57:48,830 Well, this is production rate and then degradation rate. 1013 00:57:51,400 --> 00:57:52,630 So this is an alpha x. 1014 00:57:55,630 --> 00:57:59,370 And so my question here is, I told you 1015 00:57:59,370 --> 00:58:02,420 that robustness-- something is robust-- Yeah, question. 1016 00:58:02,420 --> 00:58:06,108 AUDIENCE: My question is in this case, 1017 00:58:06,108 --> 00:58:11,900 you're saying that the off means signal disappears, right? 1018 00:58:11,900 --> 00:58:15,960 PROFESSOR: OK, T off is this idea. 1019 00:58:15,960 --> 00:58:16,880 It's the T 1/2. 1020 00:58:16,880 --> 00:58:21,870 So this is the time that it takes for the protein 1021 00:58:21,870 --> 00:58:24,999 concentration to reach half-- to go 1022 00:58:24,999 --> 00:58:27,540 from halfway the distance from where you were to where you're 1023 00:58:27,540 --> 00:58:28,000 going to end up. 1024 00:58:28,000 --> 00:58:29,992 AUDIENCE: But what if the signal not disappear, 1025 00:58:29,992 --> 00:58:33,870 but to half of the original signal? 1026 00:58:33,870 --> 00:58:37,051 PROFESSOR: So the signal could do a range of different things. 1027 00:58:37,051 --> 00:58:38,550 And it could be that the signal just 1028 00:58:38,550 --> 00:58:40,730 changes so that instead of going down to 0, 1029 00:58:40,730 --> 00:58:42,860 you go down to some other value. 1030 00:58:42,860 --> 00:58:44,110 Is that what you're imagining? 1031 00:58:44,110 --> 00:58:44,693 AUDIENCE: Yes. 1032 00:58:44,693 --> 00:58:46,900 PROFESSOR: In that case, you still go exponentially 1033 00:58:46,900 --> 00:58:50,720 to this new value, so actually the T-- the response time 1034 00:58:50,720 --> 00:58:53,160 there is still actually the cell generation time. 1035 00:58:53,160 --> 00:58:55,130 So it doesn't matter, in the absence 1036 00:58:55,130 --> 00:58:58,000 of any these, for example, autoregulation. 1037 00:58:58,000 --> 00:59:00,160 The time, the characteristic timescale, 1038 00:59:00,160 --> 00:59:03,000 is always the cell generation time if it's a stable protein. 1039 00:59:03,000 --> 00:59:04,999 It doesn't matter whether you're going up, down, 1040 00:59:04,999 --> 00:59:07,190 or all the way to 0 or not. 1041 00:59:13,360 --> 00:59:20,450 So the question here is-- OK, x equilibrium is robust to what? 1042 00:59:23,090 --> 00:59:25,340 And this is to small changes in what? 1043 00:59:32,120 --> 00:59:34,752 It's going to be A, alpha. 1044 00:59:48,160 --> 00:59:53,410 So this is going to be our first example of an advanced use 1045 00:59:53,410 --> 00:59:55,540 of our cards. 1046 00:59:55,540 --> 00:59:58,600 So the way that it works is that you can choose more than one. 1047 00:59:58,600 --> 01:00:01,575 OK, now, this requires some manual dexterity. 1048 01:00:01,575 --> 01:00:03,950 So what you have to do is if you think that the answer is 1049 01:00:03,950 --> 01:00:06,241 more than one of these things, then what you have to do 1050 01:00:06,241 --> 01:00:07,705 is show me more than one card. 1051 01:00:12,090 --> 01:00:13,700 These cards are amazing, right? 1052 01:00:13,700 --> 01:00:17,960 You can do so many different combinations. 1053 01:00:17,960 --> 01:00:19,790 I'll give you 20 seconds to think about it. 1054 01:00:25,670 --> 01:00:27,140 AUDIENCE: So what's e? 1055 01:00:27,140 --> 01:00:31,070 What do those things mean? 1056 01:00:31,070 --> 01:00:34,540 PROFESSOR: OK, the question is, are the equilibrium 1057 01:00:34,540 --> 01:00:37,330 concentration of protein x is robust means it does not 1058 01:00:37,330 --> 01:00:42,000 change in response to small changes in what quantities? 1059 01:00:42,000 --> 01:00:43,885 So if I change the degradation rate, 1060 01:00:43,885 --> 01:00:45,010 does it change equilibrium. 1061 01:00:45,010 --> 01:00:46,380 If I change the beta. 1062 01:00:46,380 --> 01:00:48,670 And I'm asking about this case here, 1063 01:00:48,670 --> 01:00:51,150 perfect negative autoregulation, just 1064 01:00:51,150 --> 01:00:55,060 so we can try to establish our intuition here. 1065 01:00:55,060 --> 01:00:59,840 K is this repression threshold. 1066 01:00:59,840 --> 01:01:02,210 None means that it's not robust to any of these things. 1067 01:01:02,210 --> 01:01:15,600 DK always means "don't know." 1068 01:01:15,600 --> 01:01:17,130 I'll give you an extra 30 seconds. 1069 01:01:17,130 --> 01:01:17,755 This might be-- 1070 01:01:34,964 --> 01:01:36,380 So this one's the production rate. 1071 01:01:36,380 --> 01:01:38,452 This one's the degradation rate. 1072 01:01:38,452 --> 01:01:39,910 This figure might be useful to you. 1073 01:01:45,230 --> 01:01:47,200 AUDIENCE: Can you define K again? 1074 01:01:47,200 --> 01:01:52,190 PROFESSOR: Yes, so K is the concentration of the protein x 1075 01:01:52,190 --> 01:01:56,970 at which this super effective repression kicks in. 1076 01:01:56,970 --> 01:02:00,080 So we're assuming perfect negative autoregulation. 1077 01:02:00,080 --> 01:02:03,600 Beta is the rate of expression for low concentrations. 1078 01:02:03,600 --> 01:02:06,245 The moment you get to concentration K, 1079 01:02:06,245 --> 01:02:08,328 you get perfect repression and no more expression. 1080 01:02:18,540 --> 01:02:19,621 Do you need more time? 1081 01:02:19,621 --> 01:02:20,120 Question. 1082 01:02:20,120 --> 01:02:23,940 AUDIENCE: By saying that x equilibrium is robust, 1083 01:02:23,940 --> 01:02:27,640 so you mean that when you change these perimeters, 1084 01:02:27,640 --> 01:02:33,510 x equilibrium stays exactly the same, or will x equlibrium-- 1085 01:02:33,510 --> 01:02:36,680 PROFESSOR: For now, what we'll mean right now 1086 01:02:36,680 --> 01:02:39,255 is that a small change in this parameter 1087 01:02:39,255 --> 01:02:41,940 leads to no change in x equilibrium. 1088 01:02:41,940 --> 01:02:44,675 Now, for any real example, what we'll typically mean 1089 01:02:44,675 --> 01:02:46,860 is 's going to be some sort of sensitivity analysis. 1090 01:02:46,860 --> 01:02:51,080 For example, where you'll say oh, a 1% change in a parameter 1091 01:02:51,080 --> 01:02:53,390 leads to a less than 1% change, for example. 1092 01:02:53,390 --> 01:02:59,350 But in this case, there's going to be no change, I'll tell you, 1093 01:02:59,350 --> 01:03:02,890 just so we can get the intuition clear here. 1094 01:03:02,890 --> 01:03:05,090 All right, do you need more time? 1095 01:03:05,090 --> 01:03:06,270 Let's go ahead and vote. 1096 01:03:06,270 --> 01:03:08,728 Remember, you can vote for more than one thing if you like. 1097 01:03:08,728 --> 01:03:12,230 Ready, three, two, one. 1098 01:03:12,230 --> 01:03:17,580 All right, some people are using our more than one. 1099 01:03:17,580 --> 01:03:19,300 And of course, I can give you a hint. 1100 01:03:19,300 --> 01:03:22,000 The reason that I'm letting you vote more than once 1101 01:03:22,000 --> 01:03:26,060 is because more than one thing is going to be-- All right, 1102 01:03:26,060 --> 01:03:30,380 so the majority of the group has got this, but not everyone. 1103 01:03:30,380 --> 01:03:32,020 So let's discuss. 1104 01:03:32,020 --> 01:03:35,229 Can somebody give an explanation for why both alpha and beta are 1105 01:03:35,229 --> 01:03:36,020 going to work here? 1106 01:03:36,020 --> 01:03:39,310 AUDIENCE: So the equilibrium is basically 1107 01:03:39,310 --> 01:03:41,734 when degradation involves production. 1108 01:03:41,734 --> 01:03:43,400 PROFESSOR: I want to make sure I'm okay. 1109 01:03:43,400 --> 01:03:47,160 The equilibrium is when the production rate is 1110 01:03:47,160 --> 01:03:48,730 equal to the degradation rate. 1111 01:03:48,730 --> 01:03:51,230 So this is a very important thing 1112 01:03:51,230 --> 01:03:53,010 to make sure we're on top of. 1113 01:03:53,010 --> 01:03:57,075 And in this case, we have very sharp-- this production. 1114 01:03:57,075 --> 01:03:57,950 So then what happens? 1115 01:03:57,950 --> 01:04:00,116 AUDIENCE: Well, [INAUDIBLE] is the intersection of-- 1116 01:04:00,116 --> 01:04:02,170 PROFESSOR: Right, so in this case, 1117 01:04:02,170 --> 01:04:04,086 what is the equilibrium concentration? 1118 01:04:04,086 --> 01:04:06,030 AUDIENCE: K. 1119 01:04:06,030 --> 01:04:13,900 PROFESSOR: It's equal to K. Now, I strongly encourage you, 1120 01:04:13,900 --> 01:04:16,196 whenever possible, to draw things out. 1121 01:04:16,196 --> 01:04:18,570 Because this is a problem that when you have the drawing. 1122 01:04:18,570 --> 01:04:19,944 It's reasonable to do. 1123 01:04:19,944 --> 01:04:21,360 And if you don't have the drawing, 1124 01:04:21,360 --> 01:04:25,440 you're going to get yourself tied up into weird knots. 1125 01:04:25,440 --> 01:04:29,140 And indeed, we can see that if we change alpha, 1126 01:04:29,140 --> 01:04:32,090 what happens in this spot? 1127 01:04:32,090 --> 01:04:35,110 Right, it changes the slope. 1128 01:04:35,110 --> 01:04:37,140 And you can see that if we change the slope 1129 01:04:37,140 --> 01:04:39,920 by small amounts, we get no change where 1130 01:04:39,920 --> 01:04:42,170 this crossing point is. 1131 01:04:42,170 --> 01:04:44,532 And even for a real system, if it came around, 1132 01:04:44,532 --> 01:04:45,990 you'd see that it's going to end up 1133 01:04:45,990 --> 01:04:50,230 being a less than proportional change in the equilibrium. 1134 01:04:50,230 --> 01:04:53,480 And what about what about beta? 1135 01:04:53,480 --> 01:04:56,020 That just raises and lowers this. 1136 01:04:56,020 --> 01:04:58,640 And again, that doesn't change the equilibrium. 1137 01:04:58,640 --> 01:05:03,110 Of course, if we changed K, then we get a 1 to 1 change. 1138 01:05:03,110 --> 01:05:07,030 So a 10% change in K leads to a 10% change in the equilibrium 1139 01:05:07,030 --> 01:05:07,984 concentration of x. 1140 01:05:11,940 --> 01:05:15,920 So this is the sense in which the equilibrium concentration 1141 01:05:15,920 --> 01:05:20,744 in negative autoregulation is robust to changes in both-- 1142 01:05:20,744 --> 01:05:22,660 in the book, they say oh, the production rate, 1143 01:05:22,660 --> 01:05:24,118 but it's actually also in principle 1144 01:05:24,118 --> 01:05:27,230 the degradation rate over some range. 1145 01:05:27,230 --> 01:05:30,390 And this could be useful, because there 1146 01:05:30,390 --> 01:05:32,840 are lots of things that are going to affect the production 1147 01:05:32,840 --> 01:05:37,360 rate of a protein, and also the degradation rate, 1148 01:05:37,360 --> 01:05:38,730 for that matter. 1149 01:05:38,730 --> 01:05:40,750 The division rate, it changes it. 1150 01:05:40,750 --> 01:05:43,100 Whereas it may be that K is subjected 1151 01:05:43,100 --> 01:05:45,740 to less severe changes, because that's 1152 01:05:45,740 --> 01:05:49,610 determined by, for example, in the kinetics 1153 01:05:49,610 --> 01:05:52,930 of binding of this protein to this promoter. 1154 01:05:52,930 --> 01:05:56,020 And that is perhaps less subject to changes. 1155 01:05:56,020 --> 01:05:59,450 It can still change depending upon the pH 1156 01:05:59,450 --> 01:06:03,440 and so forth of the interior of the cell. 1157 01:06:03,440 --> 01:06:07,080 But at least it's probably not subject to the big changes 1158 01:06:07,080 --> 01:06:11,210 that alpha and beta are going to be a good experience. 1159 01:06:11,210 --> 01:06:15,370 So the argument that Uri makes for why it is we see so much 1160 01:06:15,370 --> 01:06:19,460 negative autoregulation in the cell is because it both 1161 01:06:19,460 --> 01:06:24,200 increases the rate that the cell can respond to changes, 1162 01:06:24,200 --> 01:06:27,340 in the on direction, at least, but also that it makes 1163 01:06:27,340 --> 01:06:30,420 the concentration of protein more robust to changes 1164 01:06:30,420 --> 01:06:34,260 in several of the parameters that govern the equilibrium . 1165 01:06:34,260 --> 01:06:37,030 Concentration And once again, you 1166 01:06:37,030 --> 01:06:41,140 could argue about which one of these is more important, 1167 01:06:41,140 --> 01:06:44,262 but I think they're both likely playing a significant role 1168 01:06:44,262 --> 01:06:45,053 in different cases. 1169 01:06:50,860 --> 01:06:54,150 I'm going to want to move on, but I will tell you that only 1170 01:06:54,150 --> 01:06:58,560 over some range of these -- alpha, beta, K-- 1171 01:06:58,560 --> 01:07:02,340 will this thing be robust. 1172 01:07:02,340 --> 01:07:06,100 So for example, if this comes up too high, 1173 01:07:06,100 --> 01:07:09,350 we're going to lose this phenomenon of robustness. 1174 01:07:09,350 --> 01:07:13,940 So I expect you to be able to tell me in some later date 1175 01:07:13,940 --> 01:07:17,880 the conditions in which that might happen. 1176 01:07:17,880 --> 01:07:20,516 And I'm available for the next half hour 1177 01:07:20,516 --> 01:07:22,890 after class, so if you do not know what I'm talking about 1178 01:07:22,890 --> 01:07:24,801 right there, please hang out with me after, 1179 01:07:24,801 --> 01:07:27,730 and I'll tell you the solution to that question on the exam. 1180 01:07:27,730 --> 01:07:28,260 OK? 1181 01:07:28,260 --> 01:07:28,760 All right. 1182 01:07:32,620 --> 01:07:34,870 But I do want to talk about positive autoregulation, 1183 01:07:34,870 --> 01:07:38,710 because this is another interesting beast. 1184 01:07:38,710 --> 01:07:41,660 So if negative autoregulation has those nice properties, 1185 01:07:41,660 --> 01:07:44,000 then you can imagine that positive autoregulation 1186 01:07:44,000 --> 01:07:46,855 will have some drawbacks in the same kind of ways. 1187 01:07:49,470 --> 01:07:54,351 But it leads to some other very interesting, just qualitative 1188 01:07:54,351 --> 01:07:54,850 features. 1189 01:08:03,100 --> 01:08:04,100 Positive autoregulation. 1190 01:08:09,600 --> 01:08:15,440 So we have some x that is activating itself. 1191 01:08:15,440 --> 01:08:17,140 And often we think about cases where 1192 01:08:17,140 --> 01:08:19,109 it's activating its own expression 1193 01:08:19,109 --> 01:08:21,880 in a cooperative fashion. 1194 01:08:21,880 --> 01:08:23,840 In particular, we might assume that x 1195 01:08:23,840 --> 01:08:27,520 dot is equal to, for example, some beta 0 1196 01:08:27,520 --> 01:08:30,600 plus some beta one of some cooperative thing 1197 01:08:30,600 --> 01:08:40,149 here where N might be 2 3 4 and then again minus alpha x. 1198 01:08:40,149 --> 01:08:41,470 Right? 1199 01:08:41,470 --> 01:08:43,080 Now, if you just look at this, you 1200 01:08:43,080 --> 01:08:45,010 might think oh, I don't know what this is going to do 1201 01:08:45,010 --> 01:08:45,551 and so forth. 1202 01:08:45,551 --> 01:08:47,430 But you've got to draw things out. 1203 01:08:47,430 --> 01:08:50,810 Once you draw it, then you'll see that it's 1204 01:08:50,810 --> 01:08:52,861 pretty straightforward. 1205 01:08:52,861 --> 01:08:55,319 So again, this is the production and the degradation rates. 1206 01:09:03,710 --> 01:09:05,790 So that's production. 1207 01:09:05,790 --> 01:09:08,130 And degradation, for example, might look like this. 1208 01:09:10,660 --> 01:09:13,350 So this is the production. 1209 01:09:13,350 --> 01:09:15,359 This is the degradation. 1210 01:09:15,359 --> 01:09:16,550 So that's the alpha x term. 1211 01:09:21,450 --> 01:09:25,399 One question would be, how many fixed points 1212 01:09:25,399 --> 01:09:27,720 does this system have? 1213 01:09:31,590 --> 01:09:35,020 So a fixed point means that if you started right there, 1214 01:09:35,020 --> 01:09:38,580 and in the absence of any noise, you would stay right there. 1215 01:09:38,580 --> 01:09:41,467 So it's clearly both stable and unstable at these points. 1216 01:09:57,818 --> 01:09:58,568 Can you read that? 1217 01:10:02,060 --> 01:10:04,000 I'll give you 15 seconds to count them. 1218 01:10:18,060 --> 01:10:22,000 Ready, three, two, one. 1219 01:10:26,880 --> 01:10:29,190 All right, it seems like we have pretty good agreement. 1220 01:10:29,190 --> 01:10:32,160 There are indeed 3 fixed points. 1221 01:10:32,160 --> 01:10:37,670 Once again, the fixed point is where these curves cross. 1222 01:10:37,670 --> 01:10:44,560 So we have one right here, one here, and one here, 1223 01:10:44,560 --> 01:10:47,170 Now, how many are stable? 1224 01:10:47,170 --> 01:10:48,660 We're going to do this verbally. 1225 01:10:48,660 --> 01:10:52,360 Ready, three, two, one. 1226 01:10:52,360 --> 01:10:52,860 AUDIENCE: 2. 1227 01:10:52,860 --> 01:10:53,445 PROFESSOR: 2. 1228 01:10:53,445 --> 01:10:54,320 Let's try that again. 1229 01:10:54,320 --> 01:10:56,113 Ready, three, two, one. 1230 01:10:56,113 --> 01:10:56,612 AUDIENCE: 2. 1231 01:10:56,612 --> 01:10:57,153 PROFESSOR: 2. 1232 01:10:57,153 --> 01:11:01,940 Yeah, you get so used to the card, it's hard to speak. 1233 01:11:01,940 --> 01:11:05,580 So they're the ones on the ends of the stable ones. 1234 01:11:05,580 --> 01:11:08,754 And you see here that around this point, 1235 01:11:08,754 --> 01:11:11,420 the production rate over here is more than the degradation rate. 1236 01:11:11,420 --> 01:11:13,580 That means that if you leave that fixed point, 1237 01:11:13,580 --> 01:11:15,560 you're going to get pushed away. 1238 01:11:15,560 --> 01:11:18,290 So it's very nice to draw these little arrows here 1239 01:11:18,290 --> 01:11:19,600 to make one happy. 1240 01:11:23,520 --> 01:11:27,200 So this thing here is stable, unstable, and again stable. 1241 01:11:32,150 --> 01:11:35,030 Now, the reason we call this bistability 1242 01:11:35,030 --> 01:11:38,350 is because there are 2 stable fixed points. 1243 01:11:38,350 --> 01:11:43,600 This is important because this phenomenon 1244 01:11:43,600 --> 01:11:51,830 is the basic dynamical system's origin of memory. 1245 01:11:51,830 --> 01:11:57,330 Now it's, not obvious how memory comes from this. 1246 01:11:57,330 --> 01:12:01,380 So memory is a generalist idea that the g-network or the cell 1247 01:12:01,380 --> 01:12:06,836 can retain a memory of its past state. 1248 01:12:06,836 --> 01:12:08,460 And we're going to see examples of this 1249 01:12:08,460 --> 01:12:10,840 over the next few weeks. 1250 01:12:10,840 --> 01:12:13,585 But just to be clear, if, for example, we 1251 01:12:13,585 --> 01:12:17,100 imagine a situation where the alpha changes. 1252 01:12:17,100 --> 01:12:20,220 And it could be division rate, for example, 1253 01:12:20,220 --> 01:12:22,700 high-food, low-food environments. 1254 01:12:22,700 --> 01:12:29,520 What we do is we can plot-- Often, you can plot, 1255 01:12:29,520 --> 01:12:31,194 for example, the equilibrium, but that's 1256 01:12:31,194 --> 01:12:32,110 a little bit trickier. 1257 01:12:32,110 --> 01:12:33,776 So I'm just going to plot the production 1258 01:12:33,776 --> 01:12:39,820 rate as a function of alpha. 1259 01:12:43,280 --> 01:12:47,010 Now, the question is, if we change alpha, 1260 01:12:47,010 --> 01:12:50,089 what's going to happen? 1261 01:12:50,089 --> 01:12:51,880 Now, for a fixed alpha, you can see already 1262 01:12:51,880 --> 01:12:53,963 that there are two different production rates that 1263 01:12:53,963 --> 01:12:56,010 are stable in this case. 1264 01:12:56,010 --> 01:12:59,060 But what happens if we increase alpha? 1265 01:12:59,060 --> 01:13:02,440 So we increase the growth rate so it goes like this. 1266 01:13:02,440 --> 01:13:05,890 Can that change the number of fixed points? 1267 01:13:05,890 --> 01:13:10,700 And indeed, what we can see is that as this line gets steeper 1268 01:13:10,700 --> 01:13:14,280 here, eventually you only have a single fixed point, 1269 01:13:14,280 --> 01:13:16,100 and it's stable. 1270 01:13:16,100 --> 01:13:17,475 And that's known as a bifurcation 1271 01:13:17,475 --> 01:13:19,860 of the dynamics of the system. 1272 01:13:19,860 --> 01:13:24,540 So this is for large alpha, you end up-- And just to be clear, 1273 01:13:24,540 --> 01:13:27,250 this is beta 0 down here. 1274 01:13:27,250 --> 01:13:30,820 And then up here is the beta 1. 1275 01:13:30,820 --> 01:13:35,050 So what we do is we know that beta 0 is 1276 01:13:35,050 --> 01:13:38,840 where we get for large alpha. 1277 01:13:38,840 --> 01:13:41,900 Now, for small alpha, do we end up 1278 01:13:41,900 --> 01:13:45,460 getting another-- We get another bifurcation. 1279 01:13:45,460 --> 01:13:48,370 So actually, there's only again one stable point 1280 01:13:48,370 --> 01:13:52,679 up here at small alpha. 1281 01:13:52,679 --> 01:13:54,220 And what we're going to get is what's 1282 01:13:54,220 --> 01:14:00,740 known as a full bifurcation where 1283 01:14:00,740 --> 01:14:04,780 solid lines denote stable points, stable fixed points. 1284 01:14:04,780 --> 01:14:08,360 Dashed lines represent unstable fixed points. 1285 01:14:08,360 --> 01:14:12,806 So stable, and the dash is unstable. 1286 01:14:19,480 --> 01:14:21,930 There are some regions of alpha conditions 1287 01:14:21,930 --> 01:14:23,540 where the system is bistable. 1288 01:14:23,540 --> 01:14:27,170 But then outside of that, it's just monostable. 1289 01:14:27,170 --> 01:14:29,710 Can somebody explain why this thing-- 1290 01:14:29,710 --> 01:14:31,960 why I might make the argument that this thing displays 1291 01:14:31,960 --> 01:14:32,460 memory? 1292 01:14:37,520 --> 01:14:39,730 Well, one of those two is fine, but any new people 1293 01:14:39,730 --> 01:14:43,245 want to explain my thought process? 1294 01:14:46,060 --> 01:14:46,726 No. 1295 01:14:46,726 --> 01:14:47,600 All right, maybe you. 1296 01:14:47,600 --> 01:14:50,840 AUDIENCE: All right, well depending on 1297 01:14:50,840 --> 01:14:53,318 whether we had high degradation or low degradation 1298 01:14:53,318 --> 01:14:57,340 rates in the past, we'll be on the lower or the upper range 1299 01:14:57,340 --> 01:14:59,520 of that if we return to normal. 1300 01:14:59,520 --> 01:15:00,520 PROFESSOR: That's right. 1301 01:15:00,520 --> 01:15:02,603 So the argument here is that-- let's say that this 1302 01:15:02,603 --> 01:15:05,740 is some normal condition. 1303 01:15:05,740 --> 01:15:08,020 This is where you are right now, for example. 1304 01:15:08,020 --> 01:15:11,510 Now, depending upon whether you're sitting here or here, 1305 01:15:11,510 --> 01:15:13,400 that's perhaps giving some information 1306 01:15:13,400 --> 01:15:16,030 about the past state of the cell. 1307 01:15:16,030 --> 01:15:19,060 Because if you were here, that means oh, maybe in the past 1308 01:15:19,060 --> 01:15:22,340 you were out at high degradation rates, whereas if you're here, 1309 01:15:22,340 --> 01:15:23,410 maybe were at low. 1310 01:15:23,410 --> 01:15:26,400 In particular, you could reset things. 1311 01:15:26,400 --> 01:15:30,430 If you start here, then you can reset this memory module 1312 01:15:30,430 --> 01:15:32,846 by coming over here. 1313 01:15:32,846 --> 01:15:34,470 Once you get to this point here, that's 1314 01:15:34,470 --> 01:15:37,290 the bifurcation dynamics, the full bifurcation. 1315 01:15:37,290 --> 01:15:41,800 Then you come up here, and now you'll retain this state. 1316 01:15:41,800 --> 01:15:43,560 In principle, until you get over here. 1317 01:15:43,560 --> 01:15:45,893 Of course, there could be stochastic switching dynamics. 1318 01:15:45,893 --> 01:15:48,760 We're going to talk a lot about that in the coming weeks. 1319 01:15:48,760 --> 01:15:51,120 But at least in the limit of a low rates 1320 01:15:51,120 --> 01:15:53,270 of stochastic switching, then this 1321 01:15:53,270 --> 01:15:56,280 represents some sort of memory module, the simplest 1322 01:15:56,280 --> 01:15:57,210 version of it. 1323 01:15:57,210 --> 01:16:01,240 I'd say that in the cell, most examples of such memory modules 1324 01:16:01,240 --> 01:16:03,520 involve not just positive feedback of one 1325 01:16:03,520 --> 01:16:06,120 protein activating itself, although this happens, 1326 01:16:06,120 --> 01:16:08,120 but often through a whole network, where the one 1327 01:16:08,120 --> 01:16:09,430 protein activates another, activates another, 1328 01:16:09,430 --> 01:16:10,420 and then you come back. 1329 01:16:10,420 --> 01:16:14,010 Or it could be repressing, repressing. 1330 01:16:14,010 --> 01:16:18,220 2 0's, is a po-- two negatives is a positive. 1331 01:16:18,220 --> 01:16:19,540 Just like two lefts is a right. 1332 01:16:22,386 --> 01:16:23,760 Right, so are there any questions 1333 01:16:23,760 --> 01:16:25,940 about the sense in which this thing can 1334 01:16:25,940 --> 01:16:29,551 serve as a basic memory module? 1335 01:16:29,551 --> 01:16:31,925 And this is maybe not the most interesting example of it, 1336 01:16:31,925 --> 01:16:33,850 because alpha is such a global parameter. 1337 01:16:33,850 --> 01:16:35,900 But you can also get similar dynamics 1338 01:16:35,900 --> 01:16:39,350 as a function of, for example, the galactose 1339 01:16:39,350 --> 01:16:43,270 in the concentration of some sugar in your media. 1340 01:16:43,270 --> 01:16:47,564 So given that different small molecules such as food sources 1341 01:16:47,564 --> 01:16:49,230 can act as inputs into these g-networks, 1342 01:16:49,230 --> 01:16:51,180 you can also get these sorts of dynamics 1343 01:16:51,180 --> 01:16:54,630 as a function of what you might call really 1344 01:16:54,630 --> 01:16:59,540 some simple, external molecule, which is nice, because that 1345 01:16:59,540 --> 01:17:02,976 means that you can have memory modules that are really 1346 01:17:02,976 --> 01:17:04,975 independent of all the other memory modules that 1347 01:17:04,975 --> 01:17:06,550 are going on in your cell. 1348 01:17:06,550 --> 01:17:09,790 Whereas if you had a vary alpha, then this changes everything. 1349 01:17:09,790 --> 01:17:12,730 Whereas if it's just a concentration of some sugar 1350 01:17:12,730 --> 01:17:14,610 outside, then you can imagine that that 1351 01:17:14,610 --> 01:17:17,620 could be very useful to retain a memory of what the cell has 1352 01:17:17,620 --> 01:17:18,620 encountered in the past. 1353 01:17:25,720 --> 01:17:27,220 So today, what we've been able to do 1354 01:17:27,220 --> 01:17:31,720 is analyze something about a possible evolutionary 1355 01:17:31,720 --> 01:17:33,825 explanation for why autoregulation 1356 01:17:33,825 --> 01:17:36,770 is as commonly observed as is. 1357 01:17:36,770 --> 01:17:38,430 So negative autoregulation is the one 1358 01:17:38,430 --> 01:17:41,160 that's observed perhaps most frequently. 1359 01:17:41,160 --> 01:17:43,190 And that, I think, has some very clear purposes. 1360 01:17:43,190 --> 01:17:46,620 And this idea of the concentration being robust 1361 01:17:46,620 --> 01:17:51,020 to other biochemical parameters I think is a big idea. 1362 01:17:51,020 --> 01:17:53,543 We're going to see this idea of robustness 1363 01:17:53,543 --> 01:17:56,440 crop up multiple times over the course of this semester. 1364 01:17:56,440 --> 01:17:58,890 And I think that it's nice to think about robustness 1365 01:17:58,890 --> 01:18:01,340 in this case, because it's perhaps the simplest 1366 01:18:01,340 --> 01:18:04,660 example of how robustness as an approach can 1367 01:18:04,660 --> 01:18:07,349 be useful as a way of thinking about a problem. 1368 01:18:07,349 --> 01:18:09,390 We're later going to be thinking about robustness 1369 01:18:09,390 --> 01:18:12,780 in the context of perfect adaptation 1370 01:18:12,780 --> 01:18:16,557 in chemotaxis, where bacteria try to find food. 1371 01:18:16,557 --> 01:18:18,390 And there, I think everything's more subtle, 1372 01:18:18,390 --> 01:18:21,470 because already the base phenomenon that is robust 1373 01:18:21,470 --> 01:18:22,740 is a form of robustness. 1374 01:18:22,740 --> 01:18:25,450 And so it kind of gets you mixed up. 1375 01:18:25,450 --> 01:18:28,310 So I think that it's good to be very clear about what 1376 01:18:28,310 --> 01:18:30,477 robustness means here, so that we 1377 01:18:30,477 --> 01:18:32,060 can use that to think about robustness 1378 01:18:32,060 --> 01:18:34,700 in other biological functions. 1379 01:18:34,700 --> 01:18:36,520 With that, have a good weekend. 1380 01:18:36,520 --> 01:18:39,536 Good luck on the problem set, and I'll see you on Tuesday.