Class 20, Jan 9m 2026
AI Assistant
Transcript
00:07:440Paolo Guiotto: about today… we finish the part of materials. We start… With some exercises left.
00:18:680Paolo Guiotto: So, in particular, the 9… Exercise 943.
00:26:420Paolo Guiotto: 944 and 945.
00:29:330Paolo Guiotto: which are exercises on weak, strong law of large numbers. The 943 says we have a sequence XN of random variables, which are independent, and for which we know that the probability that Xn is equal to plus or minus N
00:49:650Paolo Guiotto: is equal to 1 over 2n log n.
00:54:820Paolo Guiotto: Yeah, and it is a natural, greater… I suppose that, greater or equal than 2 is not said.
01:02:570Paolo Guiotto: However, since we have to do…
01:06:20Paolo Guiotto: So what could be said about, just to have a fixed, maybe we can take, X1, identical equal to 0.
01:16:290Paolo Guiotto: No. Yeah.
01:18:890Paolo Guiotto: You don't see?
01:20:430Paolo Guiotto: Cool.
01:22:150Paolo Guiotto: What's the problem?
02:12:290Paolo Guiotto: I want to look back.
02:23:260Paolo Guiotto: Okay.
02:25:370Paolo Guiotto: Holy & And, probability that XN is equal to 0 is 1.
02:35:680Paolo Guiotto: Minus, 1 over… Well, maybe there is a typo here.
02:44:370Paolo Guiotto: Should be n log n, I suppose. Because,
02:49:210Paolo Guiotto: Yeah, when we sum the two, there is a typo in the… in the text.
02:55:430Paolo Guiotto: So this variable XN takes 3 values, 0 plus minus n.
03:00:750Paolo Guiotto: And the two probabilities that Xn is plus or minus n are both equal to 1 over 2n log N.
03:09:390Paolo Guiotto: So, the probability that Xn is different from 0 is the sum of the two, so it is 1 over n log n, and that's why I… here in the… in the
03:21:470Paolo Guiotto: Assignment, there is a 2, but it's wrong.
03:26:410Paolo Guiotto: So, now the question one is, the following.
03:32:90Paolo Guiotto: check that XN, the average of DXN, goes to zero in probability.
03:41:520Paolo Guiotto: So, it's, the, should be the, the,
03:49:460Paolo Guiotto: the L2 week low. So, we recall that,
04:02:300Paolo Guiotto: if we have variables X, K, in L2… Such that… They are independent, XK… Independent.
04:17:700Paolo Guiotto: They have the same meaning.
04:21:649Paolo Guiotto: the expectation of XK is constantly equal to M,
04:26:280Paolo Guiotto: So, for every K. And there is a bound for the variance. The variance of XK is controlled by a constant M for every K.
04:36:30Paolo Guiotto: Then, the average of the…
04:41:280Paolo Guiotto: XK, so this is 1 over n, sum for k going from 1 to N of the XK. This thing goes in probability to the mean value n.
04:55:660Paolo Guiotto: Now, let's see if, this, sequence verify, this, verifies this, setup. So, in our case…
05:10:310Paolo Guiotto: Well, the variables are… they take a finite number of values, like Bernoulli, even if they are not Bernoulli. In our case, however, since we have to compute the variance, we will do these calculations, so we noticed that the expected value of XK squared
05:30:730Paolo Guiotto: to show that this is in L2, we show that this number is finite. Now, the XK can take just 3 values, so we have that it is 0 times the probability that XK
05:46:120Paolo Guiotto: square is 0, so which is XK equals 0.
05:53:530Paolo Guiotto: plus when XK is equal to plus or minus k, so we will have XK squared is equal to k squared times the probability that
06:05:360Paolo Guiotto: If you want, XK is, K.
06:09:450Paolo Guiotto: plus K squared, the probability that XK is equal to minus K.
06:17:220Paolo Guiotto: Now, this number is 0, and the other two are finite, because they are just a finite sum of finite values, so it is 2K squared. Moreover, they have the same values, 1 over K log of K,
06:31:980Paolo Guiotto: So, it is, one of the two, so…
06:35:750Paolo Guiotto: It is equal to K divided the log of K.
06:51:650Paolo Guiotto: Yeah.
06:59:30Paolo Guiotto: It is a little fraud, because this quantity is finite, so the variables are in L2.
07:07:750Paolo Guiotto: So, they are in L2, but…
07:11:80Paolo Guiotto: If you look at the variance, the variance, The variants…
07:26:500Paolo Guiotto: Me doing something…
07:31:400Paolo Guiotto: Yeah, the text is correct.
07:47:460Paolo Guiotto: It's… well, the problem is that this quantity
07:51:90Paolo Guiotto: But if you compute the variance of, the XCave…
07:57:550Paolo Guiotto: This is the expected value of XK.
08:00:890Paolo Guiotto: square minus the expected value of XK squared.
08:06:150Paolo Guiotto: Now, the expected value of XK, it will be equal to 0.
08:11:550Paolo Guiotto: Because, you see that it is 0 times the probability that XK is equal to 0.
08:19:290Paolo Guiotto: plus, K times the probability that XK is equal to K.
08:25:710Paolo Guiotto: plus minus K times the probability that XK is equal to minus K, but since the two probabilities are the same, this sum is zero, so this quantity is zero, and therefore the variance is the expected value of XK squared.
08:44:660Paolo Guiotto: And this wouldn't be bounded, so… sorry, probably… We have to…
08:56:880Paolo Guiotto: Yeah, that's not bounded K over log K, so probably… I copied the…
09:08:740Paolo Guiotto: Well, let's make it, let's modify the exercise, otherwise,
09:13:330Paolo Guiotto: This… this conclusion is not true, so we probably have to take… here.
09:20:890Paolo Guiotto: be roots of n.
09:24:640Paolo Guiotto: I'm sorry.
09:27:590Paolo Guiotto: Well, let me… one second to verify…
09:40:820Paolo Guiotto: Yeah.
09:41:910Paolo Guiotto: We should take this one, because,
09:44:920Paolo Guiotto: And with this… with this, choice, now we… here we get, K…
09:52:10Paolo Guiotto: Times the probability, because it's,
09:54:770Paolo Guiotto: XK is the root of K, so inside here, we have… XK, well, root of K.
10:05:300Paolo Guiotto: and minus root of K.
10:07:590Paolo Guiotto: And the value of the quantity becomes K.
10:11:260Paolo Guiotto: So if that's the probability, here we have this.
10:15:690Paolo Guiotto: And now this, is, bounded, because this is, just the… 1 over log of k.
10:26:100Paolo Guiotto: that 4K greater or equal than 2 becomes also bounded, so we have, to correct this here.
10:37:330Paolo Guiotto: And nothing changes for the expected value of XK.
10:42:600Paolo Guiotto: So the variance of XK is equal to 1 over log K.
10:52:210Paolo Guiotto: for K greater or equal than 2. This quantity, since log K is, increasing with K, this is, bounded by 1 over log… log 2.
11:05:160Paolo Guiotto: So, I've not solved before this.
11:09:910Paolo Guiotto: I have still to write the solution, however.
11:13:790Paolo Guiotto: Okay, so this, says that the condition 3 is verified, the condition 2 is verified, the expected value for the XK is commonly… the common value M.
11:25:790Paolo Guiotto: So, the, L2… Week low.
11:31:560Paolo Guiotto: applies.
11:35:650Paolo Guiotto: And,
11:37:660Paolo Guiotto: the average, 1 over n sum for k going from 1 to n of dxk goes to the common value n, which is, in this case, equal to 0 in probability.
11:53:390Paolo Guiotto: Now, about the second question, there is a second question here.
11:59:910Paolo Guiotto: Which is… Show that X and average does not…
12:15:80Paolo Guiotto: converge… converges.
12:18:130Paolo Guiotto: Almost surely.
12:24:550Paolo Guiotto: Okay.
12:26:180Paolo Guiotto: So here, we may start noticing that, XN bar converges in probability to zero.
12:36:690Paolo Guiotto: So, if, Xandbar… Would converge, almost surely.
12:42:210Paolo Guiotto: to something, let's say to a quantity X,
12:47:120Paolo Guiotto: Since almost sure convergence would imply the probability convergence, we would have that XN average, and we should converge in probability to the same X, but it converges to zero, so…
13:00:610Paolo Guiotto: Xn should converge almost surely to 0.
13:05:900Paolo Guiotto: So the unique possibility is that,
13:10:50Paolo Guiotto: This XN average converges to zero.
13:17:670Paolo Guiotto: Okay.
14:46:430Paolo Guiotto: the white chip, let's see.
14:50:320Paolo Guiotto: Because, of course,
15:00:280Paolo Guiotto: But this was supposed to be correct.
15:45:280Paolo Guiotto: Let me think about, because in this…
16:03:110Paolo Guiotto: Because this means that the prob…
16:06:100Paolo Guiotto: This happens if and only if the probability that the little soup
16:13:460Paolo Guiotto: of modulus X and bar, were to equal, then, epsilon.
16:19:620Paolo Guiotto: Set is, equal to zero.
16:24:900Paolo Guiotto: So, we should… This would be implied.
16:32:770Paolo Guiotto: by the fact that the sum of the probabilities that the models XN average by Boray can tell you
16:40:580Paolo Guiotto: Great liquid apps, you don't be convergent.
16:49:150Paolo Guiotto: But this is a condition to have, that this is convergent almost surely.
16:56:800Paolo Guiotto: The problem is that the variables, X and average, are not independent, because An average contains,
17:06:470Paolo Guiotto: So I cannot say that XN plus 1 average is independent of XN.
17:18:609Paolo Guiotto: And moreover, Let me just one, take one second. 1 over N sound 4K going from 1 to…
17:28:119Paolo Guiotto: N of DXK… Because this is,
17:40:970Paolo Guiotto: Okay, going from 1 to 9 minus 1.
17:58:130Paolo Guiotto: Season…
18:22:620Paolo Guiotto: The average X and which one?
18:38:590Paolo Guiotto: The disco's too sick.
18:41:980Paolo Guiotto: Music Center. Awesome.
19:15:660Paolo Guiotto: Except this book.
19:48:210Paolo Guiotto: I, I suspect that…
19:56:460Paolo Guiotto: Let me… just one thing… Let's return with the initial.
20:03:450Paolo Guiotto: Sitapa.
21:18:110Paolo Guiotto: No, that's not the right modification we have to do.
21:23:410Paolo Guiotto: So we have to keep this like that.
21:36:60Paolo Guiotto: And,
21:47:760Paolo Guiotto: Okay, I have to review this exercise, because there is something wrong, I… I'm not able to…
21:55:990Paolo Guiotto: To see how to handle this, huh?
22:01:700Paolo Guiotto: Because this could work if this would be the distribution, because the problem is this.
22:10:770Paolo Guiotto: When I attack the second part, I have to prove that the average does not converge almost surely.
22:17:570Paolo Guiotto: Since I know that it converges in probability to zero, it should converge also almost surely to 0. So the goal should be to prove that this is false.
22:26:810Paolo Guiotto: Now, I noticed that if you rewrite the average in this way, separating the Xn, now, probably in the exercise, the Xn should be equal plus minus N,
22:40:800Paolo Guiotto: in such a way that, by this algebraic identity, you have that Xn over n is the average XN minus n minus 1 over n, the average XN minus 1. But if this goes to 0, this goes to 0, this goes to 1, everything here, it should go to 0. While on the other side, you have XN over N, which is with some positive probability plus
23:05:700Paolo Guiotto: plus minus 1. Now, the problem would be, is this going… is it possible? So, is it possible that this XN over n goes to 0?
23:15:530Paolo Guiotto: If you have this, that the probability that Xn is equal to plus minus n is 1 over 2 logo n log n, then you could say that
23:27:90Paolo Guiotto: By the… But I can't tell ya, ma'am.
23:32:200Paolo Guiotto: the probability that modulus Xn Is greater or equal than epsilon.
23:39:210Paolo Guiotto: This would be less or equal than, since,
23:45:80Paolo Guiotto: for n large, epsilon fixed, this would coincide with the probability that Xn is equal to N.
23:57:40Paolo Guiotto: union with Xn equal minus N, because if you are larger than epsilon, then it's going to infinity from certain N
24:09:700Paolo Guiotto: on, you will have… if you are greater than epsilon, you have to be either n or minus n. So this probability is just 1 over n log n, because it's the sum of the two probabilities.
24:21:240Paolo Guiotto: So when you do the sum of this.
24:23:890Paolo Guiotto: get this, and since this is the Virgin series.
24:27:880Paolo Guiotto: you would get that, NDXN are independent.
24:35:190Paolo Guiotto: the second Boracantelilemma would say that the probability of the limb subset, this is the case of the divergent series.
24:45:420Paolo Guiotto: the limbsoup case, that modulus XN, Xen, xen… of, N…
24:58:490Paolo Guiotto: So this is N epsilon. In any case, epsilon is less than 1, so you have to be larger than a number which is less than n. You must be greater than that. So this, larger than epsilon, this would be equal to 1.
25:14:540Paolo Guiotto: And this would mean that the sequence XN over N Cannot go to zero.
25:21:820Paolo Guiotto: So we would get a contradiction. So if XN average goes to zero, we would have that this should go to zero, almost surely, but the application
25:32:130Paolo Guiotto: would say that this is not possible, but this is if this happens.
25:39:380Paolo Guiotto: With the router, it doesn't work.
25:43:00Paolo Guiotto: So, since I took this exercise somewhere with… probably, I do not remember where it was, I have to review this…
25:56:660Paolo Guiotto: It's strange, because if we put the end, the variance is not bounded.
26:03:320Paolo Guiotto: the variance with the… with the… if we take XK equal K, this becomes K squared, so the… as we have seen before, this is like K over log K, which is not bounded, so you cannot apply the…
26:20:580Paolo Guiotto: we flow. So… I have to think about this problem, so let's, let's do the next one.
26:32:470Paolo Guiotto: And, I will return on this,
26:35:880Paolo Guiotto: So the next one is the 944.
26:43:00Paolo Guiotto: It says we have a sequence XN of L1,
26:47:340Paolo Guiotto: random variables, IID, independent, identically distributed.
26:53:260Paolo Guiotto: with the mean value M, with common mean value M.
27:00:850Paolo Guiotto: So, discuss… Almost sure convergence.
27:09:990Paolo Guiotto: for this average, 1 over n, sum for k going from 1 to N, XK, XK plus 1.
27:22:110Paolo Guiotto: So, here…
27:27:10Paolo Guiotto: We have to apply some strong law, So, let's, Reminder…
27:39:470Paolo Guiotto: of, we have seen two strong lows, the L4, which is not the case here.
27:46:290Paolo Guiotto: The other one, is the Albuana.
27:49:850Paolo Guiotto: L1 strong law of large numbers. It says that if… well, it won't be applied to the XN, because you see we have the average of these things.
28:02:230Paolo Guiotto: So, probably, it should be applied to these variables. So, if we have variables YK, which are in L1.
28:14:190Paolo Guiotto: Independent, identically distributed.
28:18:700Paolo Guiotto: with the common mean value, well, let's give another name, because it wants the, the value, the common value of the X,
28:29:720Paolo Guiotto: N, so let's call it, I don't know, capital M.
28:34:640Paolo Guiotto: Then, the strong law says that the average of this YK
28:43:670Paolo Guiotto: Converges, almost surely, to the common mean value.
28:49:620Paolo Guiotto: So, we may try to see if, Taking… YK…
28:58:150Paolo Guiotto: equal XK times XK plus 1.
29:03:770Paolo Guiotto: We verify the condition of this theorem.
29:07:570Paolo Guiotto: Let's see…
29:11:870Paolo Guiotto: if… L1, strong low, applies.
29:23:440Paolo Guiotto: Okay, so first of all, are these variables in L1?
29:30:410Paolo Guiotto: YK is in L1, because, yes, it is the product of two L1 variables, so in principle, it is in L2, if you want. L2 is contained in L1, and we are done. So, since XK and XK plus 1
29:49:330Paolo Guiotto: No, sorry, I told something wrong. They are not L2, they are only L1, so I cannot… this argument doesn't work. So we have to do directly, ex… expected value, it's not a big problem here.
30:04:20Paolo Guiotto: modulus of XK, my YK, it's the expected value of modulus XK
30:10:400Paolo Guiotto: time XK plus 1. So, in general, the product of 2L1 function
30:17:260Paolo Guiotto: is not in L1. The product of 2L2 function is in L1, that's the Cauchy-Sports inequality, but not the vice versa, okay?
30:27:690Paolo Guiotto: However, here, we can say that this is the modulus of XK times the modulus of XK plus 1, and since they are independent.
30:38:930Paolo Guiotto: This expectation splits into the product of the expectations of X modulus XK times the expectation of models XK plus 1.
30:50:440Paolo Guiotto: And now, since XK is in L1, both these are finite values.
30:57:140Paolo Guiotto: Because the XK…
31:00:110Paolo Guiotto: index k plus 1 are in L1. So that number is finite, and this says that YK is in L1.
31:12:420Paolo Guiotto: We need also to apply they low.
31:17:910Paolo Guiotto: to know the value of the mean value of YK. Moreover.
31:28:500Paolo Guiotto: We have that YK, the expected value of YK, for the same reason is the product of the expected value, not for XK,
31:37:960Paolo Guiotto: times the expected value of XK plus 1.
31:42:920Paolo Guiotto: Each of them is equal to little m, so this is m squared.
31:53:380Paolo Guiotto: Number two, we should check if the YKR IID, independent, identically distributed.
32:02:400Paolo Guiotto: Well, we can say immediately that they are not independent, because if you take two consecutive YK,
32:13:920Paolo Guiotto: the YK… Independent.
32:19:500Paolo Guiotto: We may notice…
32:26:30Paolo Guiotto: That.
32:27:910Paolo Guiotto: If you take, I don't know, YK, which is XK,
32:33:180Paolo Guiotto: times XK plus 1, and YK plus 1, two consecutive YK, which is XK plus 1 times XK plus 2,
32:44:990Paolo Guiotto: You see that they both contain the XK plus 1.
32:50:420Paolo Guiotto: So even if DXK are independent, but YK and YK plus 1 have some common factors, they won't be independent.
33:00:120Paolo Guiotto: So… YK, and YK plus 1, not.
33:08:900Paolo Guiotto: independent.
33:11:900Paolo Guiotto: However, for this same remark, if we take, for example, Y1,
33:22:960Paolo Guiotto: If I take Y1, which is X1, X2,
33:26:570Paolo Guiotto: Then, I skip Y2, which is X2, X3, and I take Y3, which is, X3X4.
33:38:660Paolo Guiotto: And then I have Wi-Fi.
33:42:30Paolo Guiotto: Which is, X5, X6, and so on.
33:47:600Paolo Guiotto: All these are independent, because X1, X2 are independent of X3, X4. All the variables XK are independent. So.
34:00:660Paolo Guiotto: since… the… XK.
34:07:150Paolo Guiotto: Independent.
34:10:790Paolo Guiotto: In particular, X1, X2… is independent.
34:18:690Paolo Guiotto: off.
34:19:969Paolo Guiotto: X3, X4… And also, X5, X6, and so on.
34:28:90Paolo Guiotto: So these Ys are independent, so… Y1?
34:34:769Paolo Guiotto: Y3?
34:36:599Paolo Guiotto: Y5 and so on, are independent.
34:44:859Paolo Guiotto: And the same for the Y with the even index, and… Also… Y4… Y6… and so on, are… independent.
35:06:379Paolo Guiotto: So, if we split the family of Y variables into
35:12:169Paolo Guiotto: The variables with the odd index, and the variables with the even index, these two are separately, independent, huh?
35:23:319Paolo Guiotto: Okay… They, in this sequence, they are, they are also IID.
35:30:819Paolo Guiotto: Because, and moreover…
35:41:689Paolo Guiotto: the, Y1, Y3, etc.
35:46:389Paolo Guiotto: have… ye… Same… Distribution.
35:56:369Paolo Guiotto: But, because you can see, you know, each of them, it's a product of two consecutive X, so if you want, I don't know, the CDF of Y1 at point Y is the probability that
36:12:879Paolo Guiotto: X1, X2 is less or equal than Y.
36:17:809Paolo Guiotto: Now, this is what? We can write this as the integral on the set on the set of points X1, X2, less or equal than Y.
36:32:539Paolo Guiotto: the integral of the, with respect to the law of X1, X2.
36:41:669Paolo Guiotto: But since they are independent.
36:44:699Paolo Guiotto: The low… the joint law is just the product of the laws, so D mu X1, d mu X2.
36:53:729Paolo Guiotto: So this will be a double integral in X1, X2,
36:58:589Paolo Guiotto: And this splits into the product of two measures in X1, X2, and since they are IID,
37:07:129Paolo Guiotto: These, are independent of the index 1 and 2, so they are the same measure.
37:16:819Paolo Guiotto: I'm talking about the DXKR… I… the XKR…
37:22:279Paolo Guiotto: Ideally distributed means they have the same… the same law.
37:31:89Paolo Guiotto: Distributed.
37:35:429Paolo Guiotto: So these laws, they do not depend on one tool, so if you take any
37:42:159Paolo Guiotto: variable, you get the same. So you see that at the end, this is the same that you would get for Y3, for, of Y, for Y5,
37:53:919Paolo Guiotto: So they have the same, distribution.
37:59:309Paolo Guiotto: So these variables show… Are independent, identically distributed.
38:06:589Paolo Guiotto: They are in L1, they have the same mean value, so we can apply the strong load to this set and to this set, and now we have to understand how to
38:19:589Paolo Guiotto: deduce something on this, okay?
38:23:459Paolo Guiotto: So what I can say is that if I apply the strong load to the family Y1, Y3, Y5,
38:31:69Paolo Guiotto: So… the L1, strong, low, applies… 2.
38:44:359Paolo Guiotto: Why won?
38:46:309Paolo Guiotto: Y3.
38:52:99Paolo Guiotto: etc. And I have that, 1 over n sum for K going from 1 to n of these Ys, so they are the Y of type,
39:05:19Paolo Guiotto: So we start from one, so technically, we should say two…
39:11:149Paolo Guiotto: Maybe k minus 1 plus 1,
39:14:289Paolo Guiotto: In such a way that when K is 1, we get the index 1, when K is 2, we get the index 3, and so on.
39:20:789Paolo Guiotto: This goes to the, almost surely…
39:26:949Paolo Guiotto: to the, mean value of these Ys, which is, we computed above… M square.
39:39:259Paolo Guiotto: And the same…
39:47:389Paolo Guiotto: for the variables Y2, Y4, etc.
39:52:759Paolo Guiotto: So we have 1 over N, the sum for K going from 1 to N. This time, there will be…
40:01:519Paolo Guiotto: Y to K. This, again, goes almost surely to M squared.
40:09:789Paolo Guiotto: So the average of,
40:13:289Paolo Guiotto: odd Ys goes to M squared, the average of even Ys goes to M squared. Now, we have to put these two together to get the average
40:22:959Paolo Guiotto: a generic average. How can we do so?
40:27:599Paolo Guiotto: When we take an average sum.
40:29:979Paolo Guiotto: 1 over n, sum for k going from 1 to n of these YKs.
40:36:299Paolo Guiotto: So as you may imagine, this average contains all the Y's, no? So it's 1 over N…
40:43:629Paolo Guiotto: Then there will be Y1 plus Y2 plus Y3, etc. plus YN.
40:52:689Paolo Guiotto: So here you see that we could, separate the Y order from Y even.
41:03:449Paolo Guiotto: And split this into two sums.
41:06:719Paolo Guiotto: So, let's see how to write, but intuitively, it should be more or less easy. So, to fix ideas, let's take, for example, an N which is even. Okay, so say that n is equal to 2
41:24:759Paolo Guiotto: 2… to… to capital N.
41:30:369Paolo Guiotto: So this is, 1 over 2 capital N, so this is the sum Y1
41:36:649Paolo Guiotto: Plus, it ends exactly at Y2N.
41:41:439Paolo Guiotto: So we can split this into Y1 plus Y3 plus… and the last one will be Y2N minus 1.
41:54:699Paolo Guiotto: plus the remaining variables, Y2 plus Y4, LASA… Y2N.
42:04:669Paolo Guiotto: And then splitting this, into, two, sub-averages. So, we have, 1 over 20, sum, here we read k going from, what, from 1 to N.
42:20:509Paolo Guiotto: of Y to K.
42:23:559Paolo Guiotto: Right?
42:25:229Paolo Guiotto: Then the other one is 1 over 2N. This one…
42:29:459Paolo Guiotto: would be the sum for k going from 1 to, let's see, what is it? Of Y to K minus 1 plus 1, in such a way that when K is 1, we get 1.
42:44:469Paolo Guiotto: So the last K, will be what?
42:50:549Paolo Guiotto: If I put K equal N, I get exactly 2N minus 1, so 2N minus 2 plus 1 minus 1, so it's just N, this one.
43:03:569Paolo Guiotto: However, that's a detail. Now, you see that we have a 1 half that multiplies the average of the Y to K,
43:12:599Paolo Guiotto: Okay, from 1 to N of Y to K.
43:16:469Paolo Guiotto: plus 1 over N, sum for k going from
43:21:189Paolo Guiotto: 1 to N of DY2K minus 1 plus 1.
43:28:349Paolo Guiotto: So these are the two averages of the Y with the even index, and Y with the odd index. But since we know that this one almost surely goes to M squared, and this one, for the same reason, almost surely goes to M squared.
43:47:459Paolo Guiotto: everything will go almost surely to 1 half m square plus m square.
43:54:929Paolo Guiotto: So, at the end, it will go to M square.
44:00:209Paolo Guiotto: So we obtained that the average of the YK
44:06:189Paolo Guiotto: It goes, almost surely, to M square.
44:11:329Paolo Guiotto: And this is, the answer.
44:20:879Paolo Guiotto: Okay.
44:24:789Paolo Guiotto: Now, 945…
44:30:989Paolo Guiotto: asks to compute this thing. Compute… the limit…
44:38:879Paolo Guiotto: When n goes to plus infinity, Of this integral, 0, 1, Vienna?
44:47:109Paolo Guiotto: We have this ratio, X1 squared plus X2 squared plus XN squared divided by X1 Nothing.
44:58:569Paolo Guiotto: plus X2, plus etc, plus XN.
45:02:689Audio shared by Paolo Guiotto: DX1.
45:04:719Paolo Guiotto: DXN.
45:08:839Paolo Guiotto: Now, this is just a non-dimensional integral.
45:17:629Paolo Guiotto: It's a weather.
45:19:629Paolo Guiotto: Maybe it's not so, simple to be computed.
45:28:699Paolo Guiotto: I don't know if there is a direct method to do this, sir.
45:32:669Paolo Guiotto: However, it says, think that, there is this hint, I think that this, DX1, DX…
45:47:89Paolo Guiotto: And can be seen as a joint distribution of
45:53:709Paolo Guiotto: an array of variables where the XKR independent.
46:01:759Paolo Guiotto: Independent, identically distributed, with the distribution uniform on 01.
46:10:869Paolo Guiotto: So, use this idea and see what happens.
46:15:399Paolo Guiotto: So, let's first follow the hint.
46:19:839Paolo Guiotto: So imagine that we have these variables, XK, which are independent, identically distributed, with uniform distribution in 01.
46:31:159Paolo Guiotto: what happens to the joint law? Now, the joint law would be
46:37:119Paolo Guiotto: In this case, there would be… since,
46:40:49Paolo Guiotto: X… each XK is uniform. In 0, 1, it means that there is a density.
46:47:569Paolo Guiotto: for the XK, and the density for the XK is, in this case, it is the indicator of the interval, 01 divided by the length of the interval, which is just 1.
46:59:579Paolo Guiotto: So this is the density.
47:01:709Paolo Guiotto: Now, when we say that DXK independent.
47:10:879Paolo Guiotto: We know that there is a joint density for the joint distribution.
47:17:879Paolo Guiotto: And the joint density, which is a product, which is a function of an array of variables, is just the product of one variable densities.
47:32:729Paolo Guiotto: So this means that it comes, indicator 01.
47:38:159Paolo Guiotto: of X1, indicator 01 of X2, etc, indicator 01 of XN.
47:45:569Paolo Guiotto: So, in particular, if you have to compute the expected value.
47:53:159Paolo Guiotto: of a generic function phi of the variables X1, Xn.
47:59:69Paolo Guiotto: This would be the integral on Rn.
48:02:919Paolo Guiotto: of the function phi as a numerical function. This is a…
48:07:819Paolo Guiotto: an ordinary integral on Rn in the low, in the joint law of X1XN.
48:16:699Paolo Guiotto: But since there is a density, no, this means that here we have a density FX1, Xena…
48:27:589Paolo Guiotto: little x1, little xn, times the LeBag measure, DX1 DXN.
48:35:469Paolo Guiotto: And since, in particular, this density is this.
48:39:919Paolo Guiotto: It means that we have an integral on Rn of the function phi X1.
48:46:159Paolo Guiotto: XN indicator 01 for X1, indicator 01 for X2, indicator 01 for XN.
48:57:439Paolo Guiotto: in DX1.
49:01:769Paolo Guiotto: DXN.
49:04:69Paolo Guiotto: And this means that the integral is, in fact, an integral 4. X1 between 0 and 1.
49:10:689Paolo Guiotto: for X2, between 0 and 1.
49:14:309Paolo Guiotto: et cetera, for XN between 0 and 1.
49:19:789Paolo Guiotto: of the function phi, x1, xn, in DX1, DXN.
49:29:119Paolo Guiotto: And that's exactly the integral on the cube.
49:34:449Paolo Guiotto: I per cubed 01 to the n of the function phi X1.
49:40:619Paolo Guiotto: Xena… DX1.
49:44:509Paolo Guiotto: the accent.
49:47:589Paolo Guiotto: Okay, so this is where…
49:50:469Paolo Guiotto: what comes from the hint. It comes that whenever we have to compute an expectation of phi, capital X1, capital XN, this can be written as an integral on the
50:06:569Paolo Guiotto: Hypercube, 01 to the n of the numerical function phi, X1.
50:12:129Paolo Guiotto: XN… DX1 DXN.
50:16:949Paolo Guiotto: And this is where there is the attack point to solve the exercise, because we have to compute an object like this one.
50:25:149Paolo Guiotto: And the trick is to use this in… to rewrite this in this form. Let's see why.
50:31:769Paolo Guiotto: So, when we have the integral on the domain 01 to the n of X1 squared plus
50:40:739Paolo Guiotto: Xn squared divided X1 plus Excellent.
50:47:759Paolo Guiotto: in DX1.
50:49:499Paolo Guiotto: DXN.
50:51:709Paolo Guiotto: We can look at this as the function phi.
50:55:669Paolo Guiotto: So this can be written as an expectation.
51:00:779Paolo Guiotto: of a ratio where we have the numerator is capital X1 squared plus capital X2 squared plus capital XN
51:09:939Paolo Guiotto: square divided X1 plus XN.
51:17:439Paolo Guiotto: Now, what… what way there is a difference in…
51:21:789Paolo Guiotto: Of course, here we are doing
51:24:19Paolo Guiotto: An exercise, on,
51:27:399Paolo Guiotto: The law of large numbers, so this will have to do with some average.
51:32:499Paolo Guiotto: Now the point is that if I want to compute the limit, Let's say directly.
51:39:839Paolo Guiotto: I could first try to compute the integral, and then to do the limit. This could be a story.
51:48:69Paolo Guiotto: You notice that, yes, it is a limit of integrals, but we hear it's not easy to apply things like dominated convergence and things like that, or monotone, because N is,
52:01:939Paolo Guiotto: In many parts.
52:04:469Paolo Guiotto: It is changing the domain, but that might not necessarily be a problem. The problem is that it is also changing the number of bundles, so it is changing this measure.
52:14:869Paolo Guiotto: Normally, when we apply.
52:17:29Paolo Guiotto: This limit theorem, we need to have a fixed measure and a moving function inside the integral. We can improve the domain into the function by using the indicator, but this is not the domain that gets bigger or smaller.
52:31:899Paolo Guiotto: But change dimension, so it's like if we are changing space, so it's not a fixed integral with respect to a fixed dimension.
52:40:729Paolo Guiotto: Well, down here… This expectation…
52:47:49Paolo Guiotto: is now a fixed measure problem, because this is an integral with respect to the probability measure, which is independent of M. What is moving is M inside, so it is this function. So here we may think that, perhaps,
53:04:659Paolo Guiotto: some limit theorem can be used. Let's see if this is the case. Now, since we are in the…
53:15:219Paolo Guiotto: the part with law of large numbers, what we do is we divide, we transform this into averages by dividing by n, both numerator and denominator, so multiplying by 1 over n.
53:30:419Paolo Guiotto: So, as you can see, this now becomes the expected value of… numerator is the average of the Xn square.
53:39:639Paolo Guiotto: divided… denominator is the average of the XN.
53:46:449Paolo Guiotto: So let's see what happens to these averages.
53:50:199Paolo Guiotto: Since the XK are in L1 They wear… they are uniform.
53:59:79Paolo Guiotto: XK is a uniform.
54:03:139Paolo Guiotto: We formally distributed the random variable, so it has a finite expectation.
54:10:759Paolo Guiotto: Also, you know that the expected value of DXK
54:15:879Paolo Guiotto: is, in this case, is the integral from 0 to 1 of X dx, so it is equal to 1 half.
54:24:879Paolo Guiotto: So, these variables are independent, identically distributed, with the common mean value equal 1 half.
54:35:409Paolo Guiotto: The average of DXN
54:39:89Paolo Guiotto: by the strong law, L1 strong low of large numbers, says that this goes almost surely to the common mean value, which is 1 half.
54:53:689Paolo Guiotto: So this denominator goes, almost surely, 2, 1, F.
55:00:659Paolo Guiotto: About the numerator, well, we have here the XK square.
55:06:709Paolo Guiotto: But they also are in L1, as we see in a second, though.
55:11:149Paolo Guiotto: Still, because they are uniform.
55:31:99Paolo Guiotto: A 3 between 0, 1, so it is 1 3rd.
55:35:939Paolo Guiotto: It is finite, and it is a constant.
55:39:759Paolo Guiotto: Again, DXK square are independent, and they have the same distribution.
55:46:669Paolo Guiotto: So they are IID. So the L1 strong law applies to them.
55:53:819Paolo Guiotto: And we have that the average of the X and square
55:59:69Paolo Guiotto: Converges, almost surely, to the common mean value, which is, in this case, 1 third.
56:05:659Paolo Guiotto: So this means that in this fraction, also the numerator converges almost surely to 1.
56:12:679Paolo Guiotto: To… to something.
56:14:399Paolo Guiotto: So we can say that…
56:17:759Paolo Guiotto: This is the first conclusion. The ratio, the average of the squares divided the average.
56:25:179Paolo Guiotto: Converges, almost surely, denominator to 1 3rd, the denominator to 1 half.
56:32:439Paolo Guiotto: The fraction converges to 2 thirds.
56:37:549Paolo Guiotto: So we have that, the quantity inside the expectation, they converges, they converge pointwise to 2 thirds.
56:46:749Paolo Guiotto: Now, to apply the, one of the known limit theorems, which are dominated convergence or monotone convergence, for example, dominated convergence, which is
56:59:639Paolo Guiotto: In general, I'm more flexible.
57:03:139Paolo Guiotto: We need to dominate the convergence.
57:06:429Paolo Guiotto: So… to… Apply.
57:13:909Paolo Guiotto: Or better, we may say.
57:15:949Paolo Guiotto: So… And this could say that when we do the expected value of debt ratio.
57:22:899Paolo Guiotto: The average of the square divided by the average of the variables.
57:28:719Paolo Guiotto: Since this fraction goes to 2 thirds, we may think, let's say, that this expectation will go to the expectation of 2 thirds.
57:40:319Paolo Guiotto: Right? And this, of course, is a constant, so it gives the value to third.
57:46:269Paolo Guiotto: Now, this means, equivalently, that we can do the limit in N of the expectation of these fractions, Xn squared average divided XN average.
58:01:109Paolo Guiotto: This is true if we can carry…
58:04:569Paolo Guiotto: the limiting side. So the point is, can we do that?
58:10:319Paolo Guiotto: If we can do that, We have now the conclusion.
58:18:569Paolo Guiotto: And this would be two-third of the, and therefore we would have the conclusion.
58:23:209Paolo Guiotto: So the problem is, can we justify this passage? To justify this, We… Need… some.
58:37:599Paolo Guiotto: So… like, dominated… convergence.
58:45:999Paolo Guiotto: So, we already know that we have the, almost sure, the pointwise convergence.
58:51:399Paolo Guiotto: We know… that, XN squared average divided XN
59:02:09Paolo Guiotto: Average goes almost surely, so with probability 1, 2, 2 thirds.
59:08:619Paolo Guiotto: Which is the first requirement of the theorem. What you have inside, convergence point-wise.
59:14:839Paolo Guiotto: Almost surely, almost everywhere, to something.
59:19:399Paolo Guiotto: Second, we need a domination of this, so we need to take the absolute value of this, Xn squared average, divided XN average, and say that it is less or equal than some Y, which is an L1 omega. Here, the space is the probability space.
59:37:969Paolo Guiotto: Remind that since we are in a probability space, also constants are good bounds.
59:44:249Paolo Guiotto: We're not dealing with the Lebang measure on R, where a constant is not integral, but on a probability space, this happens.
59:51:939Paolo Guiotto: And this is important, because if we go back to this ratio, well, this ratio, first of all, everything is positive.
00:00:399Paolo Guiotto: Because, remind that XDXK are uniformly distributed in 0, 1, so they take values in 01 with probability 1, so in particular, they are positive.
00:15:89Paolo Guiotto: So each XK is between 0 and 1 width.
00:22:109Paolo Guiotto: Probability 1.
00:24:929Paolo Guiotto: So this ratio is the ratio between the averages.
00:29:269Paolo Guiotto: the average of the squares divided by the average of the Xn, but going back, this was this ratio, this one.
00:40:329Paolo Guiotto: So, it is…
00:48:869Paolo Guiotto: X1 squared plus Xn squared divided X1 plus XN.
00:59:969Paolo Guiotto: Okay, now, here this information is, essential.
01:05:909Paolo Guiotto: Because since XN is XK, each of the XK is between 0 and 1, what can be said about the square of XK? It is clearly positive.
01:16:229Paolo Guiotto: But also less than 1.
01:18:479Paolo Guiotto: A bit more than this.
01:21:769Paolo Guiotto: When you do the square of a number between 0 and 1, that square will be less than the number.
01:28:549Paolo Guiotto: Okay, so I can say that this is actually less than XK, which is less than 1.
01:36:749Paolo Guiotto: And this is important, because it means that
01:40:319Paolo Guiotto: If I take this one, this is less or equal than this one.
01:46:279Paolo Guiotto: If I take the second one, this is less or equal than the second one.
01:50:929Paolo Guiotto: So each of the elements in the sum of the numerator is less or equal than the corresponding elements of the sum of the denominator. So it means, in particular, that the denominator will be bigger than the numerator. So that sum is less or equal than 1.
02:09:09Paolo Guiotto: And therefore, this is the why we need there, because its constant is in a 1.
02:16:169Paolo Guiotto: In the probability space, omega.
02:20:679Paolo Guiotto: whatever is the probability measure, P. So, this means that dominated… convergence.
02:28:739Paolo Guiotto: applies, and the conclusion follows.
02:37:849Paolo Guiotto: So, at the end, the answer is the limit.
02:42:549Paolo Guiotto: of these integrals, 0, 1, 2, the n, X1 squared plus XN.
02:50:579Paolo Guiotto: square divided X1 plus Xena.
02:56:169Paolo Guiotto: the X1.
02:58:129Paolo Guiotto: the exam, this limit is 2 thirds.
03:09:649Paolo Guiotto: This is something that, it's interesting to have seen once, because it's a very… Strange and unusual.
03:20:449Paolo Guiotto: Application of probability to a problem that seems to have nothing to do with the probability in principle.
03:30:409Paolo Guiotto: Okay, so, do the exercise.
03:38:639Paolo Guiotto: 946, and, I will review the 94… 3?
03:45:589Paolo Guiotto: So, I will probably modify the assignment, so…
03:51:229Paolo Guiotto: wait, until I check clearly that exercise.
03:57:469Paolo Guiotto: We have to talk about the last topic here, which is the, so we've talked about the law of large numbers, and we have another important fact, which is the central limit
04:15:549Paolo Guiotto: TRM.
04:22:819Paolo Guiotto: It starts from the law of large numbers, so… So assume… I don't want to be…
04:37:579Paolo Guiotto: particularly precise for the moment about the assumptions. However, assume that we have variables XK, which are, at least in L1,
04:48:349Paolo Guiotto: IID…
04:53:459Paolo Guiotto: So, in particular, we have a common mean value. The expectation of XK is constantly equal to the value M. So we know that the L1 strong law of large number says that the average
05:13:399Paolo Guiotto: 1 over n sum for k, going from 1 to n of the XK.
05:17:779Paolo Guiotto: Goes, almost surely, to this number, M.
05:22:849Paolo Guiotto: or carrying the M at the left-hand side, we have that.
05:27:99Paolo Guiotto: 1 over n sound for k going from 1 to N over XK minus M goes to 0.
05:36:739Paolo Guiotto: All in short.
05:39:849Paolo Guiotto: Now, if we, compute the, variance of this thing.
05:47:989Paolo Guiotto: So, if we take the variance, we notice that…
06:06:399Paolo Guiotto: Okay, so we noticed that if we do the variance.
06:11:779Paolo Guiotto: of this variable, 1 over n, sum for k going from 1 to N.
06:17:799Paolo Guiotto: of XK. The mean of this is 0, we already know, because we subtracted the mean value. We take the variance of this.
06:29:239Paolo Guiotto: Now, the variance is the expected value of the square.
06:34:629Paolo Guiotto: Minus this square of the expected value, right?
06:39:449Paolo Guiotto: But, we subtracted the mean value, and easily this quantity is zero, okay?
06:46:359Paolo Guiotto: So we have to compute only the expected value of the square. So we have this 1 squared, 1 over M squared, then we have the expectation of, the square of sum over K of XK minus M,
07:04:989Paolo Guiotto: Well, this is the sum of the squares, so we have the sum of a K of XK minus N.
07:13:429Paolo Guiotto: square. Then we have the mixed product, so sum of a K different from J, of XK minus M, XJ,
07:25:339Paolo Guiotto: minus M.
07:27:569Paolo Guiotto: And when we take the expectation of all this thing.
07:32:299Paolo Guiotto: we get. Now, expectation of this sum goes out, so it becomes the sum of a K of the expectations, XK minus m squared. But as you can see, this is the variance of XK.
07:48:829Paolo Guiotto: Okay? And since the XK are not only independent, but they have the same distribution, this is a fixed value, okay? This is the variance of XK
08:03:89Paolo Guiotto: That we can baptize constantly equal to sigma squared to emphasize the fact that it is positive.
08:10:379Paolo Guiotto: Plus, then here we have the sum of a K different from J. In this case, we should have the covariance of XK and XJ, so XK minus M, XJ minus M. But since they are independent, here the two indexes are different.
08:29:29Paolo Guiotto: So by independence, this splits into the product of the expectations, expectation of XK minus M times the expectation of XJ minus M.
08:40:539Paolo Guiotto: And these are both zero, because M is just the expectation of XK.
08:45:119Paolo Guiotto: So, at the end, we get…
08:47:989Paolo Guiotto: 1 over N, perhaps N squared.
08:53:99Paolo Guiotto: Right. And spirit.
08:55:279Paolo Guiotto: times sum for k going from 1 to N of this constant sigma squared, so this builds 1 over n squared times n sigma squared.
09:07:379Paolo Guiotto: So, the, variance… of this average, 1 over n sum.
09:15:859Paolo Guiotto: for K going from 1 to n of XK,
09:20:119Paolo Guiotto: minus M, this variance is sigma square over n.
09:26:759Paolo Guiotto: So now, the trick is that if we carry the N on the left side.
09:31:589Paolo Guiotto: And we put inside the variance. The variance is the expected value of a square, so when you go inside, this becomes a root of n.
09:44:269Paolo Guiotto: So this says that, the variance… of…
09:49:319Paolo Guiotto: 1 over root of n. Well, we can also divide by sigma squared, so carry sigma square here, and carry inside, this becomes a sigma.
10:00:439Paolo Guiotto: So, sigma root of n sum for k going from 1 to n of XK minus the mean value.
10:09:19Paolo Guiotto: This variance is now constantly equal to 1, constantly for every n Okay?
10:17:639Paolo Guiotto: So we have this, and we remind also that the expected value of this quantity is 1 over sigma root of n.
10:26:599Paolo Guiotto: sum for k going from 1 to n of XK minus M.
10:33:449Paolo Guiotto: Now, the expected value, this constant goes outside, the sum goes outside, we have the sum of the expectations of XK minus M, which is 0, because M is the expectation of the XK, so this is 0.
10:47:399Paolo Guiotto: So these variables here… These particular variables.
10:53:879Paolo Guiotto: F, mean 0, and variance 1.
10:57:529Paolo Guiotto: Now, what is remarkable is the following fact.
11:02:129Paolo Guiotto: that when we send n to infinity, it turns out that they have a non-trivial limit with mean 0 and variance 1, and that's the standard Gaussian.
11:13:359Paolo Guiotto: So, it happens that if the XK… now, to prove… to give a proof, we will take a little bit more restrictive assumption, so we assume that they are in L2.
11:25:429Paolo Guiotto: Instead of just L1.
11:29:119Paolo Guiotto: However, I, I…
11:32:789Paolo Guiotto: Yeah, here, also, I needed that they are in L2, because I'm talking about their variances, so, in fact, but, so, let's say that DXKR in L2 omega.
11:46:529Paolo Guiotto: independent, identically distributed random variables with common mean value. As we said, we call it M,
11:57:849Paolo Guiotto: And, common variants.
12:01:429Paolo Guiotto: for this XK equal Sigma square.
12:07:329Paolo Guiotto: Okay, so it happens that when you take 1 over 10,
12:13:819Paolo Guiotto: 1 over sigma, root of n, the sum for k going from 1 to n of XK minus n.
12:22:59Paolo Guiotto: This is a sequence of variables in N that converges in distribution to a standard normal
12:33:989Paolo Guiotto: So this in particular, normally, this statement is also written in this way.
12:44:549Paolo Guiotto: You may remind that if we have a convergence in distribution, If, YN…
12:53:729Paolo Guiotto: converges in distribution to some, say, Y,
13:00:179Paolo Guiotto: the original idea of convergence in distribution was to say that the probability that YN belongs to some interval, AB,
13:11:619Paolo Guiotto: converges to the probability that Y belongs to the interval AB. Now, we have seen that this is,
13:19:229Paolo Guiotto: a better definition, because, for example, the direct delta that
13:24:749Paolo Guiotto: is concentrated on some point that goes somewhere, won't converge in this sensor. But this becomes true
13:32:369Paolo Guiotto: We have not proved if the points AB, the endpoints of the interval, are continuity points for the CDF. So, this is for every A and B.
13:47:949Paolo Guiotto: Continuity. Points.
13:51:719Paolo Guiotto: for… the CDF of the limit.
13:56:759Paolo Guiotto: Now, in this particular case, since the limit is an absolutely continuous random variable, a Gaussian has a density is absolutely continuous, in particular.
14:07:469Paolo Guiotto: the CDF is always continuous, it's actually differentiable, it has the derivative, and the derivative is the density. So it means that this conclusion is verified.
14:19:349Paolo Guiotto: If, why?
14:21:489Paolo Guiotto: He is, absolutely.
14:24:309Paolo Guiotto: continuous, so if the, limit random variable, not necessarily the YN, this… is… Always.
14:39:859Paolo Guiotto: True.
14:42:09Paolo Guiotto: So… In particular, for the case of the central limit theorem, we can say that
14:48:59Paolo Guiotto: So, the probability that… that… it's not an average, because you see that it… the rescaling is through the root of n, it's not by N.
14:59:139Paolo Guiotto: So 1 over sigma root of n summed for K going from 1 to N, XK minus.
15:07:599Paolo Guiotto: Emma.
15:08:749Paolo Guiotto: the probability that this is between A and B,
15:12:489Paolo Guiotto: So that variable belongs to the interval AB, converges to the probability that the limit, so a standard Gaussian is between A and B, which is the integral from A to B of E to minus X squared over 2 divided root of 2 pi.
15:31:489Paolo Guiotto: India, so…
15:32:869Paolo Guiotto: This is how often is written the central limit theorem. It's basically equivalent to that formulation we have seen here.
15:42:699Paolo Guiotto: Okay? It's the same thing written in two different ways. Here it emphasizes a bit that the point is that it doesn't matter what that is XK,
15:55:519Paolo Guiotto: But they can be whatever, because the unit condition is they are independent, with identical distribution.
16:01:899Paolo Guiotto: And this is not an average, but it says that if you do the average, this is the average, this is the strong low, the low large numbers, and the average of this will go to zero.
16:16:839Paolo Guiotto: Okay, so we concentrate the zero. If you rescale a bit lesser with the root of n, these distributors who are not triggered a little bit, that turns out to be a motion.
16:28:49Paolo Guiotto: Okay.
16:29:629Paolo Guiotto: Oh, the, the, the proof is,
16:33:289Paolo Guiotto: not particularly complicated and nice, let's say. Let's at least see how it works.
16:44:999Paolo Guiotto: So to show the convergence in distribution, we do, we, we work with characteristic functions. So let's call yn the 1 over sigma root of n.
16:59:269Paolo Guiotto: sum for k going from 1 to N of XK minus M.
17:06:539Paolo Guiotto: Well, for simplicity, for computational simplicity, we assume M equals 0, okay? Otherwise, you replace the variable XK by XK minus M, and now you have mean zero variables, okay? So, we assume…
17:26:999Paolo Guiotto: M equals 0.
17:28:949Paolo Guiotto: So, let's compute the characteristic function of this YN.
17:34:249Paolo Guiotto: And let's show that we can send N to infinity and prove that this goes to the characteristic function of the standard Gaussian. So the goal is…
17:44:579Paolo Guiotto: The goal is to show
17:46:869Paolo Guiotto: that this thing goes to the characteristic function of a standard Gaussian, mean zero variance 1, which is easy, it is E minus C squared over 2. That's the characteristic function for the standard Gaussian.
18:03:419Paolo Guiotto: Now, if we do this calculation, we get this is the expected value of e to iC, then we have to put this 1 over sigma root of n, the sum for k going from 1 to n of the XK, so we set that M equals 0.
18:23:119Paolo Guiotto: Now, for the first part, it's a straightforward calculation. First, we carry out the sum on this exponent, and down here, it becomes a product. So we get the expectation of the product on K e to i.
18:37:859Paolo Guiotto: Well, let's put this C over sigma root of n here, XK, in this way. Then we use independence.
18:48:389Paolo Guiotto: We know that the XK are independent, so we carry outside this product. So this becomes the product of the expectations.
18:57:579Paolo Guiotto: e to iC over sigma root of n xk. And that's the characteristic function of the XK evaluated at C over sigma root of n.
19:13:219Paolo Guiotto: Now, these characteristic functions are independent of K, because the variables are identically distributed, so they have the same phi, okay? So this is a unique phi, because the XK
19:29:639Paolo Guiotto: Not only independent, but also identically distributed, evaluated at sigma root over root of n.
19:38:679Paolo Guiotto: Now, the idea is that, so, phi Yn, C,
19:44:09Paolo Guiotto: is the product of a K of these functions phi evaluated at Xi divided sigma root of n.
19:53:419Paolo Guiotto: Remind that k is going from 1 to n, and we have to send n to infinity. So when we send n to infinity, when n goes to infinity, the idea is that the argument of this characteristic function is going to zero.
20:06:979Paolo Guiotto: So we now use the McClure expansion to see what is it that is. So phi of, say, for a second, U is equal to phi of 0,
20:17:839Paolo Guiotto: plus phi prime 0. U plus 1 half, we need to go to the second derivative, as you will see, phi second zero, u squared plus a little correction of u squared.
20:33:709Paolo Guiotto: What are the values here? Well, each characteristic function, phi of 0 is always equal to 1.
20:40:979Paolo Guiotto: So that's one.
20:43:279Paolo Guiotto: What is this phi prime of 0? We computed in an exercise last time. We have seen also in general, that when you do the derivative with respect to C of, well, let's say, well, let's say fixed C, which is expected value.
20:59:489Paolo Guiotto: E to IXC, a variable. X. This is…
21:04:909Paolo Guiotto: expectation, so we do not justify again, but it's the usual differentiation theorem under the integral sign. So it comes ixe to IX. So when you evaluate the derivative of this at 0, you get the expected value of IX, or I,
21:26:769Paolo Guiotto: the expected value of X.
21:28:949Paolo Guiotto: which is, in this case, equal to 0, because we are assuming that M is 0. Otherwise, you will have that M.
21:36:849Paolo Guiotto: to carry around. In any case here, we would have X minus M for the true proof, so we will have always zero there, okay?
21:46:29Paolo Guiotto: So this number is zero, and that's why we need to have a next term, because this would be a not precise,
21:56:99Paolo Guiotto: approximation formula. For the second derivative at zero, well, as you can imagine, when I derive again this formula, I get down another IX. So when you set X equals 0, you get expectation of
22:09:449Paolo Guiotto: IX squared.
22:12:519Paolo Guiotto: And this yields minus the expected value of X squared.
22:17:239Paolo Guiotto: Because I square is minus 1. So this quantity is 1 half times minus the expected value of X squared, which is the sigma square factor, is the variance. It's knowing.
22:32:919Paolo Guiotto: Minus sigma squared.
22:35:649Paolo Guiotto: So…
22:37:99Paolo Guiotto: Let's say that the formula is phi of u equal phi of 0, 1 plus phi prime 0U 0, and then we have minus sigma squared over 2U squared plus a little correction of u squared.
22:54:659Paolo Guiotto: That's the Maclaurin formula for a generic, characteristic function where the mean value is 0 and the variance is sigma squared.
23:04:909Paolo Guiotto: Okay?
23:06:429Paolo Guiotto: Good. So, let's go back to this formula. So when we now compute that phi in this value, C over sigma root of n, we get
23:19:239Paolo Guiotto: So the phi evaluated at C divided by sigma root of n is equal to 1 minus sigma square over 2. Then I have to put as U this quantity, so X
23:32:919Paolo Guiotto: over sigma root of n squared, plus a little o of the same quantity. Let's wait a second to write there. It's a little O of that square.
23:43:479Paolo Guiotto: So, if you look… well, let's write it now. If you look at this squared, think that the limit is in N, so C sigma R parameters.
23:53:399Paolo Guiotto: So we just throw away… when you do the square, this becomes a 1 over n here, okay?
24:00:229Paolo Guiotto: And there should be also a C squared, sigma squared. They are important here, because we got 1 minus sigma squared over 2, then you have C squared over sigma squared n plus O of 1 over N.
24:16:569Paolo Guiotto: So, as you can see, this sigma square cancels, and what remains is 1 minus, let's write this way, C squared over 2 divided by N plus a little o of 1 over n.
24:32:999Paolo Guiotto: And that's the value of the phi evaluated at this point, Xi, of a sigma times root of n.
24:41:599Paolo Guiotto: Okay, now we can return to our initial calculation. The parameteristic function of YN, where YN was this sum reskilled.
24:54:729Paolo Guiotto: So, phi of YN evaluated at xi is the product for k going from 1 to n of these values, which are
25:07:879Paolo Guiotto: 1… Minus C squared over 2 divided by N, plus a little o of 1 over n.
25:17:259Paolo Guiotto: As you can see, there is no more K here.
25:20:819Paolo Guiotto: So, it's a constant, it means that you are multiplying by itself the same quantity and times, so this becomes a power to the n.
25:31:329Paolo Guiotto: Okay, now we have to compute this limit, and this is easy, because forget of the little o, which is just a little correction. It's like if you have 1 plus constant divided by the n to the n. That's the remarkable exponential limit.
25:45:479Paolo Guiotto: And therefore, this goes to E to this number here, minus C squared over 2, and this happens
25:54:769Paolo Guiotto: Forever, forever exceed real.
25:59:349Paolo Guiotto: But that's exactly the conclusion, because it says that C by X, the characteristic function is going to the characteristic function of a standard Gaussian. So this is the phi of a standard Gaussian
26:14:319Paolo Guiotto: evaluated at point C, and this is the conclusion.
26:19:719Paolo Guiotto: It's important to have this version of the central limit theorem, because maybe the convergence in distribution can
26:29:799Paolo Guiotto: can be a little bit, say, a little bit too abstract for the interpretation. This one is more concrete, you know, that the probability that that quantity falls between A and B can be computed for N large in this way.
26:46:629Paolo Guiotto: For example, this can give some, indication about the probability of the average, because you see that remark, and we finish here.
27:03:749Paolo Guiotto: If you take, that probability.
27:07:999Paolo Guiotto: But I want to have the average, to know
27:12:679Paolo Guiotto: So, I want to have the factor 1 over n here, okay, to have the average. So I multiply by sigma root of n, I take everything on the other side. I have to multiply and divide by N. So,
27:31:39Paolo Guiotto: or divide by… multiply by 1 over n. So this becomes… the conclusion is equivalent to, say, this, that… so the conclusion…
27:44:859Paolo Guiotto: implies… that. The probability that, so we said we put sigma, root of n.
27:56:719Paolo Guiotto: sigma root of n a less or equal than sum k going from 1 to NXK minus M.
28:08:879Paolo Guiotto: less or equal than sigma root of n b.
28:13:559Paolo Guiotto: Then we add the scaling factor 1 over n, so we divide by N.
28:18:999Paolo Guiotto: Everything.
28:21:839Paolo Guiotto: So it implies that this quantity, which is the same of the previous one, this goes to integral from A to B, E minus X squared over 2,
28:36:179Paolo Guiotto: DX over root of 2 pi.
28:42:529Paolo Guiotto: So, this is… this quantity here. When I re-split back this from this, I get the average of the XN
28:55:559Paolo Guiotto: When I do 1 over n, sumocate from 1 to n of the constant M, I get n times m, divided by n, I get n. So this is a way to assess how far I am from the mean. It says that the probability of this is between sigma a divided root of n
29:15:79Paolo Guiotto: And sigma B divided the root of n.
29:19:219Paolo Guiotto: This thing is going to, this quantity integral from A to B,
29:27:69Paolo Guiotto: A to B, E minus,
29:30:819Paolo Guiotto: X squared over 2BX divided root of 2 pi.
29:48:339Paolo Guiotto: There's something… Strange here…
30:01:149Paolo Guiotto: Let's meeting.
30:22:209Paolo Guiotto: No. Okay, well, we got this bound.
30:28:229Paolo Guiotto: Okay, so, I leave a couple of exercises on the,
30:36:909Paolo Guiotto: In the central limit theorem, do…
30:40:689Paolo Guiotto: Well, let's say that do problem 9, 4, 7, and 8.
30:49:729Paolo Guiotto: The number 7, the number 4, 8, is based… there is a second part, which is a general property of convergence, which you can accept to do the exercise, and if you want, you can prove, no, it… there is… you will see, we read what it says.
31:09:229Paolo Guiotto: it's not completely easy to… to do this, this second part proof, but you can accept to solve the exercise, okay? So if you want, you can…
31:19:659Paolo Guiotto: You can try to do that. Okay, let's stop here.
31:24:549Paolo Guiotto: Sorry for the inconvenience before.