Class 22, Jan 13 2026
AI Assistant
Transcript
00:06:970Paolo Guiotto: Oh, no, no.
00:11:210Paolo Guiotto: Okay.
00:12:550Paolo Guiotto: We started with, And it says Sizer… Basically…
00:21:240Paolo Guiotto: on the definition of Brownian motion, I take this… This one, which is… from the…
00:33:940Paolo Guiotto: exam exercises. It is the exercise number one.
00:39:810Paolo Guiotto: Exam… Exercises… That it.
00:48:290Paolo Guiotto: straightforward letter WT.
00:53:840Paolo Guiotto: take greater or equal than 0B volume motion.
01:00:120Paolo Guiotto: for… St positive.
01:04:810Paolo Guiotto: compute.
01:08:320Paolo Guiotto: these quantities. So the first one is the expected value of WT times WS.
01:17:980Paolo Guiotto: Let's write one by one the questions.
01:21:240Paolo Guiotto: Okay, so here… We have to distinguish, basically, the case T equals, from TDFR and from S.
01:35:170Paolo Guiotto: Because if T is equal to S, we have EWT squared.
01:41:500Paolo Guiotto: We know that WT is standard normal, mean 0 and variance stay, because you reminded that the increments boot t minus WS is normal, mean zero, and variance T minus S.
01:57:560Paolo Guiotto: Within addition, W0 is just 0. Take S equals 0, and you have this. So, this is exactly equal to this, while for T different from S,
02:12:930Paolo Guiotto: For example, let's say that S is less than T,
02:17:760Paolo Guiotto: So we have that. The expected value of WTWS,
02:24:990Paolo Guiotto: we basically use the independence of the increment. We can say that this is, for example, WT minus WS plus WS.
02:36:220Paolo Guiotto: That's WT.
02:38:140Paolo Guiotto: times WS.
02:40:660Paolo Guiotto: So when we split this into expected value of WT minus WF, times WS…
02:49:940Paolo Guiotto: plus expected value of WS square.
02:55:890Paolo Guiotto: This one is what we computed here above. It is equal to S.
03:01:540Paolo Guiotto: While this one, remind that the increments of the Brownian motions are independent, so WT minus WS is independent, I can say, of WS minus W0.
03:18:400Paolo Guiotto: Which is WS.
03:21:520Paolo Guiotto: So, this means that this expectation splits into the product of the expectations, WT minus WS,
03:31:180Paolo Guiotto: expectation of WS, and all these are equal to zero, because these are, mean zero Gaussians, so this is equal to zero.
03:42:270Paolo Guiotto: So we have that when S is less than T, the expectation of WT times WS is S.
03:51:50Paolo Guiotto: So we could say that, expectation of WT times WS
03:59:340Paolo Guiotto: Well, actually, if T is equal to S, it comes S, so we can say that it is equal to S when S is less or equal than t.
04:11:430Paolo Guiotto: And if S is greater or equal… is greater than T, it will be equal to T, because we can just flip the two variables. So, in general, we get this formula that the expectation of WT times WF
04:29:680Paolo Guiotto: is what? Is S when S is less than T? Is T when T is less than F, so it is the minimum.
04:36:540Paolo Guiotto: between S and T.
04:41:140Paolo Guiotto: This was the number one. Then we have the number 2, which is similar to complete the expectation of
04:49:830Paolo Guiotto: WS times WT squared.
04:56:470Paolo Guiotto: We can precede them.
04:59:230Paolo Guiotto: as in the previous case, so if S is equal to T,
05:04:90Paolo Guiotto: We have the expectation of WT Cuba.
05:09:260Paolo Guiotto: Now, reminding that WT is a standard, is a normal, means zero variance T.
05:16:170Paolo Guiotto: So what is the expectation of WT cubed? Well, this would be the integral on R over X cubed times the density, which is here, E minus X squared over
05:32:170Paolo Guiotto: to the variances T, divided by the root of 2PT.
05:38:630Paolo Guiotto: in the X.
05:41:820Paolo Guiotto: As you can see, this integral is zero, because the function is…
05:46:180Paolo Guiotto: is odd, no? And the interval is symmetric with respect to the origin, so the integral is just 0.
05:55:240Paolo Guiotto: for S less than T, as before, but here there is not a symmetry between the two cases, so we have to see what happens for S greater than T. So we have W, S, W, T,
06:09:620Paolo Guiotto: We are… equals… We could use a similar trick.
06:17:780Paolo Guiotto: as we did above, so we write WT as WT minus WS plus WS,
06:24:580Paolo Guiotto: to use the fact that increments are independent. And here we have the increment, WT minus WS, which is independent.
06:37:90Paolo Guiotto: of WS minus W0, no, minus W0, which is 0, okay? So, we have just to do a little bit of calculations, because we have expectation WS, the cube is the cube of WT minus WS.
06:57:560Paolo Guiotto: plus triple product, the square of WT minus WS times WS.
07:05:910Paolo Guiotto: plus, 3 times, WT minus WS,
07:11:450Paolo Guiotto: times WS squared, plus… there is the cube of WS.
07:18:400Paolo Guiotto: All this is multi… yeah.
07:22:890Paolo Guiotto: Whatever.
07:24:400Paolo Guiotto: Yeah, that's a good question, because…
07:28:40Paolo Guiotto: Probably I was thinking of previous case, that's right, better, because also the calculation is easier.
07:36:720Paolo Guiotto: Okay.
07:38:170Paolo Guiotto: So… This is square, this is the double product, and then we have only this one.
07:49:460Paolo Guiotto: Square. Okay, so 2 times this plus W squared. Okay. Now, we have,
07:58:720Paolo Guiotto: when we do this product, WS times WT minus WS squared, the two variables are independent, because also WT minus WS squared will be independent of WS, so this will split into the product of these two expectations.
08:16:830Paolo Guiotto: And, of course, the first one
08:19:620Paolo Guiotto: is 0, the second one would be T minus S, so we get 0.
08:25:370Paolo Guiotto: plus the same. When we do this times WS, we get the expectation of WS square times the increment. We split into the product, so we get two expectation of WS square times expectation of the increment, WT
08:43:900Paolo Guiotto: minus WS, and now this is 0, this is S.
08:49:160Paolo Guiotto: And then we have expectation of WS Cube.
08:56:70Paolo Guiotto: Which is equal to 0.
08:59:930Paolo Guiotto: Sorry, this one is equal to zero. We computed a bover.
09:06:280Paolo Guiotto: So, this means that we get just zero.
09:09:380Paolo Guiotto: Okay?
09:10:500Paolo Guiotto: So this is when S is less than T.
09:14:700Paolo Guiotto: When S is greater than t, the expectation of WS.
09:21:950Paolo Guiotto: WT square.
09:24:830Paolo Guiotto: Now, it is convenient to do the increment in WS, so we write this as WS minus WT plus WT.
09:36:630Paolo Guiotto: WT squared.
09:40:680Paolo Guiotto: So this exercise is basically learned using independence of the increment. That's the scope of the exercise. So we have, again, expectation of this times this, they are independent now.
09:55:300Paolo Guiotto: So, expectation of WS minus WT times expectation of
10:00:890Paolo Guiotto: WT squared, this is 0, this is T, so in any case, the product will be 0. And then we have this time this, which is 0, because it is the expectation of WT
10:14:510Paolo Guiotto: So we get zero. So whatever we computed, we got just zero. So the expectation the expectation of…
10:26:280Paolo Guiotto: In all cases, WS times WT squared is equal to 0 for every S and T.
10:35:80Paolo Guiotto: Then there is another one, which I think,
10:40:80Paolo Guiotto: you could do by yourself. Question 3 is to compute the expected value of, now, WS squared, WT squared.
10:52:270Paolo Guiotto: Okay, same calculation, so same ideas. And the Q4 is largely different, because it asks her to compute the calculation… the expectation of WS exponential of WT.
11:09:350Paolo Guiotto: Let's see, what is it this.
11:13:640Paolo Guiotto: Okay.
11:16:40Paolo Guiotto: Well, let's see if we can, now do something,
11:24:240Paolo Guiotto: Yeah, because I could say, if the two are different, no?
11:29:150Paolo Guiotto: we should always reduce to the case when they are the same, because if T is different from S,
11:37:160Paolo Guiotto: for example, T larger than S. I could say, expectation of WS e to WT, I add and subtract WS.
11:52:650Paolo Guiotto: Nope.
11:54:50Paolo Guiotto: So this comes, the expectation of, WS.
11:59:610Paolo Guiotto: E to WS times the expectation of the increment. They are independent, so it comes the product of this.
12:14:570Paolo Guiotto: So let's keep there, just to have an idea of what we have to compute at the end. If T is less than S, I could do…
12:23:40Paolo Guiotto: So this is equal, let's put a star.
12:26:960Paolo Guiotto: Yeah.
12:29:640Paolo Guiotto: In this case, I would say it is, expectation of, WS… minus WT plus WT.
12:41:170Paolo Guiotto: E to WT, so when we split, We get this times this.
12:49:190Paolo Guiotto: They are independent, so they split into the product, and the expectation of WS minus WT is 0, so 0. And then what remains is WT…
12:59:480Paolo Guiotto: E to WT.
13:02:660Paolo Guiotto: And when T is equal to S, of course, we get,
13:07:520Paolo Guiotto: That we have to compute the expectation of
13:11:90Paolo Guiotto: WTE to WT. So at the end, we need to compute this quantity here.
13:19:260Paolo Guiotto: Because we have this, and we have also this, and then we have to compute this quantity here. So, basically, we have to compute these two quantities.
13:28:770Paolo Guiotto: So, let's start with the expectation of, something like E to W, E, expectation of WT, E to WT.
13:39:380Paolo Guiotto: Now, here, there is nothing to do. We use the formula, this is normal, means
13:45:880Paolo Guiotto: So, this is integral on R, XE2.
13:53:740Paolo Guiotto: X, notice that now this is not 0. E minus X squared over 2t.
14:01:550Paolo Guiotto: Divided the root of 2 pi t.
14:05:530Paolo Guiotto: the X.
14:10:410Paolo Guiotto: So we can put together to get sort of mean value, because, so we keep the scaling factor here to pi t.
14:20:760Paolo Guiotto: Then, X… E, so if we write, like, 2TX squared, then we put together this, so…
14:36:530Paolo Guiotto: So minus, minus, minus here. It should be 2TX, right? If we put this, we get it to X. So we can see this as the part of a square, but this is the double product, so we need the T square here, minus T squared.
14:55:660Paolo Guiotto: to balance. Now, this is a square, so we can say that it is integral Xe minus X minus T squared divided 2T.
15:08:510Paolo Guiotto: And then there is a factor that minus minus plus t squared over 2t is e to t half.
15:16:390Paolo Guiotto: DX divided the root of 2 pi t.
15:21:280Paolo Guiotto: This is, this is a scalar, goes outside, and then we have, the mean value of the Gaussian centered at T, so mean T, and variance,
15:32:150Paolo Guiotto: And variance t, right away. But this is the integral, so it is the mean value. So that integral, we don't need to compute it, will come out, so we have e to t half times t.
15:44:530Paolo Guiotto: This is what… what we obtain. You see?
15:49:870Paolo Guiotto: Okay, so if you have an integral of X e to minus X minus m squared over 2 sigma squared, divide the root of 2 pi sigma squared dx, this is the mean value of this distribution, it is equal to m at the end.
16:07:950Paolo Guiotto: Okay, so we get this. It is e to tf time t.
16:13:320Paolo Guiotto: So… We need also to compute this one.
16:19:330Paolo Guiotto: So let's do this, sacrifice E to WT minus WS.
16:26:300Paolo Guiotto: Again, we use the distribution of WT minus WS.
16:30:960Paolo Guiotto: is normal, means 0 variance T minus S.
16:36:190Paolo Guiotto: So this is the integral on R of E to X.
16:40:530Paolo Guiotto: And then we have to put the density of this, which is E2 minus…
16:47:60Paolo Guiotto: X squared divided 2T minus S.
16:53:100Paolo Guiotto: Then we have the scaling root of 2 pi t minus S dx.
17:00:830Paolo Guiotto: Again, we put together the two exponentials. We have a unique,
17:06:770Paolo Guiotto: exponential E minus 2T minus S, then X squared, this becomes, as above, minus 2T minus SX.
17:21:30Paolo Guiotto: And then we complete the square with the T minus S square minus T minus S square.
17:30:860Paolo Guiotto: And then DX divided, they'll do just… Now, splitting this, this is…
17:37:830Paolo Guiotto: X, X minus, T minus S.
17:42:610Paolo Guiotto: square, so we get the integral on R.
17:47:740Paolo Guiotto: E to minus X minus T minus S squared divided 2T minus S
17:56:420Paolo Guiotto: Then we have the scaling, root of 2 pi.
18:01:240Paolo Guiotto: P minus S dx, and then this constant, it is minus, minus, plus E2 plus
18:09:350Paolo Guiotto: T minus S of 2. Now, this integral is equal to 1, because it's the Gaussian density centered at T minus S with the variance T minus S.
18:23:770Paolo Guiotto: So we get e to t minus S.
18:28:320Paolo Guiotto: So, now let's put together, things. So, we say that…
18:39:420Paolo Guiotto: 40 greater… so, this expectation.
18:43:20Paolo Guiotto: Let's see if we can give a uniform. WSE to WT is equal to…
18:51:670Paolo Guiotto: We have 4P greater than S.
18:58:510Paolo Guiotto: for T greater than S, huh?
19:03:260Paolo Guiotto: We have… E to WS, times exponential, expectation EWS, hmm, expectation WS, E to WS.
19:15:160Paolo Guiotto: which is equal to S E to S half.
19:23:170Paolo Guiotto: times, then we have the expectation of e to WT minus WS, we just computed, and it is equal… I forgot N over 2 here… e to t minus S over 2.
19:39:10Paolo Guiotto: for T equals S, what is it?
19:43:380Paolo Guiotto: It is just the expectation of WTE to WT.
19:48:650Paolo Guiotto: So, it is TE to TF.
19:53:60Paolo Guiotto: So, T… when I say SE to SF, let's run this one.
19:58:860Paolo Guiotto: And for T less than S.
20:05:640Paolo Guiotto: It is, it is this one. So again, the value of that expectation.
20:11:960Paolo Guiotto: So, TE to TF for this.
20:16:980Paolo Guiotto: And you see that, here something happens, because it is S times this exponential with exponent S half cancels with this one.
20:27:360Paolo Guiotto: So we get E to 0, 1, and what remains is e to t half.
20:34:660Paolo Guiotto: So, we have, we have, S e to t alpha, when t is greater than S.
20:44:50Paolo Guiotto: S e to S half when S is equal to T, and T equals T half
20:50:970Paolo Guiotto: when T is less than S. I don't know, maybe there is a unique way to write this.
21:01:100Paolo Guiotto: But it seems, we could say, if we use the letter T here instead of letter S,
21:09:740Paolo Guiotto: Which is the same.
21:13:260Paolo Guiotto: You see that, huh?
21:16:180Paolo Guiotto: The 2 and the third case are a unique case, so we can…
21:20:230Paolo Guiotto: produces a bit of this writing. And then we have the factor is the same, e to t alpha, and then we have S when t is larger, and t when t is smaller.
21:32:770Paolo Guiotto: But in any case, it is actually the minimum, should be the minimum between S and T times e to t half, if I'm not wrong.
21:44:940Paolo Guiotto: Okay.
21:48:800Paolo Guiotto: Okay, so… Then.
21:52:320Paolo Guiotto: Let's see if we can do another…
22:06:280Paolo Guiotto: I left you also some exercise to do.
22:10:970Paolo Guiotto: Only… The end of the chapter.
22:15:900Paolo Guiotto: But, for example, this is, again, basically, 11, 3, 1…
22:25:630Paolo Guiotto: So, W is a Bronian motion.
22:30:840Paolo Guiotto: And lambda is a number different from zero. Define this W tilde
22:38:00Paolo Guiotto: as lambda WT divided lambda squared.
22:45:230Paolo Guiotto: check that this W tilde is a Brownian motion.
22:53:870Paolo Guiotto: Now, we have to check, what, the characteristic properties of the Brownian motion. So we have to check…
23:01:540Paolo Guiotto: Wheat.
23:02:580Paolo Guiotto: Av.
23:04:110Paolo Guiotto: Let me check.
23:07:180Paolo Guiotto: So, number one… the trajectories are continuous. So these are, well, let's say…
23:17:660Paolo Guiotto: while I use this notation, maybe I should have uniformed the notation in this way. So, put the T here.
23:27:250Paolo Guiotto: So, the W tilde, as a function of T, is a continuous function.
23:34:460Paolo Guiotto: on 0 plus infinity, for, almost every omega in, capitals. Omega. But that's easy, because,
23:46:930Paolo Guiotto: W is just lambda, which is a scalar, WT over lambda square.
23:55:240Paolo Guiotto: So, since, the function W is a continuous function.
24:04:40Paolo Guiotto: on 0 plus infinity, because it is a Brownian motion.
24:08:90Paolo Guiotto: With, let's say, shorty with probability equal to 1, that means almost every omega.
24:16:720Paolo Guiotto: This is just a rescaling with a constant factor, DW tilde.
24:23:660Paolo Guiotto: is also a continuous function on seroplasmity.
24:29:160Paolo Guiotto: Then, number 2… I…
24:37:880Paolo Guiotto: I do not remember exactly, as we stated, the properties of the Browning motion, if,
24:45:700Paolo Guiotto: The increment is Gaussian, okay.
24:49:930Paolo Guiotto: So, we have to take the increment W tilde, T minus W tilde S, and show that it is a normal, mean zero, variance T minus S.
25:03:280Paolo Guiotto: No, how can we check that? Well, for Gaussians, for example, the use of characteristic functions is,
25:13:460Paolo Guiotto: is a standard way, so we compute the characteristic function of this thing, so the phi of WT tilde minus WS tilde.
25:26:410Paolo Guiotto: C is equal to… it is the exponential of E to… I see.
25:35:710Paolo Guiotto: Then we have, each of them is a lambda.
25:40:640Paolo Guiotto: Yeah, W of T, well, we could have… Markets, let's use this.
27:15:840Paolo Guiotto: mute the…
27:23:130Paolo Guiotto: You should now be back, huh?
27:25:910Paolo Guiotto: Recording, we are recording, okay.
27:29:430Paolo Guiotto: So that's your type of shit.
27:32:30Paolo Guiotto: So this is T over lambda square minus WS over lambda square. Now, we know that this thing is an increment of the W, Brownian motion, so it is normal, mean zero, and variance, the increment of times, which is T over lambda square minus S over lambda squared.
27:54:410Paolo Guiotto: So this means that this is the characteristic function evaluated at this point, lambda Xi, of a Gaussian, mean zero, and that variance. So the formula says it is e to minus 1 half
28:10:720Paolo Guiotto: the variance, so T minus S over lambda squared, times the square of this guy, lambda X squared.
28:20:820Paolo Guiotto: And as you can see, the lambda square disappeared, here.
28:25:740Paolo Guiotto: And this is E2 minus 1 half t minus S C squared that says that the increment vu tilde T minus v tilde s is normal mean zero variance t minus S.
28:44:810Paolo Guiotto: The number 3 is the independence of the increments, so… Increments…
28:57:330Paolo Guiotto: R.
28:58:830Paolo Guiotto: Independent.
29:02:280Paolo Guiotto: So we have to take, times T1,
29:07:300Paolo Guiotto: We're going to record down 0, less than T2, less than T3, etc, less than TN.
29:14:690Paolo Guiotto: And we do the increments W tilde, so T2 minus W tilde T1, etc.
29:25:40Paolo Guiotto: For the genetic increment, W tilde TJ plus 1 minus W tilde Tj.
29:33:850Paolo Guiotto: But now, this is, because of the definition, it is lambda times W of PJ plus 1 over lambda squared.
29:46:420Paolo Guiotto: minus WTJ over lambda squared.
29:51:440Paolo Guiotto: So as you can see, these are… these increments, the increments of W tilde are functions of increments of W, and the times for W are ordered because, T1 over lambda square will be less, greater or equal than zero, less than T2 over lambda square, etc.
30:10:630Paolo Guiotto: TJ over lambda square, less than TJ plus 1 over lambda square, because we divide all times by the same positive factor lambda squared.
30:23:420Paolo Guiotto: And so these increments are independent, so these increments here are independent.
30:30:190Paolo Guiotto: And therefore, they are independent, also dead ones.
30:34:470Paolo Guiotto: And the last condition is that W tilde at 0 is 0, which is evident, because W tilde at 0 is lambda W at 0 over lambda square, which is 0, because lambda… W is 0, 0.
30:58:180Paolo Guiotto: for example.
31:02:980Paolo Guiotto: Well, let's do another… Fast Exercise, the 11.
31:10:690Paolo Guiotto: Three.
31:11:840Paolo Guiotto: Three.
31:13:400Paolo Guiotto: So we have WT is a Brownian motion.
31:20:780Paolo Guiotto: So, it asks which of the following, Which… off the… following
31:34:690Paolo Guiotto: processes…
31:41:650Paolo Guiotto: Browning motions.
31:45:30Paolo Guiotto: So, the number 1 is minus WT.
31:50:200Paolo Guiotto: Which is basically a particular case of the previous example, because it is, you take minus 1W of T divided minus 1 square.
32:04:440Paolo Guiotto: So, it's a particular case, If I teach you that.
32:12:90Paolo Guiotto: He's a… off.
32:15:280Paolo Guiotto: previous… exist size.
32:19:800Paolo Guiotto: The number 2 is, this, huh?
32:23:810Paolo Guiotto: The root of TW1.
32:29:10Paolo Guiotto: Of course, this doesn't look to be a Bauchan function.
32:33:890Paolo Guiotto: Well, we can notice that, of course, the trajectories are continuous, no, because it's just a function root of t multiplied by a random number. But if you fix an omega and you look as a function of T, this is clearly root of the variable. W1 is a continuous function.
32:53:350Paolo Guiotto: On 0 plus infinity.
32:56:810Paolo Guiotto: With probability, One.
33:00:310Paolo Guiotto: It is also… well, it is also… no. If you look at the increment, WT… root of t, W1 minus root of SW1, this would be the, I don't know, the W tilde T minus W tilde S.
33:19:620Paolo Guiotto: Well, this is just root of T minus root of SW1.
33:25:660Paolo Guiotto: as a random value, this is Gaussian, because it's a factor times Gaussian. Its mean value is 0, and since the variance of W1 is 1,
33:38:320Paolo Guiotto: This has variance, if we assume that T is… S is less than T, is… its variance is root of T minus root of S, which is, of course, not, T minus S.
33:53:120Paolo Guiotto: Well, actually, sorry, sorry, sorry. When you multiply by a factor, the variance is multiplied by the square of the factor, so there should be a square.
34:03:660Paolo Guiotto: when you do the square of this, you get, yes, you get T minus S, but then you have the double product, the root of TS, so this makes this different from T minus S, so it's not…
34:18:580Paolo Guiotto: and normal, mean zero variance T minus S.
34:22:540Paolo Guiotto: So we can conclude here, also, the increment won't be independent, because they all depend on the same random variable.
34:30:870Paolo Guiotto: So the third case is this one.
34:35:90Paolo Guiotto: say WT equal W2T, minus WT.
34:42:230Paolo Guiotto: Maybe this is more interesting. So, in this case, what can be said? Of course, since the W is a Brownian motion, as function of T, when you freeze the omega, you get a continuous function, so clearly the W tilde is continuous in time.
35:02:890Paolo Guiotto: Almost every omega in the sample space.
35:09:560Paolo Guiotto: about the distribution, if you do W tilde T minus W tilde S, say, for example, S less than t.
35:20:650Paolo Guiotto: Well, this is, of course, W2T… minus W2S.
35:28:890Paolo Guiotto: minus WT minus WS.
35:35:820Paolo Guiotto: well, what is this? This is a combination of two increments.
35:45:90Paolo Guiotto: So, assuming that this is 0ST, well, we cannot say where is it exactly 2S with respect to T. We could have that 2S is here, for example, or 2S could be here.
35:58:630Paolo Guiotto: So how can we treat this, huh?
36:04:210Paolo Guiotto: Well, we could also say that this is a combination of Gaussian, and when you do a linear combination of Gaussian, you get a Gaussian random variable, so we just need to compute mean and variance, and to see if we get mean 0 and variance T minus S.
36:22:460Paolo Guiotto: Or we could always use the characteristic function, but still then we have to… we need some information about the,
36:34:830Paolo Guiotto: the position of these times, I would say that, WT…
36:41:100Paolo Guiotto: tilde minus W. S tilde is linear… combination… of Goshen.
36:54:280Paolo Guiotto: So, this is a general fact that, you can, you can easily see by…
37:06:380Paolo Guiotto: So it's a map, it's a linear map of Cauchan vector, so it's a, the… it is Gaussian.
37:18:710Paolo Guiotto: Or if you want to use the independence of increments, you have to… to do cases, no?
37:24:660Paolo Guiotto: the case when 2S is less than T, 2S greater than T, and then to introduce the appropriate point, you know, because if this is the situation, you would say, okay, 2… so if we are in this case.
37:40:690Paolo Guiotto: And this side that this is 2T. So the increment, 2T minus 2S, would be splitted into this plus this, you see?
37:49:190Paolo Guiotto: And then you reconstruct independent increments, and you have the… something that probably we will have to do later when we have to discuss the independence of increments.
38:00:920Paolo Guiotto: If we look at the expected value, well, the expected value of this WT tilde minus WS tilde.
38:09:610Paolo Guiotto: This is clearly zero, because it is the sum of the two expected values.
38:14:250Paolo Guiotto: Well, if we do the expected value of WT tilde minus WS tilde, there you see that sooner or later, we have to… you have to face that problem, because
38:26:360Paolo Guiotto: When you do this, we get the expected value of the square of the first, so we have W2t minus W,
38:43:50Paolo Guiotto: I'm thinking that. I don't know even if it is convenient to do that. Okay, W2S square.
38:49:20Paolo Guiotto: Then we have the square of the second, WT minus WS.
38:55:780Paolo Guiotto: And then we have the double product with the minus W2T… minus W2S times WT
39:04:740Paolo Guiotto: minus WS. To compute this one, we need to know the position of these times, so this is square. So we have, this one is 2T minus 2S.
39:18:170Paolo Guiotto: This one is T minus S.
39:21:250Paolo Guiotto: And now we have minus 2 expected value of this W, 2T, minus W2S.
39:31:190Paolo Guiotto: times, 2T minus 2S.
39:40:790Paolo Guiotto: Well, let's see what is it to this. So we have, 0ST… So I'd say…
39:56:150Paolo Guiotto: So, 2D is definitely here.
40:00:350Paolo Guiotto: And basically, I think we have two cases. Either 2S is below T or above.
40:06:890Paolo Guiotto: Let's see what happens if it is below.
40:10:170Paolo Guiotto: If it is below that big increment, we'll be split into W2T minus WT plus WT minus W2.
40:20:930Paolo Guiotto: Wessa?
40:23:650Paolo Guiotto: We want independent increments, so this one, WT minus WS has to be splitted into two, you see?
40:31:880Paolo Guiotto: So I want to have independent things, so I have this… sorry, let's put a round parentheses.
40:39:60Paolo Guiotto: These, times WT minus W2S, and then W2S, sorry, plus… plus W2S…
40:56:00Paolo Guiotto: minus WS. And then let's try to do the calculation
40:59:840Paolo Guiotto: by excluding all terms that are zero, because when we do the product, we have this time this, then we do the expectation of the product. You see that one is the increment between 2T and T, and the other is between T and 2S, so they are two consecutive increments.
41:19:300Paolo Guiotto: The increments are independent, so the expectation splits into the product of the expectation, and this will be zero, because each of them are zero.
41:28:300Paolo Guiotto: Then, when we do this times this… well, let's use a different color, maybe. This times this. It is this increment in blue versus this one. Again, independent increment, the expectation splits into the product of the expectations, we get zero.
41:47:710Paolo Guiotto: Then we have, this times this, WT minus W, 2S, this.
41:55:00Paolo Guiotto: By its sash, this is not 0, so we get a value here, minus 2 expectations.
42:02:350Paolo Guiotto: of WT minus W2S squared.
42:07:890Paolo Guiotto: And the last one is this times this, so we have…
42:12:860Paolo Guiotto: the increment here, T minus W minus 2S, and then 2S,
42:19:220Paolo Guiotto: to S minus S. And these two now are independent, so the expectation splits we get 0.
42:25:810Paolo Guiotto: So the unique term survives here is this one, and this is normal, means 0 versus T minus 2S, so the value is T minus 2S, here.
42:37:740Paolo Guiotto: So the final value is…
42:40:950Paolo Guiotto: 2, T minus S, the first, plus T minus S, so we can say 3T minus S.
42:51:920Paolo Guiotto: Minus 2… T minus 2S, and now let's hope that all this kills T minus S.
42:59:450Paolo Guiotto: So we have 3T minus 2T is T, then we have minus 3S plus 4 plus 4S…
43:15:940Paolo Guiotto: So it's pluses.
43:19:400Paolo Guiotto: It's not T-minus S.
43:24:40Paolo Guiotto: I was sure of the calculation, let's check, because now it's… it's important, because if it is this two results, we can say the increment is not… mean zero variance T minus S. Well, let's check again.
43:37:560Paolo Guiotto: So we have, this, WT minus… so first we wrote this, okay, that we did the square.
43:48:420Paolo Guiotto: And we have square first, square second, and then double product.
43:52:520Paolo Guiotto: Then here we are assuming that 2S is less than 2T, so that's the normal increment, 2T minus 2S, yes.
44:02:90Paolo Guiotto: The other one is T minus S, no doubt. Then we have the double product.
44:07:160Paolo Guiotto: And now the problem is that we wrote the double product in this form. WT minus W2S, we introduce WT.
44:16:580Paolo Guiotto: So minus WT is plus WT, okay, and the same for the other increment, T minus S, so WT minus W2S plus W, so there is no problem here.
44:28:230Paolo Guiotto: Now, we did the products, and then the expectations. So, we take the first one, the red, the W2T minus WT, times W2T minus… WT minus WS, they are increment, the independent expectation is the product zero.
44:45:590Paolo Guiotto: The same for the blue.
44:49:400Paolo Guiotto: Yes.
44:52:500Paolo Guiotto: The red one, the green one is the same, so undoubtly, you get this.
44:59:20Paolo Guiotto: And the purple is,
45:02:880Paolo Guiotto: T minus 2S, W2S minus… yes, there is no error here. So we get minus 2, this is T minus 2S, the difference of the two times.
45:13:470Paolo Guiotto: So at the end, we get two T minus S plus another T minus S, 3T minus S minus 2,
45:21:890Paolo Guiotto: then T minus 2S. So we get 3T minus 2T is T.
45:27:160Paolo Guiotto: minus 3S plus 4S, it's plus S, so it is this one.
45:34:270Paolo Guiotto: So, I think that there is no question, and this means that,
45:39:800Paolo Guiotto: It seems that this is a WT LATI.
45:45:720Paolo Guiotto: Peace… nuts.
45:48:900Paolo Guiotto: Yeah, well, in motion.
45:52:510Paolo Guiotto: Okay.
45:57:360Paolo Guiotto: Okay, so still… we have still about 45 minutes. I want to tell you something about, the trajectories. So, what are Cody, Brownian?
46:13:140Paolo Guiotto: bets.
46:17:530Paolo Guiotto: Now, we know that, the, let's, W… the ground in motion.
46:27:560Paolo Guiotto: So we know that,
46:32:250Paolo Guiotto: We say this since the beginning. Now, this process arises as a process that describes this irregular motion, and we say that one of the features that are observed in
46:46:80Paolo Guiotto: in the real model are the extreme irregularity of the trajectory, so they are non-differentiable, okay?
46:55:440Paolo Guiotto: Now, I want to tell you something about this.
46:59:120Paolo Guiotto: Well, the starting point could be the following. We know that at any increment, WT minus WS,
47:06:810Paolo Guiotto: as standard distribution mean 0 variance T minus S. In particular, this says that the expected value of WT minus WS squared
47:20:720Paolo Guiotto: is T minus X.
47:23:870Paolo Guiotto: Now, a naive argument would say the following. Now, reminder, I don't know, for me, it's more intuitive. I don't know if the same is also for you, but this would say that if we take the integral of this quantity.
47:41:990Paolo Guiotto: Or maybe, let's try to use a probabilistic idea. So, this says that the mean value of this quantity is T minus S.
47:52:720Paolo Guiotto: Now, saying that the mean value of something is T minus S does not mean that that something is T minus S. Of course, if the something is constant, the mean value coincides with the value of the constant.
48:07:800Paolo Guiotto: But in general, that's a random variable, so I'm saying the mean value of this is T minus S does not mean at all that WT minus WS square is like T minus S.
48:25:690Paolo Guiotto: Okay.
48:26:950Paolo Guiotto: But the naive, let's say, argument would suggest that if the expectation, if the average of WT minus WS squared is T minus S, probably with high probability, something like this, that increment squared will be like T minus S. That would mean
48:46:350Paolo Guiotto: let's say, improperly. Now, if this is true, it is correct that WT minus WS should be of type root of T minus S.
48:57:630Paolo Guiotto: If this is the case, you would have an intuitive… an intuitive,
49:07:690Paolo Guiotto: an intuitive explanation why the trajectories are not differentiable, because then you would say that if you take WT minus WS, if you want to compute the derivative, for example, at time s, you divide by T minus S, and you compute the limit when T goes to S. This would be the derivative with respect to
49:31:470Paolo Guiotto: To time, Well…
49:34:880Paolo Guiotto: let's use the letter S here, at time S of W. But if this happens, this would be the limit when t goes to S,
49:45:620Paolo Guiotto: of the numerator is root of t minus S divided denominator is T minus S, so you will get 1 over the root of T minus S, and you will get infinity. So this would explain why there is no derivative, more or less.
50:00:600Paolo Guiotto: Of course, basically, the first step is that I don't know if an expectation is equal to something, how much the variable inside is equal to that something.
50:13:760Paolo Guiotto: That's just an average, so you could get a value by doing the average between two values opposite, no? You cannot say if they are bigger or smaller, no?
50:24:710Paolo Guiotto: But the average is… they close together in the average, and you get more value. So, it's not evident how we can make this step correct.
50:36:70Paolo Guiotto: We should need then to know, for example, things like, is that true for almost every omega, or for which omegas?
50:45:10Paolo Guiotto: Now, the point is how to make this thing a precise and rigorous statement. Well, a first important fact is a proposition, which is,
50:59:20Paolo Guiotto: the behavior of the so-called quadratic variation. So, instead of computing a single variation, WT minus WS squared, we do this. So, let…
51:13:340Paolo Guiotto: P, subdivision, B, subdivision.
51:22:470Paolo Guiotto: of the interval from S to T.
51:26:80Paolo Guiotto: So we say this is the interval from S to T.
51:30:250Paolo Guiotto: And we divide into sub-intervals with the irregular lengths.
51:34:960Paolo Guiotto: So we call the points of this subdivision TK,
51:39:790Paolo Guiotto: So, let's say that T0 is, by convenience, S, and the last one, TN, is T. And in general, TK is less than TK plus 1.
51:51:420Paolo Guiotto: So, the subdivision is the set made of these points. So, T0 equal to S, less than T1, less than T2, etc, less than TN, which is equal to the final T.
52:06:970Paolo Guiotto: Now, associated to a subdivision, we compute, we call the quadratic variation. Relative to that subdivision, we write S2 pi.
52:18:490Paolo Guiotto: This quantity, by definition, is this… it is the sum for k going… technically, it is from 0 to n minus 1 of the squares of the increments of the Brownian motion, WTK plus 1 minus WTK,
52:35:740Paolo Guiotto: Square
52:37:920Paolo Guiotto: So we sum the, the square of the increments, between S and T. Of course, among all them, there is the full increment now, because the subdivision made by zero… by just the two points, the initial and final point.
52:52:330Paolo Guiotto: But we can, actually, for any subdivision of the interval between S and T, introduce this quantity. It is well-defined.
53:03:200Paolo Guiotto: Now, what happens? It is the following. Assume that
53:12:630Paolo Guiotto: We have a sequence pi, pi, pi, pi, pi what, pi… pi m.
53:22:750Paolo Guiotto: Is a sequence.
53:27:260Paolo Guiotto: of subdivision, Subdivision.
53:32:750Paolo Guiotto: of the interval st, Such that,
53:39:110Paolo Guiotto: Well, the key point is, of course, at the end, the number of points must go to infinity, but we want that this sequence of subdivision is such that when we send them to infinity, they basically invade all the interval st, and this can be explicit in this way, such that this quantity, the modulus of pi m.
54:03:00Paolo Guiotto: Which is the maximum
54:05:250Paolo Guiotto: distance between two consecutive TKs, so TK plus 1 minus TK. Now, these are the points of the m subdivision, so I have to put an index to say that these are the points of the subdivision pi m.
54:22:810Paolo Guiotto: So, pi M is made by points TKM, okay?
54:27:590Paolo Guiotto: So, you have one subdivision. Each subdivision is a set of points between S and T.
54:36:310Paolo Guiotto: Since here we are considering a sequence of subdivisions for each of these elements of the sequence, we have a family of points that divide the interval st, well, such that this quantity, which is the maximum amplitude between two consecutive points of the subdivision, goes to zero.
54:56:170Paolo Guiotto: So this is a way to say what? That when M is bigger, these points must divide the interval from S to T in small parts, because each of these shouldn't be greater… is not greater than the size of the subdivision, which is going to zero, no?
55:16:280Paolo Guiotto: So, saying that this size, the maximum distance between two consecutive points is going to zero, we are saying that the number of points is increasing
55:29:230Paolo Guiotto: For example, for example, you could take the subdivision that divides the interval in M equal parts, no? For example, the pi m defined in this way, it is made of points. We start from S, and we add K over
55:48:140Paolo Guiotto: M, T minus S, when K is 0, 1, 2, M…
55:57:190Paolo Guiotto: If you look at these points, this is the subdivision.
56:03:280Paolo Guiotto: off.
56:04:330Paolo Guiotto: ST… in M equal… bots.
56:15:730Paolo Guiotto: So this is the subdivision that takes S,
56:19:40Paolo Guiotto: T, and it divides in equal parts each of length what? The total length, T minus S, divided by the number of parts I want to divide.
56:32:290Paolo Guiotto: So if you start from the first point, K equals 0, it is S, then you add one part, T minus S over M, so if you want, you can write it this way to see better this. So you can write it as plus K,
56:48:400Paolo Guiotto: times T minus S over M. So you add, k units of this little length.
56:57:720Paolo Guiotto: So, when you have done M times, you arrive at the final point.
57:03:720Paolo Guiotto: Okay.
57:04:870Paolo Guiotto: Now, what happens if we take a sequence of subdivisions of ST with the maximum amplitude going to zero.
57:14:660Paolo Guiotto: Then, when we compute the quadratic variations
57:19:970Paolo Guiotto: of the Brownian motion on this sequence, this is now a sequence of water. These quantities are functions of the Brownian motion, so are random variables, okay? So this is a sequence of random variables. It turns out that this converges in L2
57:39:160Paolo Guiotto: to T minus S.
57:44:30Paolo Guiotto: Now, this will be the crucial step to prove, for example, that the trajectories cannot be differentiable, because if they are differentiable, this quantity should go to zero.
57:55:70Paolo Guiotto: This is the idea. Well, let's see why. So, to check this, to prove this.
58:02:260Paolo Guiotto: We have to do a calculation.
58:06:860Paolo Guiotto: Sweet.
58:08:480Paolo Guiotto: To show…
58:14:120Paolo Guiotto: that the L2 norm of this minus this goes to zero. Now, the L2 norm here is respect to the probability, so it is the expected values of the square of S2 pi m
58:31:00Paolo Guiotto: minus… T minus S.
58:35:270Paolo Guiotto: this goes to zero, because you see, this is the L2 norm of this quantity, S2 pi m minus T minus S. So I'm assessing squared, I'm assessing the distance in L2 norm between these two quantities.
58:52:620Paolo Guiotto: Now, this is a calculation that uses the independence and the Gaussian nature of the increments. So, if we write down, this is expectation of… this is… well, let's use some…
59:08:420Paolo Guiotto: light notation. So the S2 is a sum of increments, square of increments of the Browner motion. So I will call, forcing… for brevity, delta KW
59:22:700Paolo Guiotto: is WTK plus 1 minus WTK. So they increment between two consecutive times of the Brownian motion. So this is the sum of delta KW squared.
59:37:760Paolo Guiotto: So this is the S… sum over K. I don't want to write from K going from 0 to a blah blah blah. Minus T minus S… well, let's give a name also to T minus S to do not carry… well, maybe we have to carry around this, so let's, for a second, keep T minus S.
59:56:900Paolo Guiotto: Square.
59:58:230Paolo Guiotto: So what is the calculation? We do the square, and we check that everything will go to zero. So we have to be a little bit patient, so square, square off first.
00:09:850Paolo Guiotto: Plus square of second minus the double product, so we have the expectation of
00:15:510Paolo Guiotto: This square of the sum of the delta KW square.
00:22:390Paolo Guiotto: So the square of this plus the square of the second, T minus S square.
00:27:660Paolo Guiotto: minus the double product, 2T minus S, the sum of the delta KW squared.
00:36:690Paolo Guiotto: And we have to take the expectation of all this.
00:39:750Paolo Guiotto: Of course, most of the problems will be dealing with this term, but you will see it's not complicated.
00:46:480Paolo Guiotto: So, let's start from the easiest. This is a constant, okay? So the expectation of the constant is the constant, so I have a T minus S squared factor.
00:56:980Paolo Guiotto: Then let's deal with this one, which is also easy, because I have minus 2T minus S, this comes out of the expected value, it's a constant. Then I have the expectation of the sum, I do the sum of the expectations. So, expectation of the delta KW
01:15:440Paolo Guiotto: square.
01:17:200Paolo Guiotto: And then, plus, we have the expectation of the square of that sum. I will treat in a moment.
01:28:70Paolo Guiotto: But first, let's solve this. This is easy, because this is the expected value of the square of an increment, so it's expectation of WTK plus 1 minus WTK squared.
01:43:350Paolo Guiotto: We know that the increments are Gaussian, mean zero, variance, the difference between the two times, and this is the variance, so this is equal TK plus 1 minus TK.
01:56:50Paolo Guiotto: So we have… this is T minus S squared minus 2T minus S, then we have the sum of a K. Fortunately, these quantities are TK plus 1 minus TK.
02:10:980Paolo Guiotto: And what are these? These are the sum of consecutive times from S to T.
02:17:750Paolo Guiotto: So these are the lengths of all the intervals of the subdivisions, no? So when I sum all these lengths, I get the total length. So this quantity is nothing but
02:31:520Paolo Guiotto: T minus S again.
02:34:70Paolo Guiotto: Okay?
02:35:840Paolo Guiotto: And then we have still that expectation of the square of the sun. That's more complicated, we will treat it in a circle.
02:44:460Paolo Guiotto: So this says that you should look at the first two terms. They are T minus S squared minus 2, T minus S times another T minus S, so T minus S squared. So together, they give minus T minus S squared.
03:00:850Paolo Guiotto: And then we have, finally, to treat this term.
03:04:970Paolo Guiotto: the expectation of the sum of the square of that sum. So now let's focus on this.
03:13:480Paolo Guiotto: This is the expectation of the square of sum, reminded these are
03:19:190Paolo Guiotto: In that sum, there are the squares of the increments of the Brownian motion. Not the increments, but the square of the increments.
03:27:580Paolo Guiotto: So here we have the delta KW square. All this must be squared.
03:35:760Paolo Guiotto: So how can we write this? Well, it's a square, so we have the sum of the squares.
03:42:40Paolo Guiotto: So we have expectation.
03:44:330Paolo Guiotto: sum over K of the squares of the squares, so this means the fourth power, delta KW, Power 4…
03:53:890Paolo Guiotto: Plus, there are the mixed product.
03:59:190Paolo Guiotto: with two of these terms with different indexes. So I could say sum for K different from J of a delta KW square
04:11:780Paolo Guiotto: times delta JW squared.
04:17:60Paolo Guiotto: If I write this way, I do not need to write the double product, because I will have the pair KJ and the pair JK, that they produce the same, okay? Otherwise, if you have to write in the double product form, you have to say 2, and then you have to put the two must be counted once. So, for example, you have to write, like, K less than J,
04:36:670Paolo Guiotto: delta KW square.
04:40:00Paolo Guiotto: delta JW squared. But it's more convenient to keep that form instead of this one.
04:46:560Paolo Guiotto: Now, let's compute this. Of course, we carry outside the sums, so sum of a k of the expectation of delta KW.
04:56:330Paolo Guiotto: Fourth power.
04:58:540Paolo Guiotto: plus sum for K different from J, the expectation of this. Fortunately, since K is different from J, you have an increment.
05:09:170Paolo Guiotto: of the Brownian motion on an interval times an increment of the Brownian motion on another increment, and they are independent, so the expectation splits into the product.
05:19:960Paolo Guiotto: by the independence of the increments, so delta KW squared times expectation of delta JW.
05:29:10Paolo Guiotto: square.
05:31:690Paolo Guiotto: You see?
05:33:130Paolo Guiotto: Now, these are easy, because we already computed. This is the expected value of the square of an increment. It is here, right? It is equal to the length of the interval on which the increment is based. So this is TK plus 1 minus TK.
05:53:220Paolo Guiotto: And similarly, this will be TJ plus 1 minus TJ.
05:58:350Paolo Guiotto: Okay.
06:00:330Paolo Guiotto: What about this?
06:02:160Paolo Guiotto: This is the expectation of not the square, but the fourth power of the increment vootyk plus 1 minus rootk
06:13:420Paolo Guiotto: power of 4.
06:14:810Paolo Guiotto: Well, here you can see, it's a standard factor. If you have that, since WT minus WS is normal, mean zero, and variance T minus S, the expected value of the fourth power of this
06:34:510Paolo Guiotto: can be computed by a computing the Gaussian integral, it comes out 3 times T minus S squared.
06:44:190Paolo Guiotto: We accept, okay?
06:47:280Paolo Guiotto: You write the integral, it's the integral of X to power 4, e to minus X squared divided 2t minus S over root of 2 pi t minus S dx. You integrate pi parts, and you reduce to the case of the variance, and you get that result easily.
07:07:640Paolo Guiotto: Okay?
07:09:800Paolo Guiotto: We actually know all the powers of a Gaussian variable. There is a general formula
07:16:240Paolo Guiotto: for the, so-called moments of a Gaussian variable, X to the n for X normal mean 0, and variance sigma squared. There is a formula with the factorials, I don't… not remind, we don't need it here.
07:31:970Paolo Guiotto: So the key point is that this is 3,
07:34:950Paolo Guiotto: TK plus 1 minus TK squared.
07:40:230Paolo Guiotto: Okay, now let's plug all this into this formula, so equals star…
07:47:200Paolo Guiotto: Let's see what we get, huh?
07:50:20Paolo Guiotto: Okay, let's go down here.
07:54:760Paolo Guiotto: So we have… the first sum is 3 times…
07:59:780Paolo Guiotto: the sum of a K of TK plus 1 minus TK to the power fourth.
08:06:420Paolo Guiotto: And then we have a plus.
08:09:910Paolo Guiotto: I'm just copying. The sum for K different from J. Here we have the simple increments, TK plus 1 minus TK.
08:19:450Paolo Guiotto: And TJ plus 1 minus TJ.
08:24:220Paolo Guiotto: Okay.
08:25:479Paolo Guiotto: Now, let's work on this summon.
08:29:520Paolo Guiotto: Of course, this is a double sum, sum of a K, sum of a J, okay? So we may write this sum in this form.
08:37:630Paolo Guiotto: Choose one of the index to sum first, and one of the index to some second. So let's say that we leave the sum of a K outside the sum of a J inside.
08:47:500Paolo Guiotto: Now, you can say that this form… this is the sum of the pairs, KJ, where the two are different, and they both vary between 0 and the final index. So I can say that this is the same over summing over K, and then summing over J different from K.
09:04:450Paolo Guiotto: With the point that now this thing is a constant for the inner sum. So I can say that I can factorize, put outside TK plus 1 minus TK, and then sum over J, TJ plus 1 minus TJ, right?
09:23:60Paolo Guiotto: But now be careful, this sum is not T minus S.
09:27:350Paolo Guiotto: Why?
09:29:290Paolo Guiotto: Because you see that this is the sum of all the consecutive intervals.
09:34:950Paolo Guiotto: of this subdivision, indexed by J,
09:39:50Paolo Guiotto: But we are missing one of them, you see, because there is not J equal K, so it means that you have all the consecutive intervals, but at a certain point, there is TK, TK plus 1,
09:52:470Paolo Guiotto: Which is not… so the length of this is not among them. But you are summing the length of this, this, plus this, plus this, plus this, plus this, plus this, and all the others.
10:05:70Paolo Guiotto: Except that red one.
10:08:210Paolo Guiotto: So the trick is, let's add what misses here, and so if I add also this one, I have the total length, T minus S, but I have to subtract the excess, which is the red length. So this is equal to…
10:26:80Paolo Guiotto: T minus S minus the length decay plus 1 minus TK, okay?
10:35:130Paolo Guiotto: You see?
10:36:380Paolo Guiotto: Because that's the sum of all the… the length of all the intervals of the subdivision, except one.
10:42:780Paolo Guiotto: the interval from TK, because you see the index J is different from the index K, so it means that here, among all of them, it is missing TK plus 1 minus TK.
10:53:610Paolo Guiotto: So if I want the total length, I need also to have that one, so I can, if you want, add TK plus 1 minus TK, but I have also to subtract TK plus 1 minus TK. Now, giving this one together with this part of the sum, I get the total length T minus S minus that.
11:13:660Paolo Guiotto: So, this means that this is…
11:17:810Paolo Guiotto: Now, doing the operation here, I have this factor, T minus S, which is independent of K, so I write outside, T minus S times the sum of k of the lengths TK plus 1 minus TK.
11:32:880Paolo Guiotto: And then, we have minus…
11:36:40Paolo Guiotto: the sum of k of… you see, the TK plus 1 minus TK times itself, so TK plus 1,
11:44:230Paolo Guiotto: minus TK squared.
11:49:110Paolo Guiotto: Now, this first sum is, again, the sum of the length of all the consecutive intervals, so when I sum up all these lengths, I get, once again, T minus S, so this is T minus S.
12:01:20Paolo Guiotto: So, at the end, I get T minus S.
12:04:590Paolo Guiotto: square minus the sum of a K of TK plus 1 minus TK squared. So that's the formula for water.
12:16:770Paolo Guiotto: for the… Expectation of, for this,
12:25:620Paolo Guiotto: for this one, which is the ex… it is this one, okay? So, the expectation.
12:33:410Paolo Guiotto: of the square of the sum of the delta KW square Turns out to be… At the end there.
12:55:130Paolo Guiotto: So… So, here there is a mistake, because they should be squared. This comes from this right… oh, sorry.
13:08:30Paolo Guiotto: So, we say that the increment of the fourth power of the increment of the Brownian motion is this one, 3, the increment of time squared. So, we have 3, this square, okay?
13:23:130Paolo Guiotto: plus that sum. We worked out that sum, it becomes this, so we have putting together a decent quantity, 3,
13:33:250Paolo Guiotto: the sum of k of the squares of the increment of times TK plus 1 minus TK.
13:41:370Paolo Guiotto: Then we have that double sum that we worked out here, and it comes plus… P minus S square.
13:52:310Paolo Guiotto: minus the sum of a K of TK plus 1 minus TK squared.
14:01:510Paolo Guiotto: So this is about that term, man.
14:04:20Paolo Guiotto: So the two together, you see that there are 3 minus 1 makes 2, so it is T minus S squared plus 2, the sum of a K of dec, plus 1 minus dec
14:19:630Paolo Guiotto: square.
14:21:170Paolo Guiotto: All this is this quantity here. So now we go back to the beginning.
14:27:380Paolo Guiotto: So, let's review what we did. So, we started computing this expectation here.
14:37:240Paolo Guiotto: which is the L2 Norma, the distance in L2 norm between the S2,
14:47:680Paolo Guiotto: So the quadratic variation, NT minus S. So we had to do that expectation. So, we have seen that after some steps.
14:59:300Paolo Guiotto: Yes, you see, what we got was minus T minus S squared, plus the expectation of the square of that sum. The expectation of the square of the sum is down here.
15:11:410Paolo Guiotto: Okay, so we can conclude now that… the expectation of square of S, SM.
15:21:770Paolo Guiotto: S, sorry, S2 paella.
15:27:440Paolo Guiotto: Minus, T minus S.
15:30:470Paolo Guiotto: square.
15:32:490Paolo Guiotto: So this quantity is equal to…
15:37:830Paolo Guiotto: So, the first part… Well, let me just color the formula. So, this is the initial calculation, okay?
15:46:930Paolo Guiotto: It is this one.
15:48:660Paolo Guiotto: It is this.
15:50:250Paolo Guiotto: We did the equal equal.
15:52:800Paolo Guiotto: And we arrived there. Yeah. You see?
15:55:870Paolo Guiotto: So we have a minus T minus S squared plus that expectation. So we start writing.
16:03:300Paolo Guiotto: minus T minus S squared, plus the expectations.
16:11:730Paolo Guiotto: The expectation, that's around in yellow, this path here.
16:15:930Paolo Guiotto: Now, we took the expectation, it is ER,
16:19:880Paolo Guiotto: We said that at the end, the expectation is equal to this box here.
16:27:930Paolo Guiotto: So we plug this into the formula. T minus S squared plus 2 times sum of k of TK plus 1
16:37:350Paolo Guiotto: minus TK squared.
16:40:200Paolo Guiotto: As you can see, these two cancel.
16:44:540Paolo Guiotto: And what remains is this formula. So, the expectation.
16:49:660Paolo Guiotto: off.
16:51:70Paolo Guiotto: S2 pi m, so the distance between the quadratic sum and t minus S, the distance in L2 norm, it is equal to, exactly, to 2 sum of k, the TK plus 1 minus TK
17:10:970Paolo Guiotto: square.
17:12:530Paolo Guiotto: And now we can conclude the final, because…
17:16:830Paolo Guiotto: Now, you look at this sum, this is not the sum of the lengths TK plus 1 minus TK, because they are squared, but I can say that if you take TK plus 1 minus TK squared.
17:30:960Paolo Guiotto: This is the product of decay plus 1.
17:34:170Paolo Guiotto: Minus decay time itself.
17:39:880Paolo Guiotto: Now, you take one of them.
17:42:220Paolo Guiotto: and say that this is bounded by the maximum amplitude of the interval, so that quantity we call the modulus of pi m, the maximum size of the subdivision.
17:57:20Paolo Guiotto: If you remind of the definition.
18:00:200Paolo Guiotto: modulus of pi m is this quantity, is the maximum of the length of the intervals of the subdivision, okay?
18:10:00Paolo Guiotto: So I can estimate this one of the two with modulus pi m, and leave the other as it is. So, from this, I get that this quantity, the expectation of
18:24:220Paolo Guiotto: the distance in L2 norm between the quadratic variation and T minus S.
18:32:440Paolo Guiotto: the distance in L2 norm is now less or equal… it is less or equal because I'm throwing away one of these factors.
18:41:220Paolo Guiotto: then 2 sum of k, the maximum amplitude of the subdivision, times TK plus 1 minus TK.
18:52:590Paolo Guiotto: Now, this is now a constant that you can write out of the sum, and what remains is the sum of the lengths TK plus 1 minus TK, that, once again, it is the length of the total interval that is T minus S. So this is equal to 2, the maximum amplitude of the subdivision times T minus S.
19:13:440Paolo Guiotto: And therefore, we have the conclusion now, because if this quantity goes to zero, this goes to zero. And this yields the conclusion, so it proves that the quadratic
19:25:810Paolo Guiotto: Variation.
19:27:130Paolo Guiotto: Along the step of the Brownian motion, along the subdivision, pi m goes in L2 to T minus S.
19:40:860Paolo Guiotto: Okay, so now let's see why, from this, it follows. I, I will limit to a strong, very strong version, but in fact, this result can be weakened a lot. So, the corollary is…
20:01:330Paolo Guiotto: the probability that the function W, as function of time is a C1 function, Is equal to zero.
20:12:380Paolo Guiotto: So this is a way to say that the trajectories of the Braun motion are almost never regular trajectories.
20:22:860Paolo Guiotto: So this is a very, say, also a deep view now, because you have to imagine that these trajectories are
20:30:300Paolo Guiotto: an imaginary number of functions, continuous functions, no? This is saying that among the continuous function, the C1 functions, so those with a regular derivative, are negligible.
20:44:80Paolo Guiotto: So, which is intuitively, no? It's like if there is a big box where there are all these trajectories. If I pick at random a trajectory, what do I get? I will never get a regular trajectory.
20:57:850Paolo Guiotto: So they are very rare. Regular trajectories are rare, because they have probability zero. From the probabilistic point of view, they are impossible events, okay?
21:08:180Paolo Guiotto: Now, why this? Because, if, if, fade.
21:18:130Paolo Guiotto: If, suppose that, for a second, W is a C1 function.
21:25:190Paolo Guiotto: On, on an interval,
21:29:910Paolo Guiotto: Actually, the result can be much… this is very strong, no?
21:36:80Paolo Guiotto: A C1 trajectory is differentiable with derivative continuous everywhere. But this can be weakened to say that
21:45:370Paolo Guiotto: whatever is the time t you fix, so at one single point, still the probability to pick a trajectory that is differentiable at that single point is zero.
21:56:260Paolo Guiotto: So even the trajectories with one, one tangent are impossible trajectories. So, if W is C1, you can say that,
22:13:40Paolo Guiotto: So, if you take, the, so if you take an increment, WTK plus 1 minus WTK,
22:25:850Paolo Guiotto: Now, you know that there is the Lagrange, formula.
22:31:460Paolo Guiotto: the finite increment. Now, that says something like F of B minus F of A is equal to the derivative somewhere in the middle times B minus A. This is the formula I'm talking about, okay?
22:48:150Paolo Guiotto: Or if you want, you can also exaggerate and use the fundamental theorem of integral calculus. You say, this is the integral from TK to TK plus 1, so let's, let's, let's use the fundamental theorem of integral calculus, fundamental theorem of integral calculus.
23:08:610Paolo Guiotto: of the derivative, WS derivative with respect to S of WS, no?
23:15:660Paolo Guiotto: If the trajectory were C1, we could write this. But then, if you take the modulus of the increment, WTK plus 1 minus WTK,
23:27:760Paolo Guiotto: This would be controlled by the modulus of that integral, that is the integral of the modulus, dec from dec, modulus of the derivative.
23:38:830Paolo Guiotto: you could say, if that function is continuous, we can say it is, because the derivative is supposed to be here, C1. I could say that the infinity norm of the derivative of
23:51:870Paolo Guiotto: of W times what remains is the integral of 1, so TK plus 1 minus dec.
24:01:90Paolo Guiotto: So, in particular, you would have that the event, the set of omegas in capital omega, such that the trajectory
24:11:630Paolo Guiotto: is a C1.
24:15:110Paolo Guiotto: function.
24:17:450Paolo Guiotto: is contained in the set of trajectories for which you have this property, that the increment of the Brownian motion, dk plus 1 minus TK,
24:29:70Paolo Guiotto: are controlled by a constant, TK plus 1, minus TK.
24:38:90Paolo Guiotto: So these are… we… these functions are, what are called the Lipschitz functions, if you never heard of this word, are a little bit less than this.
24:49:550Paolo Guiotto: But if this happens, look what happens to the quadratic variation. Take the quadratic variation, which is the sum of the increments WTK plus 1 minus WTK squared.
25:03:190Paolo Guiotto: Right?
25:04:550Paolo Guiotto: If you know that there is a constant that bounds the increment of W by the increment of time, this would be less or equal than sum of that constant squared, TK plus 1
25:18:70Paolo Guiotto: minus TK squared.
25:21:790Paolo Guiotto: And now this is the similar trick as we did, above. Now, this can be said it is less than the maximum amplitude of the subdivision times the… you take one of the two lengths, and you, you,
25:38:350Paolo Guiotto: you control by the maximum size of the subdivision, and you keep the other in such a way that this becomes less or equal than C squared, the maximum amplitude
25:49:570Paolo Guiotto: times the sum of the lengths, which is once more the final T minus S.
25:56:390Paolo Guiotto: So you would have that S2 pi m, which is…
26:01:860Paolo Guiotto: also positive, by definition, would be bounded by this, but since this goes to zero, you would have what? That S2
26:11:710Paolo Guiotto: by Emma.
26:13:720Paolo Guiotto: Should go to zero.
26:17:390Paolo Guiotto: But we know that S2 pi m goes to T minus S.
26:23:140Paolo Guiotto: Which is never 0, unless T is equal to S.
26:26:850Paolo Guiotto: So it means that, well, precisely, if we want to do a precise argument, the point would be, here we have in L2.
26:34:820Paolo Guiotto: And this would be, for the omega for which we are here, for the omega for which the trajectory is differentiable. Omega such that the trajectory is differentiable.
26:50:300Paolo Guiotto: But then how these two things can stay together. Since we have,
26:55:700Paolo Guiotto: a convergence in L2, this is not a pointwise convergence, but if we take a subsequence, we can say that this becomes also an almost sure convergence. So, by
27:08:410Paolo Guiotto: a subsequence.
27:11:750Paolo Guiotto: This becomes also an almost sure convergent to the same limit, to T minus S,
27:17:610Paolo Guiotto: And therefore, if this sequence converges almost surely to T minus S, what is the probability of the set that it converges to zero?
27:29:120Paolo Guiotto: Well, if it converges with probability 1 to T minus S, it cannot converge with any positive probability to 0, because it converges with probability 1 to T minus S. It means that outside of that set with probability 1, it remains a probability 0 set where it converges to anywhere else.
27:48:490Paolo Guiotto: Among these, there are… there is limit zero.
27:51:180Paolo Guiotto: So it means that this, the probability that this happens must be equal to zero.
27:59:220Paolo Guiotto: So, it's actually stronger than what the statement says.
28:06:800Paolo Guiotto: Okay, so…
28:08:900Paolo Guiotto: I wanted just to, to show you at least this, important feature of the Brownian motion.
28:17:650Paolo Guiotto: Then… You can find, but these are a little bit,
28:23:400Paolo Guiotto: more complicated, refine the results to study exactly what is the regularity of this trajectory. This is not just a mathematical point, because…
28:37:20Paolo Guiotto: when… when we… we… we have these parts, we need the… we want to know what kind of regularity do they have, because it depends what we are doing with the models. So, it's important to have some information about the trajectories of the Brownian motion.
28:54:230Paolo Guiotto: Okay, so if you are… if you are curious just to see the statements, not the proofs, maybe, unless you are particularly interested, because proofs are quite difficult, you can see in the part of this chapter 12,
29:13:40Paolo Guiotto: What kind of things can be said about
29:17:730Paolo Guiotto: The regularity properties of this trajectory.
29:21:790Paolo Guiotto: Good. Let's stop here, and let's close here the courses, so let me stop recording first before…
29:29:970Paolo Guiotto: We tell something.