AI Assistant
Transcript
00:10:910Paolo Guiotto: So, yes, yes. Last time, we, we introduced the,
00:18:30Paolo Guiotto: concept of orthogonal projection. We have,
00:21:430Paolo Guiotto: Seeing that the orthogonal projection on a closed
00:25:670Paolo Guiotto: Subspace of an hidden space is always well-defined and characterized uniquely.
00:32:540Paolo Guiotto: Now, let's introduce an important definition.
00:39:530Paolo Guiotto: This is just, for the moment, for a set, let, V.
00:47:970Paolo Guiotto: It keeps the… with an inner product.
00:51:900Paolo Guiotto: Beat, and in… problems.
00:59:510Paolo Guiotto: space, so with a scalar or medium product, as a subset of V, just a subset.
01:07:830Paolo Guiotto: we defined
01:13:120Paolo Guiotto: the orthogonal
01:20:20Paolo Guiotto: compliments.
01:25:440Paolo Guiotto: Oh.
01:27:630Paolo Guiotto: S, this sector.
01:30:680Paolo Guiotto: We say as perpendicular as the set of vectors F in V, which are orthogonal to all vectors of X. So F scar u equals 0,
01:43:230Paolo Guiotto: for every U in S.
01:48:670Paolo Guiotto: Now, it turns out, this is just a simple remark, we write a proposition, that this set is always a closed subspace of V. So, whatever is S, if I see any set.
02:06:290Paolo Guiotto: So, S perpendicular, is always…
02:16:60Paolo Guiotto: It closed the… subspace.
02:25:330Paolo Guiotto: off.
02:27:460Paolo Guiotto: B.
02:28:740Paolo Guiotto: This is just a simple, straightforward proof.
02:34:50Paolo Guiotto: First of all, is a subspace. No, I have to prove that if I take a linear combination of vectors of this sector.
02:43:320Paolo Guiotto: I am stealing the set, okay?
02:46:20Paolo Guiotto: So, first, the S perpendicular is a… subspace.
02:53:470Paolo Guiotto: Subspace for us means vector space contains the interview.
02:59:900Paolo Guiotto: Because… If you take F and G in S perpendicular, and two scalars, alpha, beta in R,
03:14:120Paolo Guiotto: or C is the same if we are in the complex case. So now, this means that F scalar U is 0 for every U in S, and similarly, G scalar U is 0 for every U in S.
03:32:470Paolo Guiotto: So…
03:34:190Paolo Guiotto: when you do the linear combination alpha F plus beta G, and you do the scalar product with the generic Q of S,
03:42:820Paolo Guiotto: Since whatever is the kind of product, it is always linear in the first argument.
03:49:600Paolo Guiotto: So, for the scalar case, or for the emitian case, they are both linear. So, this is alpha F scalar U plus beta
04:00:160Paolo Guiotto: G Scholar U.
04:01:980Paolo Guiotto: But since these two are zero, we get that all this is equal to 0 for every U in S. So it means that also alpha F plus beta G belongs to S perpendicular. So, if we assume that we have two
04:20:459Paolo Guiotto: Vectors in, in that set, and two scalars.
04:25:700Paolo Guiotto: real or complex, then their linear combination belongs to the set. This means the set is a backed off base.
04:33:200Paolo Guiotto: And the second… We have to prove that this set is closed.
04:40:640Paolo Guiotto: S perpendicular is… Closed.
04:45:450Paolo Guiotto: if and only if, for every sequence of vectors FN in S perpendicular.
04:54:730Paolo Guiotto: such that this sequence converges in the norm, which is here, the norm induced by the scarlet product, the canonical norm for this setup, to some F,
05:09:640Paolo Guiotto: Then, we have to show that this F is also a vector of this set S perpendicular.
05:17:760Paolo Guiotto: But also, this is quite straightforward, because if we know this, this means that FN, scholar, U, is equal to 0 for every U in S, and also for every N, because each FN is in this set F perpendicular.
05:36:120Paolo Guiotto: Now, the natural thing to do is we pass to the limit, this thing. Since this FN is supposed to go to F, we know that the inner product, scalar emission, is always a continuous function of
05:48:990Paolo Guiotto: its arguments. So this quantity will go to the scalar product between F and U, or the inner… the medium product between F and U. And since this remain equals 0, we have that this must be equal to zero.
06:02:30Paolo Guiotto: for every hue in S, and this exactly implies that F is in S autonomal. This finishes the proof.
06:13:600Paolo Guiotto: Well, notice that, in general, the orthogonal is never empty. Maybe it… We… we remarked this.
06:26:990Paolo Guiotto: as perpendicular is always different from empty, because there is for sure a vector here. Now, you may understand, because there is a… this is a linear subspace, so in the linear subspace, there is always this zero.
06:39:900Paolo Guiotto: And in fact, if you take a vector, the vector 0, 0 color U, is equal to zero, whatever is…
06:48:910Paolo Guiotto: You may wonder why this should be true.
06:52:600Paolo Guiotto: Okay, let's say contains solo is the zero. The zero means the zero of the letter of space, because…
07:03:430Paolo Guiotto: Zero. Scholar U.
07:06:260Paolo Guiotto: is equal to zero, so the zero here, we have two zeros. Now, one is the zero of the space, and the other is the zero of the scalars.
07:14:400Paolo Guiotto: Non?
07:22:400Paolo Guiotto: You know, this is… this is not written in the axioms, no? So why is 0, Scala? You should be 0.
07:30:840Paolo Guiotto: Well, of course, if you think to the examples, no, the scalar product in RD, no, it is sum of products, components. If the components of one vector are zero, you get zero. Or if you think to the L2 scalar product, integral of F times G, no? If F is zero, the integral is zero, but…
07:48:720Paolo Guiotto: We should be Zoom in general.
07:51:320Paolo Guiotto: Wow, this is,
07:57:360Paolo Guiotto: A simple, maybe, argument, because if you take, this color 0 times the vector 0,
08:06:670Paolo Guiotto: Well, of course, this is still the vector zero.
08:10:130Paolo Guiotto: But you can carry out this color zero, so say that this is also zero, product, algebraic product with zero V,
08:19:470Paolo Guiotto: Yo.
08:20:870Paolo Guiotto: And this is, of course, equal to zero, no? So, here you read the zero of the space color U equal to zero. Okay.
08:32:179Paolo Guiotto: So, in any case, this sector, this orthogonal.
08:37:870Paolo Guiotto: is always well defined. It contains something, at least at the zero of the vector space.
08:45:180Paolo Guiotto: For example, if you do the orthogonal of the full space, so let's say… for example, remarket.
08:58:500Paolo Guiotto: If you do the orthogonal of the NT space, you get exactly that set, the set made on only the vector 0. This because if NF is in,
09:11:490Paolo Guiotto: V orthogonal, it must be F as color, U equals 0. Now, for every U in the space, among them, there is also F. So in particular.
09:27:120Paolo Guiotto: F, Scala F is equal to 0, because I can take, no, F itself is in V. And this means that the norm of F squared is equal to 0, and because of vanishing, this means F equals 0.
09:43:490Paolo Guiotto: So, the orthogonal of the full space is just the generate subspace.
09:52:360Paolo Guiotto: In general, this, this will be a vector subspace of V.
09:58:400Paolo Guiotto: Now, since it is always closed.
10:02:840Paolo Guiotto: If the space is complete, is an Ilbat space, there will be always the orthogonal projection on that subset. Okay, so in particular, so another remark, or if you want corollary.
10:22:40Paolo Guiotto: if, the space V equipped with the inner product is a hillbird.
10:33:370Paolo Guiotto: space… And, S is any subset of V.
10:40:890Paolo Guiotto: Dan… The… orthogonal.
10:49:230Paolo Guiotto: Projection.
10:52:190Paolo Guiotto: on S perpendicular, so what we call P, S perpendicular F.
11:00:540Paolo Guiotto: is… well-defined.
11:05:880Paolo Guiotto: for every F in… View.
11:11:760Paolo Guiotto: Now, an interesting fact is that if, moreover, the set S is now itself a closed subspace, so there is also the orthogonal projection on S, so let's say that this is a new proposition, or let's say this, if, moreover.
11:34:750Paolo Guiotto: S.
11:35:990Paolo Guiotto: let's say, equal U, let's use the standard notation, is itself closed.
11:46:790Paolo Guiotto: subspace… of…
11:51:650Paolo Guiotto: V, then we know that there is the orthogonal projection also on U, and we have this relation. The orthogonal projection on U perpendicular
12:02:960Paolo Guiotto: of F is exactly F minus the orthogonal projection on U of F, which is a…
12:09:290Paolo Guiotto: quite, let's say, natural in the dramatic sense. Let's say that this is our space U. Let's imagine that the space V is the plane, no? So U perpendicular should be this one.
12:25:230Paolo Guiotto: Now, you take a vector F here.
12:27:920Paolo Guiotto: And, suppose that the orthogonal projection is well-defined. For us, in final dimension, we know that finite dimensional subspaces are always closed, so this operation is always well defined.
12:41:870Paolo Guiotto: And so this is the vector that we call the orthogonal projection, so the PU of F. This vector here, which is perpendicular to space U, is exactly F minus PU of F.
12:57:240Paolo Guiotto: Now, as you can see from this figure, this seems to be exactly the same of the projection on the orthogonal of U, so the PU perpendicular of F.
13:09:130Paolo Guiotto: And, so in particular, if you want, you have also this other way to write this, or equivalently.
13:16:830Paolo Guiotto: The vector F can be written as the orthogonal projection on U of F plus the orthogonal projection of u perpendicular of F.
13:28:60Paolo Guiotto: And since these two vectors, by definitions, this is in U, this is in U perpendicular, they are themselves perpendicular.
13:37:50Paolo Guiotto: So this is called the, orthogonal decomposition of, the two components, one is the comprehensive, and the other is the perpendicular.
13:52:50Paolo Guiotto: Okay, what have we to prove here? Well, the first part is just a consequence, because we noticed that S orthogonal is always a close subspace of V. V is an Hilbert space, so the orthogonal projection is well-defined.
14:10:490Paolo Guiotto: So there is nothing to prove. The second part, suppose that U is itself a closed subspace of V, and we want to deduce this relation. So in this case, when U is a closed subspace.
14:24:160Paolo Guiotto: of V, V Hild space, both projections exist. That one on U, and that one on U perpendicular.
14:32:790Paolo Guiotto: We want to show that there is this relation. So, we don't have to show that they exist, we already know they exist. The unique fact is that we want to show that this relation falls, okay? So…
14:47:840Paolo Guiotto: So since… Both… U and U perpendicular are.
14:57:70Paolo Guiotto: closed subspaces.
15:03:110Paolo Guiotto: of V. Hilbert.
15:09:470Paolo Guiotto: both.
15:11:00Paolo Guiotto: Projections… P-U-F.
15:16:840Paolo Guiotto: PU perpendicular F, well… define.
15:27:940Paolo Guiotto: So the unique thing to be proved is this relation.
15:32:360Paolo Guiotto: the… Unique.
15:39:00Paolo Guiotto: I mean… to be approved.
15:47:230Paolo Guiotto: is that…
15:49:310Paolo Guiotto: One can be obtained from the other through this formula. PU perpendicular F is F minus PU of F, or if you want, equivalently, PU of F is F minus PU perpendicular.
16:03:230Paolo Guiotto: Now, to show that this relation falls, we remind that there is a characterization of the orthogonal projection on any subspace, and that is provided by the orthogonality condition that we have in the main theorem. The main theorem says that
16:22:750Paolo Guiotto: Let's quickly review. It says that the orthogonal projection is characterized as the unique solution of this
16:32:300Paolo Guiotto: condition, equation, no? F minus the supposed orthogonal projection, scalpered with any vector of the space on which you are projecting, equal to zero.
16:45:350Paolo Guiotto: So what we do is, for example, if we want to show that PU perpendicular F is F minus PUF, we plug F minus PUF there, and we take a vector U which is in the perpendicular of U, okay? So let's just copy this condition.
17:05:589Paolo Guiotto: We recall that.
17:14:190Paolo Guiotto: PU perpendicular of F, is characterized.
17:21:460Paolo Guiotto: for… Perhaps, mmm…
17:25:819Paolo Guiotto: well, let's do one PUF. PUF is characterized, otherwise I have to write a lot of orthogon as it's characterized by this fact, no? F.
17:38:280Paolo Guiotto: minus PUF.
17:40:720Paolo Guiotto: Scala U equals 0 for every U in capital U, okay?
17:47:790Paolo Guiotto: Now, if you look at this identity, this identity is equivalent to say that PUF is F minus PU perpendicular
18:00:390Paolo Guiotto: F, okay? So what we do is, let's see what happens if we plug into that relation the vector F minus PU orthogonal F, no? So we want to plug this…
18:12:510Paolo Guiotto: thing inside here to show that F minus that vector, scalario U, is 0. And this is trivial, because we have…
18:24:920Paolo Guiotto: that if we do F minus, let's say, the vector which is F minus PU perpendicular of F,
18:33:710Paolo Guiotto: Scala.
18:36:720Paolo Guiotto: U, let's see what is this product. Well, now you see that some… something cancels here, F and F. So, what remains is the scalar product between P, U perpendicular F, scalar U. And what about this?
18:54:40Paolo Guiotto: Well, this is definitely equal to 0, because this vector here belongs to the space U perpendicular. And what is u perpendicular? It's the set of vectors such that vectors, color U is 0 for every U in capital U, so this is zero.
19:11:80Paolo Guiotto: or every U, capital U, just by definition of what is U orthogonal, no? So now we see that the relation is verified, and this means that that vector
19:23:60Paolo Guiotto: F minus PUF.
19:26:180Paolo Guiotto: Place as the orthogonal projection.
19:29:640Paolo Guiotto: PU perpendicular as the orthogonal projection on U of F.
19:37:10Paolo Guiotto: Okay, let's see an example of application of this factor.
19:42:40Paolo Guiotto: So I take the exercise 1436.
19:52:610Paolo Guiotto: So let's… it says let H, capital H, be the space, L2.
20:01:760Paolo Guiotto: 01.
20:06:220Paolo Guiotto: equipped with the… so this is the usual Scarlet product, so we are in the EI case, okay?
20:12:590Paolo Guiotto: It keep the… We… the… usual.
20:22:20Paolo Guiotto: Scala… Product.
20:27:220Paolo Guiotto: That means F scattered G in L2 is the integral between 01 of F times G.
20:35:300Paolo Guiotto: Now, we define U as this, the set of F in H, such that integral 01 of F of X is equal to 0.
20:51:740Paolo Guiotto: Now, question one.
20:54:210Paolo Guiotto: Is U a closed subspace?
21:00:570Paolo Guiotto: of H… Question 2, if yes, determine… It says determine the… Porto Bonar.
21:17:460Paolo Guiotto: of you.
21:19:60Paolo Guiotto: Let's put it in this way, but we have to determine the orthogonal projection.
21:24:820Paolo Guiotto: Well, let's add this. Determine the orthogonal projection on… no, I want to… I don't know why it's written like that, but what they want to do… let's see, question three this. Determine at first…
21:40:430Paolo Guiotto: determine the projection on U, 3, Determine the octagonal of you.
21:49:990Paolo Guiotto: Okay, so let's examine this case. First of all, let's, Look at first question.
21:56:840Paolo Guiotto: Is you a closed subspace?
21:59:600Paolo Guiotto: Well, the first thing to be checked is that, you see, there is a condition here. It says, I have an age such that integral equals 0.
22:08:440Paolo Guiotto: We have to verify if that integral makes sense. Otherwise, the condition written is a nonsense, okay? What does it mean? We have to show that if F is in the space, namely, in L2, that integral is well-defined. Otherwise, the condition, is not well-defined. So, if F is in L2,
22:29:850Paolo Guiotto: 01… Now, the integral makes sense if the function is that integral.
22:40:400Paolo Guiotto: when the integral makes sense. It means the function is
22:46:180Paolo Guiotto: L1, okay? So, the goal should be show that F is in L1, okay?
22:54:130Paolo Guiotto: So… to… have… integral 01 of F, well defined.
23:03:960Paolo Guiotto: We need the… We'll show…
23:09:720Paolo Guiotto: You know that here we have not continuous functions, so we cannot just say it is continuous, it is integral, okay? We have integral functions, so this is a little bit delicate. We know that you can be in L2, and you can… you are not in L1, okay? It happens, so… but in this case, it is not possible, as we will see.
23:32:370Paolo Guiotto: We need to show that F is in L101.
23:38:440Paolo Guiotto: So that means that the integral between 0, 1 of modulus of F is final.
23:49:880Paolo Guiotto: So we need to show that, when…
23:55:430Paolo Guiotto: integral between 01 of modulus F squared, The X is fine.
24:01:660Paolo Guiotto: This was one of the exercises I left. I don't know if you tried. That is a general factor, indeed.
24:10:350Paolo Guiotto: That says that when you have that the measure of the space is fired, L2 is contained into L1, and actually the L2 norm is stronger than L1 norm. That was an exercise. However, I've published also the solutions. I hope that you are checking solutions
24:29:820Paolo Guiotto: I'm continuously updating, so you have a bunch of exercises to do now, and I think the next step will be exercises on the convolution.
24:41:770Paolo Guiotto: So, these are apart from the homeworks, the exam exercises, and so on. However, if you have not seen, well, the question is, how can I bound this with this?
24:56:670Paolo Guiotto: Okay? So…
25:00:430Paolo Guiotto: We may… we may have seen this sometime. This is a sort of standard trick. I use the Cauchy force inequality. That says, I look at this as the product between F and 1,
25:12:560Paolo Guiotto: models of F and 1. And then I apply the Cashich-port's inequality that says it is less or equal than the product of the two norms. In fact, I can write, it is less or equal than the two norm of F times the two norm of 1.
25:31:570Paolo Guiotto: And the two norm of 1 is just the integral 0 to 1 of 1 square dx to the exponent 1 half. So in this case, it is equal exactly to 1.
25:43:250Paolo Guiotto: So you see that this is the abnormal value.
25:47:290Paolo Guiotto: So it means that when the altonorm of F is finite, the L1 norm of F is finite, in this case. This is because the domain has a finite measure.
25:57:490Paolo Guiotto: This could be extended to that case, okay? So…
26:02:190Paolo Guiotto: In general, it's a suggestion of factor.
26:06:630Paolo Guiotto: If you have that measure of the space is finite.
26:12:750Paolo Guiotto: The L1 norm is always controlled by the L2 norm to a constant that turns out to be the root of the measure of X. This inequality holds.
26:26:960Paolo Guiotto: Okay, so now this says that, that condition makes sense, no?
26:33:470Paolo Guiotto: So we don't have to put any further restriction.
26:37:430Paolo Guiotto: And, so let's, see if this U is a closed subspace of H.
26:46:180Paolo Guiotto: I love…
26:47:850Paolo Guiotto: you can… here… you can proceed… well, let's check that linear termination when… of vectors in… of U stays in U, and then it is closed, because I proved directly.
27:01:290Paolo Guiotto: But if I am a little bit smart, here I may notice that this condition means what? Can be seen as…
27:13:700Paolo Guiotto: Look at this.
27:16:300Paolo Guiotto: And look at this.
27:19:630Paolo Guiotto: Between?
27:23:170Paolo Guiotto: and 1 equals 0, exactly. So we notice that… That, our EU… is, the,
27:39:310Paolo Guiotto: is the set of F,
27:41:930Paolo Guiotto: in H, such that this color product between F and 1 in our L2 product is 0.
27:50:900Paolo Guiotto: So these are the functions.
27:53:30Paolo Guiotto: Which are orthogonal to 1.
27:56:30Paolo Guiotto: So this is exactly the orthogonal.
28:00:00Paolo Guiotto: Of the set made of, 1.
28:07:290Paolo Guiotto: Okay, so we are exactly in the… in… not… not yet exactly, but we are somehow in this case, no? It says the projection on the orthogonal is F minus the projection only. Why this?
28:24:250Paolo Guiotto: Because you reminded that we computed, and we have done a couple of examples last times, where we computed the orthogonal projection on certain subspaces.
28:38:600Paolo Guiotto: And in those cases, in those examples, we had to compute the orthogonal projection on some finite dimensional spaces.
28:49:850Paolo Guiotto: And then the strategy is clear, no? If the space is generated by a number of vectors, no?
28:56:800Paolo Guiotto: the orthogonality condition, that is the condition that should verify by the orthogonal projection, reduces to a system of conditions, no? Your vector is a linear combination of these vectors. So, suppose that you have 10 vectors that generate just 10 linearly independent vectors. So, you know that the orthogonal projection will be a linear combination of these 10 vectors.
29:20:920Paolo Guiotto: So you have to identify the 10 coefficients.
29:24:90Paolo Guiotto: How do you identify? Well, you impose the orthogonality condition. What is sufficient is sufficient to impose the orthogonality with the vector of the basis.
29:34:50Paolo Guiotto: Because by linearity, you have four all vectors there, no? And this provides
29:39:290Paolo Guiotto: exactly 10 conditions, see if we have 10 parameters in this example, so 10 equations and 10 linear equations, or let's say, elemental equations without any particular technical difficulty, that solved, provided the array of coefficients, no?
29:57:00Paolo Guiotto: So this is a general case. Whenever you have a space which is generated by a finite number of vectors, the problem persists in determining that particular linear combination of vectors.
30:10:940Paolo Guiotto: By solving this linear system, okay? So this was the case in the two… no, the second example was different, okay?
30:19:430Paolo Guiotto: But in the second example, which is not a final dimensional, we had the order to check, okay, not to determine. In this case, what can be done? Well, if you look at the set U, the set U is the orthogonal of a set generated by… of a set made of one single vector.
30:38:200Paolo Guiotto: And you may imagine that if you are perpendicular to one, you will be perpendicular one… a set made of only one is not a subspace, it's not a linear subspace, no? It's just one single vector.
30:51:320Paolo Guiotto: If you have one single vector and you want to be a linear subspace, you must be zero, no, because the zero is always in the subspace. Now, what is the subspace generated by this is what we call the span of just one
31:07:410Paolo Guiotto: So, I claim that this is the case because, of course, if you have a particular to 1, you have a particular function which is proportional to 1. By the way, this means that you have proportional constants, no?
31:21:710Paolo Guiotto: And in fact, you can see here, now, because if you do… if you replace 1 by a constant, it means that you are in this integral with F of X times constant.
31:32:170Paolo Guiotto: It is zero.
31:35:490Paolo Guiotto: So, basically, we have that our U is the orthogonal of this linear space, and this one is generated by one function. So it would be easy to determine the orthogonal projection on that space.
31:51:270Paolo Guiotto: So, I can determine the projection on this one.
31:55:770Paolo Guiotto: And now the idea is, let's use this relation to determine the projection on our U, because we have… it's like if it's easy to compute… it is easy to compute this one, and I want this one, you see?
32:10:930Paolo Guiotto: You see, what is the point? So I say that. Let's compute… Well, let's give a name.
32:21:90Paolo Guiotto: Let's call it, I don't know, u tilde, this space. Let u tilde be the space generated by 1.
32:31:940Paolo Guiotto: Which is a finite… Even one-dimensional.
32:40:980Paolo Guiotto: Subspace.
32:43:990Paolo Guiotto: Okay, and… Let's compute…
32:54:460Paolo Guiotto: the orthogonal projection on this U tilde.
32:58:590Paolo Guiotto: Probacs.
33:00:530Paolo Guiotto: Then, by what we said above.
33:06:170Paolo Guiotto: the orthogonal projection on u tilde perpendicular
33:14:250Paolo Guiotto: Since a finite dimensional space, it is closed.
33:20:950Paolo Guiotto: We know that's a general factor. This will be F minus PU tilde of F.
33:29:260Paolo Guiotto: So, if you want the projection on the orthogonal, you just do this.
33:34:170Paolo Guiotto: So, let's compute the PU. Now, PU tilde of F is an element of U tilde.
33:42:470Paolo Guiotto: So in this case, there is only one vector that generates, so it would be something like alpha times 1. So the unique point is to determine, let's put alpha star times 1. So the unique point is to determine this alpha star.
33:57:30Paolo Guiotto: Where… Alpha star is determined In such a way, imposing, see what, if you want.
34:09:179Paolo Guiotto: imposing.
34:11:780Paolo Guiotto: the orthogonality, which says F minus PU tilde. F, a scholar with u tilde.
34:21:870Paolo Guiotto: equals 0 for every u tilde in capital U tilde. But u tilde is generated by
34:30:739Paolo Guiotto: a single vector, 1, so it is sufficient to check this condition on the generator of the space, so F minus. We said that the P of tilde is alpha star 1, equals 0.
34:47:639Paolo Guiotto: So from this, you see, we can say that F, Scala 1 minus… Alpha star.
34:57:410Paolo Guiotto: 1 scalar 1 equals 0. Now, 1 scalar 1 is what? The 1 scalar 1 is the integral between 0, 1 of 1 times 1, dx, so it is equal to 1, and we get that alpha star is just F scalar 1.
35:16:610Paolo Guiotto: So now we have the formula for the orthogonal projection. So, PU tilde.
35:23:680Paolo Guiotto: F is F, Scala 1.
35:28:670Paolo Guiotto: times 1. So, it is an integral from 0 to 1 of F, DX, this is the scalar product, times 1.
35:38:280Paolo Guiotto: rate is this number. It is a number because the orthogonal projection on u tilde, u tilde, is a linear space generated by a constant. It is a space of constants, so it is the real line, basically, and that's the value of the orthogonal projection. So, about U,
35:56:230Paolo Guiotto: We will have that P of F is F minus PU Sorry, this… Yes.
36:08:960Paolo Guiotto: So, our U is the orthogonal of U tilde, so this is PU tilde orthogonal of F, which is equal to F minus PU tilde of F, so F minus this integral 01 of F
36:28:820Paolo Guiotto: Yeah, so… So this is the orthogonal projection.
36:36:320Paolo Guiotto: of F.
36:37:860Paolo Guiotto: on you.
36:43:460Paolo Guiotto: So, what that… what was… determine the orthogonal projection on you.
36:50:60Paolo Guiotto: And determine the U perpendicular.
36:53:970Paolo Guiotto: What is your perpendicular?
36:59:480Paolo Guiotto: 3 E.
37:00:620Paolo Guiotto: Now, we say that,
37:09:250Paolo Guiotto: Well, basically here, we said that U is the orthogonal of the space, or the set generated by 1, no? Or the orthogonal of this subspace. So we proved that
37:23:930Paolo Guiotto: U is the orthogonal of span of 1.
37:29:870Paolo Guiotto: So the orthogonal of U will be the orthogonal of this span 1.
37:37:990Paolo Guiotto: Otogan.
37:39:400Paolo Guiotto: Now…
37:43:340Paolo Guiotto: Is it true that this double orthogonal, no, if I start, I am orthogonal to something, and then I am orthogonal to this orthogonal, I should come back to the origin. No, so this should be the span of 1.
37:59:580Paolo Guiotto: Now, there is something here that we should answer. So, if provided.
38:10:160Paolo Guiotto: Provided that, let's say, u tilde, orthogonal. Orthogonal is U tilde.
38:20:280Paolo Guiotto: When this will be true.
38:27:540Paolo Guiotto: Okay, let's see, so… Probably, we have to,
38:40:290Paolo Guiotto: This… this probably will… okay, so there is a…
38:45:210Paolo Guiotto: If you want to have this identity, this is… this must be the orthogonal of something, we… I didn't know that the orthogonal of whatever is supposed to be subspace. So this identity, you need our orthogonal, you get the emission set, cannot be bound for whatever, yeah.
39:03:640Paolo Guiotto: It must be a closed linear subspace, otherwise, if they don't get it, this is still talking about something. You see? So, for example, this cannot be with a set made of just the vector 1, because the set made of the vector 1 is not the linear subspace.
39:21:110Paolo Guiotto: But one bottle of local different to water is not correct.
39:26:900Paolo Guiotto: So, necessarily, this is… if, U tilde is a… closed down.
39:37:670Paolo Guiotto: linear… subspace.
39:42:840Paolo Guiotto: So, we expect that this should be the case.
39:47:920Paolo Guiotto: Let's prove this factor.
39:52:10Paolo Guiotto: Indeed.
39:56:990Paolo Guiotto: So, we prove a, I guess a double implication.
40:03:550Paolo Guiotto: So, let's take a vector F, which is in the double orthogonal.
40:21:410Paolo Guiotto: I'm wondering if… If the… if there is any reason to say that this is, trivial.
40:32:310Paolo Guiotto: Well, let's see what happens. If you are in the double orthogonal, it means that F… Scala.
40:43:320Paolo Guiotto: say, V tilde is equal to zero, for every V tilde, in U tilde perpendicular.
41:08:970Paolo Guiotto: I mean, this is,
41:13:500Paolo Guiotto: Yeah, I have to prove that you are in… F is in U tilde.
41:19:920Paolo Guiotto: Because the other, probably one way is trivial, which is the way if F is in utild.
41:28:310Paolo Guiotto: Yes, okay, let's see the trivial, the trivial factor is if F is in U tilde.
41:36:250Paolo Guiotto: Then, if I do the product between F and V tilde with V tilde, in U tilde perpendicular.
41:48:780Paolo Guiotto: Since I want to show that F belongs to this double orthogonal, I have to prove that F is perpendicular to every vector of your orthogonal.
41:58:330Paolo Guiotto: So I have to show that this is zero.
42:01:320Paolo Guiotto: Okay? Now, this is evident, because since F is in U tilde perpendicular, what does it mean? It means that V tilde scar product with u tilde is 0 for every u tilde in
42:19:100Paolo Guiotto: U tilde, capital U tilde. And since my F is in U tilde, this means that this product is 0. So this means that from F in U tilde, I get F in
42:33:770Paolo Guiotto: the orthogonal of the orthogonal of your tilde. So it means that this inclusion is, immediate.
42:43:960Paolo Guiotto: It comes from this.
42:45:480Paolo Guiotto: Okay, so this is the…
42:47:750Paolo Guiotto: Proof of the inclusion. The difficult one is the vice versa, so take F in U tilde.
42:56:260Paolo Guiotto: Perpendicular?
42:58:540Paolo Guiotto: enter the second perpendicular, and you want to show that X color V.
43:07:320Paolo Guiotto: is equal to 0 for every V in Q tilde perpendicular.
43:13:180Paolo Guiotto: Now, so you know that this, and the goal is, F is in Utilda.
43:30:150Paolo Guiotto: Okay, so here… Who showed this, huh?
43:40:540Paolo Guiotto: So if it is false, what happens?
43:43:770Paolo Guiotto: if, F.
43:46:980Paolo Guiotto: is not in U tilde.
44:17:620Paolo Guiotto: What… what is the…
44:33:530Paolo Guiotto: If F is not in Utilda…
44:42:730Paolo Guiotto: So I have a space util.
44:47:610Paolo Guiotto: I have, let's say, the orthogonal.
44:51:260Paolo Guiotto: tilde da perpendicular.
45:03:230Paolo Guiotto: Yeah, okay, you know, perhaps, I got…
45:06:510Paolo Guiotto: So, I want to show that this is the implication
45:11:700Paolo Guiotto: We have to show that if F is in the double orthogoland, then F is in U.
45:17:620Paolo Guiotto: Now, take an F, which is in the orthogonal of your perpendicular. If you look at the figure.
45:24:270Paolo Guiotto: This app cannot be… let's say here, no? It must be here on the UT displays.
45:31:590Paolo Guiotto: However, since, well, let's say that we don't need to argue by contradiction. If we take an F, we just say that we can always decompose that F into components, the projection on U plus the projection on U perpendicular, right?
45:49:790Paolo Guiotto: No? These are the two components, the two vectors. This is the projection on u tinda of F, and this is the projection on u tilde perpendicular of F.
46:04:760Paolo Guiotto: Right.
46:05:690Paolo Guiotto: Now, if I prove that
46:09:110Paolo Guiotto: This component must be equal to zero automatically, and we have that F coincide with each autopilot projection on u tithe, and this in particular means that F is in uter. So the goal is to prove that this is zero.
46:27:460Paolo Guiotto: And… Now, this is a vector of U tilde Perpendicular, right?
46:37:470Paolo Guiotto: So, let's do the product of this by a generic vector of U tilde.
46:44:280Paolo Guiotto: perpendicular.
46:47:270Paolo Guiotto: So this is the vector I called here V, okay? So let's take a V, which is V tilde, which is in U tilde perpendicular, and let's calculate this product.
47:01:740Paolo Guiotto: Now, when we do this, the product is additive, whatever it is, real or complex. This comes F… sorry.
47:10:660Paolo Guiotto: P U tilde, F, scalar V tilde, plus…
47:18:830Paolo Guiotto: B u tilde perpendicular F scalar B tilde.
47:24:600Paolo Guiotto: But…
47:26:160Paolo Guiotto: my vector V belongs on u tilde perpendicular, and this vector belongs to U tilde. This belongs to U tilde perpendicular, so this first product is zero, right?
47:39:700Paolo Guiotto: No? Because if you are in util perpendicular, when you multiply by a vector of util, you get 0. So this is 0. And this means that this is equal to…
47:50:170Paolo Guiotto: P, U tilde, perpendicular F, scalar V tilde.
47:59:370Paolo Guiotto: And reminded that,
48:02:590Paolo Guiotto: Since we are assuming that our F is itself a vector of the perpendicular, of u tilde perpendicular, when you multiply, this belongs to u tilde perpendicular perpendicular. So, this vector is orthogonal to this one, and so this is zero.
48:23:770Paolo Guiotto: So what we get is, in conclusion, is that the product between the projection on u tilde orthogonal of F scar v tilde is 0 for every vector V tilde of
48:38:510Paolo Guiotto: UTA perpendicular. Now, as vector, take…
48:45:350Paolo Guiotto: take V tilde exactly equal the projection of u tilde perpendicular of F that belongs to that space, since it is the projection. What you read there is that the vector PU tilde perpendicular F scarlet itself
49:06:340Paolo Guiotto: is zero.
49:08:230Paolo Guiotto: But this scalar product is exactly the square of the norm of PU tilde F.
49:16:260Paolo Guiotto: So you said that this is 0, this means that PU tilde perpendicular F must be 0.
49:23:610Paolo Guiotto: And therefore, returning back to the decomposition, since this vector is 0, now we get that
49:32:500Paolo Guiotto: So, F is equal to the projection on u tilde of F, and this belongs to U tilde.
49:42:190Paolo Guiotto: Ask, please.
49:47:60Paolo Guiotto: So, in fact, this seems to be an important proposition. So, we proved that, if…
49:58:630Paolo Guiotto: Well, actually, to… to show this, what we used is that if V with the Scarlet product.
50:07:340Paolo Guiotto: is a Hilbert space.
50:12:500Paolo Guiotto: And, well, we… we did for Utilda writer.
50:17:920Paolo Guiotto: And you… East.
50:21:140Paolo Guiotto: it closed.
50:24:490Paolo Guiotto: subspace… V, then the double perpendicular of U coincide with you.
50:36:610Paolo Guiotto: That's what we have, shown here.
50:41:460Paolo Guiotto: So, returning back to the exercise, all this arise… arose from the question, what is the orthogonal? And the orthogonal of U, in this case, is the space generated by 1, so it's the space of constant. So, going back to the exercise.
51:07:680Paolo Guiotto: To… Exercise.
51:14:160Paolo Guiotto: U, perpendicular is span.
51:19:280Paolo Guiotto: of 1, so it's the set of constants. So, alpha with alpha real.
51:27:50Paolo Guiotto: In this case.
51:32:930Paolo Guiotto: Okay.
51:38:540Paolo Guiotto: So, let me, give you…
51:44:360Paolo Guiotto: And you have to find the fists.
51:47:620Paolo Guiotto: There is some problem on the… On the exercises, exam, exercises…
51:57:40Paolo Guiotto: I have a sec on that, I… It's so… Rash?
52:13:890Paolo Guiotto: Done.
52:15:560Paolo Guiotto: This one…
52:22:430Paolo Guiotto: So… Give me the exam, sir.
52:50:360Paolo Guiotto: Yep.
52:51:550Paolo Guiotto: Do exercise.
52:56:940Paolo Guiotto: 10.
52:59:10Paolo Guiotto: Wrong.
53:00:670Paolo Guiotto: exam.
53:03:880Paolo Guiotto: sets.
53:08:380Paolo Guiotto: So we continue until the end, right?
53:14:890Paolo Guiotto: Okay, so… What else,
53:23:470Paolo Guiotto: Okay, the next, topic is the…
53:28:300Paolo Guiotto: Definition of what is an orthonormal.
53:36:100Paolo Guiotto: Cool.
53:38:280Paolo Guiotto: No remote.
53:41:750Paolo Guiotto: basis.
53:45:400Paolo Guiotto: So, you know what is the concept of a basis for a vector space. It is a set of vectors.
53:54:140Paolo Guiotto: And normally it is a finite set of vectors for which we can express any vector as a finite linear combination of these, no?
54:03:210Paolo Guiotto: In linear algebra.
54:09:730Paolo Guiotto: Jesus.
54:13:120Paolo Guiotto: 4… a backstop.
54:17:60Paolo Guiotto: space B.
54:20:30Paolo Guiotto: is a set.
54:23:750Paolo Guiotto: Let's say… For the moment, we call them… the… 1.
54:31:810Paolo Guiotto: the… B, B, B, D.
54:37:790Paolo Guiotto: off.
54:39:280Paolo Guiotto: linearly. Independent.
54:45:60Paolo Guiotto: vectors…
54:47:290Paolo Guiotto: So linear independent means the unique possible linear combination of this that is, is zero in the combination with all coefficients equals zero.
54:56:360Paolo Guiotto: is a set of linear independent vectors for which…
55:05:900Paolo Guiotto: for every vector V of… for every vector V of this space, there exists coefficients, say, alpha 1,
55:16:70Paolo Guiotto: I'll see one.
55:17:660Paolo Guiotto: CD… Real or complex, this depends on what is the…
55:23:320Paolo Guiotto: the set of scholars, such that the vector V can be expressed as
55:29:550Paolo Guiotto: the linear combination, sum J1 from D, CJ, VJ.
55:36:90Paolo Guiotto: Now, if they are linear independent, this combination is unique.
55:41:290Paolo Guiotto: Now, this definition works nicely for finely dimensional spaces.
55:46:80Paolo Guiotto: For in finally dimensional spaces, it's a little bit… it's not the good definition of spaces.
55:53:840Paolo Guiotto: Because, first of all, since the space is infinite dimensional, the set of vectors will be infinite. And the problem is, is what? If you pretend to represent every vector with a finite combination of vectors.
56:10:670Paolo Guiotto: The number of activities is huge.
56:15:560Paolo Guiotto: Imagine that we take functions on where we are running, no?
56:19:320Paolo Guiotto: So, for example, we may imagine that,
56:24:310Paolo Guiotto: We can, if we use powers, no, 1, X, X squared, X cubed, and so on, by doing a combination of these functions, we can just get the class of the polynomials.
56:36:50Paolo Guiotto: And we find combination of these, we will never need this class. Imagine that now I take a function.
56:44:730Paolo Guiotto: Which are not only polynomials, for example, the exponential, and enough representing the exponential as a finite sum of polynomials, so…
56:53:580Paolo Guiotto: Either I use an infinite sum, or I must add the functions to the class of powers. So, for example, one exponential, and if you have an exponential in that class. So, with this equation, I should use a huge basics, so probably an uncountable basics.
57:13:180Paolo Guiotto: Well, since we know how to deal with infinite sum, which are not so easy as finding sum, because find some is just a natural equation.
57:24:390Paolo Guiotto: And in finite sound is a limit of, fine sounds, so it is a more complex. You know that American students are not these…
57:35:220Paolo Guiotto: But we know how to handle, in fact. We have a definition, so it is better to consider a family, a smaller possible family of factors.
57:47:130Paolo Guiotto: but using a larger way to sum, so summing them, maybe in many components. So, that's why… We…
57:58:850Paolo Guiotto: We say this.
58:01:920Paolo Guiotto: Well, that's why I don't… I don't give yet a definition. If, the dimension
58:11:330Paolo Guiotto: of V is infinite.
58:14:790Paolo Guiotto: this… It, well, it is convenient.
58:28:100Paolo Guiotto: to consider…
58:33:890Paolo Guiotto: infinite…
58:39:90Paolo Guiotto: linear.
58:40:410Paolo Guiotto: combinations.
58:47:480Paolo Guiotto: Back to us.
58:51:290Paolo Guiotto: So, something like,
58:53:440Paolo Guiotto: sum… now, that would be something like sumj goes from 1 to infinity, CJ, VJ,
59:02:390Paolo Guiotto: Now, this is an infinite sum, so we have never given a definition here, so let's just open a parenthesis on what is the meaning of this infinite sum.
59:16:10Paolo Guiotto: This is,
59:23:50Paolo Guiotto: And, you fight… some, or better. We… you know that we use the term serious.
59:35:170Paolo Guiotto: of factors.
59:38:470Paolo Guiotto: So, let's just give a definition of what does it mean this operation.
59:44:640Paolo Guiotto: So, definition… This definition is for a generic norm space, so V…
59:50:550Paolo Guiotto: With a certain norm, a normed… space.
59:57:30Paolo Guiotto: We take a sequence of vectors, FN in V, We define the infinite sum.
00:05:410Paolo Guiotto: For N going… oh, let's keep using the letter J. For J going…
00:11:780Paolo Guiotto: from 1 to infinity of FJ, so we treat this as vectors FJ.
00:20:970Paolo Guiotto: Well, exactly as you do with the infinite sum of numbers, what you do, you do the finite sum, sum for j, that goes from 1 to sum n of these vectors of j, and this is well-defined because it is a finite sum, no?
00:36:190Paolo Guiotto: Then you take the limit when n goes to infinity.
00:41:460Paolo Guiotto: And now, the point is that this is the limit of a sequence of vectors, no? So, it's something that we… we have a definition of what it does mean the limit, no? So, where, if we look at these, we call them SN. Sn itself is a sequence of vectors of V, so this is limit
01:05:470Paolo Guiotto: in V of the sequence, SN.
01:14:630Paolo Guiotto: So, the limit is, is a certain S in V,
01:22:100Paolo Guiotto: If and only if the distance between SN and S goes… goes to zero.
01:32:530Paolo Guiotto: when n goes to infinity. This is the definition. So, let's say that, at least formally, to define an infinite sum is relatively easy. The operations I do are, first, I compute a partial sum, which is this as N,
01:52:240Paolo Guiotto: Bash out.
01:54:750Paolo Guiotto: some…
01:57:350Paolo Guiotto: So, this is an operation that you can do, because this is a phi with sum of vectors, is a vector. And then, once you have done this, you do the limit for n going to plus infinity.
02:08:970Paolo Guiotto: This, in principle, seems easy, but in practice is extremely difficult, because even for numerical sequences, you have already seen that numerical series
02:21:120Paolo Guiotto: In general, you hardly computed the… or you established the behavior of that sum, the convergence.
02:29:840Paolo Guiotto: By doing these steps, no? You compute partial sums, and then you do the limit of partial sums. This can be done in very exceptional cases. One remarkable case is the case of the geometrical series, maybe you remind.
02:43:920Paolo Guiotto: But apart a few cases, this is basically impossible. Now, so, in general, you may remind that to study convergence, there are a number of tests that… checks that you do on the generic term of the series, easy checks, relatively easy checks, that, however, do not require to compute these partial sums, because this is a very difficult
03:08:470Paolo Guiotto: If not impossible task.
03:10:390Paolo Guiotto: Now, something similar happens here. So, there is a very important test, which is the following proposition. This is called the Viastas test.
03:27:530Paolo Guiotto: That says that if the space is a Banach space.
03:41:420Paolo Guiotto: And… You know that this series
03:47:890Paolo Guiotto: for J going from 1 to infinity of norms of FJ,
03:54:40Paolo Guiotto: is convergent. Now, the advantage here is that you have here a numerical series, so that in principle, you can use the tools, you know, for numerical series. The unique point is that you should compute this quantity, the norm of FJ. So let's assume that you compute the norm of FJ,
04:13:690Paolo Guiotto: Now, if you know that the sum of the series of norms is convergent, this is a serious, convergent
04:23:200Paolo Guiotto: in R, so that's a numerical figure, not a series of vectors, then your series
04:31:180Paolo Guiotto: some J going from 1 to infinity of FJ is convergent.
04:38:860Paolo Guiotto: converges.
04:40:420Paolo Guiotto: in V.
04:47:190Paolo Guiotto: Well, let's see the little proof.
04:51:900Paolo Guiotto: And the idea is that, since we are in a Banach space, what we are going to prove is not that the limit exists because we can't compute it, but because we can show that the sequence of partial sums is a Cauchy sequence.
05:06:80Paolo Guiotto: So, then the existence of the limit will be granted by the completeness, the fact that bank space means Cauchy sequences are converted.
05:14:810Paolo Guiotto: So, let's check… that, the sequence SN sequence… of partial.
05:30:130Paolo Guiotto: Psalms.
05:33:540Paolo Guiotto: is convergent.
05:37:390Paolo Guiotto: Well, take SN and SM, you do the difference, and you assess the norm of this.
05:45:690Paolo Guiotto: Now, to fix ideas, we take two indexes. Well, if they are the same, you get zero. So let's say… let's say they are different, and for example, n is larger than N.
06:01:40Paolo Guiotto: So this is to say that since Sn is the sum of all the vectors between 1 and N, Sn is the sum of all the vectors between 1 and M, when you do the difference, you cancel the common vectors, and you get
06:15:20Paolo Guiotto: the sum of vectors from M plus 1 to n of the FJ.
06:23:130Paolo Guiotto: Now, from the triangular inequality, this is less or equal than the sum of the norms.
06:33:810Paolo Guiotto: And I can see this sum, again, as a difference, precisely as the difference between sum for J that goes from 1 to N of norm of FJ, minus where I subtract the sum from 1 to M of norms of this FJ, you see?
06:54:350Paolo Guiotto: Now, call this, I don't know, sigma N, call this, of course, sigma M,
07:01:690Paolo Guiotto: So you have that this bound, the normal of SN minus SM is less or equal than sigma n
07:13:890Paolo Guiotto: minus… Seek my M.
07:18:940Paolo Guiotto: I can actually write absolute value of this, because this quantity is positive.
07:28:430Paolo Guiotto: You see?
07:29:720Paolo Guiotto: Because this difference is this guy here, is sum of positive numbers.
07:37:430Paolo Guiotto: Now, where is the point? Is that sigma n is exactly the partial sum of this series, you see?
07:46:130Paolo Guiotto: So, seems that… since the series of norms of FJ is convergent, and this is a numerical series.
07:59:900Paolo Guiotto: And, it's partial sums.
08:03:380Paolo Guiotto: D… Boss out.
08:06:630Paolo Guiotto: Sans.
08:09:90Paolo Guiotto: off.
08:10:530Paolo Guiotto: this… are exactly the sigma N, The sequence of sigma Na.
08:19:420Paolo Guiotto: Now, convergence for a series means convergence for the sequence of partial sums, so it means that sigma
08:28:950Paolo Guiotto: converges, has some limit, let's say sigma, when N, find it.
08:34:340Paolo Guiotto: when n goes to infinity.
08:37:420Paolo Guiotto: In particular, this sigma N is a Cauchy sequence itself.
08:42:40Paolo Guiotto: is Cauchy.
08:45:100Paolo Guiotto: So, you can write the Cauchy property for every epsilon positive. There exists an initial index n, such that distance between sigma m and sigma m is less or equal than epsilon for every n and m larger.
09:02:20Paolo Guiotto: then N… And because of this bound here, this says that also the distance between SN and SM
09:14:500Paolo Guiotto: in norm will be less or equal than epsilo for every n and m larger than capital N. So this will say that the sequence of Sn is cosi, but now not in R, because this is a sequence of vector in V.
09:32:200Paolo Guiotto: And therefore, since V is Banak, There exists the limit.
09:42:180Paolo Guiotto: of the SN in V, and this exactly means convergence.
09:47:729Paolo Guiotto: Okay.
09:49:130Paolo Guiotto: So this is an important test of convergence for a series of vectors.
09:56:900Paolo Guiotto: Now, we have a thesis for a generic vector space.
10:04:110Paolo Guiotto: Now, if the space is a space with an inner product, we can even get a characterization for convergence, so a test which is an if-and-only-if test.
10:16:590Paolo Guiotto: So the difference with Weiss test is that the Weiss is a sufficient condition. If the series of norms is convergent, then you have convergence of series of vector.
10:27:910Paolo Guiotto: But if this condition is not verified, you don't know. However, this is the unique general condition you can have on a norm space. If the space has an inner product, so the norm is induced by an inner product, we can do better. So this is the following statement.
10:46:40Paolo Guiotto: Well, let's say that if… The… Norm.
10:56:830Paolo Guiotto: is induced
11:02:550Paolo Guiotto: by… It… and dinner… product.
11:14:100Paolo Guiotto: We… Ken.
11:17:50Paolo Guiotto: Kevin.
11:20:350Paolo Guiotto: a better… characterization.
11:30:840Paolo Guiotto: Which is the following.
11:35:70Paolo Guiotto: So, let's say that the space will be now a space with an inner product, and again, it should be complete. So, in this case, a Hilbert space.
11:52:470Paolo Guiotto: We consider that, A series of vectors.
11:58:960Paolo Guiotto: And, with the particular feature that these vectors are perpendicular.
12:04:440Paolo Guiotto: Such that,
12:06:690Paolo Guiotto: FI is perpendicular to FJ for every iJ, of course, i different from J. So this means the scalar product between Fi and FJ is equal to 0 for I different from J.
12:25:540Paolo Guiotto: Then, for this type of series, we have…
12:29:930Paolo Guiotto: An if and only if test.
12:32:900Paolo Guiotto: which is the series of the FJ is convergent.
12:39:60Paolo Guiotto: in V, if and only if the series of their norms Square is funny.
12:50:240Paolo Guiotto: So it sounds like the vice test. The nice thing is that it is an if and all if.
12:56:130Paolo Guiotto: So, it's a true-false, it's a characterization of conversions.
13:01:140Paolo Guiotto: But it requires this more structure that we cannot
13:07:00Paolo Guiotto: Have, in a just normal space, the orthogonality of vectors.
13:13:70Paolo Guiotto: Let's see the proof, which is based on the same idea. So we check that the sequence of partial sums of this series is a Cauchy sequence.
13:25:360Paolo Guiotto: We check,
13:28:880Paolo Guiotto: That's it.
13:31:30Paolo Guiotto: the sequence SN, made of partial sums, sum J1 to N of FJ, Is… cause she… in V.
13:47:760Paolo Guiotto: So to do that, we compute the norm, again, of SN minus SM.
13:56:370Paolo Guiotto: Since we are in a, in an inner product case.
14:01:90Paolo Guiotto: You know that this norm is… is a norm of F, comes from root of scalar product F with F.
14:11:500Paolo Guiotto: So what we have to compute is this quantity, and therefore, since there is the root, I take the square of this to eliminate the root. So this becomes this color product between SN minus SM and itself.
14:31:160Paolo Guiotto: As above, let's take N larger than M,
14:36:300Paolo Guiotto: So because when n is equal to M, the difference is 0, so the norm will be 0.
14:41:700Paolo Guiotto: And so it will be small. For N different from M, we choose N greater than M, otherwise we switch the two indexes. So the sum, SN minus SM, as above, comes the sum for J going from M plus 1
14:57:830Paolo Guiotto: to N of the FJ.
15:00:740Paolo Guiotto: So, when we replace this into this double product, we have a sum J from M plus 1 to N of DFJ, and then I have the same sum, but here it is better to use another letter.
15:17:480Paolo Guiotto: So let's say sum for K going from M plus 1 to N of FK.
15:27:40Paolo Guiotto: Now, the product is additive in both factors, whatever is the kind of product, even for the Hermitian case, it is additive. So this means that we can carry out the sum, and we do this both sides, so we have this double sum.
15:42:40Paolo Guiotto: We have a sum of J and K, they both go from M plus 1,
15:47:950Paolo Guiotto: to N of the scalar puddle between FJ and FK.
15:54:840Paolo Guiotto: But this product, now, because, you see, it is here where the condition on the orthogonality enters, it is zero if J is different from K.
16:08:100Paolo Guiotto: It is equal to FJ is parallel FJ, so norm of FJ squared, if j is equal to K.
16:17:440Paolo Guiotto: So it means that, huh?
16:19:530Paolo Guiotto: A lot of this… a lot of terms here are zero.
16:23:340Paolo Guiotto: What is saved is only the case when J is equal to K, so we reduce to a unique sum for J going from n plus 1 to N, and in that case, we have FJ's color, FJ, which is norm of F, J squared.
16:39:280Paolo Guiotto: And here we have basically the same situation as before. Before, we had the norm of SN minus S,
16:48:50Paolo Guiotto: M, after the application of the triangular inequality, comes less or equal than this type of sum. Sum for j going from n plus 1 to n of norm of FJ. You see?
17:01:410Paolo Guiotto: Here we have an exact identity. SN minus SM squared norm is equal to this, okay? So let's copy norm of SN minus SM squared equals sum
17:15:340Paolo Guiotto: for J going from N plus 1 to N, norm of FJ squared.
17:21:880Paolo Guiotto: And we do now the same kind of trick. We look at this as a difference between a sigma N and a sigma M, maybe to distinguish
17:34:550Paolo Guiotto: from the previous case, because it's not the same sigma, let's call it sigma tilde, where sigma tilde n is the sum for j going from 1 to n of the norm of FJ, but now squared.
17:50:120Paolo Guiotto: So when you do the difference sigma tilde N minus sigma tilde n, you get exactly this guy.
17:57:800Paolo Guiotto: And so, since this, again, is positive, we can say that this is the absolute value, sigma tilde n minus sigma tilde n. So we get this…
18:12:560Paolo Guiotto: that norm of Sn minus SN squared equal modulus sigma n tilde minus sigma m tilde.
18:23:580Paolo Guiotto: At this point, you basically say that this quantity on the left is more if and only if the quantitative direct is more.
18:32:100Paolo Guiotto: So, basically, this means… this is a pushy sequence, even though if this is a pusher sequence. So, from this, easily, we get…
18:41:80Paolo Guiotto: Backed up.
18:44:930Paolo Guiotto: this, we… Easily.
18:49:950Paolo Guiotto: get that. The sequence SN is coshe if and only if the sequence of sigma and tilde is coshe.
19:02:400Paolo Guiotto: But since here we are, on the left-hand side, the sequence SN is in a Hilbert space, it is Cauchy, because
19:12:950Paolo Guiotto: Hilberta?
19:14:700Paolo Guiotto: space,
19:17:450Paolo Guiotto: If and only if it is convergent. So, SN… converges.
19:23:790Paolo Guiotto: And this exactly means that the series of vectors, FJ is, convergent.
19:33:610Paolo Guiotto: On the right-hand side, you have that, sigma tilde n is Cauchy, but Cauchy in R, R is, in R is,
19:44:480Paolo Guiotto: Complete.
19:47:940Paolo Guiotto: So, bank space itself. This means that the series of, of norms, FJ square.
19:57:930Paolo Guiotto: is convergent, because the semantic line must be convergent, so the series of norms, FJ squared is convergent. And that's the conclusion. You see that this is convergent if and only if this is convergent.
20:18:160Paolo Guiotto: Okay, so now we are ready to introduce the definition of orthonormal basis for the space.
20:29:440Paolo Guiotto: So, let…
20:35:410Paolo Guiotto: Let V… For the moment, we say just the inner broader space.
20:41:900Paolo Guiotto: B… And in, products… space.
20:51:750Paolo Guiotto: A family.
20:58:110Paolo Guiotto: Normally, we will use the letter EN.
21:03:960Paolo Guiotto: and in natural.
21:07:960Paolo Guiotto: is, ortho.
21:12:510Paolo Guiotto: normal.
21:16:210Paolo Guiotto: basis.
21:19:420Paolo Guiotto: of B, if… So, if we can write every vector for every…
21:27:30Paolo Guiotto: F in V, there exist coefficients that now will be infinitely many coefficients, so coefficient CN in the field of scholars, which can be R or C, such that F can be written as infinite sum
21:45:180Paolo Guiotto: J from 1 to infinity of CJ, FJ, S-R-E-E-J.
21:55:870Paolo Guiotto: So, if we can express, always with the same vectors.
22:02:820Paolo Guiotto: Now, they will be remaining in general, any function through a linear, infinite linear combination of these vectors.
22:12:260Paolo Guiotto: Now, for… let's see a first example, simple example.
22:18:940Paolo Guiotto: Which is not at all a trivial example, because, as I told you, this is basically the prototype of an Ilbert space. So let's take as a space…
22:32:140Paolo Guiotto: V be the little l2. Let me remind you, this is the set of sequences, F, FJ.
22:41:490Paolo Guiotto: contained, for example, in R, such that the sum of FJ square is found. This is the little l2, which is the biggest L2 on the set of naturals.
22:54:300Paolo Guiotto: with the sigma algebra of all the subsets of N, and with the counting measure, counting… measure.
23:04:110Paolo Guiotto: So it's a special case of a big L2 space.
23:10:00Paolo Guiotto: This scalar product is the standard scalar product for this, so for a vector FJ, scalar, a vector GJ,
23:19:440Paolo Guiotto: Well, now I know that I will confuse you with this kind of notations, because
23:27:120Paolo Guiotto: So you have to remind that this is… I have not written, this is a family of vector of V, okay?
23:35:200Paolo Guiotto: So these are vectors of the space, while the CJ are scars. Okay, so here, if you have two vectors of that space, so this is a vector F, this is a vector G, they both belong to the vector… to the space little l2.
23:55:670Paolo Guiotto: Now, the scalar product is the sum of a J of the products between the components, FJGJ.
24:02:580Paolo Guiotto: Okay?
24:03:770Paolo Guiotto: Now, in this case, there is a natural, very natural orthonormal basis, which is the canonical basis of this space, and this is the, let's say, the extension of the canonical basis of RD. So, let now EJ…
24:21:10Paolo Guiotto: in L2, so EJ is a sequencer.
24:25:400Paolo Guiotto: Of what? Of zeros and 1, a unique 1 at place J. So it's a sequence 0, 0, etc. When you arrive at J position, you put 1 and then zeros for errors.
24:42:430Paolo Guiotto: Okay?
24:43:890Paolo Guiotto: So this is a family of vectors in L2. It is clear that they are in L2, because the infinite sum of the squares of the component is 1.
24:53:270Paolo Guiotto: So… Dan… This family here, E… EJ?
25:03:810Paolo Guiotto: J in N.
25:06:560Paolo Guiotto: E's… and… Orto, nor homal.
25:13:670Paolo Guiotto: Yeah, I forgot to say…
25:18:280Paolo Guiotto: I'm sorry, I'm a little bit disordered, I don't understand why. However, I missed something, yeah. It's an orthonormal basis,
25:28:900Paolo Guiotto: off… L2…
25:32:460Paolo Guiotto: Okay, what is missing here? Of course, I have not said what does it mean, orthonormal. Basis means you can write any vector as a linear combination of these vectors. What I missed here is the specification of this orthonormal. So, orthonormal
25:53:820Paolo Guiotto: means that these vectors are first perpendicular when they are different, so EI scala EJ is 0 for I different from J.
26:08:710Paolo Guiotto: And this is the orthogonal, the ortho part of this name. Normal means that they have length 1. So, the norm of each EJ is equal to 1.
26:22:440Paolo Guiotto: These two conditions can be summarized in a unique one, because the norm of EJ is the root of EJ scala… EJ scalar EJ, right?
26:36:60Paolo Guiotto: So that root is 1 if and only if J is scary J is 1. So we may say that in a unique notation.
26:44:880Paolo Guiotto: An orthonormal… an orthonormal set of vectors is characterized by this. When you do ER, scalar, EJ, this is 0 for I different from J.
26:59:840Paolo Guiotto: And 1 for I equal to J.
27:05:550Paolo Guiotto: There is a classical symbol, which is the Kronack symbol, delta IJ, that is just this, no? 0 when the two indexes are different, one when they coincide.
27:18:130Paolo Guiotto: Okay, so that's what is an orthonormal, what does mean orthonormal? So let's check that this EJ is an orthonormal base, indeed.
27:32:670Paolo Guiotto: One, the family EJ… is an… Orto Normal.
27:43:490Paolo Guiotto: set of… vectors… Because if you compute a discolored product, between EI
27:52:920Paolo Guiotto: and EJ. Be careful, because here, you see, here there are these parentheses, because these parentheses means the vector F is itself a sequence of numbers, no? Here, there are not the parentheses, because EJ is a vector. So, EJ is a sequence of zeros and a unique one at EI at plus I.
28:19:00Paolo Guiotto: While…
28:20:420Paolo Guiotto: EJ is the same, a vector with infinitely many zeros, with a unique one, and this one is at plus J.
28:32:680Paolo Guiotto: So, when you do, by definition of the scale, you can do this times this, times this times this, and so on.
28:41:120Paolo Guiotto: You can see that if I is different from J, you always multiply slot and something from one part to the other, okay? So…
28:56:500Paolo Guiotto: So this sum is 0 when I is different from J. If I is equal to J,
29:05:60Paolo Guiotto: It means that you have exactly 1 times 1 when you arrive at index i, and this produces a 1 into the sum. So this is exactly our delta ij we mentioned about. So this is an orthonormal set of vectors. Number two.
29:24:470Paolo Guiotto: EJ… is a basis
29:30:00Paolo Guiotto: So, we have to show that each vector, F, is a linear combination, infinite linear combination of these vectors. But that's easy, because
29:40:830Paolo Guiotto: F in little l2 is a sequence of numbers, F1, F2, F3, etc. In general, FJ,
29:51:10Paolo Guiotto: etc. Now, of course, it's like cardino. You can see this as F1 times the vector 1, 0, etc, forever.
30:00:650Paolo Guiotto: Then you have plus F2 times the vector 0, 1, 0 forever.
30:06:330Paolo Guiotto: And what are these vectors? These are nothing but E1, E2, E3, etc, so this turns out to be the sum of FJ, the coordinates of F, times vector EJ.
30:21:920Paolo Guiotto: Be careful, because DFJ are numbers, and DJ are vectors here.
30:30:120Paolo Guiotto: And this is an infinite sum from 1… from 1 to infinity.
30:34:480Paolo Guiotto: So this is a very simple example of autonormal basis.
30:41:460Paolo Guiotto: A couple of, remarks before we stop? That's one remark,
30:49:550Paolo Guiotto: So, if, EJ is… And ortho.
30:58:170Paolo Guiotto: Normal.
31:01:200Paolo Guiotto: basis.
31:03:600Paolo Guiotto: And we know that this means that for every F in the space V, F will be written as infinite sum J
31:14:120Paolo Guiotto: goes from 1 to 50, certain coefficient CJ times EJ. Now, I'm in general, no more in little l2. In little l2, we know that, what is the shape of CJ? The CJ are just the components of the array F.
31:31:530Paolo Guiotto: Okay? sequence of numbers, F1, F2, F3, etc. And SEJ, if EJ is the canonical basis, is exactly that complex.
31:44:210Paolo Guiotto: Now, in the definition here, we say that it is an autonomal basis if there exists coefficients such that
31:53:780Paolo Guiotto: How this coefficient can be computed.
31:56:460Paolo Guiotto: The nice thing is that we have a formula. In fact, if we take this relation.
32:02:780Paolo Guiotto: And we do the Scala product with one of the vectors of the canonical basis, so we compute F scala EI, for example.
32:11:550Paolo Guiotto: This is the infinite sum, J goes from 1 to infinity, of CJ EJ scalar EI.
32:21:810Paolo Guiotto: Now, this is an infinite sum, so in principle, it's not a finite sum, and we cannot immediately say we can carry outside the sum, no?
32:32:700Paolo Guiotto: We can… we cannot immediately say this, because it's not a finite sum. If it is a finite sum, we can do that.
32:39:250Paolo Guiotto: But that's a limit of finite sum. It is a limit in n of sums 1 to N, C, J, EJ,
32:52:610Paolo Guiotto: And now, we can carry outside the limit, that's because we know that the product is continuous in each of the components, so the limit can be carried outside.
33:04:960Paolo Guiotto: Continuity of the product, inner product.
33:10:550Paolo Guiotto: So, we have the limit outside. And now, once we have the limit outside, inside, we have just a finite sum.
33:18:830Paolo Guiotto: And therefore, Add dativity, so plus add dativity.
33:26:130Paolo Guiotto: or linearity in the first, in the first argument. This is sum for J going from 1 to n of scalar product of CJEJ times EJ. But, again, linearity
33:44:320Paolo Guiotto: we can carry outside also the coefficient, so we have CJ, and what remains is the product between EJ and EI.
33:53:150Paolo Guiotto: But that's the Kronecker symbol, delta IJ, which is equal
34:00:00Paolo Guiotto: to 0 when i is different from J, and 1 when i is equal to J. Now, since this index, N, is going to inferior.
34:09:650Paolo Guiotto: Sooner or later, N will be big enough in such a way that our dividexi is between 1 and 10, you know? These are numbers that range J is 1, 2, 3, 4, etc.
34:27:150Paolo Guiotto: N… For n big, I can say that the certain pointer is I.
34:33:830Paolo Guiotto: And since I'm doing the limit, this is definitely what happens. So I have that this is the limit in n of what? Well, you see that delta is always 0, except when J is I, this saves just the term CI. So it is the limit of CI,
34:52:190Paolo Guiotto: Constantly, it's independent on N, and therefore, we have a CI.
34:58:210Paolo Guiotto: So we get this formula that says the coefficient CI can be computed doing this color product between F and EI,
35:08:620Paolo Guiotto: And therefore, the formula for F is the following. Sum…
35:14:310Paolo Guiotto: for I, for J that goes from, 1 to infinity, CJ… now, CJ is F…
35:23:220Paolo Guiotto: scalar EJ times EJ, and this is the representation formula for F, which is also called the abstract 48.
35:40:60Paolo Guiotto: Serious.
35:44:770Paolo Guiotto: Okay.
35:46:80Paolo Guiotto: We stop here.
35:49:110Paolo Guiotto: Let me see…
36:10:480Paolo Guiotto: Do… Exerciser.
36:14:700Paolo Guiotto: 15… 4… 1… Which is, the first one is an exercise,
36:22:100Paolo Guiotto: on the convergence of serious, and two, these two exercises.
36:37:150Paolo Guiotto: Okay. I've not… No…
36:46:390Paolo Guiotto: But it's a sort of…