Introduction to von Neumann algebras, Lecture 6 (tensor products of Hilbert spaces and vN algebras; the GNS representation, the hyperfinite II_1 factor)
by Orr Shalit
In this lecture we will introduce tensor products of Hilbert spaces. This construction is very useful for exhibiting various operators, and, in particular, it will enable us to introduce new von Neumann algebras. In particular, we will construct the so called hyperfinite factor.
1. Hausdorff completion and tensor products of Hilbert spaces
Let and be two Hilbert spaces. Our goal is to construct a new Hilbert space, formed from and , called the Hilbert space tensor product and denoted .
Definition 1: Let be a vector space. A semi-inner product is a function such that for all and all :
Of course, if occurs only for , then is said to be an inner product.
Definition 2: Given a semi-inner product, we define the associated semi-norm by
Exercise A: A semi-inner product satisfies the Cauchy-Schwarz inequality:
Consequently, the semi-norm arising from a semi-inner product is really a semi-norm. It follows that is a subspace, and that for all and . Therefore, on we can define an inner product on it
Finally, the inner product space can be completed in a unique way to form a Hilbert space .
Definition 3: Given a semi-inner product on a vector space , the Hilbert space constructed above is called the Hausdorff completion of .
Definition 4: Given Hilbert spaces and , let denote the free vector space with basis ; that is, is just the space of all finite (formal) linear combinations . On define a semi-inner product
Exercise B: This is indeed a semi-inner product. (Hint: The only thing that requires proof is positive semi-definiteness. You can find a proof in all kinds of books, e.g. Takesaki. But I think the following might be an elegant approach: Given two finite dimensional Hilbert spaces and , and given and , one has the rank one operator given by matrix multiplication. Observe that defines a semi-inner product (which is actually an inner product) on the linear maps . Notice further, that is a semi-inner preserving map from to the linear maps from to .)
Definition 4: The Hilbert space tensor product of two Hilbert spaces and , denoted , is the Hausdorff completion of . The image of in is denoted . Vectors of the form are called simple tensors.
Example: The Hilbert space tensor product of and can be identified with , as in the hint of Exercise B.
Exercise C: For every , and , it holds that
(likewise with the roles of and reversed) and
Exercise D: If is an orthonormal basis for and is an orthonormal basis for , then is an orthonormal basis for .
Exercise E: If and are sets such that and , then
Let us fix notation for what follows. Let be Hilbert spaces, let be an orthonormal basis for and be an orthonormal basis for . By Exercise D, every element in can be written as the norm convergent sum . Rearranging, we see that every element in can be written as the norm convergent sum , where and the summands are all orthogonal. In fact This gives rise to an identification
where every is a copy of .
2. Tensor products of von Neumann algebras
Keep the notation from above. Given and , we define the tensor product of and , denoted , by first defining it on simple tensors:
One then wishes to extend this definition from simple tensors, first to finite sums of simple tensors, and then to the whole space . It suffices to show that defines a bounded operator on finite linear combinations of simple tensors. In fact, it is enough to consider , because proving that is bounded is analogous, and then .
We shall make one more reduction: what we will actually work to show is that is an isometry, whenever (in fact will be unitary, which is very easy to see once one knows that it is well defined). This will suffice because every bounded operator is the sum of four unitaries (Lecture 1). Now, if is unitary, then
This shows that preserves inner products, hence is an isometry on the space of finite sums of simple tensors. Whence extends to an isometry (actually a unitary) on .
Remark: The explanation we gave here is different than the one I gave in class. If you attended the lecture, can you see why I gave a different explanation?
Now that we know that , it is a simple matter to obtain
The tensor product of operators enjoys some other nice properties:
- (and likewise it is linear in the right factor),
Definition 5: Let and be von Neumann algebras. The von Neumann algebra tensor product (or simply: the tensor product) of and is the algebra
i.e., is the von Neumann algebra generated by all tensor products , where and .
Now for all , let be the operator
Letting be the isomorphism mentioned above (given by ) can be considered also as the restriction , so can be considered as an operator in .
For every and every define . This gives a (usually infinite) “operator block matrix” . If , then . There is a bijection
Operator block matrices follow the usual algebraic rules, and act on elements in by matrix-versus-column multiplication.
Proposition 6: . Moreover, is in the above set if and only if there is some such that .
Proof: A direct calculation shows ““. Conversely, if then multiplying by from both sides, and using the fact are isometries with orthogonal ranges, one sees that for some . An operator with such a diagonal block operator matrix is easily seen to operator as .
Now define a *-representation by .
Exercise F: Is WOT/SOT continuous?
Now let be the rank one operator on , given by . A direct calculation shows that .
Proposition 7: Let be a von Neumann algebra. Then is a von Neumann algebra. To be precise:
Proof: Since , we have by the previous proposition . Therefore if then for some . Since , commutes with for all . Therefore so . This shows that , and since is tautological, the proof is complete.
As a consequence, we obtain
Theorem 8: Let be a von Neumann algebra. Then
Proof: See the following exercise.
Exercise G: Complete the details.
Project C: What about ? It is very reasonable and elegant to conjecture that . This is true, but (maybe surprisingly) highly non-trivial. For Project C, show that
- , and
Exercise H: Let be a von Neumann algebra, and suppose that there exists a family such that
(Such a family is called a system of matrix units in ). Let for some . Then . Prove that
Using the above exercise, you can now prove:
Exercise I: If is a type factor on a separable Hilbert space, then there is a type factor and a separable Hilbert space such that
3. The hyperfinite II_1 factor
We now meet a special, particular factor, called the hyperfinite factor. It is constructed as follows.
For every , let , and let be the unique unital trace on . The algebra can be identified as a unital subalgebra of , via the unital, injective and trace preserving *-homomorphism given by
Let be the normed *-algebra formed as the increasing union of the s (all the algebraic operations are performed in one of the s, that is, if , we find some so that , and then we define in ; same for , , ). On , we define a functional by , if . Since the inclusions are trace preserving (), the functional is well defined.
The hyperfinite factor is defined using the ingredients . The construction itself – called the GNS construction – is recurrent in the theory of operator algebras, so let us give it special attention.
3.1 The GNS representation
In this subsection, we let denote a *-algebra satisfying some of the nice properties of the algebra we just considered above. Thus, does not have to be an increasing union of matrix algebras, it can also be a unital C*-algebra, or the increasing union (or direct limit) of unital C*-algebras. (I am not sure what is the optimal category of *-algebras for which the construction works.)
We also let denote, not necessarily the tracial state treated above, but any state on , by which we mean a positive () and unital () linear functional on . A state is said to be faithful if .
Example: If is a *-subalgebra and is a unit vector, then is a state. Such a state is called a vector state. is faithful if and only if is separating.
The GNS representation will show that essentially all states are vector states (of course, not every state is really actually a vector state).
Theorem 9 (GNS representation): Given a pair of a nice unital normed *-algebra as above, there exists a Hilbert space , a *-representation , and a unit vector such that
(*) (“ is cyclic“)
(**) for all .
The triple is called the GNS representation of , and is the unique such triple satisfying (*) and (**).
Proof: We begin by defining an inner product on :
It is plain to see that is a sesqui-linear form, and it is positive because is positive. We define to be the Hausdorff completion of , and we write for the image of in . Put .
Define for every , the linear map be given by
After showing that is bounded on , one can extend it to a linear operator on . Boundedness follows from
which follows from (because .
It is then routine to check that is a *-representation, and we omit this. We cannot omit the gratifying step of verifying that it satisfies (**):
The uniqueness is left as an exercise.
Exercise J: Prove the uniqueness of the GNS representation (make sure you first explain what uniqueness means).
Example: Let , where is a probability measure. Let be the state
Then is the completion of with respect to the inner product
that is, . Also, , and the GNS representation is the representation by multiplication operators, given by .
The above example leads the following notation: given a von Neumann algebra and a state , one writes for , and .
Example: Consider , for a countable group , with the state . The the GNS representation is the identity representation.
3.2 The definition of and its properties
Now we leave the general GNS construction and return to the particular choice of (where ) with its state . Letting be the GNS representation of we now define to be the von Neumann algebra generated by :
is called the hyperfinite factor. Since is faithful on , there is no quotient required in the Hausdorff completion, just completion; moreover is faithful:
if . By Theorem 1 in Lecture 3, every is isometric, so is isometric. We can therefore push to :
Being a vector state, extends from to its WOT/SOT closure . This state is, in fact, a trace: if , we invoke Kaplansky’s density theorem to find bounded nets so
is faithful : if then . But then for all , using that is a trace,
Since is dense in , .
Now, we will show that is a factor. It will follow that it is a factor, since it has a faithful (normal) trace.
Let . Define for all . The functional is also a trace, because is central. Therefore is also a trace, and by uniqueness of the trace (in finite factors, in particular uniqueness of the trace in ), it must hold that for some constant . Since the inclusions are trace preserving, it must be that for some . Since is WOT dense in , . But , so
We conclude that either or . Since is faithful, we have that either or .
This shows that , so is a factor.
Definition 10: A von Neumann algebra is said to be hyperfinite (or AFD – approximately finite dimensional) if it contains a SOT dense increasing union of finite dimensional C*-algebras, that is , where are all finite dimensional C*-algebras.
The hyperfinite factor is hyperfinite, by construction. By Exercise B in Lecture 4, is also hyperfinite (and also a factor). The reason that is called THE hyperfinite factor, is because; it turns out that every hyperfinite factor is *-isomorphic to . This is not trivial – it was proved in Murray and von Neumann’s fourth joint paper on the subject. We don’t have time to cover the proof. If you want a heavy project, this is a good choice.
Project D: Uniqueness of the hyperfinite factor (this project might be too big, and may spill over into the summer break. But if you are interested this can be a nice experience, we can discuss it, and see how much of it you can do).
4. Infinite tensor products
Here is another way to look at the hyperfinite factor.
The imbedding is given by
Therefore, one thinks of as the infinite tensor product , and .
Using different finite dimensional algebras and different states (not necessarily traces), one gets different kinds of von Neumann algebras with states. Replacing the trace with the state
for , Powers obtained infinitely many type factors.