### The remarkable Hilbert space H^2 (Part I – definition and interpolation theory)

#### by Orr Shalit

This series of posts is based on the colloquium talk that I was supposed to give on November 20, at our department. As fate had it, that week studies were cancelled.

Several people in our department thought that it would be a nice idea if alongside the usual colloquium talks given by invited speakers which highlight their recent achievements, we would also have some talks by department members that will be more of an exposition to the fields they work in. So my talk was supposed to be an exposition to the setting in which much of the research I do goes on.

The topic of the “talk” is the Hilbert space . There will be three parts to this series:

- Definition and interpolation theory.
- Multivariate operator theory and model theory
- Current research problems

#### 1. Introduction

What is ?

is a Hilbert space. One may ask: *what could be remarkable about a Hilbert space*? *A Hilbert space is a Hilbert space, and they are all isomorphic, are they not? *

This is a real question, and I was actually asked this question by a member of my department a week before this talk, after the abstract was published. I have two answers to this question.

The first answer is that is not just a Hilbert space, it is a Hilbert *function* space, so it has a much richer structure than a mere Hilbert space. The function theory that arises in the context of the space connects in a very fruitful way with the Hilbert space structure. More on this soon.

The second answer is that whenever we pick a particular construction of Hilbert space to work in, we are choosing a representation for some object of interest. In other words, the particular Hilbert space we choose to work with comes along with a set of natural operators. As an example, consider a countable group , and consider . This is the same Hilbert space as , but invites us to represent on it in a very natural way, while makes no such invitation. The operator theory that **naturally** arises in the context of the space is what makes this Hilbert space special. More on this later.

#### 2. The function space

In fact, there is a sequence of spaces which interest us. Let be a positive integer or . Let denote the open unit ball in (where is understood to be when ). I ask of you, for this talk, *don’t get your mind bogged with questions about the case,* in fact, if you are not an operator theorist you may as well* take , things are interesting enough*. We define to be the space of holomorphic functions with Taylor series

which satisfies

(*) .

Here we are using the standard multi–index notation: for we put and (and of course . Also, is the sum, not product).

Equation (*) defines a Hilbert space norm on and it is very easy to figure out what the inner product has to be.

So can be very naturally identified with a weighted –space, but we really want to think about it as a space of functions. These are not equivalence classes of functions, like we have when we look at the space , these are honest–to–God analytic functions that have well defined values at every point. The crucial fact is that *point evaluation is a bounded functional* on . The easiest way to show this is to exhibit for every , an element such that for all ,

A simple computation (using the orthogonality of the monomials) shows that the unique function that satisfies this is

Let us carry out the computation in the case . Let , and . Then

The fact that point evaluation is a bounded linear functional is the starting point of an intimate relationship between the Hilbert space structure and the operator theory of , on the one hand, and the function theory of , on the other hand. It is remarkable that both sides have benefited from this relationship.

I personally find the more interesting (or surprising) side of this story to be that operator theory has applications to complex function theory. I will tell you about my favorite example.

#### 3. Nevanlinna–Pick interpolation

Let be points in the unit disc , and let be complex numbers. One can always find an analytic function that interpolates this data, meaning that for . This is easy to do with polynomials. However, for some applications such as control theory (and also for the glory of human kind) it is desirable to find an optimal solution to this interpolating problem. For example, one would like to find an analytic function that interpolates data and has the smallest possible sup norm

It is not hard to see that we will be able to figure out what is the minimal norm of an interpolating function if we know how to solve the following problem. Denote by the Banach algebra of bounded analytic functions on the disc with the sup norm.

**Nevanlinna–Pick interpolation problem:** Given and , does there exist , with that satisfies for ?

G. Pick (1916) and R. Nevanlinna (1919) independently solved this problem. They provided the following very satisfying solution.

**Theorem 1: ***The Nevanlinna–Pick problem has a solution if and only if the matrix *

*is positive definite. *

This is a very satisfying solution because given the data you can actually form this matrix and check whether or not it is positive definite.

In 1967 D. Sarason introduced a new approach to this problem, which could simultaneously treat Nevanlinna–Pick interpolation problems as well as other interpolation problems of interest. His approach used operator theory on (the case ) in an essential way, and among other things, it gave the following result (first proved by Sz.-Nagy–Koryani, also by operator theoretic techniques).

**Theorem 2: ***Given and , there exists a bounded analytic matrix valued function , with that satisfies for , if and only if the matrix *

*is positive definite. *

So what do these beautiful theorems have to do with our space? It seems as if the problem is in the wrong space: we just introduced the Hilbert space , but in the NP problem we are looking for a function in . It turns out that is very closely related to . is equal to the so–called *multiplier algebra* of , that is,

Moreover, the space of bounded analytic matrix valued functions is the multiplier algebra of the space of vector valued functions . This simple connection allows us to harness all the power of operator theory to the function theoretic NP problem.

#### 4. Reproducing kernel Hilbert spaces and complete NP kernels

Our discussion fits in a larger framework.

**Definition 3:*** Let be a set. A reproducing kernel Hilbert space (RKHS for short, also called a Hilbert function space) is a Hilbert space that consists of functions , in which point evaluation at any point is a bounded functional on .*

Since point evaluation is bounded, we have, for any , a unique such that

(*)

So is itself a function on . The functions are called **kernel functions.** Denote . The function satisfies that for every , the matrix

is positive definite. Such a function is said to be a **positive definite kernel**. It is also referred to as a reproducing kernel, because the kernel functions “reproduce” the functions in by (*). It is a fact (known as Aronszajn’s Theorem) that every positive definite kernel is the reproducing kernel of a RKHS. If is a positive definite kernel, one sometimes denotes by the RKHS that has as its reproducing kernel.

Every RKHS has a **multiplier algebra**, defined

The multplier algebra has a natural norm:

The matrix valued NP problem makes sense in any multiplier algebra:

**Matrix valued NP interpolation problem:** Given and , does there exist , with that satisfies for ?

To clarify, can be simply considered as the algebra of matrices with entries in . This algebra acts naturally on the Hilbert space ( times), and the norm is the operator norm.

Many RKHS are known, and many have been studied. In some of them there is a nice solution to the NP interpolation problem, in some of them there is a solution but it is not nice, and in some of them nobody knows a characterization of when the problem is solvable. The most favorable case is the following one:

**Definition 4:** *A kernel is said to be a complete Nevanlinna–Pick kernel (or, for short, a complete NP kernel) if for all , the matrix valued NP interpolation problem for and , has a solution in of norm less than or equal to if and only if the matrix*

*is positive definite. In this case, is said to be a complete NP space. A multiplier algebra of complete NP space is said to be a complete NP algebra.*

I hope nobody will confuse between “complete NP” and “NP complete”.

**Remark:** Sometimes one uses the terminology “complete Pick” instead of “complete Nevanlinna–Pick”.

Theorem 2 can be restated by saying that the kernel (known as *the Szego kernel*) is a complete NP kernel. Now, this kernel is the kernel for . It is a fact (proved by Arias–Popescu, Davidson–Pitts, and Agler–McCarthy) that the kernel of is a complete NP kernel for all .

**Theorem 5:*** For all , is a complete NP space. *

Thus, NP interpolation problem in has a very nice solution.

#### 5. Universality of

Are there any other complete NP spaces besides ? Yes, there are. The Sobolev space as well as the Dirichlet space are complete NP spaces, for example. These spaces look very different from ; especially the Sobolev space, which is not even a space of analytic functions. However, the following remarkable theorem of Agler and McCarthy shows is the **universal** complete NP space.

Let us say that a kernel is **irreducible** if for all , and are linearly independent, but not orthogonal.

**Theorem 6: ***L**et be an irreducible kernel on the set , and suppose that is separable. Then has the complete NP property if and only if there exist , an injection and a nowhere vanishing function on such that *

The theorem has the following consequence:

**Corollary 7:** *Let be a complete NP multiplier algebra. Then there is a and an analytic variety such that *

The symbol stands for completely isometrically isomorphic.

[…] This post is a continuation of this previous post about the space . […]

[…] accompany/replace the colloquium talk I was supposed to give. The first two parts are available here and here. In this post I will discuss three open problems that I have been thinking about, which […]

[…] wrote a series of blog posts on Drury-Arveson space: one, two and three. Actually, at the time I wrote those posts I already knew that I was going to write […]

Nice read, I just passed this onto a friend who was doing some research on that. And he actually bought me lunch as I found it for him smile Thus let me rephrase that Thanks for lunch!

Excellent post. I was checking continuously this blog and I’m impressed! Very useful info particularly the last part efbbeeekkdde