Jump to content

Operator theory

From Wikipedia, the free encyclopedia

In mathematics, operator theory is the study of linear operators on function spaces, beginning with differential operators and integral operators. The operators may be presented abstractly by their characteristics, such as bounded linear operators or closed operators, and consideration may be given to nonlinear operators. The study, which depends heavily on the topology of function spaces, is a branch of functional analysis.

If a collection of operators forms an algebra over a field, then it is an operator algebra. The description of operator algebras is part of operator theory.

Single operator theory

[edit]

Single operator theory deals with the properties and classification of operators, considered one at a time. For example, the classification of normal operators in terms of their spectra falls into this category.

Spectrum of operators

[edit]

The spectral theorem is any of a number of results about linear operators or about matrices.[1] In broad terms the spectral theorem provides conditions under which an operator or a matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This concept of diagonalization is relatively straightforward for operators on finite-dimensional spaces, but requires some modification for operators on infinite-dimensional spaces. In general, the spectral theorem identifies a class of linear operators that can be modelled by multiplication operators, which are as simple as one can hope to find. In more abstract language, the spectral theorem is a statement about commutative C*-algebras. See also spectral theory for a historical perspective.

Examples of operators to which the spectral theorem applies are self-adjoint operators or more generally normal operators on Hilbert spaces.

The spectral theorem also provides a canonical decomposition, called the spectral decomposition, eigenvalue decomposition, or eigendecomposition, of the underlying vector space on which the operator acts.

Normal operators

[edit]

A normal operator on a complex Hilbert space H is a continuous linear operator N : HH that commutes with its hermitian adjoint N*, that is: NN* = N*N.[2]

Normal operators are important because the spectral theorem holds for them. Today, the class of normal operators is well understood. Examples of normal operators are

The spectral theorem extends to a more general class of matrices. Let A be an operator on a finite-dimensional inner product space. A is said to be normal if A* A = A A*. One can show that A is normal if and only if it is unitarily diagonalizable: By the Schur decomposition, we have A = U T U*, where U is unitary and T upper triangular. Since A is normal, T T* = T* T. Therefore, T must be diagonal since normal upper triangular matrices are diagonal. The converse is obvious.

In other words, A is normal if and only if there exists a unitary matrix U such that where D is a diagonal matrix. Then, the entries of the diagonal of D are the eigenvalues of A. The column vectors of U are the eigenvectors of A and they are orthonormal. Unlike the Hermitian case, the entries of D need not be real.

Polar decomposition

[edit]

The polar decomposition of any bounded linear operator A between complex Hilbert spaces is a canonical factorization as the product of a partial isometry and a non-negative operator.[3]

The polar decomposition for matrices generalizes as follows: if A is a bounded linear operator then there is a unique factorization of A as a product A = UP where U is a partial isometry, P is a non-negative self-adjoint operator and the initial space of U is the closure of the range of P.

The operator U must be weakened to a partial isometry, rather than unitary, because of the following issues. If A is the one-sided shift on l2(N), then |A| = (A*A)1/2 = I. So if A = U |A|, U must be A, which is not unitary.

The existence of a polar decomposition is a consequence of Douglas' lemma:

Lemma — If A, B are bounded operators on a Hilbert space H, and A*AB*B, then there exists a contraction C such that A = CB. Furthermore, C is unique if Ker(B*) ⊂ Ker(C).

The operator C can be defined by C(Bh) = Ah, extended by continuity to the closure of Ran(B), and by zero on the orthogonal complement of Ran(B). The operator C is well-defined since A*AB*B implies Ker(B) ⊂ Ker(A). The lemma then follows.

In particular, if A*A = B*B, then C is a partial isometry, which is unique if Ker(B*) ⊂ Ker(C). In general, for any bounded operator A, where (A*A)1/2 is the unique positive square root of A*A given by the usual functional calculus. So by the lemma, we have for some partial isometry U, which is unique if Ker(A) ⊂ Ker(U). (Note Ker(A) = Ker(A*A) = Ker(B) = Ker(B*), where B = B* = (A*A)1/2.) Take P to be (A*A)1/2 and one obtains the polar decomposition A = UP. Notice that an analogous argument can be used to show A = P'U' , where P' is positive and U' a partial isometry.

When H is finite dimensional, U can be extended to a unitary operator; this is not true in general (see example above). Alternatively, the polar decomposition can be shown using the operator version of singular value decomposition.

By property of the continuous functional calculus, |A| is in the C*-algebra generated by A. A similar but weaker statement holds for the partial isometry: the polar part U is in the von Neumann algebra generated by A. If A is invertible, U will be in the C*-algebra generated by A as well.

Connection with complex analysis

[edit]

Many operators that are studied are operators on Hilbert spaces of holomorphic functions, and the study of the operator is intimately linked to questions in function theory. For example, Beurling's theorem describes the invariant subspaces of the unilateral shift in terms of inner functions, which are bounded holomorphic functions on the unit disk with unimodular boundary values almost everywhere on the circle. Beurling interpreted the unilateral shift as multiplication by the independent variable on the Hardy space.[4] The success in studying multiplication operators, and more generally Toeplitz operators (which are multiplication, followed by projection onto the Hardy space) has inspired the study of similar questions on other spaces, such as the Bergman space.

Operator algebras

[edit]

The theory of operator algebras brings algebras of operators such as C*-algebras to the fore.

C*-algebras

[edit]

A C*-algebra, A, is a Banach algebra over the field of complex numbers, together with a map * : AA. One writes x* for the image of an element x of A. The map * has the following properties:[5]

  • It is an involution, for every x in A
  • For all x, y in A:
  • For every λ in C and every x in A:
  • For all x in A:

Remark. The first three identities say that A is a *-algebra. The last identity is called the C* identity and is equivalent to:

The C*-identity is a very strong requirement. For instance, together with the spectral radius formula, it implies that the C*-norm is uniquely determined by the algebraic structure:

See also

[edit]

References

[edit]
  1. ^ Sunder, V.S. Functional Analysis: Spectral Theory (1997) Birkhäuser Verlag
  2. ^ Hoffman, Kenneth; Kunze, Ray (1971), Linear algebra (2nd ed.), Englewood Cliffs, N.J.: Prentice-Hall, Inc., p. 312, MR 0276251
  3. ^ Conway, John B. (2000), A Course in Operator Theory, Graduate Studies in Mathematics, American Mathematical Society, ISBN 0821820656
  4. ^ Nikolski, Nikolai (1986), A treatise on the shift operator, Springer-Verlag, ISBN 0-387-90176-0. A sophisticated treatment of the connections between Operator theory and Function theory in the Hardy space.
  5. ^ Arveson, William (1976), An Invitation to C*-Algebra, Springer-Verlag, ISBN 0-387-90176-0. An excellent introduction to the subject, accessible for those with a knowledge of basic functional analysis.

Further reading

[edit]
[edit]