Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Spectra of Hermitian and Unitary Operators

The following is central to the workings of quantum mechanics.

Eigenvalues and eigenvectors for Hermitian operators

Consider a Hermitian operator A=AA = A^{\dagger}.

  1. Theorem. The eigenvalues of AA are real.

Proof. Consider a normalized eigenvector a\ket{a} (a=1||a|| = 1) such that Aa=aaA \ket{a} = a \ket{a}. This means that aAa=a\bra{a} A \ket{a} = a. By the skew-symmetry of the adjoint,

a=aAa=aAa=aAa=aa^* = \bra{a} A \ket{a}^* = \bra{a} A^{\dagger} \ket{a} = \bra{a} A \ket{a} = a
  1. Theorem. Eigenvectors of A=AA = A^{\dagger} with distinct eigenvalues are orthogonal

Proof Consider eigenvectors a,b\ket{a}, \ket{b} such that

Aa=aa ;  Ab=bbA\ket{a} = a \ket{a}\ ; \ \ A \ket{b} = b \ket{b}

Then

bAa=baa=bba\bra{b} A \ket{a} = \brket{b}{a} a = b \brket{b}{a}

for aba \neq b this is only possible if ba=0\brket{b}{a} = 0.

  1. Theorem. The eigenvectors of A=AA = A^{\dagger} can be constructed to form an orthonormal basis ai\ket{a_i} such that Aai=aiaiA\ket{a_i} = a_i \ket{a_i}.

I will only sketch a proof.

det(Aa1)=cI=1d(aai)\text{det}(A - a {\bf 1}) = c \prod_{I = 1}^d (a - a_i)

There are always dd complex roots to this equation. If A=AA = A^{\dagger} all eigenbvalues are real so all aia_i are real. There are still dd roots. Hhere aia_i are the eigenvalues. If aiaja_i \neq a_j for iji \neq j, we are done, as aiaj=δij\brket{a_i}{a_j} = \delta_{ij} and there are dd such eigenvectors.

  1. Theorem. For any Hermitian operator AA, there exists a unitary matrix UU and a diagonal matrix Λ\Lambda such that A=UΛUA = U^{\dagger} \Lambda U, and the entries of Λ\Lambda are the eigenvalues, such than an eigenvalue aka_k with degree of degeneracy dkd_k appears in dkd_k entries.

Proof (sketch). Given an orthonormal basis i\ket{i}, Aij=iAjA_{ij} = \bra{i} A \ket{j}. On the other hand there is an orthonormal basis of eigenvectors ak,αk\ket{a_k,\alpha_k}, where Aak,αk=akak,αkA \ket{a_k,\alpha_k} = a_k \ket{a_k,\alpha_k}, αk=1,,dk\alpha_k = 1,\ldots,d_k, and ak,αka,β=δklδαk,β\brket{a_k,\alpha_k}{a_{\ell},\beta_{\ell}} = \delta_{kl} \delta_{\alpha_k,\beta_{\ell}}. Let us relabel (ak,αk)a~(a_k,\alpha_k) \to {\tilde a}. Then a~Ab~=Λa~b~\bra{\tilde a} A \ket{\tilde b} = \Lambda_{{\tilde a}{\tilde b}}, where Λ\Lambda is diagonal with the entries equal to the eigenvalues in the way described. Now using 1=a~a~a~{\bf 1} = \sum_{{\tilde a}} \ket{{\tilde a}}\bra{{\tilde a}}, we can write

Aij=a~,b~ia~a~Ab~b~j=Uia~Λa~b~Ub~jA_{ij} = \sum_{{\tilde a},{\tilde b}} \brket{i}{\tilde a}\bra{\tilde a} A \ket{\tilde b} \brket{\tilde b}{j} = U^{\dagger}_{i{\tilde a}} \Lambda_{{\tilde a}{\tilde b}} U_{{\tilde b} j}

where Ua~i=a~iU_{{\tilde a}i} = \brket{{\tilde a}}{i}, and Uja~=ja~U^{\dagger}_{j{\tilde a}} = \brket{j}{{\tilde a}}. You can convince yourselves that these are Hermitian conjugates of each other. We have already shown in Changes of basis that these are inverses of each other, so UU is a unitary matrix and the statement holds.

For those familiar with the singular value decomposition of matrices, this is of course a special case.

The next theorem will be central to the uncertainty principle.

  1. Theorem. Two Hermitian operators A,BA,B are diagonal in the same orthonormal basis if and only if [A,B]=0[A,B] = 0.

Proof (partial). For the “only if” part, if A,BA,B are diagonal in the same basis, we use the fact that we can writ A=UΛAUA = U^{\dagger} \Lambda_A U, B=UΛBUB = U^{\dagger} \Lambda_B U, following the construction above. A,BA,B are diagonalized by the same unitary matrix because this matrix implements the change of basis and by supposition the basis of eigenvectors is the same. Then since diagonal matrices commute,

AB=UΛAUUΛBU=UΛAΛBU=UΛBλAU=UΛBUUΛAU=BAAB = U^{\dagger} \Lambda_A U U^{\dagger} \Lambda_B U = U^{\dagger} \Lambda_A \Lambda_B U = U^{\dagger} \Lambda_B \lambda_A U = U^{\dagger}\Lambda_B U U^{\dagger}\Lambda_A U = BA

For the “if” part, let A,BA,B commute. We take the case that the eigenvalues of A,BA,B are distinct (see for example Bellac, 2006). Then if Aai=aiaiA \ket{a_i} = a_i \ket{a_i}, A(Bai)=BAai=aiBaiA(B \ket{a_i}) = BA \ket{a_i} = a_i B \ket{a_i}. That is, BaiB\ket{a_i} is an eigenvector of AA. But by supposition the subspace of eigenvectors with a fixed eigenvalue is one-dimensional, Bai=biaiB\ket{a_i} = b_i \ket{a_i} for some bib_i which is clearly an eigenvalue of bib_i.

Unitary operators

Theorem For any unitary operator UU we can write U=VΛVU = V^{\dagger}\Lambda V where VV is a unitary matrix, and Λ\Lambda diagonal with enries of the form eiλe^{i\lambda}, λR\lambda \in \CR.

Example: Operators on the space of functions

For functions with periodic boundary conditions, the functions ψn(x)=1Le2πinx/L\psi_n(x) = \frac{1}{\sqrt{L}} e^{2\pi i n x/L}, nZn \in \CZ are a basis (since Fourier modes are a basis of periodic functions. In this case, p^ψn=2πnLψn{\hat p} \psi_n = \frac{2\pi \hbar n}{L} \psi_n; this basis is a basis of eigenfunctions of p^{\hat p}. Note that x^:ψ(x)xψ(x){\hat x}:\psi(x) \to x\psi(x) is not an operator in this space.

For L2(R)L^2(\CR), x^{\hat x} is almost such an operator, if we focus on ψ(x)\psi(x) falling off fast enough (faster than 1x3/2\frac{1}{|x|^{3/2}}). The fact that it fails in some cases means that x^{\hat x} is not a bounded operator. One can also show the same for the momentum operator p^=ix{\hat p} = \frac{\hbar}{i} \frac{\del}{\del x}. Nonetheless, these operators are useful and we will use them. I will table a discussion of them for a week or two, when we get to wave mechanics.

Projection operators and spectral representations

Projection operators

  1. Definition. Let VV be a vector space over C\CC and W1,2VW_{1,2} \subset V be vector subspaces of VV. W1W_1, W2W_2 are orthogonal (denoted W1W2W_1 \perp W_2) if w1W1,w2W2\forall \ket{w_1} \in W_1, \ket{w_2} \in W_2, w1w2=0\brket{w_1}{w_2} = 0.

  2. Theorem. WV\forall W \subset V vector subspaces, there exists a subspace WVW^{\perp} \subset V such that vV\forall \ket{v} \in V, there exists a unique expression

v=w+w\ket{v} = \ket{w} + \ket{w^{\perp}}

with wW\ket{w} \in W, wW \ket{w^{\perp}} \in W^{\perp}. This is known as an orthogonal decomposition of v\ket{v}. I will forgo a proof here.

Note that we can form orthogonal/orthonormal bases i,I\ket{i}, \ket{I} for W,WW,W^{\perp} respectively; by the above theorem, since any vector can be written as a sum of vectors in the wubspaces W,WW,W^{\perp}, the collection of basis elemenbts i,I\ket{i},\ket{I} are a basis for VV. Thus if d=dimVd = \text{dim} V, dW=dim(W)d_W = \text{dim}(W), dW=dim(W)d_{W^{\perp}} = \text{dim}(W^{\perp}), we have

d=dW+dWd = d_W + d_{W^{\perp}}
  1. Definition. A projection operator P\CP is a Hermitian operator such that P2=P\CP^2 = \CP. The reasons for calling this a projection operator will become clear as we explore its properties.

P2a=PaaPa=a2aa2=a\CP^2 \ket{a} = \CP a \ket{a} \Rightarrow \CP \ket{a} = a^2 \ket{a} \Rightarrow a^2 = a

The only solutions are a=1a = 1 or a=0a = 0.

P=(1dP×dP0dP×dP0dP×dP0dP×dP)\CP = \begin{pmatrix} {\bf 1}_{d_P\times d_P} & {\bf 0}_{d_P \times d_{P^{\perp}}} \\ {\bf 0}_{d_{P^{\perp}}\times d_{P^{\perp}}} & {\bf 0}_{d_{P^{\perp}} \times d_{P^{\perp}}} \end{pmatrix}

where dim(range(P))=dP\text{dim}(\text{range}(\CP)) = d_P and dim(Ker(P))=dP\text{dim}(\text{Ker}(\CP)) = d_{P^{\perp}}. If the eigenvectors of P\CP with unit eigenvalues are i\ket{i}, i{1,,dP}i \in \{1, \ldots , d_P\}, then we can also write

P=i=1dPii\CP = \sum_{i = 1}^{d_P} \ket{i}\bra{i}

In the latter case, we can show this is Hermitian by first realizing that

(χψ)=ψχ\left(\ket{\chi}\bra{\psi}\right)^{\dagger} = \ket{\psi}\bra{\chi}

(can you shouw this?), and then showing that

P2=iiijjj=i,jiijj=i,jiδijj=P\CP^2 = \sum_i \ket{i}\bra{i} \sum_j \ket{j}\bra{j} = \sum_{i,j} \ket{i}\brket{i}{j}\bra{j} = \sum_{i,j} \ket{i}\delta_{ij} \bra{j} = \CP

Spectral representation of Hermitian operators

Consider a vector space VV and a Hermitian operator A:VVA: V\to V with a basis of eigenstates ai,αi\ket{a_i,\alpha_i}, where Aai,αi=aiai,αiA\ket{a_i,\alpha_i} = a_i \ket{a_i,\alpha_i}, αi{1,,di}\alpha_i \in \{1,\ldots,d_i\}, and ai,αiaj,αj=δijδαi,αj\brket{a_i,\alpha_i}{a_j,\alpha_j} = \delta_{ij}\delta_{\alpha_i,\alpha_j}. Here if di>1d_i > 1 then aia_i is a degenerate eigenvalue and did_i is the degree of degeneracy. We can write a projection operator

Pi=αi=1diai,αiai,αi\CP_i = \sum_{\alpha_i = 1}^{d_i} \ket{a_i,\alpha_i}\bra{a_i,\alpha_i}

which projects onto the subspace ViVV_i \subset V of vectors with eigenvalue aia_i. We can then write the operator AA as

A=iaiPiA = \sum_i a_i \CP_i

The right hand side has the same eigenvalues and eigenvector as AA, by construction; thus it has the same matrix elements in a basis of eigenvectors of AA, and so must be the same operator as AA.

Note that the fact that ai,αi\ket{a_i,\alpha_i} are a complete orthonormal basis for VV means that

1=i,αiai,αiai,αi=iPi{\bf 1} = \sum_{i,\alpha_i} \ket{a_i,\alpha_i}\bra{a_i,\alpha_i} = \sum_i \CP_i
References
  1. Bellac, M. L. (2006). Quantum Physics. Cambridge University Press. https://books.google.com/books?id=uSQ-jwEACAAJ