-
svanheijst authoredsvanheijst authored
title: Eigenvalues and eigenvectors
Eigenvalues and eigenvectors
The lecture on eigenvalues and eigenvectors consists of the following parts:
and at the end of the lecture one can find the corresponding Problems
The contents of this lecture are summarised in the following videos:
In the previous lecture we discussed a number of operator equations, which are equations of the form
A specific class of operator equations, that appear frequently in quantum mechanics, are equations of the form
In this lecture we present the main ingredients of these equations and how we can apply them to quantum systems.
Eigenvalue equations in linear algebra
First of all let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix
The key feature of such equations is that applying a matrix
In order to determine the eigenvalues of the matrix
This relations follows from the eigenvalue equation in terms of components
Once we have solved the characteristic equation, we end up with
We can then determine the corresponding eigenvector
Let us remind ourselves that in
- A_{12} \left| \begin{array}{cc} A_{21} & A_{23} \ A_{31} & A_{33} \end{array} \right|
- A_{13} \left| \begin{array}{cc} A_{21} & A_{22} \ A_{31} & A_{32} \end{array} \right| $$
Let us illustrate how to compute eigenvalues and eigenvectors by considering a
Next we can determine the associated eigenvectors
Eigenvalue equations in quantum mechanics
We can now extend the ideas of eigenvalue equations from linear algebra to the case of quantum mechanics. The starting point is the eigenvalue equation for the operator \hat{A}, \hat{A}|\psi\rangle= \lambda_{\psi}|\psi\rangle \, , where the vector state |\psi\rangle is the eigenvector of the equation and $ \lambda_{\psi}$ is the corresponding eigenvalue, in general a complex scalar.
In general this equation will have multiple solutions, which for a Hilbert space \mathcal{H} with n dimensions can be labelled as \hat{A}|\psi_k\rangle= \lambda_{\psi_k}|\psi_k\rangle \, , \quad k =1,\ldots, n \, .
In order to determine the eigenvalues and eigenvectors of a given operator \hat{A} we will have to solve the corresponding eigenvalue problem for this operator, what we called above the characteristic equation. This is most efficiently done in the matrix representation of this operation, where we have that the previous operator equation can be expressed in terms of its components as \begin{pmatrix} A_{11} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33} & \ldots \\\vdots & \vdots & \vdots & \end{pmatrix} \begin{pmatrix} \psi_{k,1}\\\psi_{k,2}\\\psi_{k,3} \\\vdots\end{pmatrix}= \lambda_{\psi_k}\begin{pmatrix} \psi_{k,1}\\\psi_{k,2}\\\psi_{k,3} \\\vdots\end{pmatrix} \, , \quad k=1,\ldots,n \, .
As discussed above, this condition is identical to solving a set of linear equations for the form \begin{pmatrix} A_{11}- \lambda_{\psi_k} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22}- \lambda_{\psi_k} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33}- \lambda_{\psi_k} & \ldots \\\vdots & \vdots & \vdots & \end{pmatrix} \begin{pmatrix} \psi_{k,1}\\\psi_{k,2}\\\psi_{k,3} \\\vdots\end{pmatrix}=0 \, , \quad k=1,\ldots,n \, . This set of linear equations only has a non-trivial set of solutions provided that the determinant of the matrix vanishes, as follows from the Cramer condition: {\rm det} \begin{pmatrix} A_{11}- \lambda_{\psi} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22}- \lambda_{\psi} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33}- \lambda_{\psi} & \ldots \\\vdots & \vdots & \vdots & \end{pmatrix}= \left| \begin{array}{cccc}A_{11}- \lambda_{\psi} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22}- \lambda_{\psi} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33}- \lambda_{\psi} & \ldots \\\vdots & \vdots & \vdots & \end{array} \right| = 0 which in general will have n independent solutions, which we label as \lambda_{\psi,k}.
Once we have solved the n eigenvalues ${ \lambda_{\psi,k} } $, we can insert each of them in the original evolution equation and determine the components of each of the eigenvectors, which we can express as columns vectors |\psi_1\rangle = \begin{pmatrix} \psi_{1,1} \\ \psi_{1,2} \\ \psi_{1,3} \\ \vdots \end{pmatrix} \,, \quad |\psi_2\rangle = \begin{pmatrix} \psi_{2,1} \\ \psi_{2,2} \\ \psi_{2,3} \\ \vdots \end{pmatrix} \,, \quad \ldots \, , |\psi_n\rangle = \begin{pmatrix} \psi_{n,1} \\ \psi_{n,2} \\ \psi_{n,3} \\ \vdots \end{pmatrix} \, .
An important property of eigenvalue equations is that if you have two eigenvectors $ |\psi_i\rangle$ and $ |\psi_j\rangle$ that have associated different eigenvalues, $\lambda_{\psi_i} \ne \lambda_{\psi_j} $, then these two eigenvectors are orthogonal to each other, that is \langle \psi_j | \psi_i\rangle =0 \, \quad {\rm for} \quad {i \ne j} \, . This property is extremely important, since it suggest that we could use the eigenvectors of an eigenvalue equation as a set of basis elements for this Hilbert space.
Recall from the discussions of eigenvalue equations in linear algebra that the eigenvectors |\psi_i\rangle are defined up to an overall normalisation constant. Clearly, if |\psi_i\rangle is a solution of \hat{A}|\psi_i\rangle = \lambda_{\psi_i}|\psi_i\rangle then c|\psi_i\rangle will also be a solution, with c some constant. In the context of quantum mechanics, we need to choose this overall rescaling constant to ensure that the eigenvectors are normalised, that is, that they satisfy \langle \psi_i | \psi_i\rangle = 1 \, \quad {\rm for~all}~i \, . With such a choice of normalisation, one says that the set of eigenvectors are orthogonal among them.
The set of all the eigenvalues of an operator is called eigenvalue spectrum of the operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be degenerate.
##Problems
1) Eigenvalues and Eigenvectors
Find the characteristic polynomial and eigenvalues for each of the following matrices,
A=\begin{pmatrix} 5&3\\2&10 \end{pmatrix}\, \quad B=\begin{pmatrix} 7i&-1\\2&6i \end{pmatrix} \, \quad C=\begin{pmatrix} 2&0&-1\\0&3&1\\1&0&4 \end{pmatrix}
2) The Hamiltonian for a two-state system is given by H=\begin{pmatrix} \omega_1&\omega_2\\ \omega_2&\omega_1\end{pmatrix} A basis for this system is |{0}\rangle=\begin{pmatrix}1\\0 \end{pmatrix}\, ,\quad|{1}\rangle=\begin{pmatrix}0\\1 \end{pmatrix}
Find the eigenvalues and eigenvectors of the Hamiltonian H, and express the eigenvectors in terms of \{|0 \rangle,|1\rangle \}
3) Find the eigenvalues and eigenvectors of the matrices
A=\begin{pmatrix} -2&-1&-1\\6&3&2\\0&0&1 \end{pmatrix}\, \quad B=\begin{pmatrix} 1&1&2\\2&2&2\\-1&-1&-1 \end{pmatrix} .
4) The Hadamard gate
In one of the problems of the previous section we discussed that an important operator used in quantum computation is the Hadamard gate, which is represented by the matrix: \hat{H}=\frac{1}{\sqrt{2}}\begin{pmatrix}1&1\\1&-1\end{pmatrix} \, . Determine the eigenvalues and eigenvectors of this operator.
5) Show that the Hermitian matrix
\begin{pmatrix} 0&0&i\\0&1&0\\-i&0&0 \end{pmatrix}
has only two real eigenvalues and find and orthonormal set of three eigenvectors.
6) Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix
\begin{pmatrix} 2&1&2\\1&2&2\\2&2&1 \end{pmatrix}
are real, and its eigenvectors are orthogonal.