Skip to content
Snippets Groups Projects

Addressing #16

Merged Maciej Topyla requested to merge maciejedits into master
+ 44
47
@@ -47,7 +47,7 @@ of vector space). We can express the previous equation in terms of its component
assuming as usual some specific choice of basis, by using
the rules of matrix multiplication:
!!! tip "Eigenvalue equation: Eigenvalue and Eigenvector"
!!! info "Eigenvalue equation: Eigenvalue and Eigenvector"
$$
\sum_{j=1}^n A_{ij} v_j = \lambda v_i \, .
$$
@@ -57,7 +57,7 @@ the rules of matrix multiplication:
!!! warning "Number of solutions"
In general, there will be multiple solutions to the eigenvalue equation $A \vec{v} =\lambda \vec{v}$, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal.
!!! info "Characteristic equation:"
!!! tip "Characteristic equation:"
In order to determine the eigenvalues of the matrix $A$, we need to evaluate the solutions of the so-called *characteristic equation*
of the matrix $A$, defined as
$$
@@ -78,7 +78,7 @@ $$
\sum_{j=1}^n\left( A_{ij} - \lambda \delta_{ij}\right) v_j =0 \, , \qquad i=1,2,\ldots,n\, ,
$$
which only admit non-trivial solutions if the determinant of the matrix $A-\lambda\mathbb{I}$ vanishes
(the so-called Cramer condition), thus leading to the characteristic equation.
(the so-called Cramer's condition), thus leading to the characteristic equation.
Once we have solved the characteristic equation, we end up with $n$ eigenvalues $\lambda_k$, $k=1,\ldots,n$.
@@ -183,7 +183,7 @@ $$
!!! info "Cramer's rule"
This set of linear equations only has a non-trivial set of solutions provided that
the determinant of the matrix vanishes, as follows from the Cramer condition:
the determinant of the matrix vanishes, as follows from the Cramer's condition:
$$
{\rm det} \begin{pmatrix} A_{11}- \lambda_{\psi} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22}- \lambda_{\psi} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33}- \lambda_{\psi} & \ldots \\\vdots & \vdots & \vdots & \end{pmatrix}=
\left| \begin{array}{cccc}A_{11}- \lambda_{\psi} & A_{12} & A_{13} & \ldots \\ A_{21} & A_{22}- \lambda_{\psi} & A_{23} & \ldots\\A_{31} & A_{32} & A_{33}- \lambda_{\psi} & \ldots \\\vdots & \vdots & \vdots & \end{array} \right| = 0
@@ -198,69 +198,66 @@ $$
|\psi_2\rangle = \begin{pmatrix} \psi_{2,1} \\ \psi_{2,2} \\ \psi_{2,3} \\ \vdots \end{pmatrix} \,, \quad \ldots \, , |\psi_n\rangle = \begin{pmatrix} \psi_{n,1} \\ \psi_{n,2} \\ \psi_{n,3} \\ \vdots \end{pmatrix} \, .
$$
An important property of eigenvalue equations is that if you have two eigenvectors
$ |\psi_i\rangle$ and $ |\psi_j\rangle$ that have associated *different* eigenvalues,
$\lambda_{\psi_i} \ne \lambda_{\psi_j} $, then these two eigenvectors are orthogonal to each
other, that is
$$
\langle \psi_j | \psi_i\rangle =0 \, \quad {\rm for} \quad {i \ne j} \, .
$$
This property is extremely important, since it suggest that we could use the eigenvectors
of an eigenvalue equation as a *set of basis elements* for this Hilbert space.
!!! tip "Orthogonality of eigenvectors"
An important property of eigenvalue equations is that if you have two eigenvectors
$ |\psi_i\rangle$ and $ |\psi_j\rangle$ that have associated *different* eigenvalues,
$\lambda_{\psi_i} \ne \lambda_{\psi_j} $, then these two eigenvectors are orthogonal to each
other, that is
$$
\langle \psi_j | \psi_i\rangle =0 \, \quad {\rm for} \quad {i \ne j} \, .
$$
This property is extremely important, since it suggest that we could use the eigenvectors
of an eigenvalue equation as a *set of basis elements* for this Hilbert space.
Recall from the discussions of eigenvalue equations in linear algebra that
the eigenvectors $|\psi_i\rangle$ are defined *up to an overall normalisation constant*. Clearly, if $|\psi_i\rangle$ is a solution of $\hat{A}|\psi_i\rangle = \lambda_{\psi_i}|\psi_i\rangle$
then $c|\psi_i\rangle$ will also be a solution, with $c$ some constant. In the context of quantum mechanics, we need to choose this overall rescaling constant
to ensure that the eigenvectors are normalised, that is, that they satisfy
then $c|\psi_i\rangle$ will also be a solution, with $c$ being a constant. In the context of quantum mechanics, we need to choose this overall rescaling constant to ensure that the eigenvectors are normalised, thus they satisfy
$$
\langle \psi_i | \psi_i\rangle = 1 \, \quad {\rm for~all}~i \, .
$$
With such a choice of normalisation, one says that the set of eigenvectors
With such a choice of normalisation, one says that the eigenvectors in a set
are *orthogonal* among them.
The set of all the eigenvalues of an operator is called *eigenvalue spectrum* of the operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be *degenerate*.
!!! tip "Eigenvalue spectrum and degeneracy"
The set of all eigenvalues of an operator is called the *eigenvalue spectrum* of an operator. Note that different eigenvectors can also have the same eigenvalue. If this is the case the eigenvalue is said to be *degenerate*.
***
##Problems
**1)** *Eigenvalues and Eigenvectors*
Find the characteristic polynomial and eigenvalues for each of the following matrices,
##6.3. Problems
$$A=\begin{pmatrix} 5&3\\2&10 \end{pmatrix}\, \quad
B=\begin{pmatrix} 7i&-1\\2&6i \end{pmatrix} \, \quad C=\begin{pmatrix} 2&0&-1\\0&3&1\\1&0&4 \end{pmatrix}$$
1. *Eigenvalues and eigenvectors I*
**2)** The Hamiltonian for a two-state system is given by
$$H=\begin{pmatrix} \omega_1&\omega_2\\ \omega_2&\omega_1\end{pmatrix}$$
A basis for this system is
$$|{0}\rangle=\begin{pmatrix}1\\0 \end{pmatrix}\, ,\quad|{1}\rangle=\begin{pmatrix}0\\1 \end{pmatrix}$$
Find the characteristic polynomial and eigenvalues for each of the following matrices,
$$A=\begin{pmatrix} 5&3\\2&10 \end{pmatrix}\, \quad
B=\begin{pmatrix} 7i&-1\\2&6i \end{pmatrix} \, \quad C=\begin{pmatrix} 2&0&-1\\0&3&1\\1&0&4 \end{pmatrix}$$
Find the eigenvalues and eigenvectors of the Hamiltonian $H$, and express the eigenvectors in terms of $\{|0 \rangle,|1\rangle \}$
2. *Hamiltonian*
The Hamiltonian for a two-state system is given by
$$H=\begin{pmatrix} \omega_1&\omega_2\\ \omega_2&\omega_1\end{pmatrix}$$
A basis for this system is
$$|{0}\rangle=\begin{pmatrix}1\\0 \end{pmatrix}\, ,\quad|{1}\rangle=\begin{pmatrix}0\\1 \end{pmatrix}$$
Find the eigenvalues and eigenvectors of the Hamiltonian $H$, and express the eigenvectors in terms of $\{|0 \rangle,|1\rangle \}$
**3)** Find the eigenvalues and eigenvectors of the matrices
3. *Eigenvalues and eigenvectors II*
$$A=\begin{pmatrix} -2&-1&-1\\6&3&2\\0&0&1 \end{pmatrix}\, \quad B=\begin{pmatrix} 1&1&2\\2&2&2\\-1&-1&-1 \end{pmatrix} $$.
Find the eigenvalues and eigenvectors of the matrices
$$A=\begin{pmatrix} -2&-1&-1\\6&3&2\\0&0&1 \end{pmatrix}\, \quad B=\begin{pmatrix} 1&1&2\\2&2&2\\-1&-1&-1 \end{pmatrix} $$.
**4)** *The Hadamard gate*
4. *The Hadamard gate*
In one of the problems of the previous section we discussed that an important operator used in quantum computation is the *Hadamard gate*, which is represented by the matrix:
$$\hat{H}=\frac{1}{\sqrt{2}}\begin{pmatrix}1&1\\1&-1\end{pmatrix} \, .$$
Determine the eigenvalues and eigenvectors of this operator.
In one of the problems of the previous section we discussed that an important operator used in quantum computation is the *Hadamard gate*, which is represented by the matrix:
$$\hat{H}=\frac{1}{\sqrt{2}}\begin{pmatrix}1&1\\1&-1\end{pmatrix} \, .$$
Determine the eigenvalues and eigenvectors of this operator.
**5)** Show that the Hermitian matrix
5.*Hermitian matrix*
$$\begin{pmatrix} 0&0&i\\0&1&0\\-i&0&0 \end{pmatrix}$$
Show that the Hermitian matrix
$$\begin{pmatrix} 0&0&i\\0&1&0\\-i&0&0 \end{pmatrix}$$
has only two real eigenvalues and find and orthonormal set of three eigenvectors.
has only two real eigenvalues and find and orthonormal set of three eigenvectors.
6. *Orthogonality of eigenvectors*
**6)**
Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix
$$\begin{pmatrix} 2&1&2\\1&2&2\\2&2&1 \end{pmatrix}$$
are real, and its eigenvectors are orthogonal.
Confirm, by explicit calculation, that the eigenvalues of the real, symmetric matrix
$$\begin{pmatrix} 2&1&2\\1&2&2\\2&2&1 \end{pmatrix}$$
are real, and its eigenvectors are orthogonal.
Loading