Skip to content
Snippets Groups Projects
Commit aa9ac63c authored by Maciej Topyla's avatar Maciej Topyla
Browse files

Update src/6_eigenvectors_QM.md

parent 4c3b7e36
No related branches found
No related tags found
1 merge request!23Addressing #16
Pipeline #120528 passed
......@@ -2,64 +2,68 @@
title: Eigenvalues and eigenvectors
---
# Eigenvalues and eigenvectors
# 6. Eigenvalues and eigenvectors
The lecture on eigenvalues and eigenvectors consists of the following parts:
- [Eigenvalue equations in linear algebra](#Eigenvalue-equations-linear-algebra)
- [6.1. Eigenvalue equations in linear algebra](#61-eigenvalue-equations-in-linear-algebra)
- [Eigenvalue equations in quantum mechanics](#Eigenvalue-equations-quantum-mechanics)
- [6.2. Eigenvalue equations in quantum mechanics](#62-eigenvalue-equations-in-quantum-mechanics)
and at the end of the lecture one can find the corresponding [Problems](#problems)
and at the end of the lecture notes, there is a set of corresponding exercises:
The contents of this lecture are summarised in the following **videos**:
- [6.3. Problems](#63-problems)
- [6_eigenvectors_QM_video1](https://www.dropbox.com/s/n6hb5cu2iy8i8x4/linear_algebra_09.mov?dl=0)
The contents of this lecture are summarised in the following **video**:
In the previous lecture we discussed a number of *operator equations*, which are equations of the form
- [Eigenvalues and eigenvectors](https://www.dropbox.com/s/n6hb5cu2iy8i8x4/linear_algebra_09.mov?dl=0)
In the previous lecture, we discussed a number of *operator equations*, which have the form
$$
\hat{A}|\psi\rangle=|\varphi\rangle \, ,
$$
where $|\psi\rangle$ and $|\varphi\rangle$ are state vectors
belonging to the Hilbert space of the system $\mathcal{H}$.
A specific class of operator equations, that appear frequently
in quantum mechanics, are equations of the form
$$
\hat{A}|\psi\rangle= \lambda_{\psi}|\psi\rangle \, ,
$$
where $\lambda_{\psi}$ is a scalar (in general complex). These are equations where the action of the operator $\hat{A}$
on the state vector $|\psi\rangle$ returns *the same state vector*
multiplied by the scalar $\lambda_{\psi}$. This type of operator equations are known as *eigenvalue equations*,
and are of great importance for the description of quantum systems.
In this lecture we present the main ingredients of these equations
and how we can apply them to quantum systems.
!!! info "Eigenvalue equation:"
A specific class of operator equations, which appear frequently in quantum mechanics, consists of equations in the form
$$
\hat{A}|\psi\rangle= \lambda_{\psi}|\psi\rangle \, ,
$$
where $\lambda_{\psi}$ is a scalar (in general complex). These are equations where the action of the operator $\hat{A}$
on the state vector $|\psi\rangle$ returns *the same state vector* multiplied by the scalar $\lambda_{\psi}$.
This type of operator equations are known as *eigenvalue equations* and are of great importance for the description of quantum systems.
In this lecture, we present the main ingredients of these equations and how we can apply them to quantum systems.
## Eigenvalue equations in linear algebra
##6.1. Eigenvalue equations in linear algebra
First of all let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix $A$ with dimensions $n\times n$ and $\vec{v}$ is a column vector in $n$ dimensions. The corresponding eigenvalue equation will be of form
First of all, let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix $A$ with dimensions $n\times n$ and $\vec{v}$ is a column vector in $n$ dimensions. The corresponding eigenvalue equation will be of form
$$
A \vec{v} =\lambda \vec{v} .
$$
with $\lambda$ being a scalar number (real or complex, depending on the type
of vector space). We can express the previous equation in terms of its components,
assuming as usual some specific choice of basis, by using
the rules of matrix multiplication,
$$
\sum_{j=1}^n A_{ij} v_j = \lambda v_i \, .
$$
The scalar $\lambda$ is known as the *eigenvalue* of the equation, while the vector $\vec{v}$ is known as the associated *eigenvector*.
The key feature of such equations is that applying a matrix $A$ to the vector $\vec{v}$ returns *the original vector* up to an overall rescaling, $\lambda \vec{v}$. In general there will be multiple solutions to the eigenvalue equation $A \vec{v} =\lambda \vec{v}$, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal.
In order to determine the eigenvalues of the matrix $A$, we need to evaluate the solutions of the so-called *characteristic equation*
of the matrix $A$, defined as
$$
{\rm det}\left( A-\lambda \mathbb{I} \right)=0 \, ,
$$
where $\mathbb{I}$ is the identity matrix of dimensions $n\times n$,
and ${\rm det}$ is the determinant.
the rules of matrix multiplication:
!!! tip "Eigenvalue equation: Eigenvalue and Eigenvector"
$$
\sum_{j=1}^n A_{ij} v_j = \lambda v_i \, .
$$
The scalar $\lambda$ is known as the *eigenvalue* of the equation, while the vector $\vec{v}$ is known as the associated *eigenvector*.
The key feature of such equations is that applying a matrix $A$ to the vector $\vec{v}$ returns *the original vector* up to an overall rescaling, $\lambda \vec{v}$.
!!! warning "Number of solutions"
In general, there will be multiple solutions to the eigenvalue equation $A \vec{v} =\lambda \vec{v}$, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal.
!!! info "Characteristic equation:"
In order to determine the eigenvalues of the matrix $A$, we need to evaluate the solutions of the so-called *characteristic equation*
of the matrix $A$, defined as
$$
{\rm det}\left( A-\lambda \mathbb{I} \right)=0 \, ,
$$
where $\mathbb{I}$ is the identity matrix of dimensions $n\times n$, and ${\rm det}$ is the determinant.
This relations follows from the eigenvalue equation in terms of components
$$
......@@ -94,42 +98,48 @@ space in $n=3$ dimensions will be given in terms of the previous expression
$$
{\rm det}\left( A \right) = \left| \begin{array}{ccc} A_{11} & A_{12} & A_{13} \\ A_{21} & A_{22}
& A_{23} \\ A_{31} & A_{32}
& A_{33} \end{array} \right| = A_{11} \left| \begin{array}{cc} A_{22} & A_{23}
\\ A_{32} & A_{33} \end{array} \right|
- A_{12} \left| \begin{array}{cc} A_{21} & A_{23} \\ A_{31} & A_{33} \end{array} \right|
& A_{33} \end{array} \right| =
\begin{array}{c}
A_{11} \left| \begin{array}{cc} A_{22} & A_{23} \\ A_{32} & A_{33} \end{array} \right| \\
- A_{12} \left| \begin{array}{cc} A_{21} & A_{23} \\ A_{31} & A_{33} \end{array} \right| \\
+ A_{13} \left| \begin{array}{cc} A_{21} & A_{22} \\ A_{31} & A_{32} \end{array} \right|
$$
Let us illustrate how to compute eigenvalues and eigenvectors by considering a $n=2$ vector space. Consider the following matrix
$$
A = \left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right) \, ,
$$
which has associated the following characteristic equation
$$
{\rm det}\left( A-\lambda\cdot I \right) = \left| \begin{array}{cc} 1-\lambda & 2 \\ -1 & 4-\lambda \end{array} \right| = (1-\lambda)(4-\lambda)+2 = \lambda^2 -5\lambda + 6=0 \, .
$$
This is a quadratic equation which we know how to solve exactly, and we find
that the two eigenvalues are $\lambda_1=3$ and $\lambda_2=2$.
Next we can determine the associated eigenvectors $\vec{v}_1$ and $\vec{v}_2$. For the first one the equation that needs to be solved is
$$
\left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right)
\left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right)=\lambda_1
\left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right) = 3 \left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right)
$$
from where we find the condition that $v_{1,1}=v_{1,2}$: an important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector $\vec{v}$ satisfies $A\vec{v}=\lambda\vec{v} $,
then the vector $\vec{v}'=c \vec{v}$ with $c$ some constant will also satisfy the same equation. So then we find that the eigenvalue $\lambda_1$ has associated an eigenvector
$$
\vec{v}_1 = \left( \begin{array}{c} 1 \\ 1 \end{array} \right) \, ,
$$
and indeed one can check that
$$
A\vec{v}_1 = \left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right)
\left( \begin{array}{c} 1 \\ 1 \end{array} \right) = \left( \begin{array}{c} 3 \\ 3 \end{array} \right)=
3 \vec{v}_1 \, ,
$$
as we wanted to demonstrate. As an exercise, you can try to obtain the expression of the eigenvector
corresponding to the second eigenvalue $\lambda_2=2$.
\end{array}
$$
"
!!! check "Example":
Let us illustrate how to compute eigenvalues and eigenvectors by considering a $n=2$ vector space. Consider the following matrix
$$
A = \left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right) \, ,
$$
which has associated the following characteristic equation
$$
{\rm det}\left( A-\lambda\cdot I \right) = \left| \begin{array}{cc} 1-\lambda & 2 \\ -1 & 4-\lambda \end{array} \right| = (1-\lambda)(4-\lambda)+2 = \lambda^2 -5\lambda + 6=0 \, .
$$
This is a quadratic equation which we know how to solve exactly, and we find
that the two eigenvalues are $\lambda_1=3$ and $\lambda_2=2$.
Next we can determine the associated eigenvectors $\vec{v}_1$ and $\vec{v}_2$. For the first one the equation that needs to be solved is
$$
\left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right)
\left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right)=\lambda_1
\left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right) = 3 \left( \begin{array}{c} v_{1,1} \\ v_{1,2} \end{array} \right)
$$
from where we find the condition that $v_{1,1}=v_{1,2}$: an important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector $\vec{v}$ satisfies $A\vec{v}=\lambda\vec{v} $,
then the vector $\vec{v}'=c \vec{v}$ with $c$ some constant will also satisfy the same equation. So then we find that the eigenvalue $\lambda_1$ has associated an eigenvector
$$
\vec{v}_1 = \left( \begin{array}{c} 1 \\ 1 \end{array} \right) \, ,
$$
and indeed one can check that
$$
A\vec{v}_1 = \left( \begin{array}{cc} 1 & 2 \\ -1 & 4 \end{array} \right)
\left( \begin{array}{c} 1 \\ 1 \end{array} \right) = \left( \begin{array}{c} 3 \\ 3 \end{array} \right)=
3 \vec{v}_1 \, ,
$$
as we wanted to demonstrate.
!!! note "Exercise"
As an exercise, you can try to obtain the expression of the eigenvector
corresponding to the second eigenvalue $\lambda_2=2$.
## Eigenvalue equations in quantum mechanics
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment