where $\lambda_{\psi}$ is a scalar (in general complex). These are equations where the action of the operator $\hat{A}$
on the state vector $|\psi\rangle$ returns *the same state vector* multiplied by the scalar $\lambda_{\psi}$.
This type of operator equations are known as *eigenvalue equations* and are of great importance for the description of quantum systems.
In this lecture, we present the main ingredients of these equations and how we can apply them to quantum systems.
## Eigenvalue equations in linear algebra
##6.1. Eigenvalue equations in linear algebra
First of all let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix $A$ with dimensions $n\times n$ and $\vec{v}$ is a column vector in $n$ dimensions. The corresponding eigenvalue equation will be of form
First of all, let us review eigenvalue equations in linear algebra. Assume that we have a (square) matrix $A$ with dimensions $n\times n$ and $\vec{v}$ is a column vector in $n$ dimensions. The corresponding eigenvalue equation will be of form
$$
A \vec{v} =\lambda \vec{v} .
$$
with $\lambda$ being a scalar number (real or complex, depending on the type
of vector space). We can express the previous equation in terms of its components,
assuming as usual some specific choice of basis, by using
the rules of matrix multiplication,
$$
\sum_{j=1}^n A_{ij} v_j = \lambda v_i \, .
$$
The scalar $\lambda$ is known as the *eigenvalue* of the equation, while the vector $\vec{v}$ is known as the associated *eigenvector*.
The key feature of such equations is that applying a matrix $A$ to the vector $\vec{v}$ returns *the original vector* up to an overall rescaling, $\lambda \vec{v}$. In general there will be multiple solutions to the eigenvalue equation $A \vec{v} =\lambda \vec{v}$, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal.
In order to determine the eigenvalues of the matrix $A$, we need to evaluate the solutions of the so-called *characteristic equation*
where $\mathbb{I}$ is the identity matrix of dimensions $n\times n$,
and ${\rm det}$ is the determinant.
the rules of matrix multiplication:
!!! tip "Eigenvalue equation: Eigenvalue and Eigenvector"
$$
\sum_{j=1}^n A_{ij} v_j = \lambda v_i \, .
$$
The scalar $\lambda$ is known as the *eigenvalue* of the equation, while the vector $\vec{v}$ is known as the associated *eigenvector*.
The key feature of such equations is that applying a matrix $A$ to the vector $\vec{v}$ returns *the original vector* up to an overall rescaling, $\lambda \vec{v}$.
!!! warning "Number of solutions"
In general, there will be multiple solutions to the eigenvalue equation $A \vec{v} =\lambda \vec{v}$, each one characterised by an specific eigenvalue and eigenvectors. Note that in some cases one has *degenerate solutions*, whereby a given matrix has two or more eigenvectors that are equal.
!!! info "Characteristic equation:"
In order to determine the eigenvalues of the matrix $A$, we need to evaluate the solutions of the so-called *characteristic equation*
from where we find the condition that $v_{1,1}=v_{1,2}$: an important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector $\vec{v}$ satisfies $A\vec{v}=\lambda\vec{v} $,
then the vector $\vec{v}'=c \vec{v}$ with $c$ some constant will also satisfy the same equation. So then we find that the eigenvalue $\lambda_1$ has associated an eigenvector
from where we find the condition that $v_{1,1}=v_{1,2}$: an important property of eigenvalue equations is that the eigenvectors are only fixed up to an *overall normalisation condition*. This should be clear from its definition: if a vector $\vec{v}$ satisfies $A\vec{v}=\lambda\vec{v} $,
then the vector $\vec{v}'=c \vec{v}$ with $c$ some constant will also satisfy the same equation. So then we find that the eigenvalue $\lambda_1$ has associated an eigenvector