# Hermitian matrices

### Definitions and basic theorems

From now on, unless stated otherwise, $M$ is a complex linear operator, i.e. a square matrix (of size $N$, say)

$$M \to [m_{ij}], \text{ where } 1 < i,j < N,\: m_{ij} \in \mathbb{C}$$

#### Transpose of a square matrix

The transpose of $M$, written $M^T$ is defined by $${m^T}_{ij} = m_{ji}$$

For example, $$\begin{bmatrix}a & b \\ c & d\end{bmatrix}^T = \begin{bmatrix} a & c \\ b & d \end{bmatrix}$$

#### Symmetric square matrix

A square matrix, $M$, is symmetric if it is equal to its transpose. That is, $$M = M^T$$ or in terms of components, $$m_{ij} = {m^T}_{ij} = m_{ji}$$

For example, $$\begin{bmatrix}a & b \\ b & d \end{bmatrix}$$

#### Conjugate of a square matrix

The Hermitian conjugate of a square matrix, $M$, written $M^\dagger$ is defined by $$m_{ij}^\dagger = m_{ji}^*$$

For example, $$\begin{bmatrix}a & b \\ c & d \end{bmatrix}^\dagger = \begin{bmatrix}a^* & c^* \\ b^* & d^*\end{bmatrix}$$

#### Hermitian matrix

We say that a square matrix, $M$, is Hermitian if it is equal to its conjugate. That is, $$M = M^\dagger$$ or in terms of components, $$m_{ij}=m_{ij}^\dagger=m_{ji}^*$$

So, transposed elements of Hermitian matrices are complex conjugates of each other. Futhermore, the diagonal elements are complex conjugates of themselves, hence are real, $$m_{ii} = m_{ii}^* \: \Rightarrow \: m_{ii} \in \mathbb{R}$$

Also notice that the trace of a Hermitian matrix is real, $$\mathrm{Tr}(M) = \sum_{i = 1}^N m_{ii} \: \in \mathbb{R}$$

We shall see that the determinant of a Hermitian matrix is also real (once we have considered eigenvalues below).

### Hermitian matrices as operators on vectors

Nb. From now on, unless stated otherwise, we use the Einstein summation convention.

Recall, ket vectors are represented by columns and their corresponding conjugates, bra vectors, as rows (of conjugates), \begin{align*} \ket{a} &= \begin{bmatrix} a_1 \\ \vdots \\ a_N \end{bmatrix}\\ \bra{a} &= \begin{bmatrix} a_1^*, \cdots, a_N^* \end{bmatrix} \end{align*}

Also, in general, inner products are complex, $$\braket{b}{a} = b_i^* a_i \in \mathbb{C}$$ (using the Einstein summation convention).

We can think of a square matrix, $M$, as a linear operator on the space of ket vectors, $$M: \mathbb{V} \to \mathbb{V}$$

Thus, for any ket vector, $\ket{a}$, $$M \ket{a} = \ket{c} \in \mathbb{V}$$ is also a ket vector. In component form, $$c_i = M_{ij} a_j \in \mathbb{C}$$

#### Matrix elements

We can form an inner product of a bra vector, $\bra{b}$ with this new ket vector, $$\bra{b} M \ket{a} = b_i^* M_{ij} a_j \in \mathbb{C}$$ (using the Einstein summation convention on both indices). This quantity is called the ${b-a}^\mathrm{th}$ matrix element of $M$.

#### Average value

The ${a-a}^\mathrm{th}$ matrix element of $M$ is called the average value (or expectation value) of $M$ in the state $\ket{a}$. We show that the average is real, $$\bra{a} M \ket{a} \in \mathbb{R}$$

To show why this is true, first note that, for any square matrices M, we have $$\bra{b} M \ket{a} = \bra{a} M^\dagger \ket{b}^*$$ where $M^\dagger$ is the conjugate of $M$, since \begin{align*}\bra{b} M \ket{a} &= b_i^* m_{ij} a_j\\&= \left(b_i m_{ij}^* a_j^* \right)^*\\&= \left(a_j^* m_{ij}^* b_i \right)^*\\&= \left(a_i^* m_{ji}^* b_j \right)^* & (\text{ swapping dummy variables })\\&= \left(a_i^* m_{ij}^\dagger b_j \right)^*\\&= \bra{a} M^\dagger \ket{b}^*\end{align*}

For Hermitian matrices, we have $M = M^\dagger$, so we can write $$\bra{b} M \ket{a} = \bra{a} M \ket{b}^*$$ which says that the ${a-b}^\mathrm{th}$ element is the complex conjugate of the ${b-a}^\mathrm{th}$ element.

Thus, for the average value of $M$ in state $\ket{a}$, we find that $$\bra{a} M \ket{a} = \bra{a} M \ket{a}^* \in \mathbb{R}$$

### Eigenvalues and eigenvectors

#### Square Matrices

Given any square matrix, $M$, over $\mathbb{C}$, if there exists $\lambda \in \mathbb{C}$, and a ket vector, $\ket{a}$, such that $$M \ket{a}=\lambda \ket{a}$$ then

• $\lambda$ is called an eigenvalue of $M$
• $\ket{a}$ is called an eigenvector of $M$ associated with $\lambda$.

#### Eigenvalues of Hermitian matrices are real

Suppose $\lambda$ is an eigenvalue of the Hermitian matrix, $M$, and let $\ket{a}$ be the eigenvector associated with $\lambda$.

By definition, $$\bra{a} M \ket{a} = \bra{a} \lambda \ket{a} = \lambda\braket{a}{a}$$

Now, both \begin{align*}\braket{a}{a} &\in \mathbb{R}\\\bra{a} M \ket{a} &\in \mathbb{R}\end{align*}

Zero vectors are trivially eigenvectors associated with $\lambda = 0$, but in general, we consider non-zero eigenvalues and vectors, so $$\braket{a}{a} \neq 0$$

Hence, $$\lambda = \frac{\bra{a} M \ket{a}}{\braket{a}{a}} \in \mathbb{R}$$

#### Eigenvectors associated with distinct eigenvalues are orthogonal

If a Hermitian matrix has two distinct eigenvalues, $\lambda_a \neq \lambda_b$, say, with associated eigenvectors $\ket{a}, \ket{b}$. Then, by orthogonal, we mean $$\braket{b}{a} = 0$$

To see why, note that, by definition, $$\bra{b} M \ket{a} = \bra{b} \lambda_a \ket{a} = \lambda_a \braket{b}{a}$$

Also, we can write the same quantity in terms of the other eigenvalue, \begin{align*}\bra{b} M \ket{a} &= \bra{a} M \ket{b}^*\\&= \bra{a} \lambda_b \ket{b}^*\\&= \left(\lambda_b \braket{a}{b}\right)^*\\&= \lambda_b^* \braket{a}{b}^*\\&= \lambda_b \braket{b}{a}\end{align*}

Hence, we can write $$\lambda_a \braket{b}{a} = \bra{b} M \ket{a} = \lambda_b \braket{b}{a}$$ or $$\left( \lambda_b - \lambda_a \right) \braket{b}{a} = 0$$

Since we chose the eigenvalues to be distinct, we must have $$\braket{b}{a} = 0$$

#### Characteristic equation

Suppose that any square matrix, $M$, satisfies $$M \ket{a} = \lambda \ket{a}$$ for some $\lambda$ and $\ket{a}$. Then, we can re-write as $$(M - \lambda \mathbf{I}) \ket{a} = 0$$ where $\mathbf{I}$ is the identity matrix.

We assume that $\ket{a} \neq 0$, which means that the determinant of $(M - \lambda \mathbf{I})$ must be zero. That is, $$\mathrm{Det}(M - \lambda \mathbf{I}) = 0$$

This identity is called the characteristic equation of $M$.