Review of projections YouTube

Vector spaces

We are interested in finite-dimensional complex vector spaces (ket vectors), along with their corresponding conjugate spaces (bra vectors).

As noted before, the linear span of a set of vectors, $\left\{\ket{x_i}: i=1,\cdots,N\right\}$, is the set of all their linear combinations, $$\mathbf{LS}\left\{\ket{x_1},\cdots,\ket{x_N}\right\} = \left\{\sum_{i=1}^N \alpha_i \ket{x_i}: \alpha_i \in \mathbb{C}\right\}$$

We say that a set of vectors, $\left\{\ket{x_i}: i=1,\cdots,N\right\}$ are linearly dependent if there exist numbers, $\alpha_i \in \mathbb{C}$, such that not all of them are zero and $$\sum_{i=1}^N \alpha_i \ket{x_i} = 0$$

A set of vectors, $\left\{\ket{x_i}: i=1,\cdots,N\right\}$, are linearly independent if they are not linearly dependent. That is, for any numbers $\alpha_i \in \mathbb{C}$ $$\sum_{i=1}^N \alpha_i \ket{x_i} = 0 \:\Rightarrow\: \alpha_i = 0,\: \forall i=1,\cdots,N$$

For example, consider two vectors in normal space, $\hat{x}, \hat{w} = \hat{x} + \hat{y}$ (where $\hat{x}, \hat{y}$ are unit vectors in the $x,y$-directions). Suppose there exist numbers $\alpha, \beta$ such that $$\alpha\hat{x} + \beta\hat{w}= 0$$

Then, taking inner products (dot products in normal space), $$\hat{y} \cdot \left(\alpha\hat{x} + \beta\hat{w}\right) = \left(\alpha + \beta\right)\hat{y}\cdot \hat{x} + \beta\hat{y}\cdot\hat{y} = \beta$$ which implies that $\beta = 0$. Also, $$\hat{x}\cdot\left(\alpha\hat{x} + \beta\hat{w}\right) = \left(\alpha + \beta\right)\hat{x}\cdot\hat{x} + \beta\hat{x}\cdot\hat{y} = \alpha + \beta = \alpha$$ which implies that $\alpha = 0$. Hence, the vectors $\hat{x}$ and $\hat{w} = \hat{x} + \hat{y}$ are linearly independent.

The dimension of a vector space is the maximum number of linearly independent vectors that can be selected from that space.

A set of vectors, $\left\{\ket{x_i}: i=1,\cdots,N\right\}$, is said to be orthonormal if $\forall i,j = 1,\cdots,N$, $$\braket{x_i}{x_j} = \delta_{ij} = \begin{cases}1 & \text{ if } i = j \\ 0 & \text{ if } i \neq j\end{cases}$$

Notice that a linearly independent set of vectors is not necessarily an orthonormal set. In the example above, the set $\left\{\hat{x}, \hat{w} = \hat{x} + \hat{y}\right\}$ is linear independent but $\hat{x}\cdot\hat{w} \neq 0$ and $\hat{w}\cdot\hat{w} \neq 1$. It is true that an orthonormal set is linearly independent.

However, it is always possible to find a basis for any vector space, which is defined to be a set of orthornormal vectors that span the vector space. That is, the set $\left\{\ket{n_i}: i=1,\cdots,N\right\}$ is a basis for a vector space $\mathbb{V}$ $$\begin{align*}\text{if } & \mathbf{LS}\left\{\ket{n_1},\cdots,\ket{n_N}\right\} = \mathbb{V}\\\text{and } & \braket{n_i}{n_j} = \delta_{ij}\end{align*}$$ in which case, $N$ is the dimension of $\mathbb{V}$. Thus, we can write $$\mathbb{V} = \mathbb{C}^N$$

Given a basis, $\left\{\ket{n_i}: i=1,\cdots,N\right\}$, for $\mathbb{C}^N$, any vector $\ket{\psi}$ can then be written as a linear combination of the basis vectors, simply by noting that if we let $\alpha_i = \braket{n_i}{\psi}$ then $$\ket{\psi} = \sum_{i=1}^N \braket{n_i}{\psi} \ket{n_i} = \sum_{i=1}^N \alpha_i \ket{n_i}$$

Eigenspaces

A subset of a vector space is called a subspace if it forms a vector space itself. For example, it is straightforward to prove that the linear span of any vector, or set of vectors, is a subspace.

We can think of subspaces as constraints corresponding to certain propositions about a vector space.

In our case, by propositions we mean observables, or Hermitian operators, and the subspaces we are interested in are eigenspaces associated with observables.

Suppose that $K$ is some observable and that $\lambda$ is an eigenvalue of $K$. Then, associated with $\lambda$ is a set of eigenvectors, $\left\{\ket{e_\lambda^i}:i=1,\cdots,N_\lambda\right\}$ say, such that $$K\ket{e_\lambda^i} = \lambda \ket{e_\lambda^i}$$

The eigenspace associated with $\lambda$ is the linear span of the eigenvectors, $$\left\{K \equiv \lambda\right\} = \mathbf{LS}\left\{\ket{e_\lambda^1},\cdots,\ket{e_\lambda^{N_\lambda}}\right\}$$ and any vector in this eigenspace is a eigenvector of $K$ associated with $\lambda$.

For a given eigenvalue, a (full) set of eigenvectors will be a linearly independent set but it won't necessarily be a basis (i.e. orthonormal). But, since an eigenspace is a subspace, we can construct a basis, $\left\{\ket{n_\lambda^i}: i=1,\cdots,N_\lambda\right\}$ where $N_\lambda$ is the dimension of the eigenspace and then we can write $$\left\{K \equiv \lambda\right\} = \mathbf{LS}\left\{\ket{n_\lambda^1},\cdots,\ket{n_\lambda^{N_\lambda}}\right\}$$

Projections

Given two vectors, $\ket{a}, \ket{b}$, a dyad is a linear operator which maps vectors to vectors, $$\begin{align*}\ket{a}\bra{b}: &\mathbb{V} \rightarrow \mathbb{V}\\& \ket{\psi} \mapsto \left(\ket{a}\bra{b}\right)\ket{\psi} = \left(\braket{b}{\psi}\right) \ket{a}\end{align*}$$

Given a basis, $\left\{\ket{n_i}: i=1,\cdots,N\right\}$ then $$\begin{align*}\ket{\psi} &= \sum_{i=1}^N \braket{n_i}{\psi} \ket{n_i}\\ &= \sum_{i=1}^N \left(\ket{n_i}\bra{n_i}\right)\ket{\psi}\\ &= \left(\sum_{i=1}^N \ket{n_i}\bra{n_i}\right)\ket{\psi}\end{align*}$$ hence the identity operator is the sum of the dyads of all basis vectors $$\mathbf{I} = \sum_{i=1}^N \ket{n_i}\bra{n_i}$$

In particular, given an observable $K$ on a state-space $\mathbb{C}^N$, and an eigenvalue $\lambda$ of $K$, with corresponding eigenspace $\left\{K \equiv \lambda\right\}$, we can define an associated projection operator, $\Pi_{K \equiv \lambda}$, which maps vectors onto the eigenspace $$\Pi_{K \equiv \lambda} = \sum_{\left\{K \equiv \lambda\right\}} \ket{n}\bra{n}$$ where here we are summing over the basis vectors of the eigenspace, $\ket{n} \in \left\{K \equiv \lambda\right\}$.

Thus, arbitrary vectors, $\ket{\psi} \in \mathbb{C}^N$, are projected onto the eigenspace, $$\Pi_{K \equiv \lambda}\ket{\psi} = \sum_{\left\{K \equiv \lambda\right\}} \left(\braket{n}{\psi}\right) \ket{n}\in \left\{K \equiv \lambda\right\}$$ and the projection operator acts as an identity on the eigenspace itself. That is, $$\forall \ket{n} \in \left\{K \equiv \lambda\right\},\, \Pi_{K \equiv \lambda}\ket{n} = \ket{n}$$

Probabilities

The probability postulate can be stated in terms of projection operators.

The probability that an arbitrary state, $\ket{\psi}$, will be measured to have a certain property, $K \equiv \lambda$, is postulated to be the average of the projection operator associated that property. That is, $$\begin{align*}\bra{\psi}\Pi_{K \equiv \lambda}\ket{\psi} &= \bra{\psi} \left(\sum_{\left\{K \equiv \lambda\right\}} \ket{n}\bra{n}\right)\ket{\psi}\\&= \sum_{\left\{K \equiv \lambda\right\}} \braket{\psi}{n}\braket{n}{\psi}\\&= \sum_{\left\{K \equiv \lambda\right\}} \braket{n}{\psi}^* \braket{n}{\psi}\\&= \sum_{\left\{K \equiv \lambda\right\}} \left|\braket{n}{\psi}\right|^2\end{align*}$$

The dimension of the eigenspace is the number of ways in which the observable could be measured to have property $K \equiv \lambda$, and the probabilty that a property has a particular value is the sum of the probabilities that any of these will occur.

We can measure two properties, $K \equiv \lambda, L \equiv \mu$ simultaneously if two conditions are satisfied.

The first condition is that the two observables are compatible (commutative), $$\Pi_{K \equiv \lambda, L \equiv \mu} = \Pi_{K \equiv \lambda} \Pi_{L \equiv \mu} = \Pi_{L \equiv \mu} \Pi_{K \equiv \lambda}$$

The compound projection operator associated with a simultaneous measurement can then be considered in two ways - either as a projection into the $\left\{K \equiv \lambda\right\}$ eigenspace or as a projection into the $\left\{L \equiv \mu\right\}$ eigenspace, $$\begin{align*}\Pi_{K \equiv \lambda, L \equiv \mu} \ket{\psi} = \Pi_{K \equiv \lambda} \left(\Pi_{L \equiv \mu} \ket{\psi}\right) &\in \left\{K \equiv \lambda\right\}\\\Pi_{K \equiv \lambda, L \equiv \mu} \ket{\psi} = \Pi_{L \equiv \mu} \left(\Pi_{K \equiv \lambda} \ket{\psi}\right) &\in \left\{L \equiv \mu\right\}\end{align*}$$

The second condition is that the two eigenspaces have at least one non-trivial common state.

The probability that a state $\ket{\psi}$ will be observed to have both properties is then the average of the compound operator on that state, $$\bra{\psi}\Pi_{K \equiv \lambda, L \equiv \mu}\ket{\psi}$$