Rank-two tensors YouTube

Motivation from electromagnetism

Consider the motion of an electrically-charged particle in electric and magnetic fields, described by the Lorentz force, $$\vec{F} = q\left(\vec{E} + \vec{v} \times \vec{B}\right)$$

We can show that the two fields, $\vec{E}, \vec{B}$, are not independent variables and cannot be de-coupled.

To see why, imagine that, in some frame, we have $\vec{E} = 0$, so that only the magnetic field contributes to the force, $$\vec{F} = q\vec{v} \times \vec{B}$$

Now consider another (primed) frame moving at exactly the same velocity, $\vec{v}$, as the particle. In this frame, the velocity of the particle is $\vec{v'} = 0$, so the term in the Lorentz force coming from the magnetic field is zero. The behaviour of the magnetic field is already highly relativistic, depending, as it does, on the velocity of the frame in which it is observed.

If that were the end of the story then this would break one of Newton's since we would have non-zero acceleration in one frame and zero acceleration in a different frame. The resolution is to say that there must be, in the primed frame, some contribution from the electric field, $\vec{E}$. That is, the two fields are not independent and cannot be de-coupled

Thus both the Lorentz force law and Maxwell's equations are not invariant. Still, it was widely believed that they were laws of nature - this was the issue that caused Einstein to re-evaluate Newton's laws of motion and write them in terms of invariant quantities - tensors.

We need to describe a new, single object that has at least six components, so that we can include the three components from each of the two fields.

Rank-n contravariant tensors

We say that a rank-n contravariant tensor is an object with $n$ indices, and $4^n$ components which transform from frame to frame in the same way that a product of $n$ four-vectors would transform.

For example, suppose that $L_{\nu}^{\mu}$ is a Lorentz transformation matrix from one frame to another. That means that the components of a four-vector - a rank-one tensor - in one frame, $A^\mu$, transform into the components in the other frame, $A^{\mu'}$ according to $$A^{\mu'} = L_{\nu}^{\mu'} A^\nu$$

This is the prototype transformation which we use to define all tensors. So, a set of $16 = 4^2$ quantities, $T^{\mu \nu}$ form the components of a rank-two contravariant tensor if the components in the other frame, $T^{\mu' \nu'}$ are given by $$T^{\mu' \nu'} = L_{\rho}^{\mu'} L_{\sigma}^{\nu'} T^{\rho \sigma}$$

The archetypal rank-two tensor is the product of two four-vectors, $T^{\mu \nu} = A^\mu B^\nu$. It is straightforward to show that it satisfies the definition, in that it transforms correctly.

Nb. It is worth noting that not all sets of sixteen quantities form a rank-two tensor.

And rank-zero tensors, or scalars, have zero indices and hence don't change at all under Lorentz transformations.

Rank-two tensors

We can conveniently write rank-two tensors as matrices, where the rows and columns are indicated by the first and second indices, respectively, $$T^{\mu \nu} \rightarrow \begin{bmatrix}T^{00} & T^{01} & T^{02} & T^{03}\\T^{10} & T^{11} & T^{12} & T^{13}\\T^{20} & T^{21} & T^{22} & T^{23}\\T^{30} & T^{31} & T^{32} & T^{33}\end{bmatrix}$$

Symmetric rank-two tensors

We note that it is not generally true that a rank-two tensor is symmetric. That is, in general, $$T^{\mu \nu} \neq = T^{\nu \mu}$$

However, we can always build a symmetric tensor from any tensor. It is easy to show that $$T^{\mu \nu} + T^{\nu \mu}$$ is symmetric.

A symmetric tensor $T^{\mu \nu}$ has ten independent components $$\begin{bmatrix}T^{00} & - & - & -\\T^{01} & T^{11} & - & -\\T^{02} & T^{12} & T^{22} & -\\T^{03} & T^{13} & T^{23} & T^{33}\end{bmatrix}$$ where the components with dashes are the same as their counterparts, $T^{\mu \nu} = T^{\nu \mu}$.

Anti-symmetric rank-two tensors

More interesting for us are anti-symmetric rank-two tensors.

Again, we can build an anti-symmetric tensor from any tensor. For example $$T^{\mu \nu} - T^{\nu \mu}$$ is anti-symmetric.

An anti-symmetric tensor $T^{\mu \nu}$ has only six independent components $$\begin{bmatrix}0 & - & - & -\\T^{10} & 0 & - & -\\T^{20} & T^{21} & 0 & -\\T^{30} & T^{31} & T^{32} & 0\end{bmatrix}$$ since the diagonal components are always zero and the components with dashes are the negative of their counterparts, $T^{\mu \nu} = -T^{\nu \mu}$.

One useful result to note is that, if $T^{\mu \nu}$ is anti-symmetric and $X_\mu$ is any covariant four-vector, then $$T^{\mu \nu} X_\mu X_\nu = 0$$

This is simply because we can write $$\begin{align*}T^{\mu \nu} X_\mu X_\nu &= T^{\nu \mu} X_\nu X_\mu &\text{ swapping dummy indices}\\&= -T^{\mu \nu} X_\nu X_\mu &\text{ using anti-symmetry}\\&= -T^{\mu \nu} X_\mu X_\nu &\text{ re-arranging four-vectors}\end{align*}$$ from which the result follows.

Faraday tensor

It turns out that the only (interesting) object that has six independent components is a rank-two anti-symmetric tensor and is the prototype tensor we need to describe the electric and magnetic fields as a single object.

For historical reasons, we always start with the following definition of the Faraday tensor, $F^{\mu \nu}$, $$F^{\mu \nu} \rightarrow \begin{bmatrix}0 & -E_1 & -E_2 & -E_3\\E_1 & 0 & -B_3 & B_2\\E_2 & B_3 & 0 & -B_1\\E_3 & -B_2 & B_1 & 0\end{bmatrix}$$ but there is a second, equivalent version, $\tilde{F}^{\mu \nu}$, that we shall see later.

Nb. Recall, we use Latin subscripts for the indices of three-vector components - subscripts because there is no difference between contravariance and covariance in three-dimensional space, and Latin to indicate only the three spatial components.

In the next lecture, we shall show how this tensor describes electromagnetism and, if written in terms of this tensor, that the electromagnetic laws of motion are invariant.

For now, we can pick out the components of the electric field, $\vec{E} = (E_1,E_2,E_3)$, $$E_m = F^{m 0} \text{ where } m = 1,2,3$$ and the magnetic field, $\vec{B} = (B_1,B_2,B_3)$, $$\begin{align*}B_1 &= F^{3 2}\\B_2 &= -F^{3 1}\\B_3 &= F^{2 1}\end{align*}$$

The pattern in the magnetic components becomes clearer if we convert to all negative components, $$\begin{align*}B_1 &= -F^{2 3}\\B_2 &= -F^{3 1}\\B_3 &= -F^{1 2}\end{align*}$$ which we could summarise by $$B_i = -F^{j k}, \text{ where } (i,j,k) \text{ is a cycle of } (1,2,3)$$

This is reminiscent of how cross-products, $\vec{a}$, can be formed from two three-vectors, $\vec{b}, \vec{c}$, $$\vec{a} = \vec{b} \times \vec{c}$$ where, in index terms, we have $$a_i = b_j c_k - b_k c_j, \text{ where } (i,j,k) \text{ is a cycle of } (1,2,3)$$

We can define a corresponding anti-symmetric three-dimensional matrix, $$C_{jk} = b_j c_k - b_k c_j \rightarrow \begin{bmatrix}0 & - & -\\b_2 c_1 - b_1 c_2 & 0 & -\\b_3 c_1 - b_1 c_3 & b_3 c_2 - b_2 c_3 & 0\end{bmatrix} = \begin{bmatrix}0 & - & -\\-a_3 & 0 & -\\a_2 & -a_1 & 0\end{bmatrix}$$ so that, except for a minus-sign, the comparison with the magnetic components of the Faraday tensor is complete, since we can now write $$a_i = C_{jk}, \text{ where } (i,j,k) \text{ is a cycle of } (1,2,3)$$

We shall develop this further in the next lecture.