# Useful theorems

In order to proceed, we'll need the following two mathematical results.

### Integration by parts

If $f, g$ are functions of $x$ on a given interval $[x_1, x_2]$ (for our purposes, we can assume all functions are smooth - that is, continuous, differentiable to any degree) then $$\int_{x_1}^{x_2} f(x) g'(x) \textrm{d}x = [f(x) g(x)]_{x_1}^{x_2} - \int_{x_1}^{x_2} f'(x) g(x) \textrm{d}x$$

To see why, note that \begin{align*}[f(x) g(x)]_{x_1}^{x_2} &= \int_{x_1}^{x_2} \frac{\textrm{d}}{\textrm{d}x}(f(x) g(x)) \textrm{d}x\\ &= \int_{x_1}^{x_2} f'(x) g(x) + f(x) g'(x) \textrm{d}x\end{align*}

Further, if we know that $f(x_1) = 0$ or $g(x_1) = 0$ and $f(x_2) = 0$ or $g(x_2) = 0$ then $$\left[f(x) g(x)\right]_{x_1}^{x_2} = 0$$ hence $$\int_{x_1}^{x_2} f(x) g'(x) \textrm{d}x = -\int_{x_1}^{x_2} f'(x) g(x) \textrm{d}x$$

### Zero integrals

If $f(x)$ is a (continuous) function on $[x_1, x_2]$ and $$\int_{x_1}^{x_2} f(x) g(x) \textrm{d}x = 0$$ for arbitrary functions $g= g(x)$, then $$f \equiv 0$$

By arbitrary, we mean that for any $g$ we care to try, $\int_{x_1}^{x_2} f(x) g(x) \textrm{d}x = 0$ and the theorem states that the only way this can be true is if $\forall x \in [x_1, x_2] \quad f(x) = 0$.

To see why, suppose that $\int_{x_1}^{x_2} f(x) g(x) \textrm{d}x = 0$, but we can find a function, $f$, where for some $c \in [x_1, x_2], \quad f(c) \neq 0$. For simplicity we consider the case where $f(c) > 0$.

This would imply that we could find a region $[a, b] \subset [x_1, x_2]$ where $\forall x \in [a, b] \quad f(x) > 0$ (this is a consequence of $\: f \:$ being continuous). Since $g$ is arbitrary, we can choose it in a very specific way, where $$g(x) = \begin{cases} 1 & x \in [a, b]\\0 & x \notin [a, b]\end{cases}$$

Then, we get the following contradiction, $$0 = \int_{x_1}^{x_2} f(x) g(x) \textrm{d}x = \int_{a}^{b} f(x) \: 1 \textrm{d}x > 0$$

So, $\: f \not > 0$. Similarly, we can show that $\: f \not < 0$ and the result follows.

We can generalise this to sums of products of functions. That is, if $g_i$ is chosen arbitrarily, and we have $$\int_{x_1}^{x_2} \sum_{i = 1}^{N} \left\{ f_i(x) g_i(x) \right\} \textrm{d}x = 0$$ for some functions $f_i$, then it must follow that $$f_i \equiv 0 \quad \forall i$$

To show this is true, we need only choose the $g_i$ correctly in the theorem above.