Subsections

Distributions

Distributions are a relatively new idea in mathematics (they were only invented in the 1940's and 1950's). However, they have become extremely useful in analysis, especially for understanding partial differential equations.

The basic idea behind distributions is a little like the trick for getting the weak form a differential equation: multiply something nasty by something nice and integrate. Suppose f(x) is an unpleasant function; then often we can pick a nice function g(x) so that the integral

\begin{displaymath}
\int f(x) g(x) dx \end{displaymath}

makes sense.

For distributions we want f(x) to be something that is not even really a function, but we will start there.

The function g(x) is called a test function, which acts much like our v(x) in the weak form of a differential equation. The function g(x) is not a particular function, but rather one of many functions that we use to ``test'' the behavior of f(x). To do this we make g(x) as nice as possible while still being able to probe f(x) thoroughly.

The way to do this is to allow test functions to be any function g(x) that is infinitely smooth -- meaning that all derivatives exist and are continuous -- and that g(x) is eventually zero (either taking x to $+\infty$ or $-\infty$). (I don't mean that the limit is zero, rather g(x)=0 for x ``large enough''.)

If f(x) is an ordinary integrable function, we can define the functional on test functions:

\begin{displaymath}
f^*(g) = \int f(x) g(x) dx. \end{displaymath}

But we can do much better: by transfering a derivative on f to g we can actually ``differentiate'' functions that we can't usually differentiate. Here is how it works:

\begin{displaymath}
\int f'(x) g(x) dx = f(x) g(x)\vert^{x=+\infty}_{x=-\infty}
 - \int f(x) g'(x) dx = - \int f(x) g'(x) dx \end{displaymath}

is true provided f'(x) is integrable. For general ``functions'', we define f' as

\begin{displaymath}
(f')^*(g) = -\int f(x) g'(x) dx. \end{displaymath}

Then we can differentiate practically any function! We just have to be a little careful about what this ``distributional derivative'' means!

Heaviside and $\delta$-functions

  A standard example of a function which is not differentiable is the Heaviside function:
H(x) = +1 for x>0, and H(x) = 0 for $x\le 0$.
It can be represented as a distribution:

\begin{displaymath}
H^*(g) = \int H(x) g(x) dx = \int_0^\infty g(x) dx. \end{displaymath}

This distribution has a distributional derivative

\begin{displaymath}
(H')^*(g) = - \int H(x) g'(x) dx = -\int_0^\infty g'(x) dx
 = g(0)-g(+\infty) = g(0). \end{displaymath}

This distribution is known as the $\delta$-function:

\begin{displaymath}
\delta^*(g) = g(0) = \int \delta(x) g(x) dx. \end{displaymath}

Actually, the $\delta$-function isn't a function at all; no function has the property that $\int\delta(x) g(x) dx=g(0)$ for all smooth functions. Rather, $\delta$ is a limit of functions like

\begin{displaymath}
\delta_\epsilon(x) = \left\{\begin{array}
{r@{\quad}l}
 1/\e...
 ...n&0\le x\le \epsilon,\  0&\mbox{otherwise}.\end{array}\right. \end{displaymath}

That is, for any test function g(x),

\begin{displaymath}
lim_{\epsilon\downarrow0}\int\delta_\epsilon(x) g(x) dx = g(0). \end{displaymath}

A more sophisticated example...

Normally, we would not consider f(x)=1/x to be an integrable function, so $\int f(x) g(x) dx$ ordinarily would not make sense. But, we can consider it as a distributional derivative of $F(x)={\rm ln}\vert x\vert$, we can find a suitable distribution for ``1/x'':

\begin{displaymath}
f^*(g) = -\int {\rm ln}\vert x\vert g'(x) dx = -F^*(g'). \end{displaymath}

The only problem is for x around zero. So we can approximate the integral F*(g') with an interval $[-\epsilon,+\epsilon]$ removed:

\begin{displaymath}
\left(\int_{-\infty}^{-\epsilon}+\int_{+\epsilon}^{+\infty}\...
 ...epsilon}+\int_{+\epsilon}^{+\infty}\right)
 \frac{g(x)}{x} dx. \end{displaymath}

Taking $\epsilon\downarrow0$, we find that ${\rm ln}(\epsilon) [g(\epsilon)-g(-\epsilon)]=O(\epsilon {\rm ln}(\epsilon))\to 0$, and that

\begin{displaymath}
f^*(g) = lim_{\epsilon\downarrow0}
 \left(\int_{-\infty}^{-\epsilon}+\int_{+\epsilon}^{+\infty}\right)
 \frac{g(x)}{x} dx, \end{displaymath}

which is also known as the Cauchy principle value of $\int g(x)/x dx$.



David Stewart
9/11/1998