Continuity of the Derivative
We show now that the differential approximation
forces the derivative function to be continuous,
⇒
Let , but
. Use the differential approximation with
and
and also with
and
, geometrically, looking at the tangent approximation from both endpoints.
Adding, we obtain
Dividing by the nonzero term and adding
to both sides, we obtain,
or
, since the difference between two small errors is small.
This fact can be used to prove:
Theorem: The Inverse Function Theorem
If then
has an inverse function in a small neighborhood of
, that is, if
, then there is a unique
so that
.
We saw above that the differential approximation makes a microscopic view of the graph look linear. If the linear equation
with
can be inverted to find a first approximation to the inverse,
We test to see if . If not, examine the graph microscopically at
. Since the graph appears the same as it's tangent to within
and since
, the local coordinates at
look like a line of slope
:
Solving for the -value which gives output
, we get
Continue in this way generating a sequence of approximations, ,
, where the recursion function
. The distance between successive approximations is
by the Mean Value Theorem for Derivatives. Notice that , for
, so
in particular, and
:
A geometric series estimate shows that the series converges, and
.
To complete this proof we need to show that is a contraction on some nonzero interval. The function
must map the interval into itself and have derivative less than
on the interval. The precise definiton of the derivative matters because the result is false if
is defined by a pointwise limit. The function
with
has pointwise derivative 1 at zero, but is not increasing in any neighborhood of zero. (If you "move the microscope an infinitesimal amount" when looking at
, the graph will look nonlinear.)
Created by Mathematica (September 22, 2004)