next up previous
Next: Differential Equations and Computational Up: No Title Previous: Taylor Series

Continuity and Limits; Derivatives and Anti-Derivatives

The most fundamental notion in continuous mathematics is the idea of a limit: the value that an expression inexorably approaches, possibly from below, possibly from above, possibly oscillating around it, tending always closer but possibly never actually reaching it. We have already encountered limits in the power series definitions of transcendental functions. When computers try to calculate something as seemingly straightforward as $\cos(37^{o})$, they merely approximate it by truncating (considering a finite number of terms in) an infinite series whose limit is $\cos(37^{o})$. The entire monumental edifice of The Calculus - invented in the decades before 1700 independently by Isaac Newton and Gottfried Leibniz, described by John von Neumann as ``the first achievement of modern mathematics, and the greatest technical advance in exact thinking" - is built upon the notion of the limit.

Here are some properties of limits, for continuous functions
f(x) and g(x):
$\displaystyle \lim_{x \rightarrow c}[f(x)+g(x)]$ = $\displaystyle \lim_{x \rightarrow c}[f(x)] +
\lim_{x \rightarrow c}[g(x)]$ (14)
       
$\displaystyle \lim_{x \rightarrow c}[f(x)-g(x)]$ = $\displaystyle \lim_{x \rightarrow c}[f(x)] -
\lim_{x \rightarrow c}[g(x)]$ (15)
       
$\displaystyle \lim_{x \rightarrow c}[f(x)g(x)]$ = $\displaystyle \lim_{x \rightarrow c}[f(x)]
\lim_{x \rightarrow c}[g(x)]$ (16)
       
$\displaystyle \lim_{x \rightarrow c}[kf(x)]$ = $\displaystyle k\lim_{x \rightarrow c}[f(x)]$ (17)
       
$\displaystyle \lim_{x \rightarrow c}\frac{f(x)}{g(x)}$ = $\displaystyle \frac{\lim_{x \rightarrow c}
[f(x)]}{\lim_{x \rightarrow c}[g(x)]} \; \; {\rm assuming } \; \;
(\lim_{x \rightarrow c}[g(x)] \ne 0)$ (18)

The basic concept of the derivative of a function f(x), denoted f'(x) or $\frac{df(x)}{dx}$, signifying its instantaneous rate of change at a point x, is defined as the limit of its Newton Quotient at that point:

\begin{displaymath}f'(x) \equiv \lim_{\Delta x \rightarrow 0}\frac{f(x+\Delta x)-f(x)}{\Delta x}
\end{displaymath} (19)

The derivative of f(x) exists wherever the above limit exists. It will exist near any point where f(x) is continuous, i.e. if near any point c in the domain of f(x), it is true that $\lim_{x \rightarrow c} f(x) = f(c)$.


Review of Rules of Differentiation (material not lectured)
For a continuous function f(x) that is sampled only at a set of discrete points $\{ x_{1},x_{2},...,x_{n} \}$, an estimate of the derivative is called the finite difference. It is defined as you might expect:

\begin{displaymath}f'(x)=\frac{f(x_{k}) - f(x_{k-1})}{(x_{k}-x_{k-1})}
\end{displaymath} (28)

When using a computer to calculate derivatives of continuous data or signals, they must be sampled at a finite number of points; then the above finite difference becomes an estimator of the instantaneous derivative. Clearly, the finite difference approaches the instantaneous derivative in the limit that the sampling interval becomes small: $x_{k} \rightarrow x_{k-1}$.

The area
A under a function between two definite points is called its definite integral, and it can be calculated in several ways. Numerically, it can be estimated as the limit of a sum of small rectangular areas inscribed under the function, each of whose height is equal to the value of the function at that point, and whose width $\Delta x$ shrinks to zero:

\begin{displaymath}A = \lim_{n \rightarrow \infty} \sum_{k=1}^{n} f(x_{k}) \Delta x
\end{displaymath} (29)

Such a summation is the definite integral of the function over the domain covered by the shrinking rectangles, and the origin of the integral sign ${\displaystyle \int}$ is the letter S in the Latin word Summa, for sum. Thus we denote

\begin{displaymath}\int_{a}^{b} f(x) dx \equiv \lim_{n \rightarrow \infty} \sum_{k=1}^{n} f(x_{k}) \Delta x
\end{displaymath} (30)

where the set of samples f(xk) is taken uniformly from x1=a to xn=b, and so $\Delta x = (b-a)/n$. The above expression is also termed a Riemann Integral.

Many of the properties we noted earlier for limits obviously apply to definite integrals, since they are themselves defined as limits. For example:

$\displaystyle \int_{a}^{b}k f(x) dx$ = $\displaystyle k\int_{a}^{b} f(x) dx$ (31)
       
$\displaystyle \int_{a}^{b} [ f(x) + g(x)] dx$ = $\displaystyle \int_{a}^{b} f(x) dx + \int_{a}^{b} g(x) dx$ (32)
       
$\displaystyle \int_{a}^{b} [ f(x) - g(x)] dx$ = $\displaystyle \int_{a}^{b} f(x) dx - \int_{a}^{b} g(x) dx$ (33)
       
$\displaystyle \int_{a}^{b} f(x) dx$ $\textstyle \le$ $\displaystyle \int_{a}^{b} g(x) dx \; \; {\tt if} \; \; f(x) \le g(x)
\; \; {\tt on} \; \; [a,b]$ (34)
       
$\displaystyle \int_{a}^{b} f(x) dx$ + $\displaystyle \int_{b}^{c} f(x) dx = \int_{a}^{c} f(x) dx$ (35)

The antiderivative of f(x) is denoted F(x) and it is the function whose derivative is f(x), i.e. that function which satisfies

\begin{displaymath}\frac{dF(x)}{dx} = f(x)
\end{displaymath} (36)

Often one can find the antiderivative of f(x) simply by applying the rules for differentiation in reverse. For example, since we know that if n is a positive integer

\begin{displaymath}\frac{d}{dx} (x^{n})= n x^{n-1}
\end{displaymath} (37)

we can infer that if f(x) = xn, then its antiderivative is:

\begin{displaymath}F(x) = \frac{1}{n+1} x^{n+1}
\end{displaymath} (38)

Because these are relatively simple symbol-manipulating rules, they can easily be programmed into symbolic math packages such as Stephen Wolfram's famous Mathematica, and also Macsym, to generate the antiderivatives of even very complicated expressions.

Remarkably, the First Fundamental Theorem of Integral Calculus asserts that in order to calculate the integral of a function
f(x) between two points a and b, we need only evaluate its antiderivative F(x) at those two points, and subtract them!

\begin{displaymath}\int_{a}^{b}f(x) dx = F(b) - F(a)
\end{displaymath} (39)


next up previous
Next: Differential Equations and Computational Up: No Title Previous: Taylor Series
Neil Dodgson
2000-10-23