Numerical Differentiation

From PrattWiki
Jump to navigation Jump to search

For a given set, it may be important to determine the rates of change - or higher derivatives - of the information in the set. Unless you have a symbolic formula for the system in question, you will need to approximate these derivatives numerically. The following page looks at two ways to find formulas for numerical approximations to derivatives: using polynomial interpolations and using Taylor series approximations. Note: this page uses 0-indexing for data sets in order to be consistent with languages like Python; if you are using MATLAB, move everything over by 1 and go to Numerical Differentiation (1-indexed)!

Polynomial Interpolation

One method for determining the numerical approximation to a derivative is to generate a symbolic representation of the data points and then take the analytical derivative of that representation. The following sections demonstrate this process for data sets containing two or three points.

Two Point Approximations

Assume you have two data points, where \(x\) represent the independent values and \(y\) represent the dependent values:

\( \begin{align} \begin{array}{c|c} x & y \\ \hline x_0 & y_0 \\ x_1 & y_1 \end{array} \end{align} \)

You can write a first-order interpolating polynomial using Newton Polynomials as:

\( \begin{align} \hat{y}(x)&=y_1+\left(\frac{y_1-y_0}{x_1-x_0}\right)\left(x-x_0\right) \end{align} \)

Taking the first derivative yields:

\( \begin{align} \frac{d\hat{y}(x)}{dx}&=\left(\frac{y_1-y_0}{x_1-x_0}\right) \end{align} \)

which is the same for any \(x\) between \(x_0\) and \(x_1\).

Generalizing this, you can use any two points in your data set to calculate a derivative using this formula. You must first decide if you are going to use the point you are at and the point behind you or the the point you are on and the point ahead of you. Also, if the independent values are evenly spaced, you can make a simplification in that \(x_j-x_{j-1}=\Delta x\) for all \(j\). In summary:

\( \begin{align} \begin{array}{|c|c|c|}\hline \mbox{2-pt 1st derivative} & \mbox{Backward} & \mbox{Forward} \\ \hline \mbox{General} & \left.\frac{dy}{dx}\right|_{x_k}=\frac{y_{k}-y_{k-1}}{x_k-x_{x-1}} & \left.\frac{dy}{dx}\right|_{x_k}=\frac{y_{k+1}-y_{k}}{x_{k+1}-x_{k}} \\ \hline \mbox{Evenly Spaced} & \left.\frac{dy}{dx}\right|_{x_k}=\frac{y_{k}-y_{k-1}}{\Delta x} & \left.\frac{dy}{dx}\right|_{x_k}=\frac{y_{k+1}-y_{k}}{\Delta x} \\ \hline \end{array} \end{align} \)

Note that the formulas cannot be applied at particular points. Specifically, the backwards-looking first derivative does not work at the first point and the forwards-looking first derivative does not work at the last point. In those cases, you will basically use the other formula to produce a result.

Note that two-point models can only yield approximations to the first derivative; the approximation to the second derivative requires more points.

Three Point Approximations

Assume you have three data points, where \(x\) represent the independent values and \(y\) represent the dependent values:

\( \begin{align} \begin{array}{c|c} x & y \\ \hline x_0 & y_0 \\ x_1 & y_1 \\ x_2 & y_2 \end{array} \end{align} \)

You can write a second-order interpolating polynomial using Newton Polynomials as:

\( \begin{align} \hat{y}(x)&=y_0+\left(\frac{y_1-y_0}{x_1-x_0}\right)\left(x-x_0\right) + \left( \frac{\left(\frac{y_2-y_1}{x_2-x_1}\right)-\left(\frac{y_1-y_0}{x_1-x_0}\right)}{x_2-x_0} \right)\left(x-x_0\right)\left(x-x_1\right) \end{align} \)

While this can be used for unevenly spaced data, it is more common to look at even spacing; replacing the spacings between neighboring \(x\) values yields the more-manageable:

\( \begin{align} \hat{y}(x)&=y_0+\left(\frac{y_1-y_0}{\Delta x}\right)\left(x-x_0\right) + \left( \frac{y_2-2y_1+y_0}{2\left(\Delta x\right)^2} \right) \left(\left(x-x_0\right)^2-\Delta x\left(x-x_0\right)\right) \end{align} \)

...under constuction...

Taylor Series

Another way to determine the numerical approximation to derivatives is based on the Taylor Series:

\( \begin{align} f(x\pm \Delta x) &= \sum_{n=0}^{\infty}\frac{f^{(n)}(x)}{n!}\left(\pm \Delta x\right)^n \end{align} \)

where \(f^{(n)}(x)\) represents the \(n\)th derivative of \(f(x)\). Imagine three discrete points next to each other on a line, one at \(x\) and the other two \(\Delta x\) away on either side.

\( \begin{align} f(x-\Delta x) &\approx f(x) - \Delta x \frac{df(x)}{dx} + \frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} - \frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}+O(\Delta x^4)\\ f(x) &= f(x)\\ f(x+\Delta x) &\approx f(x) + \Delta x \frac{df(x)}{dx} + \frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} + \frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}+O(\Delta x^4)\\ \end{align} \)

where the Order operator \(O()\) signifies the leading order of the remaining terms.

From this, we can find approximations to various derivatives of \(f(x)\) at \(x\). Assume that the \(n\)th derivative will be some function which is a weighted sum of the three terms above:

\( \begin{align} \frac{d^nf(x)}{dx^n} &\approx af(x-\Delta x)+bf(x)+cf(x+\Delta x)\\ \frac{d^nf(x)}{dx^n} &\approx a\left(f(x) - \Delta x \frac{df(x)}{dx} + \frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} - \frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}+O(\Delta x^4)\right)+bf(x)+\\ ~ &~ ~~c\left(f(x) + \Delta x \frac{df(x)}{dx} + \frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} + \frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}+O(\Delta x^4)\right)\\ \frac{d^nf(x)}{dx^n} &\approx (a+b+c)f(x)+(-a+c)\Delta x\frac{df(x)}{dx}+ \frac{(a+c)\Delta x^2}{2}\frac{d^2f(x)}{dx}+\frac{(-a+c)\Delta x^3}{6}\frac{d^3f(x)}{dx}+(a+c)O(\Delta x^4) \end{align} \)

To get the first derivative only, solve the coefficients above to zero out the terms you do not want and keep only the first derivative:

\( \begin{align} a+b+c&=0 & \mbox{Eliminate 0th derivative}\\ (-a+c)\Delta x&= 1&\mbox{Keep 1st derivative}\\ (a+c)&=0&\mbox{Eliminate 2nd derivative} \end{align} \)

Notice that there is no way to keep the first derivative and get rid of the third derivative. This means that a three-point approximation to the first derivative will have some error, the highest order of which coming from the third derivative. It turns out that three-point second derivative approximations have their highest order error in the fourth derivative term. In any event, the solution to the above system of three equations and three unknowns is:

\( \begin{align} a&=\frac{-1}{2\Delta x} & b&=0 & c&=\frac{1}{2\Delta x} \end{align} \)

so that the approximation to the first derivative is:

\( \begin{align} \frac{df(x)}{dx} &\approx \frac{f(x+\Delta x)-f(x-\Delta x)}{2\Delta x} + O(\Delta x^2) \end{align} \)

where the \(\Delta x^2\) error comes from the combination of the \(\Delta x^3\) in front of the third derivative and the \(\Delta x^{-1}\) in the \(a\) and \(c\) terms. Proceeding in the same way to get the second derivative yields:

\( \begin{align} \frac{d^2f(x)}{dx^2} &\approx \frac{f(x+\Delta x)-2f(x)+f(x-\Delta x)}{\Delta x^2}+ O(\Delta x^3) \end{align} \)

You can get higher order accuracy by including more points, but you increase computational complexity. You can also use two points to get an approximation, but you sacrifice accuracy. For instance, the two-point backward-facing first derivative uses \(f(x)\) and \(f(x-\Delta x)\) to get the first derivative:

\( \begin{align} \frac{df(x)}{dx}&\approx af(x-\Delta x)+bf(x)\\ \frac{df(x)}{dx}&\approx a\left(f(x) - \Delta x \frac{df(x)}{dx} + \frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} - \frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}\right)+bf(x)\\ \frac{df(x)}{dx}&\approx (a+b)f(x)-a\Delta x \frac{df(x)}{dx} +a\frac{\Delta x^2}{2}\frac{d^2f(x)}{dx^2} -a\frac{\Delta x^3}{6}\frac{d^3f(x)}{dx^3}\\ \end{align} \)

This is true when \(b=-a=\frac{1}{\Delta x}\,\!\), but the approximation becomes

\( \begin{align} \frac{df(x)}{dx}&\approx \frac{df(x)}{dx} -1\frac{\Delta x}{2}\frac{d^2f(x)}{dx^2} +1\frac{\Delta x^2}{6}\frac{d^3f(x)}{dx^3}\\ \end{align} \)

which is now only first order accurate. Trying to use two points for the second derivative would be catastrophic since the error term would end up being larger than the derivative term!