#header-inner {background-position: right !important; width: 100% !important;}

11/3/12

Taylor Series

In mathematics, a Taylor series is a representation of a function as an infinite sum of terms that are calculated from the values of the function's derivatives at a single point.

The Taylor series of a real or complex-valued function ƒ(x) that is infinitely differentiable in a neighborhood of a real or complex number a is the power series:

f(a)+(f '(a) / 1!)*(x-a) + (f ''(a) / 2!)*(x-a)2 + (f(3)(a) / 3!)*(x-a)3 + ...

It is common practice to approximate a function by using a finite number of terms of its Taylor series. Taylor's theorem gives quantitative estimates on the error in this approximation. Any finite number of initial terms of the Taylor series of a function is called a Taylor polynomial. The Taylor series of a function is the limit of that function's Taylor polynomials, provided that the limit exists. A function may not be equal to its Taylor series, even if its Taylor series converges at every point.

As the degree of the Taylor polynomial rises, it approaches the correct function.