Power series

In mathematics, a power series is an infinite series whose terms involve successive powers of a variable, typically with real or complex coefficients. If the series converges, its value determines a function of the variable involved. Conversely, given a function it may be possible to form a power series from successive derivatives of the function: this Taylor series is then a power series in its own right.

Formally, let z be a variable and $$a_n$$ be a sequence of real or complex coefficients. The associated power series is


 * $$\sum_{n=0}^\infty a_n z^n . \,$$.

Radius of convergence
Over the complex numbers the series will have a radius of convergence R, a real number with the property that the series converges for all complex numbers z with $$\vert z \vert < R$$ and that R is the "largest" number with this property (supremum of all numbers with this property. If the series converges for all complex numbers, we formally say that the radius of convergence is infinite.

For example


 * $$\sum n! z^n$$ converges only for $$z=0$$ and has radius of convergence zero.
 * $$\sum z^n$$ converges for all $$\vert z \vert < 1$$, but diverges for $$z=1$$ and so has radius of convergence 1.
 * $$\sum z^n / n!$$ converges for all complex numbers z and so has radius of convergence infinity.

More generally we may consider power series in a complex variable $$z-a$$ for a fixed complex number a.

Within the radius of convergence, a power series determines an analytic function of z. Derivatives of all orders exist, and the Taylor series exists and is equal to the original power series.

Convergence tests
Some of the standard test for convergence of series translate into computations of the radius of convergence R.
 * D'Alembert ratio test: if the limit of the sequence $$\left\vert \frac{a_{n+1}}{a_n} \right\vert$$ exists, then this is equal to 1/R.
 * Cauchy n-th root test: if the limit of the sequence $$\vert a_n \vert^{1/n}$$ exists, then this is equal to 1/R.

Algebra of power series
Power series may be added and multiplied. If $$\sum a_n z^n$$ and $$\sum b_n z^n$$ are power series, we may define their sum and product


 * $$ \left(\sum a_n z^n\right) + \left(\sum b_n z^n \right) = \sum (a_n+b_n) z^n \, $$
 * $$ \left(\sum a_n z^n\right) \cdot \left(\sum b_n z^n \right) = \sum_{n=0}^\infty \left(\sum_{k=0}^n a_k b_{n-k}\right) z^n . \, $$

and these purely algebraic definitions are consistent with the values achieved within the region of convergence.

If a power series g has constant term $$b_0 = 0$$, then the n-th power of g involves only powers of z with exponent at least n. Hence if f denotes the series $$\sum a_n z^n$$ it makes sense to consider the composite


 * $$f(g) = \sum_{n=0}^\infty a_n g^n \, $$

as a power series in z, since any given power of z will appear in only finitely many of the terms $$g^n$$. Again this purely algebraic definition is consistent with function composition within the region of convergence.

Formal power series
Let R be any ring. A formal power series over R, with variable X is a formal sum $$\sum a_n X^n$$ with coefficients $$a_n \in R$$. Addition and multiplication are now defined purely formally, with no questions of convergence, by the formulae above. The formal power series form another ring denoted $$RX$$.

Inversion of power series
The power series $$f$$ is called inverse series of the power series g, iff all elements of the expansion of $$f(g(z))-z$$ with respect to $$z$$ are zero.

To simplify formulas, it is assumed that the zero-th element is zero, and the first coefficient is unity: $$f_0=0$$, and $$f_1=1$$. Then $$g_0=0 $$, and $$g_1=1$$, and
 * $$g_2=-f_2$$
 * $$g_3=2f_2^2-f_3$$
 * $$g_4=-5f_2^3+5f_3f_2-f_4$$
 * $$g_5=6f_4f_2+14f_2^4-21f_3f_2+3f_3^2-f_5$$
 * $$g_6=7f_5f_2+84f_3f_2^3-28f_3^2f_2+7f_3f_4-28f_4f_2^2-42f_2^5-f_6$$
 * $$g_7=-36f_5f_2+8f_5f_3+8f_6f_2+120f_4f_2^3-72f_4f_3f_2+4f_4^2+132f_2^6-330f_3f_2^4+180f_3^2f_2^2-12f_3^2-f_7$$

and so on.