Sums of Powers 

Let S_{k}(n) denote the sum of the kth powers of the first n integers. (For convenience we will define S_{0} = n+1.) It’s fairly easy to determine the explicit formula for these sums directly from the definition. For example, by the definition of the sum of cubes function we have 

_{} 

so it's clear that S_{3} is a polynomial in n of finite degree in n. Also, the asymptotic growth rate obviously doesn't exceed the 4th power of n, so we expect that S_{3} can be expressed in the form 

_{} 

for some constants A,B,C,D,E. We immediately see that E must be zero to give S_{3}(0) = 0. Furthermore, from the defining relation we have 

_{} 

Expanding each of the binomials, collecting terms by powers of n, and setting the coefficient of each power to zero, we find that A = 1/4, B = 1/2, C = 1/4, and D = 0, so the formula for the sum of the first n cubes is 

_{} 

The formulas for the sums of other powers can be derived in the same purely algebraic way. (For a discussion of a generalization of this approach, see the note on Sums and Differences of Discrete Functions.) However, it’s natural to wonder whether these formulas might be derivable in terms of the integral of x^{k} from x = 0 to n, since evaluating this integral is trivial. In fact, we can see that the first term in the formula for S_{3}(n) is simply the integral of x^{3} from x = 0 to n. (In a sense, this was the justification for our presumption that S_{k}(n) is a polynomial of degree k+1.) On the other hand, the summation has some additional terms that don’t appear in the integral, so we can’t simply equate the sum to the corresponding integral. 

To see exactly how sums of powers are related to integrals of powers, consider the integral 

_{} 

Letting C_{m,n} denote binomial coefficients, we can expand the binomials and collect terms by powers of x to give 

_{} 

where we have omitted the arguments on S_{k}, with the understanding that they represent S_{k}(n). Carrying out the integrations on both sides, we get 

_{} 

This equation is valid for any nonnegative integer k, so it gives the following infinite sequence of equations involving the sums of powers: 

_{} 

We can solving the first N of these equations to give expressions for S_{0}, S_{1}, …, S_{N1}. For example, to find formulas for S_{0} through S_{6}, we simply solve the system of equations 

_{} 

Solving this system of equations by inverting the coefficient matrix on the left side, we get 

_{} 

Letting b_{k,j} denote the element in the jth column of the kth row of the coefficient matrix, we can write the sums of the kth powers of the first n integers as 

_{} 

Thus we have 

_{} 

and so on. Expanding the binomials and simplifying, these expressions reduce to 

_{} 

and so on. It appears that Sk for every positive even value of k is divisible by the factors n, (n+1), and (2n+1). It would be interesting to know what other divisibility patterns these sums possess. Also, notice that the elements of the coefficient array satisfy the relation 

_{} 

and using this relation we can rewrite equation (1) as 

_{} 

Therefore, letting B_{k}[x] denote the polynomial 

_{} 

we can write the sum of the kth powers of the first n nonnegative integers succinctly as 

_{} 

Notice also that equation (2) can be written as 

_{} 

from which it follows that 

_{} 

Thus we can express all the b_{k,j} values in terms of binomial coefficients and the values of b_{0,0}, b_{1,0}, b_{2,0}, b_{3,0}, and so on. These numbers are called the Bernoulli numbers, usually denoted by B_{k} = b_{k,0}. To easily generate these numbers (without inverting large matrices), consider our previous expression for S_{k} in the special case of n = 0, so that S_{k} = 0 for all k > 0. In this case we have 

_{} 

Hence we have the recurrence relation 

_{} 

For example, given the values of B_{0} = 1, B_{1} = 1/2, B_{2} = 1/6, B_{3} = 0, we can compute B_{4} from the relation 

_{} 

Inserting the values of B_{0} through B_{3}, we find that B4 equals 1/30. 

The Bernoulli numbers were introduced by Jacques Bernoulli around 1690 in a short paper on computing the sums of powers. He was obviously pleased with his formulas when he wrote 

With the help of [these formulas] it took me less than half of a quarter of an hour to find that the 10th powers of the first 1000 numbers being added together will yield the sum 
91409924241424243424241924242500 
From this it will become clear how useless was the work of Ismael Bullialdus spent on the compilation of his voluminous Arithmetica Infinitorum in which he did nothing more than compute with immense labor the sums of the first six powers, which is only a part of what we have accomplished in the space of a single page. 

(One senses how distasteful Jacques found it to belittle one of his colleagues.) Even with the formula for the sums of 10th powers, it’s impressive that Bernoulli could compute that number by hand in 7.5 minutes. Of course, using N=1000 in the base 10 probably made it easier. In any case, one must agree that the labor Bullialdus put into his tables of sums of powers was wasted, considering how easy it is to derive the explicit formulas for these sums. 

Incidentally, given the formulas for the sums of powers, it follows that if f(k) is any polynomial function of k, we can easily express the sum of f(k) for k = 0 to n by considering the sums of each separate term, which are pure sums of powers. This is the origin of the EulerMaclaurin formula, which is just a generalization of Bernoulli’s formula, replacing x^{k} with an arbitrary function f(x). 
