Vieta's Formulas and $\zeta(2)$
03 Oct 2018Today I came up with a new idea to prove the famous identity
There is no shortage of existing proofs of this identity; see this math.SE question, or this collection of proofs compiled by Robin Chapman. I was inspired by this illuminating video, where 3Blue1Brown explains an elementary proof based on Euclidean geometry, by a physical interpretation involving systems of lighthouses. My new idea comes from trying to translate this proof into the complex plane.
Vieta’s formulas
The main idea in the video is to get an expression for the sum of inverse squared distances from the origin to the vertices of a certain regular $n$-gon. We first observe that when viewed on the complex plane, the vertices of this $n$-gon are the roots of some polynomial of degree $n$, and the sum of inverse squares can be computed by Vieta’s formulas.
More explicitly, suppose that $\rho_1,\rho_2,\ldots,\rho_n$ are the roots of the polynomial
with $a_n\neq0$ (so that there are $n$ roots) and $a_0\neq0$ (so the roots are nonzero), and we want to find the sum of inverse squares $\sum_j\frac1{\rho_j^2}$.
The astute reader might have noticed that this objective is different from the video, which computes $\sum_j\frac1{\lvert\rho_j\rvert^2}$ instead. We chose this reformulation because it is easier to handle algebraically.
If we had wanted to find the sum of squares $\sum_j\rho_j^2$ instead, we could use the following calculation:
where for $(* )$ we used Vieta’s formulas
To get the sum of inverse squares instead, we need to apply the above formula to some polynomial with roots $\frac1{\rho_1},\frac1{\rho_2},\ldots,\frac1{\rho_n}$. But this is simple: note that
so the polynomial $a_0z^n+a_1z^{n-1}+\cdots+a_{n-1}z+a_n=0$ has roots $\frac1{\rho_j}$. Now the formula above gives
An Eulerian “proof”
In the video, a sequence of regular polygons is chosen such that the vertices tend to the sequence of odd integers on the real line. In our reformulation, we should thus choose a sequence of polynomials $f_n$ which tends to some power series $f$ with regularly spaced roots. For our purposes, we will choose
which has roots $2\pi ik$ for $k\in\bb Z\backslash\{0\}$.
If we pretend that our sum-of-inverse-squares formula works not just for polynomials, but for general power series, then applying it to $f$ gives
which is a “proof” in the style of Euler that I think he might have enjoyed.
A dead end?
To make the argument rigorous, we need polynomials $f_n$ whose roots all lie on a circle, with $f_n\to f$ as $n\to\infty$; a very nice choice is given by
which has roots at $n(e^{2\pi ik/n}-1)$ for $n\nmid k$, ie. lying on a circle with radius $n$ centred at $z=-n$. As a sanity check, note that for fixed $k$, the above expression tends to $2\pi ik$; hence the roots of $f_n$ “tend pointwise” to the roots of $f$.
However, this is not enough to show that the sum-of-inverse-squares of the $n$ roots of $f_n$ tends to the sum-of-inverse-squares of the roots of $f$. This is where our argument falls short of the geometric proof, which had good control on the error of the finite approximation.
A tour de force
To finish the argument, we have to abandon our initial hopes for a quick proof, and get some exact expressions from the sum-of-inverse-squares formula.
We fix $n$, and set $\omega=e^{2\pi i/n}$. Then the roots of $f_n$ are $\rho_{k,n}=n(\omega^k-1)$ for $1\leq k\leq n-1$. Note that
Hence
We can sum this over $k$, take the real part, and compare with the sum-of-inverse-squares formula to obtain
since $\cot^2(\pi-\theta)=\cot^2\theta$.
Now we are in the position to finish as in Cauchy’s proof of the identity, by invoking the following trick: adding 1 to each term above gives
By the elementary inequality $\sin\theta<\theta<\tan\theta$ for $\theta\in(0,\frac\pi2)$, we have
and we finish by taking $n\to\infty$.
Comments (0)
Be the first to comment!