[EN] Approximating pi by Binomial Averaging
Consider this generator, which implements the Taylor series for arctan(1):
function* piTaylor() { let res = 0; let sign = 1; for (let i = 1; true; i += 2) { res += sign / i; sign = -sign; yield 4 * res; } }
This function generates successive approximations of pi using the formula:
π/4 = 1 - 1/3 + 1/5 - 1/7 + 1/9 - ...
Each term alternates between positive and negative, and we multiply the result by 4 to get pi. The sequence converges quite slowly. Here's what the first few terms look like (showing the difference from the actual value of pi):
0 0.8584073464102069 1 -0.4749259869231261 2 0.3250740130768736 3 -0.2463545583516975 4 0.1980898860927471 5 -0.1655464775436166 ...
As we can see, the error alternates between being positive and negative, with the absolute value getting smaller. This means that the sequence alternates between being too high and too low, while slowly getting closer to pi.
Since the sequence goes up and down, we could conjecture that the average of consecutive terms will give a better approximation.
Averaging Pairs
Enter the pairAvg
function, a simple transformation that takes a sequence and returns the average of each pair of consecutive terms:
function* pairAvg(gen) { let prv = gen.next().value; while (true) { const now = gen.next().value; yield (prv + now) / 2; prv = now; } }
Here's what the error terms look like after applying pairAvg
to piTaylor
:
0 0.19174067974354037 1 -0.07492598692312624 2 0.03935972736258808 3 -0.02413233612947518 4 0.01627170427456548 5 -0.01170032369746287 ...
As conjectured, the new sequence converges much faster but, interestingly, the error alternates between being positive and negative again.
This suggests that we might get better results if we apply pairAvg
a second time, and indeed we do:
0 0.05840734641020706 1 -0.01778312978026885 2 0.00761369561655644 3 -0.00393031592745485 4 0.00228569028855130 5 -0.00144391344105265 ...
Curiously, after averaging consecutive elements twice, the error keeps switching sign on each step.
Binomial Averaging
Here's where things get really interesting: if we apply pairAvg
any number times to our original sequence, it still alternates between being too high and too low on each step, and the convergence rate improves dramatically with each application.
I call this operation "binomial averaging" because k iterations of pairAvg
are equivalent to taking the weighted-sum of a window of length k+1, according to a binomial distribution with p=1/2.
The natural question would be: how fast can we find pi if we iterate it more times? For example, if we iterate it 10 times, we get to the limits of double-precision floating point in just 50 steps. (showing the approximation instead of the error now)
0 3.141633873522418 1 3.141587973355371 2 3.141593481375418 3 3.141592461371706 ... 49 3.141592653589793
$ node Welcome to Node.js v18.19.0. Type ".help" for more information. > 3.141592653589793 == Math.PI true
Note that this actually had to run the underlying piTaylor
generator for 60 steps, since each averaging pass "delays" the generator output by one step.
By trying out different amounts, I found that using 16 averaging passes, we get a value identical to Node's Math.PI
in 16 iterations, or 32 steps of the underlying generator that implements the Taylor series.
0 3.141593003936528 1 3.141592626342273 2 3.141592656958023 3 3.141592653032927 4 3.141592653703066 5 3.141592653562805 6 3.141592653597091 7 3.141592653587607 8 3.141592653590510 9 3.141592653589542 10 3.141592653589889 11 3.141592653589756 12 3.141592653589810 13 3.141592653589788 14 3.141592653589797 15 3.141592653589793
I conjecture that running k averaging passes for k steps, gives us 4k bits of precision.
The reason behind this whole thing remains a mystery to me. If you have insights into why it works, I'd love to hear your explanation
About me
My name is Sebastian. I'm a competitive programmer (ICPC World Finalist) and coach. I also enjoy learning about programming language theory and computer graphics.
Social links:
Comments
Post a Comment