### How fast can you estimate \(\pi\)?

###### posted by Nicolai Baldin on January 25, 2017

There are quite a few ways how one can calculate the number \(\pi\). I here discuss one elegant way based on the Monte Carlo simulations of independent uniformly distributed random variables. It is a toy illustrative application of the results of our recent paper with Markus Reiss "Unbiased estimation of the volume of a convex body" here.

Let us draw the points \(X_1,...,X_N\) from the uniform distribution over the square \([0,1] \times [0,1]\)
and count the number of points \(n\) which fall inside the circle centred at the origin of radius \(1\).
Let \(\hat{\pi}:= n / N\) denote the ratio of the points inside the circle to the total number of points.
It approximately equals \(\pi/4\) because it is an unbiased estimator:
$$\E[\hat{\pi}] = \frac{1}{N} \E[n] = \frac{1}{N} \E\big[\sum_{i = 1}^{N} \Ind(X_i \in C)\big] = \frac{\pi}{4}\,,
$$
and therefore its mean squared risk of convergence is governed by the variance:
$$
\E\big[(\hat{\pi} - \pi)^2\big] = \Var(\hat{\pi}) = \frac{1}{N^2}\Var(n) = \frac{1}{N^2} \Var\big(\sum_{i = 1}^{N} \Ind(X_i \in C)\big) = \frac{1}{N}\frac{\pi}{4} \big(1 - \frac{\pi}{4}\big)\,.
$$
It turns out \(\hat{\pi}\) is even a maximum likelihood estimator. Surprisingly, we are able to
estimate \(\pi\) with a much faster rate based on the data points in this experiment. Following the results of Section 4 in the paper,
we define our optimal estimator of \(\pi\) as
$$
\hat{\pi}_{opt} = 4 \frac{n + 1}{n_\circ + 1} |\hat{C}|\,,
$$
where \(n_\circ\) is the number of points lying inside the convex hull \(\hat{C}\) of the points lying inside the circle.
Theorem 3.2 together with Theorem 4.5 in the paper states that the
rate of convergence of the mean squared risk of the estimator \( \hat{\pi}_{opt} \) satisfies
\(\E\big[(\hat{\pi}_{opt} - \pi)^2\big] = \bigO(N^{-5/3})\), see the figure below for a numerical comparison of the two estimators.
Note that both estimators can well be computed in polynomial time.