Monte Carlo as Intuition

I am doing teaching assistantship for Monte Carlo methods course. I explained the following exercise as an application of the Monte Carlo principle. It is nice because it coincides with a very intuitive guess.

Suppose we have a continuous random variable $X \sim p(x)$ and we would like to estimate its tail probability, i.e. the probability of the event $\{X \geq t\}$ for some $t \in \mathbb{R}$. What is the most intuitive way to do this? I bet that a child can give an answer (along with a reasonable explanation) in the following way: Take many $X$, say $N$, i.e. take $X^{(i)}$ with $i = \{1,\ldots,N\}$. Then simply count the ones that bigger than $t$, and the following quantity gives you an estimate of the tail probability: $$T = \frac{\#\{X^{(i)} \geq t\}}{N}.$$ So among $N$ variables, you count ones bigger than $t$ and simply calculate the ratio. Very intuitive.

Let's see how this procedure underlies a rigorous Monte Carlo perspective.

Using Monte Carlo methods, one can estimate expectations with respect to any test function $\varphi(x)$ provided that we can draw exact samples from $p(x)$, in other words, if we can take many $X^{(i)}$'s. So let us take $N$ identical copies of these random variables as above, and try to calculate the $\mathbb{E}[\varphi(x)]$. We recall the convention $\mathbb{P}(\mbox{d}x) = p(x)\mbox{d}x$, and write: $$\mathbb{E}[\varphi(x)] = \int \varphi(x) \mathbb{P}(\mbox{d}x) = \int \varphi(x) p(x) \mbox{d}x$$ We take again random variables $X^{(i)}$, and construct the following particle approximation of the measure $\mathbb{P}(\mbox{d}x)$: $$\mathbb{P}^N(\mbox{d}x) = \frac{1}{N} \sum_{k = 1}^N \delta_{X^{(i)}}(\mbox{d}x)$$ The Monte Carlo estimate of the expectation is given by, $$\bar{\varphi} = \int \varphi(x) p(x) \mbox{d}x \approx \int \varphi(x) \frac{1}{N} \sum_{k=1}^N \delta_{X^{(i)}}(x) \mbox{d}x$$ So following this, the empirical expectation can be computed as (recall the properties of Dirac delta function), $$\bar{\varphi} = \frac{1}{N} \sum_{k=1}^N \varphi(X^{(i)})$$ What we are interested in above was to estimate precisely the following quantity: $\mathbb{P}(X \geq t)$. This is of course given by, $$\mathbb{P}(X\geq t) = \int_t^\infty p(x) \mbox{d}x$$ Let us modify the limits of the integral: $$\int_t^\infty p(x) \mbox{d}x = \int_{-\infty}^\infty \mathbf{1}_{\{x \geq t\}}(x) p(x) \mbox{d}x$$ hence we can write the tail probability as an expectation: $$\mathbb{P}(X \geq t) = \mathbb{E}[\mathbf{1}_{\{x\geq t\}}(x)]$$ and the story follows given the samples from $p(x)$. Just by putting $\varphi(x) = \mathbf{1}_{\{x \geq t\}}(x)$ and computing $\bar{\varphi}$ as above, one can conclude as an exercise that application of the MC principle to estimation of this expectation exactly results in counting the samples above $t$ and dividing by $N$.


Post a Comment