Commit 1fa2427a authored by David's avatar David
Browse files

Wrote up to the simple example of MC integration

parent f5d407f2
Pipeline #57373 passed with stages
in 1 minute and 35 seconds
......@@ -5,16 +5,26 @@
<iframe width="100%" height="315" src="" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Introduction
Lecture notes will be published soon!
The Monte Carlo method is a very powerful tool of statistical physics. Monte Carlo methods are as useful as they are widespread. For example, one can also compute [molecular dynamics using Monte Carlo methods]( There's a reason it's named after Monaco's famous casino; it utilises probability and randomness. In most cases, a system is evolved to a new state which is chosen from a randomly generated ensemble of possible future states. Then, using some criteria, this new state is accepted or rejected with a certain probability. This can be used in many different areas of statistics, with end goals ranging from: reaching a Bose-Einstein ground state, minimizing an investment portfolio risk or [optimizing the boarding process of an airplane]( Considering the breadth of applications, we choose to center this second project on Monte Carlo methods.
## Monte Carlo integration
While there are multiple categories of Monte Carlo Methods, we will focus on Monte Carlo integration. To see the advantage of this technique, consider a system that is described by a Hamiltonian $H(R)$ depending on $R$. Let's call $R$ the set of all system degrees of freedom. This might include terms like a magnetic field $B$, potential $V$, etc. We're interested in a specific observable of this system called $A(R)$. Specifically, we'd like to know it's expectation value $\langle A\rangle$. From statistical physics, all system state likelihoods can be summarized in the partition function:
$$Z=\int e^{=\beta H(R)}dR$$
Where $\beta=1/(k_BT$) and the Hamiltonian $H(R)$ is integrated over all system degrees of freedom. Here, the Boltzmann factor weighs the probability of each state. The expression for the expectation value can then be expressed as:
$$\langle A\rangle = 1/Z \int e^{=\beta H(R)} A(R)dR$$
### A simple example
For most systems, $R$ is a collection of many parameters. Hence, this is a high-dimensional integral. This means an analytic solution is often impossible. A numerical solution is therefore required to compute the expectation value. In the next section, we will demonstrate the purpose of sampling an integral and convert it into a sum, which is easier to solve for computers. Then, a bit later, you will see why Monte Carlo integration becomes beneficial quite quickly.
### A simple example
Take a general, one-dimensional integral $I=\int_a^bf(x)dx$. We can rewrite this integral into a summation as follows:
$$\int_a^bf(x)dx = (b-a)\int_a^b \frac{1}{b-a} f(x)dx= \lim_{N \rightarrow \infty} \frac{b-a}{N} \sum_i^N f(x_i)$$
Where, $x_i = a + i\ \frac{b-a}{N}$ One could say, the $\{x_i\}$ are distributed uniformly in the interval $[a,b]$
The limit is only needed for the integral and summation to be exacftly the same. From probability theory, we learn that:
$$\int p(x)f(x)dx \approx \frac{1}{N}\sum_i f(x_i)$$
Now, the $x_i$ are randomly drawn from $p(x)$. In other words: we are sampling the function $f(x)$ with values from $p(x)$. This way, the result of the integral can be constructed from the finite summation. In the previous example, the $x_i$ weren't random but rather evenly distributed.
### Why Monte Carlo integration becomes beneficial for high-dimensional integrals
$$\int_a^bf(x)dx = (b-a)\int_a^b \frac{1}{b-a} f(x)dx=\frac{b-a}{N} \sum_i^N f(x_i)$$
## Importance sampling
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment