###
From uniform to Gaussian

On one of the homework sets for my statistical mechanics class, we have to juggle with probability distributions. Why's that interesting? First of all, particles don't behave random, but it would take too much computational power to track all particles in a system - imagine saving six values for 10^23 particles and then starting to calculate something. Thus, there are some fancy and useful probability distributions out there to describe a macroscopic system without calculating everything form macroscopic details. Furthermore, your computer needs random numbers all the time (in general, for encryption etc.), but also to run the simulations for those particle systems. Thus, we should care about probability distributions and I started writing the following blog entry:
It's somehow "simple" but there's still some magic about it: The Box-Muller algorithm transforms random numbers drawn from a uniform distribution in the interval [0, 1] to numbers which belong to a Gaussian distribution. What I mean: You draw two numbers from the interval [0, 1] and you might get any number from this interval with the same probability. You put those numbers into the Box-Muller algorithm and you get two numbers which resemble a Gaussian distribution, i.e. anything from minus infinity up to plus infinity, but it's most likely that you land near your mean value. How often you get close to the mean is defined by the standard deviation. For a detailed mathematical description you can look here.
Our homework asks for the derivation of that mapping, so how do you get to that idea? Well, first I thought of how I could get to any number between 0 and infinity from 0 to 1. The natural logarithm can do that for you; and the negative one gets you to any number from minus infinity to 0. You fix this sign change by a cosine and sine function and "that's it" :)