One of the practical challenges of Bayesian statistics is being able to deal with all of the complex probability distributions involved. You begin with the likelihood function of interest, but once you combine it with the prior distributions of all the parameters, you end up with a complex posterior distribution that you need to characterize. Since you usually can’t calculate that distribution analytically, the best you can do is to simulate from that distribution (generally, using Markov-Chain Monte-Carlo techniques). Packages like RStan handle the simulation for you, but it’s fairly easy to use the Metropolis-Hastings algorithm to code it yourself, at least for simple problems.

PhD student Maxwell Joseph did just that, using the R language. Beginning with a data set of 50 points, he set out to estimate the joint posterior distribution of the mean and variance, given simple priors (Normal for the mean; Uniform for the variance). He ran three chains of the M-H algorithm simultanously, and created the animation below. You can see each of the chains (purple, red and blue) as they progress through the joint distribution of the mean (horizontal axis) and variance (vertical axis), and see how the posterior distribution evolves over time in the 3-D image to the right.

Read the Full Story.