9 MCMC
1. In this exercise, you will see how the Metropolis algorithm operates with a multimodal prior. So that things don’t keep changing on you each time you run the code, start with
set.seed(12345)
Define the function \(p(\theta) = (cos(4 \pi \theta) + 1)^2/1.5\) in R.
Use
gf_function()
to plot \(p(\theta)\) on the interval from 0 to 1. [Hint: Use thexlim
argument.]Use
integrate()
to confirm that \(p\) is a pdf.Run
metro_bern()
(in theCalvinBayes
package) with \(p\) as your prior, with no data (x = 0
,n = 0
), and withstep_size = 0.2
. Plot the posterior distribution of \(\theta\) and explain why it looks the way it does. (Don’t say what it looks like, be sure to explain why it looks that way.)Now create a posterior histogram or density plot using
x = 20
,n = 30
. Do the results look reasonable? Explain.Now create a posterior histogram or density plot with
x = 20
,n = 30
, andstep_size = 0.02
. Comment on how this compares to plot you made in the previous item.Repeat the previous item but with
start = 0.15
andstart = 0.95
. Why is it good practice to run MCMC algorithms with several different starting values as part of the diagnositc process?Create trace plots for each of the situations in parts f and g.
How can looking at trace plots from multiple starting points help you detect the sorts of issues we are seeing here with the Metropolis algorithm? That is, what will the trace plots look like when things are good? What would they look like when things are bad?