9 MCMC

1. In this exercise, you will see how the Metropolis algorithm operates with a multimodal prior. So that things don’t keep changing on you each time you run the code, start with

set.seed(12345)
  1. Define the function \(p(\theta) = (cos(4 \pi \theta) + 1)^2/1.5\) in R.

  2. Use gf_function() to plot \(p(\theta)\) on the interval from 0 to 1. [Hint: Use the xlim argument.]

  3. Use integrate() to confirm that \(p\) is a pdf.

  4. Run metro_bern() (in the CalvinBayes package) with \(p\) as your prior, with no data (x = 0, n = 0), and with step_size = 0.2. Plot the posterior distribution of \(\theta\) and explain why it looks the way it does. (Don’t say what it looks like, be sure to explain why it looks that way.)

  5. Now create a posterior histogram or density plot using x = 20, n = 30. Do the results look reasonable? Explain.

  6. Now create a posterior histogram or density plot with x = 20, n = 30, and step_size = 0.02. Comment on how this compares to plot you made in the previous item.

  7. Repeat the previous item but with start = 0.15 and start = 0.95. Why is it good practice to run MCMC algorithms with several different starting values as part of the diagnositc process?

  8. Create trace plots for each of the situations in parts f and g.

  9. How can looking at trace plots from multiple starting points help you detect the sorts of issues we are seeing here with the Metropolis algorithm? That is, what will the trace plots look like when things are good? What would they look like when things are bad?