Summary of Bayes Seminar (Fall 2012)

Wed 12 December 2012 by Adrian Brasoveanu
  1. Basic Probability Theory (I)
  2. Basic Probability Theory (II)
  3. Bayesian Inference (I)
  4. Bayesian Inference (II)
  5. Beta-Bernoulli updates (scripts available on eCommons)
  6. statistical modeling in general and Bayesian modeling in particular: recap; introduction to Markov Chain Monte Carlo (MCMC) and the Metropolis-Hastings family of algorithms (scripts available on eCommons)
  7. the mean model: simulated data, R analysis, JAGS analysis; the structure of JAGS models, reexpressing parameters, number of chains, number of iterations, burnin, thinning, the Brooks-Gelman-Rubin (BGR) convergence diagnostic (a.k.a. Rhat), graphical summaries of posterior distributions; binomial proportion inference with JAGS instead of the Metropolis algorithm we built “by hand” for this purpose; comparison of 3 models for the same binomial proportion data with different uniform priors: posterior estimation with JAGS and computing the evidence / marginal likelihood for each model based on the JAGS posterior samples; inference for 2 binomial proportions with JAGS instead of the Metropolis algorithm we built “by hand” for this purpose (scripts available on eCommons)
  8. essentials of linear models; t-tests with equal and unequal variances (simulated data, R analysis, JAGS analysis) (scripts available on eCommons)
  9. simple linear regression (simulated data, R analysis, JAGS analysis); goodness-of-fit assessment in Bayesian analyses (posterior predictive distributions and Bayesian p-values); interpretation of confidence vs. credible intervals, fixed-effects 1-way ANOVA (simulated data, R analysis, JAGS analysis); random-effects 1-way ANOVA (simulated data, R analysis, JAGS analysis); inferring binomial proportions with hierarchical priors (random-effects for “coins”, i.e., basically, random-effects “binomial” ANOVA) (scripts available on eCommons)
  10. 2-way ANOVA w/o and w/ interactions (simulated data, R analysis, JAGS analysis); linear mixed-effects models — random intercepts only, independent random intercepts and slopes, correlated random intercepts and slopes (simulated data, R analysis, JAGS analysis) (scripts available on eCommons)
  11. a different approach to introducing random-effects models (random effects by subject) following a chapter of Lee & Wagenmakers 2012 (“Bayesian Cognitive Modeling: A Practical Course”) and associated R & BUGS code very closely: the approach is driven by having to simultaneously satisfy both (i) the goal of having an informative posterior predictive distribution for the “average” subject and (ii) the goal of having a good model fit to each of the subjects we have data from (scripts available on eCommons)