Tentative Class Schedule
Tentative Class Schedule
Week 1: Introductions
- Monday August 26:
- Introductions & Course Overview
- Weekly Notes (R Markdown Source Code)
- Wednesday August 28:
- Mechanics of Bayesian Statistics
- Friday August 30:
- Philosophy of Bayesian Statistics (Confidence vs. Probability)
Week 2: Bayesian Foundations
- Monday September 2: Labor Day, NO CLASS
- Wednesday September 4:
- Belief, Probability, and Exchangeability
- Quiz 2 due
- Weekly Notes (R Markdown Source Code)
- Friday September 6:
- Belief, Probability, and Exchangeability
- HW 1 due (R Markdown file)
- Philosophy and the practice of Bayesian statistics
Week 3: One-Parameter Models
- Monday September 9:
- Binomial Model
- Quiz 3 due
- Weekly Notes (R Markdown Source Code)
- Wednesday September 11:
- Poisson Model / Exponential Families
- Friday September 13:
- A Little About Priors
- HW 2 due (R Markdown file)
Week 4: Computational Statistics
- Monday September 16:
- Computational Statistics - Monte Carlo
- Quiz 4 due
- Weekly Notes (R Markdown Source Code)
- Wednesday September 18:
- MCMC Intro
- Friday September 20:
Week 5: Normal Model
- Monday September 23:
- Normal Model w/ known variance
- Quiz 5 due
- Weekly Notes (R Markdown Source Code)
- Wednesday September 25:
- Normal Model, joint inference
- Friday September 27:
- Normal Model, Gibbs Sampling
- HW 4 due (R Markdown file)
Week 6: Markov Chain Monte Carlo
- Monday September 30:
- Normal Model and Gibbs Sampling
- Quiz 6 due
- Weekly Notes (R Markdown Source Code)
- Wednesday October 2:
- MCMC diagnostics
- MCMC Demo 2 (R Markdown Source Code)
- Friday October 4:
Week 7: Midterm Week
- Monday October 7: No Class, Work on Take Home Exam
- Wednesday October 9: In Class Midterm
- Friday October 11: No Class MT ASA Chapter Meeting - Take Home Midterm Due 11:59 PM (R Markdown Source Code)
Week 8: Multivariate Normal
- Monday October 14: Multivariate Normal Distribution
- Wednesday October 16: Multivariate Normal Distribution
- Friday October 18: Wishart distribution and Gibbs Sampling
Week 9: Hierarchical Modeling
- Monday October 21: Hierarchical Modeling
- HW 6 due
- No Quiz
- Weekly Notes (R Markdown Source Code)
- Wednesday October 23: Hierarchical Modeling
- Friday October 25: Stein’s paradox
Week 10: Regression Introduction
- Monday October 28: Point Mass Priors, Bayes Factors, and model selection
- Wednesday October 30: Bayes Factors and model selection
- Friday November 1: Regression
Week 11: Regression
- Monday November 4: Regression
- Wednesday November 6: Regression Demo (R Markdown Source Code)
- Friday November 8: Bayesian GLMs
Week 12: GLMs and Metropolis-Hastings
- Monday November 11: No Class Veteran’s Day
- Wednesday November 13: Generalized Linear Models
- Friday November 15: Metropolis-Hastings
Week 13: Hierarchical Regression
- Monday November 18: Hierarchical Regression
- Wednesday November 20: Hierarchical Regression
- Friday November 22: Exam Question Day
Week 14:
- Monday November 25: In Class Final
- Wednesday November 27: Thanksgiving Break NO CLASS
- Friday November 29: Thanksgiving Break NO CLASS
Week 15:
- Sunday December 1: Take Home Exam assigned (R Markdown Source Files)
- Monday December 2: Latent Variable Methods
- Wednesday December 4: Hierarchical GLMs
- Friday December 6: Predictive Models (Bayesian Trees) (R Markdown Source Code)
Week 15: Final Exam Week
- Monday December 9: Take Home Exam due
- Friday December 13: 8 AM - 9:50 Final Exam (Presentations)
Course Description
Fundamentals of Bayesian inference, methods of Bayesian data analysis, computational methods for posterior simulation, fundamentals of hierarchical modeling.
Learning Outcomes:
At the end of the course students will be able to:
- Demonstrate a basic understanding of the fundamental concepts underlying Bayesian inference
- Demonstrate connections and make comparisons among frequentist, likelihood, and Bayesian methods, both from a practical and philosophical perspective
- Demonstrate an understanding of the complex issues involved in specifying prior distributions and recognize there is no default prior
- Demonstrate ability to program methods for taking samples from posterior distributions, including rejection sampling, Metropolis-Hastings algorithm, and Gibbs sampling
- Understand the concepts underlying the computation approaches, including Hamiltonian Monte Carlo techniques
- Demonstrate ability to use available and common software to carry out Bayesian data analysis
- Demonstrate ability to write about conceptual issues, describe and justify assumptions and decisions, and interpret results
- Demonstrate ability to use creative and appropriate graphics to display raw data and results from statistical models
- Demonstrate ability to write down sophisticated models with standard notation, recognizing there are multiple ways to write the same analysis
- Explain the idea behind multi-level, or hierarchical, models and how they relate to models used in traditional linear models classes
- Demonstrate an understanding of posterior predictive checks, as well as the ability to use them meaningfully in practice
- Demonstrate a willingness to think about and discuss the foundations of statistical inference
Prerequisites
- Required: STAT 422 or STAT 502, and STAT 506
- Preferred: extensive experience with R
Textbooks
- A First Course in Bayesian Statistical Methods, by Peter Hoff
- Bayesian Data Analysis (3rd Edition), by Gelman, Carlin, Stern, & Rubin (Optional)
Additional Resources
Analysis, data visualization, and version control procedures will be implemented with:
- R / R Studio
- JAGS
- Stan
- Git / Github
Course Policies
Grading Policy
-
10% of your grade will be determined by weekly quizzes to be completed prior to class on Mondays.
-
30% of your grade will be determined by weekly homework assignments. Students are allowed and encouraged to work with classmates on homework assignments, but each student is required to complete their own homework.
-
20% of your grade will be determined by a midterm exam. The midterm exam will have two parts: an in class exam and a take home portion. The in class portions will be largely conceptual including some short mathematical derivations. The take home portions will focus on analysis of data and implementation of Bayesian computational methods.
-
20% of your grade will be determined by a final exam. The final exam will have two parts: an in class exam and a take home portion. The in class portions will be largely conceptual including some short mathematical derivations. The take home portions will focus on analysis of data and implementation of Bayesian computational methods.
-
20% of your grade will be determined by a project. The project will be a case study where students will apply Bayesian methods to a data set agreed upon by the instructor and student.
Collaboration
University policy states that, unless otherwise specified, students may not collaborate on graded material. Any exceptions to this policy will be stated explicitly for individual assignments. If you have any questions about the limits of collaboration, you are expected to ask for clarification.
In this class students are encouraged to collaborate on homework assignments, but quizzes should be completed without collaboration.
Academic Misconduct
Section 420 of the Student Conduct Code describes academic misconduct as including but not limited to plagiarism, cheating, multiple submissions, or facilitating others’ misconduct. Possible sanctions for academic misconduct range from an oral reprimand to expulsion from the university.
Disabilities Policy
Federal law mandates the provision of services at the university-level to qualified students with disabilities. Make sure to include all that relevant information here.