: Introduction to Bayesian Methods
Introduction to Bayesian Methods

Home

People:
  Faculty/Staff
  Trainees
  IT
  Administration

Employment Opportunities

Members Only

Johns Hopkins Medicine | Sidney Kimmel Comprehensive Cancer Center

Introduction to Bayesian Methods

Instructor: Gary Rosner

Description:

Illustrates current approaches to Bayesian modeling and computation in statistics. Describes simple familiar models, such as those based on normal and binomial distributions, to illustrate concepts such as conjugate and noninformative prior distributions. Covers advanced tools, including linear regression, hierarchical models (random effect models), generalized linear models, and mixed models. Discusses aspects of modern Bayesian computational methods, including Markov Chain Monte Carlo methods (Gibbs' sampler and Metropolis Hastings algorithm) and their implementation and monitoring, and examples of real statistical analyses.

Learning Objective:

Upon successfully completing this course, students will be able to: 1) develop Bayesian models for combining information across data sources; 2) create Winbugs program to run analyses; 3) calculate posterior distributions on parameters of scientific interest; 4) conduct Bayesian analyses of complex data sets.