Bayesian modeling using WinBUGS
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, o...
Saved in:
| Main Author: | |
|---|---|
| Format: | eBook Book |
| Language: | English |
| Published: |
Hoboken, N.J
Wiley
2009
WILEY J. Wiley & Sons John Wiley & Sons, Incorporated Wiley-Blackwell |
| Edition: | 1 |
| Series: | Wiley series in computational statistics. Wiley series in computational statistics |
| Subjects: | |
| ISBN: | 047014114X, 9780470141144, 0470434546, 9780470434543, 9780470434567, 0470434562 |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Table of Contents:
- Chapter 10. The Predictive Distribution and Model Checking -- 10.1 Introduction -- 10.1.1 Prediction within Bayesian framework -- 10.1.2 Using posterior predictive densities for model evaluation and checking -- 10.1.3 Cross-validation predictive densities -- 10.2 Estimating the predictive distribution for future or missing observations using MCMC -- 10.2.1 A simple example: Estimating missing observations -- 10.2.2 An example of Bayesian prediction using a simple model -- 10.3 Using the predictive distribution for model checking -- 10.3.1 Comparison of actual and predictive frequencies for discrete data -- 10.3.2 Comparison of cumulative frequencies for predictive and actual values for continuous data -- 10.3.3 Comparison of ordered predictive and actual values for continuous data -- 10.3.4 Estimation of the posterior predictive ordinate -- 10.3.5 Checking individual observations using residuals -- 10.3.6 Checking structural assumptions of the model -- 10.3.7 Checking the goodness-of-fit of a model -- 10.4 Using cross-validation predictive densities for model checking, evaluation, and comparison -- 10.4.1 Estimating the conditional predictive ordinate -- 10.4.2 Generating values from the leave-one-out cross-validatory predictive distributions -- 10.5 Illustration of a complete predictive analysis: Normal regression models -- 10.5.1 Checking structural assumptions of the model -- 10.5.2 Detailed checks based on residual analysis -- 10.5.3 Overall goodness-of-fit of the model -- 10.5.4 Implementation using WinBUGS -- 10.5.5 An Illustrative example -- 10.5.6 Summary of the model checking procedure -- 10.6 Discussion -- Problems -- Chapter 11. Bayesian Model and Variable Evaluation -- 11.1 Prior predictive distributions as measures of model comparison: Posterior model odds and Bayes factors
- 7.3.2 GLM specification in WinBUGS -- 7.4 Poisson regression models -- 7.4.1 Interpretation of Poisson log-linear parameters -- 7.4.2 A simple Poisson regression example -- 7.4.3 A Poisson regression model for modeling football data -- 7.5 binomial response models -- 7.5.1 Interpretation of model parameters in binomial response models -- 7.5.2 A simple example -- 7.6 Models for contingency tables -- Problems -- Chapter 8. Models for Positive Continuous Data, Count Data, and Other GLM-Based Extensions -- 8.1 Models with nonstandard distributions -- 8.1.1 Specification of arbitrary likelihood using the zeros-ones trick -- 8.1.2 The inverse Gaussian model -- 8.2 Models for positive continuous response variables -- 8.2.1 The gamma model -- 8.2.2 Other models -- 8.2.3 An example -- 8.3 Additional models for count data -- 8.3.1 The negative binomial model -- 8.3.2 The generalized Poisson model -- 8.3.3 Zero inflated models -- 8.3.4 The bivariate Poisson model -- 8.3.5 The Poisson difference model -- 8.4 Further GLM-based models and extensions -- 8.4.1 Survival analysis models -- 8.4.2 Multinomial models -- 8.4.3 Additional models and further reading -- Problems -- Chapter 9. Bayesian Hierarchical Models -- 9.1 Introduction -- 9.1.1 A simple motivating example -- 9.1.2 Why use a hierarchical model? -- 9.1.3 Other advantages and characteristics -- 9.2 Some simple examples -- 9.2.1 Repeated measures data -- 9.2.2 Introducing random effects in performance parameters -- 9.2.3 Poisson mixture models for count data -- 9.2.4 The use of hierarchical models in meta-analysis -- 9.3 The generalized linear mixed model formulation -- 9.3.1 A hierarchical normal model: A simple crossover trial -- 9.3.2 Logit GLMM for correlated binary responses -- 9.3.3 Poisson log-linear GLMMs for correlated count data -- 9.4 Discussion, closing remarks, and further reading -- Problems
- 11.2 Sensitivity of the posterior model probabilities: The Lindley-Bartlett paradox
- Intro -- Bayesian Modeling Using WinBUGS -- Contents -- Preface -- Acknowledgments -- Acronyms -- Chapter 1. Introduction to Bayesian Inference -- 1.1 Introduction: Bayesian modeling in the 21st century -- 1.2 Definition of statistical models -- 1.3 Bayes theorem -- 1.4 Model-based Bayesian inference -- 1.5 Inference using conjugate prior distributions -- 1.5.1 Inference for the Poisson rate of count data -- 1.5.2 Inference for the success probability of binomial data -- 1.5.3 Inference for the mean of normal data with known variance -- 1.5.4 Inference for the mean and variance of normal data -- 1.5.5 Inference for normal regression models -- 1.5.6 Other conjugate prior distributions -- 1.5.7 Illustrative examples -- 1.6 Nonconjugate analysis -- Problems -- Chapter 2. Markov Chain Monte Carlo Algorithms in Bayesian Inference -- 2.1 Simulation, Monte Carlo integration, and their implementation in Bayesian inference -- 2.2 Markov chain Monte Carlo methods -- 2.2.1 The algorithm -- 2.2.2 Terminology and implementation details -- 2.3 Popular MCMC algorithms -- 2.3.1 The Metropolis-Hastings algorithm -- 2.3.2 Componentwise Metropolis-Hastings -- 2.3.3 The Gibbs sampler -- 2.3.4 Metropolis within Gibbs -- 2.3.5 The slice Gibbs sampler -- 2.3.6 A simple example using the slice sampler -- 2.4 Summary and closing remarks -- Problems -- Chapter 3. WinBUGS Software: Introduction, Setup, and Basic Analysis -- 3.1 Introduction and historical background -- 3.2 The WinBUGS environment -- 3.2.1 Downloading and installing WinBUGS -- 3.2.2 A short description of the menus -- 3.3 Preliminaries on using WinBUGS -- 3.3.1 Code structure and type of parameters/nodes -- 3.3.2 Scalar, vector, matrix, and array nodes -- 3.4 Building Bayesian models in WinBUGS -- 3.4.1 Function description -- 3.4.2 Using the for syntax and array, matrix, and vector calculations
- 3.4.3 Use of parentheses, brackets and curly braces in WinBUGS -- 3.4.4 Differences between WinBUGS and R/Splus syntax -- 3.4.5 Model specification in WinBUGS -- 3.4.6 Data and initial value specification -- 3.4.7 An example of a complete model specification -- 3.4.8 Data transformations -- 3.5 Compiling the model and simulating values -- 3.6 Basic output analysis using the sample monitor tool -- 3.7 Summarizing the procedure -- 3.8 Chapter summary and concluding comments -- Problems -- Chapter 4. WinBUGS Software: Illustration, Results, and Further Analysis -- 4.1 A complete example of running MCMC in WinBUGS for a simple model -- 4.1.1 The model -- 4.1.2 Data and initial values -- 4.1.3 Compiling and running the model -- 4.1.4 MCMC output analysis and results -- 4.2 Further output analysis using the inference menu -- 4.2.1 Comparison of nodes -- 4.2.2 Calculation of correlations -- 4.2.3 Using the summary tool -- 4.2.4 Evaluation and ranking of individuals -- 4.2.5 Calculation of deviance information criterion -- 4.3 Multiple chains -- 4.3.1 Generation of multiple chains -- 4.3.2 Output analysis -- 4.3.3 The Gelman-Rubin convergence diagnostic -- 4.4 Changing the properties of a figure -- 4.4.1 General graphical options -- 4.4.2 Special graphical options -- 4.5 Other tools and menus -- 4.5.1 The node info tool -- 4.5.2 Monitoring the acceptance rate of the Metropolis-Hastings algorithm -- 4.5.3 Saving the current state of the chain -- 4.5.4 Setting the starting seed number -- 4.5.5 Running the model as a script -- 4.6 Summary and concluding remarks -- Problems -- Chapter 5. Introduction to Bayesian Models: Normal Models -- 5.1 General modeling principles -- 5.2 Model specification in normal regression models -- 5.2.1 Specifying the likelihood -- 5.2.2 Specifying a simple independent prior distribution
- 5.2.3 Interpretation of the regression coefficients -- 5.2.4 A regression example using WinBUGS -- 5.3 Using vectors and multivariate priors in normal regression models -- 5.3.1 Defining the model using matrices -- 5.3.2 Prior distributions for normal regression models -- 5.3.3 Multivariate normal priors in WinBUGS -- 5.3.4 Continuation of Example 5.1 -- 5.4 Analysis of variance models -- 5.4.1 The one-way ANOVA model -- 5.4.2 Parametrization and parameter interpretation -- 5.4.3 One-way ANOVA model in WinBUGS -- 5.4.4 A one-way ANOVA example using WinBUGS -- 5.4.5 Two-way ANOVA models -- 5.4.6 Multifactor analysis of variance -- Problems -- Chapter 6. Incorporating Categorical Variables in Normal Models and Further Modeling Issues -- 6.1 Analysis of variance models using dummy variables -- 6.2 Analysis of covariance models -- 6.2.1 Models using one quantitative variable and one qualitative variable -- 6.2.2 The parallel lines model -- 6.2.3 The separate lines model -- 6.3 A bioassay example -- 6.3.1 Parallel lines analysis -- 6.3.2 Slope ratio analysis: Models with common intercept and different slope -- 6.3.3 Comparison of the two approaches -- 6.4 Further modeling issues -- 6.4.1 Extending the simple ANCOVA model -- 6.4.2 Using binary indicators to specify models in multiple regression -- 6.4.3 Selection of variables using the deviance information criterion (DIC) -- 6.5 Closing remarks -- Problems -- Chapter 7. Introduction to Generalized Linear Models: Binomial and Poisson Data -- 7.1 Introduction -- 7.1.1 The exponential family -- 7.1.2 Common distributions as members of the exponential family -- 7.1.3 Link functions -- 7.1.4 Common generalized linear models -- 7.1.5 Interpretation of GLM coefficients -- 7.2 Prior distributions -- 7.3 Posterior inference -- 7.3.1 The posterior distribution of a generalized linear model

