×







We sell 100% Genuine & New Books only!

Bayesian Statistical Methods 2021 Edition at Meripustak

Bayesian Statistical Methods 2021 Edition by Brian J. Reich, Sujit K. Ghosh , Taylor & Francis Ltd

Books from same Author: Brian J. Reich, Sujit K. Ghosh

Books from same Publisher: Taylor & Francis Ltd

Related Category: Author List / Publisher List


  • Price: ₹ 4733.00/- [ 11.00% off ]

    Seller Price: ₹ 4212.00

Estimated Delivery Time : 4-5 Business Days

Sold By: Meripustak      Click for Bulk Order

Free Shipping (for orders above ₹ 499) *T&C apply.

In Stock

We deliver across all postal codes in India

Orders Outside India


Add To Cart


Outside India Order Estimated Delivery Time
7-10 Business Days


  • We Deliver Across 100+ Countries

  • MeriPustak’s Books are 100% New & Original
  • General Information  
    Author(s)Brian J. Reich, Sujit K. Ghosh
    PublisherTaylor & Francis Ltd
    ISBN9781032093185
    Pages288
    BindingPaperback
    LanguageEnglish
    Publish YearJune 2021

    Description

    Taylor & Francis Ltd Bayesian Statistical Methods 2021 Edition by Brian J. Reich, Sujit K. Ghosh

    Bayesian Statistical Methods provides data scientists with the foundational and computational tools needed to carry out a Bayesian analysis. This book focuses on Bayesian methods applied routinely in practice including multiple linear regression, mixed effects models and generalized linear models (GLM). The authors include many examples with complete R code and comparisons with analogous frequentist procedures.In addition to the basic concepts of Bayesian inferential methods, the book covers many general topics: Advice on selecting prior distributionsComputational methods including Markov chain Monte Carlo (MCMC) Model-comparison and goodness-of-fit measures, including sensitivity to priorsFrequentist properties of Bayesian methodsCase studies covering advanced topics illustrate the flexibility of the Bayesian approach:Semiparametric regression Handling of missing data using predictive distributionsPriors for high-dimensional regression modelsComputational techniques for large datasetsSpatial data analysisThe advanced topics are presented with sufficient conceptual depth that the reader will be able to carry out such analysis and argue the relative merits of Bayesian and classical methods. A repository of R code, motivating data sets, and complete data analyses are available on the book's website.Brian J. Reich, Associate Professor of Statistics at North Carolina State University, is currently the editor-in-chief of the Journal of Agricultural, Biological, and Environmental Statistics and was awarded the LeRoy & Elva Martin Teaching Award.Sujit K. Ghosh, Professor of Statistics at North Carolina State University, has over 22 years of research and teaching experience in conducting Bayesian analyses, received the Cavell Brownie mentoring award, and served as the Deputy Director at the Statistical and Applied Mathematical Sciences Institute.


    Table of contents : 1. Basics of Bayesian Inference Probability background Univariate distributions Discrete distributions Continuous distributions Multivariate distributions Marginal and conditional distributions Bayes' Rule Discrete example of Bayes' Rule Continuous example of Bayes' Rule Introduction to Bayesian inference Summarizing the posterior Point estimation Univariate posteriors Multivariate posteriors The posterior predictive distribution Exercises 2. From Prior Information to Posterior Inference Conjugate Priors Beta-binomial model for a proportion Poisson-gamma model for a rate Normal-normal model for a mean Normal-inverse gamma model for a variance Natural conjugate priors Normal-normal model for a mean vector Normal-inverse Wishart model for a covariance matrix Mixtures of conjugate priors Improper Priors Objective Priors Jeffreys prior Reference Priors Maximum Entropy Priors Empirical Bayes Penalized complexity priors Exercises 3. Computational approaches Deterministic methods Maximum a posteriori estimation Numerical integration Bayesian Central Limit Theorem (CLT) Markov Chain Monte Carlo (MCMC) methods Gibbs sampling Metropolis-Hastings (MH) sampling MCMC software options in R Diagnosing and improving convergence Selecting initial values Convergence diagnostics Improving convergence Dealing with large datasets Exercises 4. Linear models Analysis of normal means One-sample/paired analysis Comparison of two normal means Linear regression Jeffreys prior Gaussian prior Continuous shrinkage priors Predictions Example: Factors that affect a home's microbiome Generalized linear models Binary data Count data Example: Logistic regression for NBA clutch free throws Example: Beta regression for microbiome data Random effects Flexible linear models Nonparametric regression Heteroskedastic models Non-Gaussian error models Linear models with correlated data Exercises 5. Model selection and diagnostics Cross validation Hypothesis testing and Bayes factors Stochastic search variable selection Bayesian model averaging Model selection criteria Goodness-of-fit checks Exercises 6. Case studies using hierarchical modeling Overview of hierarchical modeling Case study: Species distribution mapping via data fusion Case study: Tyrannosaurid growth curves Case study: Marathon analysis with missing data 7. Statistical properties of Bayesian methods Decision theory Frequentist properties Bias-variance tradeoffAsymptotics Simulation studies Exercises AppendicesProbability distributions Univariate discrete Multivariate discrete Univariate continuous Multivariate continuous List of conjugacy pairs Derivations Normal-normal model for a mean Normal-normal model for a mean vector Normal-inverse Wishart model for a covariance matrix Jeffreys' prior for a normal model Jeffreys' prior for multiple linear regression Convergence of the Gibbs sampler Marginal distribution of a normal mean under Jeffreys' prior Marginal posterior of the regression coefficients under Jeffreys prior Proof of posterior consistency Computational algorithms Integrated nested Laplace approximation (INLA) Metropolis-adjusted Langevin algorithm Hamiltonian Monte Carlo (HMC) Delayed Rejection and Adaptive Metropolis Slice sampling Software comparison Example - Simple linear regression Example - Random slopes model



    Book Successfully Added To Your Cart