售 价:¥
温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印
为你推荐
Bayesian Analysis with Python
Table of Contents
Bayesian Analysis with Python
Credits
About the Author
About the Reviewer
www.PacktPub.com
eBooks, discount offers, and more
Why subscribe?
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Downloading the color images of this book
Errata
Piracy
Questions
1. Thinking Probabilistically - A Bayesian Inference Primer
Statistics as a form of modeling
Exploratory data analysis
Inferential statistics
Probabilities and uncertainty
Probability distributions
Bayes' theorem and statistical inference
Single parameter inference
The coin-flipping problem
The general model
Choosing the likelihood
Choosing the prior
Getting the posterior
Computing and plotting the posterior
Influence of the prior and how to choose one
Communicating a Bayesian analysis
Model notation and visualization
Summarizing the posterior
Highest posterior density
Posterior predictive checks
Installing the necessary Python packages
Summary
Exercises
2. Programming Probabilistically – A PyMC3 Primer
Probabilistic programming
Inference engines
Non-Markovian methods
Grid computing
Quadratic method
Variational methods
Markovian methods
Monte Carlo
Markov chain
Metropolis-Hastings
Hamiltonian Monte Carlo/NUTS
Other MCMC methods
PyMC3 introduction
Coin-flipping, the computational approach
Model specification
Pushing the inference button
Diagnosing the sampling process
Convergence
Autocorrelation
Effective size
Summarizing the posterior
Posterior-based decisions
ROPE
Loss functions
Summary
Keep reading
Exercises
3. Juggling with Multi-Parametric and Hierarchical Models
Nuisance parameters and marginalized distributions
Gaussians, Gaussians, Gaussians everywhere
Gaussian inferences
Robust inferences
Student's t-distribution
Comparing groups
The tips dataset
Cohen's d
Probability of superiority
Hierarchical models
Shrinkage
Summary
Keep reading
Exercises
4. Understanding and Predicting Data with Linear Regression Models
Simple linear regression
The machine learning connection
The core of linear regression models
Linear models and high autocorrelation
Modifying the data before running
Changing the sampling method
Interpreting and visualizing the posterior
Pearson correlation coefficient
Pearson coefficient from a multivariate Gaussian
Robust linear regression
Hierarchical linear regression
Correlation, causation, and the messiness of life
Polynomial regression
Interpreting the parameters of a polynomial regression
Polynomial regression – the ultimate model?
Multiple linear regression
Confounding variables and redundant variables
Multicollinearity or when the correlation is too high
Masking effect variables
Adding interactions
The GLM module
Summary
Keep reading
Exercises
5. Classifying Outcomes with Logistic Regression
Logistic regression
The logistic model
The iris dataset
The logistic model applied to the iris dataset
Making predictions
Multiple logistic regression
The boundary decision
Implementing the model
Dealing with correlated variables
Dealing with unbalanced classes
How do we solve this problem?
Interpreting the coefficients of a logistic regression
Generalized linear models
Softmax regression or multinomial logistic regression
Discriminative and generative models
Summary
Keep reading
Exercises
6. Model Comparison
Occam's razor – simplicity and accuracy
Too many parameters leads to overfitting
Too few parameters leads to underfitting
The balance between simplicity and accuracy
Regularizing priors
Regularizing priors and hierarchical models
Predictive accuracy measures
Cross-validation
Information criteria
The log-likelihood and the deviance
Akaike information criterion
Deviance information criterion
Widely available information criterion
Pareto smoothed importance sampling leave-one-out cross-validation
Bayesian information criterion
Computing information criteria with PyMC3
A note on the reliability of WAIC and LOO computations
Interpreting and using information criteria measures
Posterior predictive checks
Bayes factors
Analogy with information criteria
Computing Bayes factors
Common problems computing Bayes factors
Bayes factors and information criteria
Summary
Keep reading
Exercises
7. Mixture Models
Mixture models
How to build mixture models
Marginalized Gaussian mixture model
Mixture models and count data
The Poisson distribution
The Zero-Inflated Poisson model
Poisson regression and ZIP regression
Robust logistic regression
Model-based clustering
Fixed component clustering
Non-fixed component clustering
Continuous mixtures
Beta-binomial and negative binomial
The Student's t-distribution
Summary
Keep reading
Exercises
8. Gaussian Processes
Non-parametric statistics
Kernel-based models
The Gaussian kernel
Kernelized linear regression
Overfitting and priors
Gaussian processes
Building the covariance matrix
Sampling from a GP prior
Using a parameterized kernel
Making predictions from a GP
Implementing a GP using PyMC3
Posterior predictive checks
Periodic kernel
Summary
Keep reading
Exercises
Index
买过这本书的人还买过
读了这本书的人还在读
同类图书排行榜