万本电子书0元读

万本电子书0元读

顶部广告

Bayesian Analysis with Python电子书

售       价:¥

6人正在读 | 0人评论 9.8

作       者:Osvaldo Martin

出  版  社:Packt Publishing

出版时间:2016-11-01

字       数:247.4万

所属分类: 进口书 > 外文原版书 > 电脑/网络

温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印

为你推荐

  • 读书简介
  • 目录
  • 累计评论(0条)
  • 读书简介
  • 目录
  • 累计评论(0条)
Unleash the power and flexibility of the Bayesian framework About This Book Simplify the Bayes process for solving complex statistical problems using Python; Tutorial guide that will take the you through the journey of Bayesian analysis with the help of sample problems and practice exercises; Learn how and when to use Bayesian analysis in your applications with this guide. Who This Book Is For Students, researchers and data scientists who wish to learn Bayesian data analysis with Python and implement probabilistic models in their day to day projects. Programming experience with Python is essential. No previous statistical knowledge is assumed. What You Will Learn Understand the essentials Bayesian concepts from a practical point of view Learn how to build probabilistic models using the Python library PyMC3 Acquire the skills to sanity-check your models and modify them if necessary Add structure to your models and get the advantages of hierarchical models Find out how different models can be used to answer different data analysis questions When in doubt, learn to choose between alternative models. Predict continuous target outcomes using regression analysis or assign classes using logistic and softmax regression. Learn how to think probabilistically and unleash the power and flexibility of the Bayesian framework In Detail The purpose of this book is to teach the main concepts of Bayesian data analysis. We will learn how to effectively use PyMC3, a Python library for probabilistic programming, to perform Bayesian parameter estimation, to check models and validate them. This book begins presenting the key concepts of the Bayesian framework and the main advantages of this approach from a practical point of view. Moving on, we will explore the power and flexibility of generalized linear models and how to adapt them to a wide array of problems, including regression and classification. We will also look into mixture models and clustering data, and we will finish with advanced topics like non-parametrics models and Gaussian processes. With the help of Python and PyMC3 you will learn to implement, check and expand Bayesian models to solve data analysis problems. Style and approach Bayes algorithms are widely used in statistics, machine learning, artificial intelligence, and data mining. This will be a practical guide allowing the readers to use Bayesian methods for statistical modelling and analysis using Python.
目录展开

Bayesian Analysis with Python

Table of Contents

Bayesian Analysis with Python

Credits

About the Author

About the Reviewer

www.PacktPub.com

eBooks, discount offers, and more

Why subscribe?

Preface

What this book covers

What you need for this book

Who this book is for

Conventions

Reader feedback

Customer support

Downloading the example code

Downloading the color images of this book

Errata

Piracy

Questions

1. Thinking Probabilistically - A Bayesian Inference Primer

Statistics as a form of modeling

Exploratory data analysis

Inferential statistics

Probabilities and uncertainty

Probability distributions

Bayes' theorem and statistical inference

Single parameter inference

The coin-flipping problem

The general model

Choosing the likelihood

Choosing the prior

Getting the posterior

Computing and plotting the posterior

Influence of the prior and how to choose one

Communicating a Bayesian analysis

Model notation and visualization

Summarizing the posterior

Highest posterior density

Posterior predictive checks

Installing the necessary Python packages

Summary

Exercises

2. Programming Probabilistically – A PyMC3 Primer

Probabilistic programming

Inference engines

Non-Markovian methods

Grid computing

Quadratic method

Variational methods

Markovian methods

Monte Carlo

Markov chain

Metropolis-Hastings

Hamiltonian Monte Carlo/NUTS

Other MCMC methods

PyMC3 introduction

Coin-flipping, the computational approach

Model specification

Pushing the inference button

Diagnosing the sampling process

Convergence

Autocorrelation

Effective size

Summarizing the posterior

Posterior-based decisions

ROPE

Loss functions

Summary

Keep reading

Exercises

3. Juggling with Multi-Parametric and Hierarchical Models

Nuisance parameters and marginalized distributions

Gaussians, Gaussians, Gaussians everywhere

Gaussian inferences

Robust inferences

Student's t-distribution

Comparing groups

The tips dataset

Cohen's d

Probability of superiority

Hierarchical models

Shrinkage

Summary

Keep reading

Exercises

4. Understanding and Predicting Data with Linear Regression Models

Simple linear regression

The machine learning connection

The core of linear regression models

Linear models and high autocorrelation

Modifying the data before running

Changing the sampling method

Interpreting and visualizing the posterior

Pearson correlation coefficient

Pearson coefficient from a multivariate Gaussian

Robust linear regression

Hierarchical linear regression

Correlation, causation, and the messiness of life

Polynomial regression

Interpreting the parameters of a polynomial regression

Polynomial regression – the ultimate model?

Multiple linear regression

Confounding variables and redundant variables

Multicollinearity or when the correlation is too high

Masking effect variables

Adding interactions

The GLM module

Summary

Keep reading

Exercises

5. Classifying Outcomes with Logistic Regression

Logistic regression

The logistic model

The iris dataset

The logistic model applied to the iris dataset

Making predictions

Multiple logistic regression

The boundary decision

Implementing the model

Dealing with correlated variables

Dealing with unbalanced classes

How do we solve this problem?

Interpreting the coefficients of a logistic regression

Generalized linear models

Softmax regression or multinomial logistic regression

Discriminative and generative models

Summary

Keep reading

Exercises

6. Model Comparison

Occam's razor – simplicity and accuracy

Too many parameters leads to overfitting

Too few parameters leads to underfitting

The balance between simplicity and accuracy

Regularizing priors

Regularizing priors and hierarchical models

Predictive accuracy measures

Cross-validation

Information criteria

The log-likelihood and the deviance

Akaike information criterion

Deviance information criterion

Widely available information criterion

Pareto smoothed importance sampling leave-one-out cross-validation

Bayesian information criterion

Computing information criteria with PyMC3

A note on the reliability of WAIC and LOO computations

Interpreting and using information criteria measures

Posterior predictive checks

Bayes factors

Analogy with information criteria

Computing Bayes factors

Common problems computing Bayes factors

Bayes factors and information criteria

Summary

Keep reading

Exercises

7. Mixture Models

Mixture models

How to build mixture models

Marginalized Gaussian mixture model

Mixture models and count data

The Poisson distribution

The Zero-Inflated Poisson model

Poisson regression and ZIP regression

Robust logistic regression

Model-based clustering

Fixed component clustering

Non-fixed component clustering

Continuous mixtures

Beta-binomial and negative binomial

The Student's t-distribution

Summary

Keep reading

Exercises

8. Gaussian Processes

Non-parametric statistics

Kernel-based models

The Gaussian kernel

Kernelized linear regression

Overfitting and priors

Gaussian processes

Building the covariance matrix

Sampling from a GP prior

Using a parameterized kernel

Making predictions from a GP

Implementing a GP using PyMC3

Posterior predictive checks

Periodic kernel

Summary

Keep reading

Exercises

Index

累计评论(0条) 0个书友正在讨论这本书 发表评论

发表评论

发表评论,分享你的想法吧!

买过这本书的人还买过

读了这本书的人还在读

回顶部