万本电子书0元读

万本电子书0元读

顶部广告

Ensemble Machine Learning Cookbook电子书

售       价:¥

0人正在读 | 0人评论 9.8

作       者:Dipayan Sarkar

出  版  社:Packt Publishing

出版时间:2019-01-31

字       数:32.9万

所属分类: 进口书 > 外文原版书 > 电脑/网络

温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印

为你推荐

  • 读书简介
  • 目录
  • 累计评论(0条)
  • 读书简介
  • 目录
  • 累计评论(0条)
Implement machine learning algorithms to build ensemble models using Keras, H2O, Scikit-Learn, Pandas and more Key Features * Apply popular machine learning algorithms using a recipe-based approach * Implement boosting, bagging, and stacking ensemble methods to improve machine learning models * Discover real-world ensemble applications and encounter complex challenges in Kaggle competitions Book Description Ensemble modeling is an approach used to improve the performance of machine learning models. It combines two or more similar or dissimilar machine learning algorithms to deliver superior intellectual powers. This book will help you to implement popular machine learning algorithms to cover different paradigms of ensemble machine learning such as boosting, bagging, and stacking. The Ensemble Machine Learning Cookbook will start by getting you acquainted with the basics of ensemble techniques and exploratory data analysis. You'll then learn to implement tasks related to statistical and machine learning algorithms to understand the ensemble of multiple heterogeneous algorithms. It will also ensure that you don't miss out on key topics, such as like resampling methods. As you progress, you’ll get a better understanding of bagging, boosting, stacking, and working with the Random Forest algorithm using real-world examples. The book will highlight how these ensemble methods use multiple models to improve machine learning results, as compared to a single model. In the concluding chapters, you'll delve into advanced ensemble models using neural networks, natural language processing, and more. You’ll also be able to implement models such as fraud detection, text categorization, and sentiment analysis. By the end of this book, you'll be able to harness ensemble techniques and the working mechanisms of machine learning algorithms to build intelligent models using individual recipes. What you will learn * Understand how to use machine learning algorithms for regression and classification problems * Implement ensemble techniques such as averaging, weighted averaging, and max-voting * Get to grips with advanced ensemble methods, such as bootstrapping, bagging, and stacking * Use Random Forest for tasks such as classification and regression * Implement an ensemble of homogeneous and heterogeneous machine learning algorithms * Learn and implement various boosting techniques, such as AdaBoost, Gradient Boosting Machine, and XGBoost Who this book is for This book is designed for data scientists, machine learning developers, and deep learning enthusiasts who want to delve into machine learning algorithms to build powerful ensemble models. Working knowledge of Python programming and basic statistics is a must to help you grasp the concepts in the book.
目录展开

Title Page

Copyright and Credits

Ensemble Machine Learning Cookbook

About Packt

Why subscribe?

Packt.com

Foreword

Contributors

About the authors

About the reviewers

Packt is searching for authors like you

Preface

Who this book is for

What this book covers

To get the most out of this book

Download the example code files

Download the color images

Conventions used

Sections

Getting ready

How to do it…

How it works…

There's more…

See also

Get in touch

Reviews

Get Closer to Your Data

Introduction

Data manipulation with Python

Getting ready

How to do it...

How it works...

There's more...

See also

Analyzing, visualizing, and treating missing values

How to do it...

How it works...

There's more...

See also

Exploratory data analysis

How to do it...

How it works...

There's more...

See also

Getting Started with Ensemble Machine Learning

Introduction to ensemble machine learning

Max-voting

Getting ready

How to do it...

How it works...

There's more...

Averaging

Getting ready

How to do it...

How it works...

Weighted averaging

Getting ready

How to do it...

How it works...

See also

Resampling Methods

Introduction to sampling

Getting ready

How to do it...

How it works...

There's more...

See also

k-fold and leave-one-out cross-validation

Getting ready

How to do it...

How it works...

There's more...

See also

Bootstrapping

Getting ready

How to do it...

How it works...

See also

Statistical and Machine Learning Algorithms

Technical requirements

Multiple linear regression

Getting ready

How to do it...

How it works...

There's more...

See also

Logistic regression

Getting ready

How to do it...

How it works...

See also

Naive Bayes

Getting ready

How to do it...

How it works...

There's more...

See also

Decision trees

Getting ready

How to do it...

How it works...

There's more...

See also

Support vector machines

Getting ready

How to do it...

How it works...

There's more...

See also

Bag the Models with Bagging

Introduction

Bootstrap aggregation

Getting ready

How to do it...

How it works...

See also

Ensemble meta-estimators

Bagging classifiers

How to do it...

How it works...

There's more...

See also

Bagging regressors

Getting ready

How to do it...

How it works...

See also

When in Doubt, Use Random Forests

Introduction to random forests

Implementing a random forest for predicting credit card defaults using scikit-learn

Getting ready

How to do it...

How it works...

There's more...

See also

Implementing random forest for predicting credit card defaults using H2O

Getting ready

How to do it...

How it works...

There's more...

See also

Boosting Model Performance with Boosting

Introduction to boosting

Implementing AdaBoost for disease risk prediction using scikit-learn

Getting ready

How to do it...

How it works...

There's more...

See also

Implementing a gradient boosting machine for disease risk prediction using scikit-learn

Getting ready

How to do it...

How it works...

There's more...

Implementing the extreme gradient boosting method for glass identification using XGBoost with scikit-learn

Getting ready...

How to do it...

How it works...

There's more...

See also

Blend It with Stacking

Technical requirements

Understanding stacked generalization

Implementing stacked generalization by combining predictions

Getting ready...

How to do it...

How it works...

There's more...

See also

Implementing stacked generalization for campaign outcome prediction using H2O

Getting ready...

How to do it...

How it works...

There's more...

See also

Homogeneous Ensembles Using Keras

Introduction

An ensemble of homogeneous models for energy prediction

Getting ready

How to do it...

How it works...

There's more...

See also

An ensemble of homogeneous models for handwritten digit classification

Getting ready

How to do it...

How it works...

Heterogeneous Ensemble Classifiers Using H2O

Introduction

Predicting credit card defaulters using heterogeneous ensemble classifiers

Getting ready

How to do it...

How it works...

There's more...

See also

Heterogeneous Ensemble for Text Classification Using NLP

Introduction

Spam filtering using an ensemble of heterogeneous algorithms

Getting ready

How to do it...

How it works...

Sentiment analysis of movie reviews using an ensemble model

Getting ready

How to do it...

How it works...

There's more...

Homogenous Ensemble for Multiclass Classification Using Keras

Introduction

An ensemble of homogeneous models to classify fashion products

Getting ready

How to do it...

How it works...

See also

Other Books You May Enjoy

Leave a review - let other readers know what you think

累计评论(0条) 0个书友正在讨论这本书 发表评论

发表评论

发表评论,分享你的想法吧!

买过这本书的人还买过

读了这本书的人还在读

回顶部