万本电子书0元读

万本电子书0元读

顶部广告

Machine Learning Algorithms电子书

售       价:¥

6人正在读 | 0人评论 9.8

作       者:Giuseppe Bonaccorso

出  版  社:Packt Publishing

出版时间:2017-07-24

字       数:38.0万

所属分类: 进口书 > 外文原版书 > 电脑/网络

温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印

为你推荐

  • 读书简介
  • 目录
  • 累计评论(0条)
  • 读书简介
  • 目录
  • 累计评论(0条)
Build strong foundation for entering the world of Machine Learning and data science with the help of this comprehensive guide About This Book ? Get started in the field of Machine Learning with the help of this solid, concept-rich, yet highly practical guide. ? Your one-stop solution for everything that matters in mastering the whats and whys of Machine Learning algorithms and their implementation. ? Get a solid foundation for your entry into Machine Learning by strengthening your roots (algorithms) with this comprehensive guide. Who This Book Is For This book is for IT professionals who want to enter the field of data science and are very new to Machine Learning. Familiarity with languages such as R and Python will be invaluable here. What You Will Learn ? Acquaint yourself with important elements of Machine Learning ? Understand the feature selection and feature engineering process ? Assess performance and error trade-offs for Linear Regression ? Build a data model and understand how it works by using different types of algorithm ? Learn to tune the parameters of Support Vector machines ? Implement clusters to a dataset ? Explore the concept of Natural Processing Language and Recommendation Systems ? Create a ML architecture from scratch. In Detail As the amount of data continues to grow at an almost incomprehensible rate, being able to understand and process data is becoming a key differentiator for competitive organizations. Machine learning applications are everywhere, from self-driving cars, spam detection, document search, and trading strategies, to speech recognition. This makes machine learning well-suited to the present-day era of Big Data and Data Science. The main challenge is how to transform data into actionable knowledge. In this book you will learn all the important Machine Learning algorithms that are commonly used in the field of data science. These algorithms can be used for supervised as well as unsupervised learning, reinforcement learning, and semi-supervised learning. A few famous algorithms that are covered in this book are Linear regression, Logistic Regression, SVM, Naive Bayes, K-Means, Random Forest, TensorFlow, and Feature engineering. In this book you will also learn how these algorithms work and their practical implementation to resolve your problems. This book will also introduce you to the Natural Processing Language and Recommendation systems, which help you run multiple algorithms simultaneously. On completion of the book you will have mastered selecting Machine Learning algorithms for clustering, classification, or regression based on for your problem. Style and approach An easy-to-follow, step-by-step guide that will help you get to grips with real -world applications of Algorithms for Machine Learning.
目录展开

Title Page

Copyright

Machine Learning Algorithms

Credits

About the Author

About the Reviewers

www.PacktPub.com

Why subscribe?

Customer Feedback

Preface

What this book covers

What you need for this book

Who this book is for

Conventions

Reader feedback

Customer support

Downloading the example code

Downloading the color images of this book

Errata

Piracy

Questions

A Gentle Introduction to Machine Learning

Introduction - classic and adaptive machines

Only learning matters

Supervised learning

Unsupervised learning

Reinforcement learning

Beyond machine learning - deep learning and bio-inspired adaptive systems

Machine learning and big data

Further reading

Summary

Important Elements in Machine Learning

Data formats

Multiclass strategies

One-vs-all

One-vs-one

Learnability

Underfitting and overfitting

Error measures

PAC learning

Statistical learning approaches

MAP learning

Maximum-likelihood learning

Elements of information theory

References

Summary

Feature Selection and Feature Engineering

scikit-learn toy datasets

Creating training and test sets

Managing categorical data

Managing missing features

Data scaling and normalization

Feature selection and filtering

Principal component analysis

Non-negative matrix factorization

Sparse PCA

Kernel PCA

Atom extraction and dictionary learning

References

Summary

Linear Regression

Linear models

A bidimensional example

Linear regression with scikit-learn and higher dimensionality

Regressor analytic expression

Ridge, Lasso, and ElasticNet

Robust regression with random sample consensus

Polynomial regression

Isotonic regression

References

Summary

Logistic Regression

Linear classification

Logistic regression

Implementation and optimizations

Stochastic gradient descent algorithms

Finding the optimal hyperparameters through grid search

Classification metrics

ROC curve

Summary

Naive Bayes

Bayes' theorem

Naive Bayes classifiers

Naive Bayes in scikit-learn

Bernoulli naive Bayes

Multinomial naive Bayes

Gaussian naive Bayes

References

Summary

Support Vector Machines

Linear support vector machines

scikit-learn implementation

Linear classification

Kernel-based classification

Radial Basis Function

Polynomial kernel

Sigmoid kernel

Custom kernels

Non-linear examples

Controlled support vector machines

Support vector regression

References

Summary

Decision Trees and Ensemble Learning

Binary decision trees

Binary decisions

Impurity measures

Gini impurity index

Cross-entropy impurity index

Misclassification impurity index

Feature importance

Decision tree classification with scikit-learn

Ensemble learning

Random forests

Feature importance in random forests

AdaBoost

Gradient tree boosting

Voting classifier

References

Summary

Clustering Fundamentals

Clustering basics

K-means

Finding the optimal number of clusters

Optimizing the inertia

Silhouette score

Calinski-Harabasz index

Cluster instability

DBSCAN

Spectral clustering

Evaluation methods based on the ground truth

Homogeneity

Completeness

Adjusted rand index

References

Summary

Hierarchical Clustering

Hierarchical strategies

Agglomerative clustering

Dendrograms

Agglomerative clustering in scikit-learn

Connectivity constraints

References

Summary

Introduction to Recommendation Systems

Naive user-based systems

User-based system implementation with scikit-learn

Content-based systems

Model-free (or memory-based) collaborative filtering

Model-based collaborative filtering

Singular Value Decomposition strategy

Alternating least squares strategy

Alternating least squares with Apache Spark MLlib

References

Summary

Introduction to Natural Language Processing

NLTK and built-in corpora

Corpora examples

The bag-of-words strategy

Tokenizing

Sentence tokenizing

Word tokenizing

Stopword removal

Language detection

Stemming

Vectorizing

Count vectorizing

N-grams

Tf-idf vectorizing

A sample text classifier based on the Reuters corpus

References

Summary

Topic Modeling and Sentiment Analysis in NLP

Topic modeling

Latent semantic analysis

Probabilistic latent semantic analysis

Latent Dirichlet Allocation

Sentiment analysis

VADER sentiment analysis with NLTK

References

Summary

A Brief Introduction to Deep Learning and TensorFlow

Deep learning at a glance

Artificial neural networks

Deep architectures

Fully connected layers

Convolutional layers

Dropout layers

Recurrent neural networks

A brief introduction to TensorFlow

Computing gradients

Logistic regression

Classification with a multi-layer perceptron

Image convolution

A quick glimpse inside Keras

References

Summary

Creating a Machine Learning Architecture

Machine learning architectures

Data collection

Normalization

Dimensionality reduction

Data augmentation

Data conversion

Modeling/Grid search/Cross-validation

Visualization

scikit-learn tools for machine learning architectures

Pipelines

Feature unions

References

Summary

累计评论(0条) 0个书友正在讨论这本书 发表评论

发表评论

发表评论,分享你的想法吧!

买过这本书的人还买过

读了这本书的人还在读

回顶部