售 价:¥
温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印
为你推荐
Title Page
Copyright
Machine Learning for OpenCV
Credits
Foreword
About the Author
About the Reviewers
www.PacktPub.com
Why subscribe?
Customer Feedback
Dedication
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Errata
Piracy
Questions
A Taste of Machine Learning
Getting started with machine learning
Problems that machine learning can solve
Getting started with Python
Getting started with OpenCV
Installation
Getting the latest code for this book
Getting to grips with Python's Anaconda distribution
Installing OpenCV in a conda environment
Verifying the installation
Getting a glimpse of OpenCV's ML module
Summary
Working with Data in OpenCV and Python
Understanding the machine learning workflow
Dealing with data using OpenCV and Python
Starting a new IPython or Jupyter session
Dealing with data using Python's NumPy package
Importing NumPy
Understanding NumPy arrays
Accessing single array elements by indexing
Creating multidimensional arrays
Loading external datasets in Python
Visualizing the data using Matplotlib
Importing Matplotlib
Producing a simple plot
Visualizing data from an external dataset
Dealing with data using OpenCV's TrainData container in C++
Summary
First Steps in Supervised Learning
Understanding supervised learning
Having a look at supervised learning in OpenCV
Measuring model performance with scoring functions
Scoring classifiers using accuracy, precision, and recall
Scoring regressors using mean squared error, explained variance, and R squared
Using classification models to predict class labels
Understanding the k-NN algorithm
Implementing k-NN in OpenCV
Generating the training data
Training the classifier
Predicting the label of a new data point
Using regression models to predict continuous outcomes
Understanding linear regression
Using linear regression to predict Boston housing prices
Loading the dataset
Training the model
Testing the model
Applying Lasso and ridge regression
Classifying iris species using logistic regression
Understanding logistic regression
Loading the training data
Making it a binary classification problem
Inspecting the data
Splitting the data into training and test sets
Training the classifier
Testing the classifier
Summary
Representing Data and Engineering Features
Understanding feature engineering
Preprocessing data
Standardizing features
Normalizing features
Scaling features to a range
Binarizing features
Handling the missing data
Understanding dimensionality reduction
Implementing Principal Component Analysis (PCA) in OpenCV
Implementing Independent Component Analysis (ICA)
Implementing Non-negative Matrix Factorization (NMF)
Representing categorical variables
Representing text features
Representing images
Using color spaces
Encoding images in RGB space
Encoding images in HSV and HLS space
Detecting corners in images
Using the Scale-Invariant Feature Transform (SIFT)
Using Speeded Up Robust Features (SURF)
Summary
Using Decision Trees to Make a Medical Diagnosis
Understanding decision trees
Building our first decision tree
Understanding the task by understanding the data
Preprocessing the data
Constructing the tree
Visualizing a trained decision tree
Investigating the inner workings of a decision tree
Rating the importance of features
Understanding the decision rules
Controlling the complexity of decision trees
Using decision trees to diagnose breast cancer
Loading the dataset
Building the decision tree
Using decision trees for regression
Summary
Detecting Pedestrians with Support Vector Machines
Understanding linear support vector machines
Learning optimal decision boundaries
Implementing our first support vector machine
Generating the dataset
Visualizing the dataset
Preprocessing the dataset
Building the support vector machine
Visualizing the decision boundary
Dealing with nonlinear decision boundaries
Understanding the kernel trick
Knowing our kernels
Implementing nonlinear support vector machines
Detecting pedestrians in the wild
Obtaining the dataset
Taking a glimpse at the histogram of oriented gradients (HOG)
Generating negatives
Implementing the support vector machine
Bootstrapping the model
Detecting pedestrians in a larger image
Further improving the model
Summary
Implementing a Spam Filter with Bayesian Learning
Understanding Bayesian inference
Taking a short detour on probability theory
Understanding Bayes' theorem
Understanding the naive Bayes classifier
Implementing your first Bayesian classifier
Creating a toy dataset
Classifying the data with a normal Bayes classifier
Classifying the data with a naive Bayes classifier
Visualizing conditional probabilities
Classifying emails using the naive Bayes classifier
Loading the dataset
Building a data matrix using Pandas
Preprocessing the data
Training a normal Bayes classifier
Training on the full dataset
Using n-grams to improve the result
Using tf-idf to improve the result
Summary
Discovering Hidden Structures with Unsupervised Learning
Understanding unsupervised learning
Understanding k-means clustering
Implementing our first k-means example
Understanding expectation-maximization
Implementing our own expectation-maximization solution
Knowing the limitations of expectation-maximization
First caveat: No guarantee of finding the global optimum
Second caveat: We must select the number of clusters beforehand
Third caveat: Cluster boundaries are linear
Fourth caveat: k-means is slow for a large number of samples
Compressing color spaces using k-means
Visualizing the true-color palette
Reducing the color palette using k-means
Classifying handwritten digits using k-means
Loading the dataset
Running k-means
Organizing clusters as a hierarchical tree
Understanding hierarchical clustering
Implementing agglomerative hierarchical clustering
Summary
Using Deep Learning to Classify Handwritten Digits
Understanding the McCulloch-Pitts neuron
Understanding the perceptron
Implementing your first perceptron
Generating a toy dataset
Fitting the perceptron to data
Evaluating the perceptron classifier
Applying the perceptron to data that is not linearly separable
Understanding multilayer perceptrons
Understanding gradient descent
Training multi-layer perceptrons with backpropagation
Implementing a multilayer perceptron in OpenCV
Preprocessing the data
Creating an MLP classifier in OpenCV
Customizing the MLP classifier
Training and testing the MLP classifier
Getting acquainted with deep learning
Getting acquainted with Keras
Classifying handwritten digits
Loading the MNIST dataset
Preprocessing the MNIST dataset
Training an MLP using OpenCV
Training a deep neural net using Keras
Preprocessing the MNIST dataset
Creating a convolutional neural network
Fitting the model
Summary
Combining Different Algorithms into an Ensemble
Understanding ensemble methods
Understanding averaging ensembles
Implementing a bagging classifier
Implementing a bagging regressor
Understanding boosting ensembles
Implementing a boosting classifier
Implementing a boosting regressor
Understanding stacking ensembles
Combining decision trees into a random forest
Understanding the shortcomings of decision trees
Implementing our first random forest
Implementing a random forest with scikit-learn
Implementing extremely randomized trees
Using random forests for face recognition
Loading the dataset
Preprocessing the dataset
Training and testing the random forest
Implementing AdaBoost
Implementing AdaBoost in OpenCV
Implementing AdaBoost in scikit-learn
Combining different models into a voting classifier
Understanding different voting schemes
Implementing a voting classifier
Summary
Selecting the Right Model with Hyperparameter Tuning
Evaluating a model
Evaluating a model the wrong way
Evaluating a model in the right way
Selecting the best model
Understanding cross-validation
Manually implementing cross-validation in OpenCV
Using scikit-learn for k-fold cross-validation
Implementing leave-one-out cross-validation
Estimating robustness using bootstrapping
Manually implementing bootstrapping in OpenCV
Assessing the significance of our results
Implementing Student's t-test
Implementing McNemar's test
Tuning hyperparameters with grid search
Implementing a simple grid search
Understanding the value of a validation set
Combining grid search with cross-validation
Combining grid search with nested cross-validation
Scoring models using different evaluation metrics
Choosing the right classification metric
Choosing the right regression metric
Chaining algorithms together to form a pipeline
Implementing pipelines in scikit-learn
Using pipelines in grid searches
Summary
Wrapping Up
Approaching a machine learning problem
Building your own estimator
Writing your own OpenCV-based classifier in C++
Writing your own scikit-learn-based classifier in Python
Where to go from here?
Summary
买过这本书的人还买过
读了这本书的人还在读
同类图书排行榜