售 价:¥
温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印
为你推荐
Title Page
Copyright and Credits
Hands-On Natural Language Processing with Python
Packt Upsell
Why subscribe?
PacktPub.com
Foreword
Contributors
About the authors
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Conventions used
Get in touch
Reviews
Getting Started
Basic concepts and terminologies in NLP
Text corpus or corpora
Paragraph
Sentences
Phrases and words
N-grams
Bag-of-words
Applications of NLP
Analyzing sentiment
Recognizing named entities
Linking entities
Translating text
Natural Language Inference
Semantic Role Labeling
Relation extraction
SQL query generation, or semantic parsing
Machine Comprehension
Textual Entailment
Coreference resolution
Searching
Question answering and chatbots
Converting text-to-voice
Converting voice-to-text
Speaker identification
Spoken dialog systems
Other applications
Summary
Text Classification and POS Tagging Using NLTK
Installing NLTK and its modules
Text preprocessing and exploratory analysis
Tokenization
Stemming
Removing stop words
Exploratory analysis of text
POS tagging
What is POS tagging?
Applications of POS tagging
Training a POS tagger
Training a sentiment classifier for movie reviews
Training a bag-of-words classifier
Summary
Deep Learning and TensorFlow
Deep learning
Perceptron
Activation functions
Sigmoid
Hyperbolic tangent
Rectified linear unit
Neural network
One-hot encoding
Softmax
Cross-entropy
Training neural networks
Backpropagation
Gradient descent
Stochastic gradient descent
Regularization techniques
Dropout
Batch normalization
L1 and L2 normalization
Convolutional Neural Network
Kernel
Max pooling
Recurrent neural network
Long-Short Term Memory
TensorFlow
General Purpose – Graphics Processing Unit
CUDA
cuDNN
Installation
Hello world!
Adding two numbers
TensorBoard
The Keras library
Summary
Semantic Embedding Using Shallow Models
Word vectors
The classical approach
Word2vec
The CBOW model
The skip-gram model
A comparison of skip-gram and CBOW model architectures
Building a skip-gram model
Visualization of word embeddings
From word to document embeddings
Sentence2vec
Doc2vec
Visualization of document embeddings
Summary
Text Classification Using LSTM
Data for text classification
Topic modeling
Topic modeling versus text classification
Deep learning meta architecture for text classification
Embedding layer
Deep representation
Fully connected part
Identifying spam in YouTube video comments using RNNs
Classifying news articles by topic using a CNN
Transfer learning using GloVe embeddings
Multi-label classification
Binary relevance
Deep learning for multi-label classification
Attention networks for document classification
Summary
Searching and DeDuplicating Using CNNs
Data
Data description
Training the model
Encoding the text
Modeling with CNN
Training
Inference
Summary
Named Entity Recognition Using Character LSTM
NER with deep learning
Data
Model
Word embeddings
Walking through the code
Input
Word embedding
The effects of different pretrained word embeddings
Neural network architecture
Decoding predictions
The training step
Scope for improvement
Summary
Text Generation and Summarization Using GRUs
Generating text using RNNs
Generating Linux kernel code with a GRU
Text summarization
Extractive summarization
Summarization using gensim
Abstractive summarization
Encoder-decoder architecture
Encoder
Decoder
News summarization using GRU
Data preparation
Encoder network
Decoder network
Sequence to sequence
Building the graph
Training
Inference
TensorBoard visualization
State-of-the-art abstractive text summarization
Summary
Question-Answering and Chatbots Using Memory Networks
The Question-Answering task
Question-Answering datasets
Memory networks for Question-Answering
Memory network pipeline overview
Writing a memory network in TensorFlow
Class constructor
Input module
Question module
Memory module
Output module
Putting it together
Extending memory networks for dialog modeling
Dialog datasets
The bAbI dialog dataset
Raw data format
Writing a chatbot in TensorFlow
Loading dialog datasets in the QA format
Vectorizing the data
Wrapping the memory network model in a chatbot class
Class constructor
Building a vocabulary for word embedding lookup
Training the chatbot model
Evaluating the chatbot on the testing set
Interacting with the chatbot
Putting it all together
Example of an interactive conversation
Literature on and related to memory networks
Summary
Machine Translation Using the Attention-Based Model
Overview of machine translation
Statistical machine translation
English to French using NLTK SMT models
Neural machine translation
Encoder-decoder network
Encoder-decoder with attention
NMT for French to English using attention
Data preparation
Encoder network
Decoder network
Sequence-to-sequence model
Building the graph
Training
Inference
TensorBoard visualization
Summary
Speech Recognition Using DeepSpeech
Overview of speech recognition
Building an RNN model for speech recognition
Audio signal representation
LSTM model for spoken digit recognition
TensorBoard visualization
Speech to text using the DeepSpeech architecture
Overview of the DeepSpeech model
Speech recordings dataset
Preprocessing the audio data
Creating the model
TensorBoard visualization
State-of-the-art in speech recognition
Summary
Text-to-Speech Using Tacotron
Overview of text to speech
Naturalness versus intelligibility
How is the performance of a TTS system evaluated?
Traditional techniques – concatenative and parametric models
A few reminders on spectrograms and the mel scale
TTS in deep learning
WaveNet, in brief
Tacotron
The encoder
The attention-based decoder
The Griffin-Lim-based postprocessing module
Details of the architecture
Limitations
Implementation of Tacotron with Keras
The dataset
Data preparation
Preparation of text data
Preparation of audio data
Implementation of the architecture
Pre-net
Encoder and postprocessing CBHG
Attention RNN
Decoder RNN
The attention mechanism
Full architecture, with attention
Training and testing
Summary
Deploying Trained Models
Increasing performance
Quantizing the weights
MobileNets
TensorFlow Serving
Exporting the trained model
Serving the exported model
Deploying in the cloud
Amazon Web Services
Google Cloud Platform
Deploying on mobile devices
iPhone
Android
Summary
Other Books You May Enjoy
Leave a review - let other readers know what you think
买过这本书的人还买过
读了这本书的人还在读
同类图书排行榜