万本电子书0元读

万本电子书0元读

顶部广告

Hands-On Natural Language Processing with Python电子书

售       价:¥

4人正在读 | 0人评论 9.8

作       者:Rajesh Arumugam,Rajalingappaa Shanmugamani

出  版  社:Packt Publishing

出版时间:2018-07-18

字       数:36.3万

所属分类: 进口书 > 外文原版书 > 电脑/网络

温馨提示:数字商品不支持退换货,不提供源文件,不支持导出打印

为你推荐

  • 读书简介
  • 目录
  • 累计评论(0条)
  • 读书简介
  • 目录
  • 累计评论(0条)
Foster your NLP applications with the help of deep learning, NLTK, and TensorFlow Key Features * Weave neural networks into linguistic applications across various platforms * Perform NLP tasks and train its models using NLTK and TensorFlow * Boost your NLP models with strong deep learning architectures such as CNNs and RNNs Book Description Natural language processing (NLP) has found its application in various domains, such as web search, advertisements, and customer services, and with the help of deep learning, we can enhance its performances in these areas. Hands-On Natural Language Processing with Python teaches you how to leverage deep learning models for performing various NLP tasks, along with best practices in dealing with today’s NLP challenges. To begin with, you will understand the core concepts of NLP and deep learning, such as Convolutional Neural Networks (CNNs), recurrent neural networks (RNNs), semantic embedding, Word2vec, and more. You will learn how to perform each and every task of NLP using neural networks, in which you will train and deploy neural networks in your NLP applications. You will get accustomed to using RNNs and CNNs in various application areas, such as text classification and sequence labeling, which are essential in the application of sentiment analysis, customer service chatbots, and anomaly detection. You will be equipped with practical knowledge in order to implement deep learning in your linguistic applications using Python's popular deep learning library, TensorFlow. By the end of this book, you will be well versed in building deep learning-backed NLP applications, along with overcoming NLP challenges with best practices developed by domain experts. What you will learn *Implement semantic embedding of words to classify and find entities *Convert words to vectors by training in order to perform arithmetic operations *Train a deep learning model to detect classification of tweets and news *Implement a question-answer model with search and RNN models *Train models for various text classification datasets using CNN *Implement WaveNet a deep generative model for producing a natural-sounding voice *Convert voice-to-text and text-to-voice *Train a model to convert speech-to-text using DeepSpeech Who this book is for Hands-on Natural Language Processing with Python is for you if you are a developer, machine learning or an NLP engineer who wants to build a deep learning application that leverages NLP techniques. This comprehensive guide is also useful for deep learning users who want to extend their deep learning skills in building NLP applications. All you need is the basics of machine learning and Python to enjoy the book.
目录展开

Title Page

Copyright and Credits

Hands-On Natural Language Processing with Python

Packt Upsell

Why subscribe?

PacktPub.com

Foreword

Contributors

About the authors

About the reviewer

Packt is searching for authors like you

Preface

Who this book is for

What this book covers

To get the most out of this book

Download the example code files

Download the color images

Conventions used

Get in touch

Reviews

Getting Started

Basic concepts and terminologies in NLP

Text corpus or corpora

Paragraph

Sentences

Phrases and words

N-grams

Bag-of-words

Applications of NLP

Analyzing sentiment

Recognizing named entities

Linking entities

Translating text

Natural Language Inference

Semantic Role Labeling

Relation extraction

SQL query generation, or semantic parsing

Machine Comprehension

Textual Entailment

Coreference resolution

Searching

Question answering and chatbots

Converting text-to-voice

Converting voice-to-text

Speaker identification

Spoken dialog systems

Other applications

Summary

Text Classification and POS Tagging Using NLTK

Installing NLTK and its modules

Text preprocessing and exploratory analysis

Tokenization

Stemming

Removing stop words

Exploratory analysis of text

POS tagging

What is POS tagging?

Applications of POS tagging

Training a POS tagger

Training a sentiment classifier for movie reviews

Training a bag-of-words classifier

Summary

Deep Learning and TensorFlow

Deep learning

Perceptron

Activation functions

Sigmoid

Hyperbolic tangent

Rectified linear unit

Neural network

One-hot encoding

Softmax

Cross-entropy

Training neural networks

Backpropagation

Gradient descent

Stochastic gradient descent

Regularization techniques

Dropout

Batch normalization

L1 and L2 normalization

Convolutional Neural Network

Kernel

Max pooling

Recurrent neural network

Long-Short Term Memory

TensorFlow

General Purpose – Graphics Processing Unit

CUDA

cuDNN

Installation

Hello world!

Adding two numbers

TensorBoard

The Keras library

Summary

Semantic Embedding Using Shallow Models

Word vectors

The classical approach

Word2vec

The CBOW model

The skip-gram model

A comparison of skip-gram and CBOW model architectures

Building a skip-gram model

Visualization of word embeddings

From word to document embeddings

Sentence2vec

Doc2vec

Visualization of document embeddings

Summary

Text Classification Using LSTM

Data for text classification

Topic modeling

Topic modeling versus text classification

Deep learning meta architecture for text classification

Embedding layer

Deep representation

Fully connected part

Identifying spam in YouTube video comments using RNNs

Classifying news articles by topic using a CNN

Transfer learning using GloVe embeddings

Multi-label classification

Binary relevance

Deep learning for multi-label classification

Attention networks for document classification

Summary

Searching and DeDuplicating Using CNNs

Data

Data description

Training the model

Encoding the text

Modeling with CNN

Training

Inference

Summary

Named Entity Recognition Using Character LSTM

NER with deep learning

Data

Model

Word embeddings

Walking through the code

Input

Word embedding

The effects of different pretrained word embeddings

Neural network architecture

Decoding predictions

The training step

Scope for improvement

Summary

Text Generation and Summarization Using GRUs

Generating text using RNNs

Generating Linux kernel code with a GRU

Text summarization

Extractive summarization

Summarization using gensim

Abstractive summarization

Encoder-decoder architecture

Encoder

Decoder

News summarization using GRU

Data preparation

Encoder network

Decoder network

Sequence to sequence

Building the graph

Training

Inference

TensorBoard visualization

State-of-the-art abstractive text summarization

Summary

Question-Answering and Chatbots Using Memory Networks

The Question-Answering task

Question-Answering datasets

Memory networks for Question-Answering

Memory network pipeline overview

Writing a memory network in TensorFlow

Class constructor

Input module

Question module

Memory module

Output module

Putting it together

Extending memory networks for dialog modeling

Dialog datasets

The bAbI dialog dataset

Raw data format

Writing a chatbot in TensorFlow

Loading dialog datasets in the QA format

Vectorizing the data

Wrapping the memory network model in a chatbot class

Class constructor

Building a vocabulary for word embedding lookup

Training the chatbot model

Evaluating the chatbot on the testing set

Interacting with the chatbot

Putting it all together

Example of an interactive conversation

Literature on and related to memory networks

Summary

Machine Translation Using the Attention-Based Model

Overview of machine translation

Statistical machine translation

English to French using NLTK SMT models

Neural machine translation

Encoder-decoder network

Encoder-decoder with attention

NMT for French to English using attention

Data preparation

Encoder network

Decoder network

Sequence-to-sequence model

Building the graph

Training

Inference

TensorBoard visualization

Summary

Speech Recognition Using DeepSpeech

Overview of speech recognition

Building an RNN model for speech recognition

Audio signal representation

LSTM model for spoken digit recognition

TensorBoard visualization

Speech to text using the DeepSpeech architecture

Overview of the DeepSpeech model

Speech recordings dataset

Preprocessing the audio data

Creating the model

TensorBoard visualization

State-of-the-art in speech recognition

Summary

Text-to-Speech Using Tacotron

Overview of text to speech

Naturalness versus intelligibility

How is the performance of a TTS system evaluated?

Traditional techniques – concatenative and parametric models

A few reminders on spectrograms and the mel scale

TTS in deep learning

WaveNet, in brief

Tacotron

The encoder

The attention-based decoder

The Griffin-Lim-based postprocessing module

Details of the architecture

Limitations

Implementation of Tacotron with Keras

The dataset

Data preparation

Preparation of text data

Preparation of audio data

Implementation of the architecture

Pre-net

Encoder and postprocessing CBHG

Attention RNN

Decoder RNN

The attention mechanism

Full architecture, with attention

Training and testing

Summary

Deploying Trained Models

Increasing performance

Quantizing the weights

MobileNets

TensorFlow Serving

Exporting the trained model

Serving the exported model

Deploying in the cloud

Amazon Web Services

Google Cloud Platform

Deploying on mobile devices

iPhone

Android

Summary

Other Books You May Enjoy

Leave a review - let other readers know what you think

累计评论(0条) 0个书友正在讨论这本书 发表评论

发表评论

发表评论,分享你的想法吧!

买过这本书的人还买过

读了这本书的人还在读

回顶部