由于此商品库存有限,请在下单后15分钟之内支付完成,手慢无哦!
100%刮中券,最高50元无敌券,券有效期7天
活动自2017年6月2日上线,敬请关注云钻刮券活动规则更新。
如活动受政府机关指令需要停止举办的,或活动遭受严重网络攻击需暂停举办的,或者系统故障导致的其它意外问题,苏宁无需为此承担赔偿或者进行补偿。
醉染图书TensorFlow自然语言处理()9787564182892
¥ ×1
Preface
Chapter 1: Introduction to Natural Language Processing
What is Natural Language Processing?
Tasks of Natural Language Processing
The traditional approach to Natural Language Processing
Understanding the traditional approach
Example - generating football game summaries
Drawbacks of the traditional approach
The deep learning approach to Natural Language Processing
History of deep learning
The currensttef deep learning and NLP
Understanding a simple deep model - a Fully-Connected Neural Network
The roadmap - beyond this chapter
Introduction to the technical tools
Description of the tools
Installing Python and scikit-learn
Installing Jupyter Notebook
Installing TensorFlow
Summary
Chapter 2: Understanding TensorFlow
What is TensorFlow?
Getting started with TensorFlow
TensorFlow client in detail
TensorFlow architecture - what happens when you execute the client?
Cafe Le TensorFlow - understanding TensorFlow with an analogy
Inputs, variables, outpus,ndraios
Defining inputs in TensorFlow
Feeding data with Python code
Preloading and storing data as tensors
Building an input pipeline
Defining variables in TensorFlow
Defining TensorFlow outputs
Defining TensorFlow oraios
Comparison oraios
Mathematical oraios
Scatter and gather oraios
Neural network-related oraios
Reusing variables with scoping
Implementing our first neural network
Preparing the data
Defining the TensorFlow graph
Running the neural network
Summary
Chapter 3: Word2vec - Learning Word Embeddings
What is a word representation or meaning?
Classical approaches to learning word representation
WordNet - using an external lexical knowledge base for learning word representations
Tour of WordNet
Problems with WordNet
One-hot encoded representation
The TF-F method
Co-occurrence matrix
Word2vec - a neural network-based approach to learning word representation
Exercise: is queen = king - he + she?
Designing a loss function for learning word embeddings
The skip-gram algorithm
From raw text to structured data
Learning the word embeddings with a neural network
Formulating a practical loss function
Efficiently approximating the loss function
Implementing skip-gram with TensorFlow
The Continuous Bag-of-Words algorithm
Implementing CBOW in TensorFlow
Summary
Chapter 4: Advanced Word2vec
The original skip-gram algorithm
Implementing the original skip-gram algorithm
……
Chapter 5: Sentence Classification with Convolutional Neural Networks
Chapter 6: Recurrent Neural Networks
Chapter 7: Lonq Short-Term Memory_ Networks
Chapter 8: Applications of LSTM - Generating Text
Chapter 9: Applications of LSTM - Image Caption Generation
Chapter 10: Sequence-to-Sequence Learning - Neural Machine Translation
Chapter 11: Current Trends and the Future of Natural Language Processing
Appendix: Mathematical Foundations and Advanced TensorFlow
Other Books You May Enjoy
Index
亲,大宗购物请点击企业用户渠道>小苏的服务会更贴心!
亲,很抱歉,您购买的宝贝销售异常火爆让小苏措手不及,请稍后再试~
非常抱歉,您前期未参加预订活动,
无法支付尾款哦!
抱歉,您暂无任性付资格