Natural Language Processing with Deep Learning
Stanford, , Prof. Chris Manning
Updated On 02 Feb, 19
Stanford, , Prof. Chris Manning
Updated On 02 Feb, 19
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation.
4.1 ( 11 )
Lecture 4 introduces single and multilayer neural networks, and how they can be used for classification purposes.
Key phrases: Neural networks. Forward computation. Backward propagation. Neuron Units. Max-margin Loss. Gradient checks. Xavier parameter initialization. Learning rates. Adagrad.
-------------------------------------------------------------------------------
Natural Language Processing with Deep Learning
Instructors:
- Chris Manning
- Richard Socher
Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.
For additional learning opportunities please visit:
http://stanfordonline.stanford.edu/
Sam
Sep 12, 2018
Excellent course helped me understand topic that i couldn't while attendinfg my college.
Dembe
March 29, 2019
Great course. Thank you very much.