Please enable JavaScript.
Coggle requires JavaScript to display documents.
Neural Networks for NLP, CMU CS 11-747, Fall 2017 (CMU Neural Nets for NLP…
Neural Networks for NLP, CMU CS 11-747, Fall 2017
CMU Neural Nets for NLP 2017_Youtube
[Lecture-1]
CMU Neural Nets for NLP 2017 (1): Class Introduction & Why Neural Nets?
https://www.youtube.com/watch?v=Sss2EA4hhBQ&list=PL8PYTP1V4I8ABXzdqtOpB_eqBlVAz_xPT&index=1&t=374s
[Lecture-4]
Why is word2vec So Fast?: Speed Tricks for Neural Nets
https://www.youtube.com/watch?v=9ERZsx__rBM&index=4&list=PL8PYTP1V4I8ABXzdqtOpB_eqBlVAz_xPT
Slides
Efficiency Slides
Sample Code
Efficiency Code Examples
Lecture Video
Efficiency Lecture Video
Reading Material
Reference:
Importance Sampling
(Bengio and Senécal, 2003)
Reference:
Noise Contrastive Estimation
(Mnih and Teh, 2012)
Highly Recommended Reading:
Notes on Noise Contrastive Estimation and Negative Sampling (Dyer 2014
)
Reference:
Negative Sampling
(Goldberg and Levy, 2014)
Reference:
Mini-batching Sampling-based Softmax Approximations
(Zoph et al., 2015)
Reference:
Class-based Softmax
(Goodman 2001)
Reference:
Hierarchical Softmax
(Morin and Bengio 2005)
Reference:
Error Correcting Codes
(Dietterich and Bakiri 1995)
Reference:
Binary Code Prediction for Language
(Oda et al. 2017)
Content
Describing a word by the company that it keeps
Skip-grams and CBOW
Counting and predicting
Evaluating/Visualizing Word Vectors
Advanced Methods for Word Vectors
[Lecture-3]
Distributional Semantics and Word Vectors
https://www.youtube.com/watch?v=xCAtxcc0KIE&feature=youtu.be
Content
Describing a word by the company that it keeps
Skip-grams and CBOW
Evaluating/Visualizing Word Vectors
Counting and predicting
Advanced Methods for Word Vectors
Slides
Word Embedding Slides
Sample Code
Word Embedding Code Examples
Lecture Video
Word Embedding Lecture Video
Reading Material
Reference:
WordNet
Reference:
Linguistic Regularities in Continuous Representations
(Mikolov et al. 2013)
Required Reading (for quiz):
Goldberg Book Chapters 10-11
Reference:
t-SNE
(van der Maaten and Hinton 2008)
Reference:
Visualizing w/ PCA vs. t-SNE
(Derksen 2016)
Reference:
How to use t-SNE effectively
(Wattenberg et al. 2016)
Reference:
Evaluating Word Embeddings
(Schnabel et al. 2015)
Reference:
Morphology-based Embeddings
(Luong et al. 2013)
Reference:
Character-based Embeddings
(Ling et al. 2015)
Reference:
Subword-based Embeddings
(Bojankowski et al. 2017)
Reference:
Multi-prototype Embeddings
(Reisinger and Mooney 2010)
Reference:
Non-parametric Multi-prototype Embeddings
(Neelakantan et al. 2014)
Reference:
Cross-lingual Embeddings
(Faruqui et al. 2014)
Reference:
Retrofitting to Lexicons
(Faruqui et al. 2015)
Reference:
Sparse Word Embeddings
(Murphy et al. 2012)
Reference:
De-biasing Word Embeddings
(Bolukbasi et al. 2016)
[Resource]
NLP
INF4820, Fall 2017
INF4820 - Algorithms for artificial intelligence and natural language processing, Høst 2017
Oxford Deep NLP 2017 course
CS224n: Natural Language Processing with Deep Learning
DeepNLP-models-Pytorch
Pytorch implementations of various Deep NLP models
in cs-224n(Stanford Univ)
reading_comprehension-cs224n
Automatic Speech Recognition – An Overview
Lecture Collection | Convolutional Neural Networks for Visual Recognition (Spring 2017)
Neural Networks for NLP, CMU CS 11-747, Fall 2017
Noah Smith
CSEP 517: NLP (for professional M.S. students), taught spring 2017
CSEP 517, Spring 2017 Lecture Video
5 Best Deep Learning
in Python videos for a Beginner
Deep Learning with Keras- Python
Deep Learning with Python
Deep Learning by Andrew Ng (Full course)
TensorFlow tutorial
PyTorch Zero to All
Machine Learning
Tom Mitchell and Maria-Florina Balcan
10-601, Spring 2015,Carnegie Mellon University