Skip to content

Stanford cs224n. 斯坦福2019最新cs224n学习资料 个人学习笔记和解读以及作业解答。 http://web.stanford.edu/class/cs224n/

Notifications You must be signed in to change notification settings

DukeEnglish/cs224n_learning_note

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cs224n_learning_note

我会将自己的学习笔记以及相关资料都放在这里。时间表安排将会按照斯坦福的时间安排进行。开始时间为2019.6.19

表示我没有做A3.我实在是对这部分内容不太感冒。挣扎了两个礼拜,每次打开就放弃了。后来我想起我也没有在上课,就不做了。随他去吧。我要去做A4了。a5看心情。主要是我要把final做出来

http://web.stanford.edu/class/cs224n/

The timetable is in Schedule.md

Schedule

Lecture slides will be posted here shortly before each lecture. If you wish to view slides further in advance, refer to last year's slides, which are mostly similar.

This is the timetable from stanford website. I only use it to manage my study schedule to avoid the procrastination.

Usually, I study in the evening to ensure I have enough time.

This schedule is subject to change.

All the material could be found here

Date Description Course Materials Events Deadlines Note
Tue June 19 Introduction and Word Vectors [slides] [video] [notes] Gensim word vectors example: [code] [[preview](http://web.stanford.edu/class/cs224n/materials/Gensim word vector visualization.html)] Suggested Readings:Word2Vec Tutorial - The Skip-Gram ModelEfficient Estimation of Word Representations in Vector Space(original word2vec paper)Distributed Representations of Words and Phrases and their Compositionality (negative sampling paper) Assignment 1 out [code] [preview] done
Thu June 21 Word Vectors 2 and Word Senses [slides] [video] [notes] Suggested Readings:GloVe: Global Vectors for Word Representation (original GloVe paper)Improving Distributional Similarity with Lessons Learned from Word EmbeddingsEvaluation methods for unsupervised word embeddingsAdditional Readings:A Latent Variable Model Approach to PMI-based Word EmbeddingsLinear Algebraic Structure of Word Senses, with Applications to PolysemyOn the Dimensionality of Word Embedding. 1 done
Fri June 22 Python review session [slides] 1:30 - 2:50pm Skilling Auditorium [[map](https://maps.google.com/maps?hl=en&q=Skilling Auditorium%2C 494 Lomita Mall%2C Stanford%2C CA 94305%2C USA)] 1 Done
Tue June 26 Word Window Classification, Neural Networks, and Matrix Calculus [slides] [video] [matrix calculus notes] [notes (lectures 3 and 4)] Suggested Readings:CS231n notes on backpropReview of differential calculusAdditional Readings:Natural Language Processing (Almost) from Scratch Assignment 2 out [code] [handout] Assignment 1 due done Done
Thu June 28 Backpropagation and Computation Graphs [slides] [video] [notes (lectures 3 and 4)] Suggested Readings:CS231n notes on network architecturesLearning Representations by Backpropagating ErrorsDerivatives, Backpropagation, and VectorizationYes you should understand backprop done
Tue July 2 Linguistic Structure: Dependency Parsing [slides] [scrawled-on slides] [video] [notes] Suggested Readings:Incrementality in Deterministic Dependency ParsingA Fast and Accurate Dependency Parser using Neural NetworksDependency ParsingGlobally Normalized Transition-Based Neural NetworksUniversal Stanford Dependencies: A cross-linguistic typologyUniversal Dependencies website Assignment 3 out [code] [handout] Assignment 2 due done
Thu July 4 The probability of a sentence? Recurrent Neural Networks and Language Models [slides] [video] [notes (lectures 6 and 7)] Suggested Readings:N-gram Language Models (textbook chapter)The Unreasonable Effectiveness of Recurrent Neural Networks(blog post overview)Sequence Modeling: Recurrent and Recursive Neural Nets(Sections 10.1 and 10.2)On Chomsky and the Two Cultures of Statistical Learning 添加中期总结
Tue July 9 Vanishing Gradients and Fancy RNNs [slides] [video] [notes (lectures 6 and 7)] Suggested Readings:Sequence Modeling: Recurrent and Recursive Neural Nets(Sections 10.3, 10.5, 10.7-10.12)Learning long-term dependencies with gradient descent is difficult (one of the original vanishing gradient papers)On the difficulty of training Recurrent Neural Networks (proof of vanishing gradient problem)Vanishing Gradients Jupyter Notebook (demo for feedforward networks)Understanding LSTM Networks (blog post overview) Assignment 4 out [code] [handout] [Azure Guide] [Practical Guide to VMs] Assignment 3 due
Thu July 11 Machine Translation, Seq2Seq and Attention [slides] [video] [notes] Suggested Readings:Statistical Machine Translation slides, CS224n 2015 (lectures 2/3/4)Statistical Machine Translation (book by Philipp Koehn)BLEU (original paper)Sequence to Sequence Learning with Neural Networks (original seq2seq NMT paper)Sequence Transduction with Recurrent Neural Networks (early seq2seq speech recognition paper)Neural Machine Translation by Jointly Learning to Align and Translate (original seq2seq+attention paper)Attention and Augmented Recurrent Neural Networks (blog post overview)Massive Exploration of Neural Machine Translation Architectures(practical advice for hyperparameter choices)
Tue July 16 Practical Tips for Final Projects [slides] [video] [notes] Suggested Readings:Practical Methodology (Deep Learning book chapter)
Thu July 18 Question Answering and the Default Final Project [slides] [video] [notes] Project Proposal out [instructions] Default Final Project out[handout] [code] Assignment 4 due
Tue July 23 ConvNets for NLP [slides] [video] [notes] Suggested Readings:Convolutional Neural Networks for Sentence ClassificationA Convolutional Neural Network for Modelling Sentences
Thu July 25 Information from parts of words: Subword Models [slides] [video] Assignment 5 out [original code (requires Stanford login)/ public version] [handout] Project Proposal due
Tue July 30 Modeling contexts of use: Contextual Representations and Pretraining [slides] [video] Suggested readings:Smith, Noah A. Contextual Word Representations: A Contextual Introduction. (Published just in time for this lecture!)
Thu Aug 1 Transformers and Self-Attention For Generative Models (guest lecture by Ashish Vaswaniand Anna Huang) [slides] [video] Suggested readings:Attention is all you needImage TransformerMusic Transformer: Generating music with long-term structure
Fri Aug 2 Project Milestone out [instructions] Assignment 5 due
Tue Aug 6 Natural Language Generation [slides] [video]
Thu Aug 8 Reference in Language and Coreference Resolution [slides] [video]
Tue Aug 13 Multitask Learning: A general model for NLP? (guest lecture by Richard Socher) [slides] [video] Project Milestone due
Thu Aug 15 Constituency Parsing and Tree Recursive Neural Networks [slides] [video] [notes] Suggested Readings:Parsing with Compositional Vector Grammars.Constituency Parsing with a Self-Attentive Encoder
Tue Aug 20 Safety, Bias, and Fairness (guest lecture by Margaret Mitchell) [slides] [video]
Thu Aug 22 Future of NLP + Deep Learning [slides] [video]
Sun Aug 24 Final Project Report due[instructions]
Wed Aug 28 Final project poster session [details] 5:15 - 8:30pm McCaw Hall at the Alumni Center [map] Project Poster/Video due[instructions]

Reference:

  1. https://github.com/ankit-ai/cs224n-natural-language-processing-winter2019
  2. https://github.com/ZacBi/CS224n-2019-solutions
  3. https://blog.csdn.net/bqw18744018044/article/details/83120425
  4. https://github.com/Observerspy/CS224n

Releases

No releases published

Packages

No packages published