Lecture 1 - Introduction and Word Vectors |
최지원 |
Link |
2022.01.05 |
Lecture 2 - Neural Classifiers |
최지원 |
Link |
2022.01.05 |
Lecture 3 - Backprop and Neural Networks |
김보민 |
Link |
2022.01.12 |
Lecture 4 - Dependency Parsing |
정소연 |
Link |
2022.01.12 |
Lecture 5 - Language Models and RNNs |
최재석 |
Link |
2022.01.19 |
Lecture 6 - Simple and LSTM RNNs |
신동수 |
Link |
2022.01.19 |
Lecture 7 - Translation, Seq2Seq, Attention |
하유진 |
Link |
2022.01.26 |
Lecture 8 - Final Projects; Practical Tips |
|
Skip |
|
Lecture 9 - Self- Attention and Transformers |
김준영 |
Line |
2022.01.26 |
Lecture 10 - Transformers and Pretraining |
최지원 |
Link |
2022.02.02 |
Lecture 11 - Question Answering |
장지원 |
Link |
2022.02.02 |
Lecture 12 - Natural Language Generation |
김준영 |
Link |
2022.02.09 |
Lecture 13 - Coreference Resolution |
신동수 |
Link |
2022.02.09 |
Lecture 14 - T5 and Large Language Models |
최재석 |
Link |
2022.02.16 |
Lecture 15 - Add Knowledge to Language Models |
정소연 |
Link |
2022.02.16 |
Lecture 16 - Social & Ethical Considerations |
|
Skip |
|
Lecture 17 - Model Analysis and Explanation |
하유진 |
Link |
2022.02.23 |
Lecture 18 - Future of NLP + Deep Learning |
최지원 |
Link |
2022.02.23 |
Lecture 19 - Low Resource Machine Translation |
김보민 |
Link |
2022.03.02 |
Lecture 20 - BERT and Other Pre-trained Language Models |
장지원 |
Link |
2022.03.02 |