Lecture 4: Transformers
Video
Slides
Notes
Lecture by Sergey Karayev.
In this video, you will learn about the origin of transfer learning in computer vision, its application in NLP in the form of embedding, NLP's ImageNet moment, and the Transformers model families.
- 00:00 - Introduction
- 00:42 - Transfer Learning in Computer Vision
- 04:00 - Embeddings and Language Models
- 10:09 - NLP's ImageNet moment: ELMO and ULMFit on datasets like SQuAD, SNLI, and GLUE
- 16:49 - Rise of Transformers
- 18:20 - Attention in Detail: (Masked) Self-Attention, Positional Encoding, and Layer Normalization
- 27:33 - Transformers Variants: BERT, GPT/GPT-2/GPT-3, DistillBERT, T5, etc.
- 36:20 - GPT3 Demos
- 42:53 - Future Directions
We are excited to share this course with you for free.
We have more upcoming great content. Subscribe to stay up to date as we release it.
We take your privacy and attention very seriously and will never spam you. I am already a subscriber