Skip to content

Full Stack Deep Learning - Spring 2021

We've updated and improved our materials and can't wait to share them with you!

Every Monday, we will post videos of our lectures and lab sessions. You can follow along on our Twitter or YouTube, or sign up via email below.

Synchronous Online Course

We also offered a paid option for those who wanted weekly assignments, capstone project, Slack discussion, and certificate of completion. This synchronous option is now full, but you can enter your email above to be the first to hear about future offerings.

Those who are enrolled, please see details.

Week 1: Fundamentals

The week of February 1, we do a blitz review of the fundamentals of deep learning, and introduce the codebase we will be working on in labs for the remainder of the class.


How the backpropagation algorithm works

Week 2: CNNs

The week of February 8, we cover CNNs and Computer Vision Applications, and introduce a CNN in lab.


A brief introduction to Neural Style Transfer

Improving the way neural networks learn

Week 3: RNNs

The week of February 15, we cover RNNs and applications in Natural Language Processing, and start doing sequence processing in lab.


The Unreasonable Effectiveness of Recurrent Neural Networks

Attention Craving RNNS: Building Up To Transformer Networks

Week 4: Transformers

The week of February 22, we talk about the successes of transfer learning and the Transformer architecture, and start using it in lab.


Transformers from Scratch

Week 5: ML Projects

The week of March 1, our synchronous online course begins with the first "Full Stack" lecture: Setting up ML Projects.


Rules of Machine Learning

ML Yearning (and subscribe to Andrew Ng's newsletter)

Those in the syncronous online course will have their first weekly assignment: Assignment 1, available on Gradescope.

Week 6: Infra & Tooling

The week of March 7, we tour the landscape of infrastructure and tooling for deep learning.


Machine Learning: The High-Interest Credit Card of Technical Debt

Those in the syncronous online course will have to work on Assignment 2.

Week 7: Troubleshooting

The week of March 14, we talk about how to best troubleshoot training. In lab, we learn to manage experiments.


Why is machine learning hard?

Those in the syncronous online course will have to work on Assignment 3.

Week 8: Data

The week of March 21, we talk about Data Management, and label some data in lab.


Emerging architectures for modern data infrastructure

Those in the syncronous online course will have to work on Assignment 4.

Week 9: Ethics

The week of March 28, we discuss ethical considerations. In lab, we move from lines to paragraphs.

Those in the synchronous online course will have to submit their project proposals.

Week 10: Testing

The week of April 5, we talk about Testing and Explainability, and set up Continuous Integration in lab.

Those in the synchronous online course will work on their projects.

✨Week 11: Deployment✨

The week of April 12, we cover Deployment and Monitoring, and deploy our model to AWS Lambda in lab.

Those in the synchronous online course will work on their projects.

Week 12: Research

The week of April 19, we talk research, and set up robust monitoring for our model.

  • Lecture 12: Research Directions
  • Lab 10: Monitoring

Those in the synchronous online course will work on their projects.

Week 13: Teams

The week of April 26, we discuss ML roles and team structures, as well as big companies vs startups.

  • Lecture 13: ML Teams & Startups

Those in the synchronous online course will submit 5-minute videos of their projects and associated write-ups.

Week 14: Projects

The week of May 3, we watch the best course project videos together, and give out awards.

There are rumors of a fun conference in the air, too...

Other Resources is a great free two-course sequence aimed at first getting hackers to train state-of-the-art models as quickly as possible, and only afterward delving into how things work under the hood. Highly recommended for anyone.

Dive Into Deep Learning is a great free textbook with Jupyter notebooks for every part of deep learning.

NYU’s Deep Learning course has excellent PyTorch breakdowns of everything important going on in deep learning.

Stanford’s ML Systems Design course has lectures that parallel those in this course.

The Batch by Andrew Ng is a great weekly update on progress in the deep learning world.

/r/MachineLearning/ is the best community for staying up to date with the latest developments.