About OH Course Work Class Notes Lectures Recitations Assignments Docs & Tools S21 F20 S20
Menu
About
OH
Course Work
Class Notes
Lectures
Recitations
Assignments
Docs & Tools
S21 F20 S20
11-785 Introduction to Deep Learning

Bulletin and Active Deadlines

Assignment Deadline Description Links
This piece is performed by the Chinese Music Institute at Peking University (PKU) together with PKU's Chinese orchestra. This is an adaptation of Beethoven: Serenade in D major, Op.25 - 1. Entrata (Allegro), for Chinese transverse flute (Dizi), clarinet and flute.
Fall 2020 Project Gallery

The Course

“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.

In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.

If you are only interested in the lectures, you can watch them on the YouTube channel.

Course description from student point of view

The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.

Prerequisites

  1. We will be using Numpy and PyTorch in this class, so you will need to be able to program in python3.
  2. You will need familiarity with basic calculus (differentiation, chain rule), linear algebra and basic probability.

Units

Courses 11-785, 18-786, and 11-685 are equivalent 12-unit graduate courses, and have a final project. Course 11-485 is the undergraduate version worth 9 units, the only difference being that there is no final project.

Acknowledgments

Your Supporters

Instructors:

TAs:

Pittsburgh Schedule (Eastern Time)

Lecture: Monday and Wednesday, 8:00 a.m. - 9:20 a.m.

Recitation: Friday, 8.00am-9.20am

Office hours:

We will be using OHQueue and Zoom links listed on Piazza to manage office hours. The tentative schedule is as follows:

Day Time TA
Sunday 3:00 - 4:00 pm Anxiang, Kushal, Aiswarya
6:00 - 7:00 pm Jingwei, Sam
Monday 10:00 - 11:00 am Reshmi, Yizhuo
11:00 - 12:00 pm Yizhuo, Reshmi, Jiachen
7:00 - 8:00 pm Anti
Tuesday 5:00 - 6:00 pm Kushal, Miya
Wednesday 9:20 - 11:20 am Tony, Akshat
12:00 - 2:00 pm Abrham
4:00 - 5:00 pm David, Faris
5:00 - 6:00 pm David, Anxiang
6:00 - 7:00 pm David
Thursday 3:00 - 4:00 pm Sean
4:00 - 5:00 pm Sean, Anti
7:00 - 8:00 pm Jiachen, Miya
Friday 12:00 - 2:00 pm Abrham
3:00 - 5:00 pm Mansi, Jacob, Bharat
Saturday 3:00 - 4:00 pm Faris, Aiswarya
6:00 - 7:00 pm Jingwei

Course Work

Policy
Breakdown
Quizzes      Grading will be based on weekly quizzes (24%), homeworks (51%) and a course project (25%).
Quizzes
Quizzes      There will be weekly quizzes.
  • Quiz 0 is mandatory; we will retain your best 12 out of the remaining 14 quizzes.
  • Quizzes will generally (but not always) be released on Friday and due 48 hours later.
  • Quizzes are scored by the number of correct answers.
  • Quizzes will be worth 24% of your overall score.
Assignments
Assignments There will be five assignments in all. Assignments will include autolab components, where you must complete designated tasks, and a kaggle component where you compete with your colleagues.
  • Autolab components are scored according to the number of correctly completed parts.
  • We will post performance cutoffs for A (100%), B (80%), C (60%), D (40%) and F (0%) for Kaggle competitions. Scores will be interpolated linearly between these cutoffs.
  • Assignments will have a “preliminary submission deadline”, an “on-time submission deadline” and a “late-submission deadline.”
    • Early submission deadline: You are required to make at least one submission to Kaggle by this deadline. People who miss this deadline will automatically lose 10% of subsequent marks they may get on the homework. This is intended to encourage students to begin working on their assignments early.
    • On-time deadline: People who submit by this deadline are eligible for up to five bonus points. These points will be computed by interpolation between the A cutoff and the highest performance obtained for the HW. The highest performance will get 105.
    • Late deadline: People who submit after the on-time deadline can still submit until the late deadline. There is a 10% penalty applied to your final score, for submitting late.
    • Slack days: Everyone gets up to 7 slack days, which they can distribute across all their homework P2s only. Once you use up your slack days you will fall into the late-submission category by default. Slack days are accumulated over all parts of all homeworks, except HW0, to which no slack applies.
    • Kaggle scoring: We will use max(max(on-time score), max(slack-day score), .0.9*max(late-submission score)) as your final score for the HW. If this happens to be a slack-days submission, slack days corresponding to the selected submission will be counted.
  • Assignments carry 51% of your total score. HW0 is worth 1%, while each of the subsequent four are worth 12.5%.
Project
ProjectAll students taking a graduate version of the course are required to do a course project. The project is worth 25% of your grade. These points are distributed as follows: 10% - Proposal; 15% - Midterm Report; 30% - Project Video; 5% - Responding to comments on Piazza; 40% - Paper peer review.
Final grade
Final grade The end-of-term grade is curved. Your overall grade will depend on your performance relative to your classmates.
Pass/Fail
Pass/Fail Students registered for pass/fail must complete all quizzes, HWs and if they are in the graduate course, the project. A grade equivalent to B- is required to pass the course.
Auditing
Auditing Auditors are not required to complete the course project, but must complete all quizzes and homeworks. We encourage doing a course project regardless.
End Policy

Study groups

This semester we will be implementing study groups. It is highly recommended that you join a study group; see the forms on the bulletin.

Piazza: Discussion Board

Piazza is what we use for discussions. You should be automatically signed up if you're enrolled at the start of the semester. If not, please sign up. Also, please follow the Piazza Etiquette when you use the piazza.

AutoLab: Software Engineering

AutoLab is what we use to test your understand of low-level concepts, such as engineering your own libraries, implementing important algorithms, and developing optimization methods from scratch.

Kaggle: Data Science

Kaggle is where we test your understanding and ability to extend neural network architectures discussed in lecture. Similar to how AutoLab shows scores, Kaggle also shows scores, so don't feel intimidated -- we're here to help. We work on hot AI topics, like speech recognition, face recognition, and neural machine translation.

Media Services/YouTube: Lecture and Reciation Recordings

CMU students who are not in the live lectures should watch the uploaded lectures at Media Services in order to get attendance credit. Links to individual videos will be posted as they are uploaded.

YouTube is where non-CMU folks can view all lecture and recitation recordings. Videos marked “Old“ are not current, so please be aware of the video title.

Books and Other Resources

The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.

You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.

Academic Integrity

You are expected to comply with the University Policy on Academic Integrity and Plagiarism.
  • You are allowed to talk with and work with other students on homework assignments.
  • You can share ideas but not code. You should submit your own code.
Your course instructor reserves the right to determine an appropriate penalty based on the violation of academic dishonesty that occurs. Violations of the university policy can result in severe penalties including failing this course and possible expulsion from Carnegie Mellon University. If you have any questions about this policy and any work you are doing in the course, please feel free to contact your instructor for help.

Class Notes

A book containing class notes is being developed in tandem with this course; check it out.

Tentative Schedule of Lectures

Lecture Date Topics Slides and Video Additional Materials Quiz
0 -
  • Course Logistics
  • Learning Objectives
  • Grading
  • Deadlines
Slides (*.pdf)
Video (YT)
Quiz 0
1 Monday
August 31
  • Learning Objectives
  • History and cognitive basis of neural computation
  • Connectionist Machines
  • McCullough and Pitt model
  • Hebb’s learning rule
  • Rosenblatt’s perceptron
  • Multilayer Perceptrons
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
The New Connectionism (1988)
On Alan Turing's Anticipation of Connectionism
Quiz 1
2 Wednesday
September 2
  • The neural net as a universal approximator
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Hornik et al. (1989)
Shannon (1949)
On the Bias-Variance Tradeoff
Monday
September 7
  • Labor Day, no class
Quiz 2
3 Wednesday
September 9
  • Training a neural network
  • Perceptron learning rule
  • Empirical Risk Minimization
  • Optimization by gradient descent
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Widrow and Lehr (1992)
Convergence of perceptron algorithm
4 Monday
September 14
  • Back propagation
  • Calculus of back propogation
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Werbos (1990)
Rumelhart, Hinton and Williams (1986)
Quiz 3
5 Wednesday
September 16
  • Back propagation continued
  • Calculus of back propogation continued
Slides (*.pdf)
Video1 (YT)
Video1 (MT)
Video2 (YT)
Video2 (MT)
Chat (*.txt)
Werbos (1990)
Rumelhart, Hinton and Williams (1986)
6 Monday
September 21
  • Convergence issues in back propagation
  • Second order methods
  • Momentum and Nestorov
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Backprop fails to separate, where perceptrons succeed, Brady et al. (1989)
Why Momentum Really Works
Quiz 4
7 Wednesday
September 23
  • Convergence in neural networks
  • Rates of convergence
  • Loss surfaces
  • Learning rates, and optimization methods
  • RMSProp, Adagrad, Momentum
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Momentum, Polyak (1964)
Nestorov (1983)
8 Monday
September 28
  • Stochastic gradient descent
  • Acceleration
  • Overfitting and regularization
  • Tricks of the trade:
    • Choosing a divergence (loss) function
    • Batch normalization
    • Dropout
Slides (*.pdf)
Video1 (YT)
Video1 (MT)
Video2 (YT)
Video2 (MT)
Chat (*.txt)
ADAGRAD, Duchi, Hazan and Singer (2011)
Adam: A method for stochastic optimization, Kingma and Ba (2014)
Quiz 5
9 Wednesday
September 30
  • Convolutional Neural Networks (CNNs)
  • Weights as templates
  • Translation invariance
  • Training with shared parameters
  • Arriving at the convlutional model
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
10 Monday
October 5
  • Models of vision
  • Neocognitron
  • Mathematical details of CNNs
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Quiz 6
11 Wednesday
October 7
  • CNNs continued
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
CNN Explainer
12 Monday
October 12
  • CNNs continued
See lecture 11 slides
Video (YT)
Video (MT)
Video P2 (MT)
Chat (*.txt)
Quiz 7
- Monday
October 12
  • CNNs Office Hours
Video (YT)
13* Wednesday
October 14
  • Cascade Correlation Filters (Scott Fahlman and Dean Alderucci)
Slides (*.pptx)
Video (YT)
Video (MT)
Chat (*.txt)
Falhman and Lebiere (1990)
13 Monday
October 19
  • Recurrent Neural Networks (RNNs)
  • Modeling series
  • Back propogation through time
  • Bidirectional RNNs
Slides (*.pdf)
Video1 (YT)
Video2 (YT)
Video (MT)
Bidirectional Recurrent Neural Networks Quiz 8
14 Wednesday
October 21
  • Stability
  • Exploding/vanishing gradients
  • Long Short-Term Memory Units (LSTMs) and variants
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
LSTM
15 Monday
October 26
  • Loss functions for recurrent networks
  • Sequence prediction
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
See recitation 2 on computing derivatives Quiz 9
16 Wednesday
October 28
  • Sequence To Sequence Methods
  • Connectionist Temporal Classification (CTC)
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Labelling Unsegmented Sequence Data with Recurrent Neural Networks
17 Monday
November 2
  • Sequence To Sequence Methods
  • Connectionist Temporal Classification (CTC)
See lecture 16 slides
Video1 (YT)
Video2 (YT)
Video (MT)
Chat (*.txt)
Quiz 10
18 Wednesday
November 4
  • Sequence-to-sequence models
  • Models examples from speech and language
  • Attention
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
19 Monday
November 9
  • Representations and Autoencoders
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Quiz 11
20 Wednesday
November 11
  • Shinji Watanabe Guest Lecture
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
21 Monday
November 16
  • Variational Autoencoders
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Tutorial on VAEs (Doersch)
Autoencoding variational Bayes (Kingma)
Quiz 12
22* Wednesday
November 18
  • Variational Autoencoders Part 2
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
23 Monday
November 23
  • Generative Adversarial Networks (GANs) Part 1
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Quiz 13
- Wednesday
November 25
  • Thanksgiving, no class
24 Monday
November 30
  • Generative Adversarial Networks (GANs) Part 2
Slides (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Quiz 14
25 Monday
December 2
  • Experimental Design for Machine Learning by Gerald Friedland
Slides (*.pdf)
Video (YT)
Video (MT)
26 Monday
December 7
  • Hopfield /Boltzmann 1
Slides (*.pdf)
Video (YT)
😄
27 Wednesday
December 9
  • Hopfield / Boltzmann 2
Slides (*.pdf)
Video (YT)
27 Wednesday
December 11
  • Hopfield / Boltzmann 3
Slides (*.pdf)
Video (YT)

Tentative Schedule of Recitations

Recitation Date Topics Materials Videos Instructor
0A Due Aug. 31 Fundamentals of Python Notebook (*.zip) Video (YT) Mansi
0B Due Aug. 31 Fundamentals of NumPy Notebook (*.zip) Video (YT) Reshmi
0C Due Aug. 31 Fundamentals of PyTorch Notebook (*.zip) Video (YT) Kushal
0D Due Aug. 31 Fundamentals of AWS Document (url)
Video (YT) Faris and Miya
0E Due Aug. 31 Fundamentals of Google Colab Document (url)
Video (YT) Mansi
0F Due Aug. 31 Debugging Notebook (*.zip) Video (YT) Akshat
0G Due Aug. 31 Remote Notebooks Notebook (*.zip)
Document (*.pdf)
Video (YT) Anxiang
1 September 4 Your First Deep Learning Code Slides (*.pdf)
Notebooks (url)
Video (MT)
Chat (*.txt)
David and Miya
2 September 11 Computing Derivatives Slides (part 1) (*.pdf)
Slides (part 2) (*.pdf)
Video (YT)
Video (MT)
Chat (*.txt)
Jacob and Anti
- September 13 HW1P1 Bootcamp Video (YT)
Chat (*.txt)
3 September 18 Optimizing the network Notebook1 (*.zip)
Slides2 (*.pdf)
Notebook2 (*.zip)
Chat (*.txt)
Video (MT) Kushal and Anxiang
4 September 25 CNN: Basics Slides (*.pptx)
Notebook (*.zip)
Video (YT)
Video (MT)
Reshmi and Mansi
- September 30 HW2 Bootcamp Video (YT)
Anxiang and David
5 October 2 CNN: Losses, transfer learning Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Video (MT)
Yizhuo and Bharat
6 October 9 RNN: Basics Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Samuel and Miya
7 October 16 CTC Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Video (MT)
Chat (*.txt)
Tony and Sean
8p1 October 23 Attention Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Video (MT)
Jingwei and Bharat
- October 27 HW3 Bootcamp
Video (YT)
Chat (*.txt)
8p2 October 30 Listen Attend Spell Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Video (MT)
Anxiang and Samuel
- October 30 AMA Video (YT)
9 November 6 Representation and Autoencoders Slides (*.pdf)
Video (YT)
Video (MT)
Jiachen and Abrham
10 November 13 Variational Autoencoders (VAEs) Slides (*.pdf) Video (YT)
Video (MT)
Akshat and Jiachen
- November 17 HW4 Bootcamp Video (YT)
11 November 20 Generative Adversarial Networks (GANs) Slides (*.pdf) Video (YT) Kushal and Akshat
12 December 4 Hopfield Nets / Boltzmann Machines Slides (*.pdf)
Notebook (*.zip)
Video (YT)
Video (MT)
Sean and Jingwei
13 December 11 Reinforcement Learning Slides (*.pdf) Video (YT) Tony
14 December 18 Image Segmentation, RCNN, YOLO Faris, Yizhuo and David

Assignments

Date P1 HWs P2 HWs Bonus HWs Project (Project ideas) Quizzes
Summer HW0p1 out
Autolabhandout
(see recitation 0s)
HW0p2 out
Autolabhandout
(see recitation 0s)
Quiz 0 out
Mon. Aug. 31 HW0p1 due HW0p2 due
Fri. Sept. 4 Quiz 1 out
Sun. Sept. 6 HW1p1 out
Autolab
Writeup (*.pdf)
Handout (*.zip)
HW1p2 out
Kaggle
Writeup (*.pdf)
Quiz
Autolab
Quiz 0 due
Quiz 1 due
Fri. Sept. 11 Quiz 2 out
Sun. Sept. 13 Quiz 2 due
Wed. Sept. 16 HW1p2 early deadline
Fri. Sept. 18 Quiz 3 out
Sun. Sept. 20 Quiz 3 due
Fri. Sept. 25 Quiz 4 out
Sun. Sept. 27 HW1p1 due HW1p2 on-time deadline
A: ≥ 72%, B ≥ 66%,
C ≥ 53%, D ≥ 30%
HW1 Bonus out
Autolab
Writeup (*.pdf)
Handout (*.zip)
HW2p1 out
Autolab
Writeup (*.pdf)
Handout (*.zip)
HW2p2 out
Kaggle
Writeup (*.pdf)
Quiz
Autolab
Mon. Sept. 28 Quiz 4 due
Fri. Oct. 2 Quiz 5 out
Sun. Oct. 4 HW1p2 code submission deadline Team formations due
(or be randomly grouped)
Quiz 5 due
Wed. Oct. 7 HW2p2 early deadline
Fri. Oct. 9 Quiz 6 out
Sun. Oct. 11 Project proposals due
Proposal Guidelines
Canvas
Template
Quiz 6 due
Fri. Oct. 16 Quiz 7 out
Sun. Oct. 18 HW2p1 due HW2p2 on-time deadline
Baseline
A: ≥ 0.92, B ≥ 0.85,
C ≥ 0.77, D ≥ 0.50
Quiz 7 due
HW3p1 out
Autolab
Writeup (*.pdf)
Handout (*.zip)
HW3p2 out
Kaggle
Writeup (*.pdf)
Autolab
HW2 Bonus out
Autolab
Writeup (*.pdf)
Handout (*.zip)
Fri. Oct. 23 Quiz 8 out
Sun. Oct. 25 HW2p2 code submission deadline Quiz 8 due
Fri. Oct. 30 HW3p2 early deadline Quiz 9 out
Sun. Nov. 1 HW4p1 out
Autolab
Writeup (*.pdf)
Handout (*.zip)
Quiz 9 due
Fri. Nov. 6 Quiz 10 out
Sun. Nov. 8 HW3p1 due HW3p2 on-time deadline
Baseline
A: ≤ 8, B ≤ 12,
C ≤ 20, D ≤ 30
Quiz 10 due
HW4p2 out
Kaggle
Writeup (*.pdf)
Autolab
Tue. Nov. 10 Midterm report due
Instructions
Canvas
Report template
Fri. Nov. 13 Quiz 11 out
Sun. Nov. 15 HW3p2 code submission deadline Quiz 11 due
Wed. Nov. 18
Fri. Nov. 20 Quiz 12 out
Sun. Nov. 22 HW4p2 early deadline Quiz 12 due
Fri. Nov. 27 Quiz 13 out
Sun. Nov. 29 HW4p1 due HW4p2 on-time deadline Quiz 13 due
Fri. Dec. 1 Bonuses due
Fri. Dec. 4 Start project uploads
Video/other instructions
Quiz 14 out
Sun. Dec. 6 HW4p2 code submission deadline Quiz 14 due
Sun. Dec. 8 Preliminary report due
Wed. Dec. 9 Project uploads due
Thu. Dec. 10 Start peer reviews
Sat. Dec. 12 Peer reviews due
Sun. Dec. 13 Final report due

Documentation and Tools

Textbooks

This is a selection of optional textbooks you may find useful

Deep Learning
Dive Into Deep Learning By Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola PDF, 2020
Deep Learning
Deep Learning By Ian Goodfellow, Yoshua Bengio, Aaron Courville Online book, 2017
Neural Networks and Deep Learning
Neural Networks and Deep Learning By Michael Nielsen Online book, 2016