About OH Course Work Lectures Recitations Homework Project Docs & Tools F20 S20
Menu
About
OH
Course Work
Lectures
Recitations
Homework
Project
Docs & Tools
F20 S20
11-785 Introduction to Deep Learning
Spring 2020

Bulletin and Active Deadlines

Assignment Deadline Description Links
Homework 4 Part1 May 3 Word-Level Neural Language Models Writeup (*.pdf)
Homework 4 Part2 May 3 Attention Mechanisms and Memory Networks Writeup (*.pdf)
Midterm Report April 5 report template is provided to detail your initial experiments
Final Project Video May 6 This is the 5-minute video for the course project Video Instruction
Final Project Report May 8 This should be the final document for the course project
Final Project Playlist Please enjoy the final project videos from our students Youtube Playlist

“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.

In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.

If you are only interested in the lectures, you can watch them on the YouTube channel listed below.

Course description from student point of view

The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.

Acknowledgments

Your Supporters

Instructor:

TAs:

Pittsburgh Schedule (Eastern Time)

Lecture: Monday and Wednesday, 9:00 a.m. - 10:20 a.m.

Zoom Link: Meeting Link , Meeting ID: 403 746 7921

Recitation: Friday, 9.00am-10.20am BH A51

Office hours:
Day Time Location TA
Monday 10:30 - 11:30 am GHC 5417 Bhuvan Agrawal
1:00 - 2:00 pm GHC 6404 Advait Gadhikar
3:00 - 5:00 pm GHC 6708 Zhefan Xu
3:30 - 4:30 pm GHC 6708 Yang Xia
Tuesday 12:00 - 1:00 pm GHC 6708 Amala Sanjay Deshmukh, Soumya Vadlamannati
1:00 - 2:00 pm GHC 5417 Hao Liang
3:00 - 4:00 pm GHC 5417 Jianfeng Xia, Yuying Zhu
Wednesday 10:30 - 11:30 am GHC 5417 Bhuvan Agrawal
2:00 - 3:00 pm LTI Commons Jianfeng Xia
6:30 - 8:30 pm GHC 5417 Christopher George
Thursday 12:00 - 2:00 pm GHC 6404 David Park
1:00 - 2:00 pm GHC 6404 Hao Liang
2:00 - 3:00 pm GHC 6404 Amala Sanjay Deshmukh, Soumya Vadlamannati
Friday 10:30 - 11:30 am LTI Commons Yang Xia, Yuying Zhu
11:00 - 12:00 am LTI Commons Rohit Prakash Barnwal
1:30 - 2:30 am LTI Commons Rohit Prakash Barnwal
3:00 - 4:00 pm GHC 6708 David Park
Saturday 2:00 - 4:00 pm GHC 5417 Advait Gadhikar, Vedant Sanil, Yash Belhe
Sunday 3:00 - 5:00 pm GHC 5417 Vedant Sanil

Kigali Schedule (Central Africa Time)

Lecture: Monday and Wednesday, 3:00 p.m. – 4:20 p.m. @ CMR C421

Office hours:

Silicon Valley Schedule (Pacific Time)

Office hours:

Prerequisites

  1. We will be using one of several toolkits (the primary toolkit for recitations/instruction is PyTorch). The toolkits are largely programmed in Python. You will need to be able to program in at least one of these languages. Alternately, you will be responsible for finding and learning a toolkit that requires programming in a language you are comfortable with,
  2. You will need familiarity with basic calculus (differentiation, chain rule), linear algebra and basic probability.

Units

11-785 is a graduate course worth 12 units. 11-485 is an undergraduate course worth 9 units.

Course Work

Grading

Grading will be based on weekly quizzes (24%), homeworks (51%) and a course project (25%).

Policy
Quizzes      There will be weekly quizzes.
  • There are 14 quizzes in all. We will retain your best 12 scores.
  • Quizzes will generally (but not always) be released on Friday and due 48 hours later.
  • Quizzes are scored by the number of correct answers.
  • Quizzes will be worth 24% of your overall score.
Assignments There will be five assignments in all. Assignments will include autolab components, where you must complete designated tasks, and a kaggle component where you compete with your colleagues.
  • Autolab components are scored according to the number of correctly completed parts.
  • We will post performance cutoffs for A, B, C, D and F for Kaggle competitions. These will translate to scores of 100, 80, 60, 40 and 0 respectively. Scores will be interpolated linearly between these cutoffs.
  • Assignments will have a “preliminary submission deadline”, an “on-time submission deadline” and a “late-submission deadline.”
    • Early submission deadline: You are required to make at least one submission to Kaggle by this deadline. People who miss this deadline will automatically lose 10% of subsequent marks they may get on the homework. This is intended to encourage students to begin working on their assignments early.
    • On-time deadline: People who submit by this deadline are eligible for up to five bonus points. These points will be computed by interpolation between the A cutoff and the highest performance obtained for the HW. The highest performance will get 105.
    • Late deadline: People who submit after the on-time deadline can still submit until the late deadline. There is a 10% penalty applied to your final score, for submitting late.
    • Slack days: Everyone gets up to 7 slack days, which they can distribute across all their homeworks. Once you use up your slack days you will fall into the late-submission category by default. Slack days are accumulated over all parts of all homeworks, except HW0, to which no slack applies.
    • Kaggle scoring: We will use max(max(on-time score), max(slack-day score), .0.9*max(late-submission score)) as your final score for the HW. If this happens to be a slack-days submission, slack days corresponding to the selected submission will be counted.
  • Assignments carry 51% of your total score. HW0 is worth 1%, while each of the subsequent four are worth 12.5%.
ProjectAll students are required to do a course project. The project is worth 25% of your grade
Final grade The end-of-term grade is curved. Your overall grade will depend on your performance relative to your classmates.
Pass/Fail Students registered for pass/fail must complete all quizzes, HWs and the project. A grade equivalent to B- is required to pass the course.
Auditing Auditors are not required to complete the course project, but must complete all quizzes and homeworks. We encourage doing a course project regardless.
End Policy

Piazza: Discussion Board

Piazza is what we use for discussions. You should be automatically signed up if you're enrolled at the start of the semester. If not, please sign up. Also, please follow the Piazza Etiquette when you use the piazza.

AutoLab: Software Engineering

AutoLab is what we use to test your understand of low-level concepts, such as engineering your own libraries, implementing important algorithms, and developing optimization methods from scratch.

Kaggle: Data Science

Kaggle is where we test your understanding and ability to extend neural network architectures discussed in lecture. Similar to how AutoLab shows scores, Kaggle also shows scores, so don't feel intimidated -- we're here to help. We work on hot AI topics, like speech recognition, face recognition, and neural machine translation.

YouTube: Lecture and Reciation Recordings

YouTube is where all lecture and recitation recordings will be uploaded. Links to individual lectures and recitations will also be posted below as they are uploaded. Videos marked “Old“ are not current, so please be aware of the video title.

CMU students can also access the videos Live from Media Services or Recorded from Media Services.

Books and Other Resources

The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.

You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.

Academic Integrity

You are expected to comply with the University Policy on Academic Integrity and Plagiarism.
  • You are allowed to talk with and work with other students on homework assignments.
  • You can share ideas but not code. You should submit your own code.
Your course instructor reserves the right to determine an appropriate penalty based on the violation of academic dishonesty that occurs. Violations of the university policy can result in severe penalties including failing this course and possible expulsion from Carnegie Mellon University. If you have any questions about this policy and any work you are doing in the course, please feel free to contact your instructor for help.

Tentative Schedule of Lectures

Lecture Date Topics Lecture Slides and Video Additional Readings (if any) Homework & Assignments
0 -
  • Course Logistics
  • Learning Objectives
  • Grading
  • Deadlines
Slides
Video (url)
1 January 15
  • Learning Objectives
  • History and cognitive basis of neural computation
  • Connectionist Machines
  • McCullough and Pitt model
  • Hebb’s learning rule
  • Rosenblatt’s perceptron
  • Multilayer Perceptrons
Slides
Video (url)
2 January 17
  • The neural net as a universal approximator
Slides
Video (url)
Hornik et al. (1989)
Shannon (1949)
January 20
  • MLK Day, no class
3 January 22
  • Training a neural network
  • Perceptron learning rule
  • Empirical Risk Minimization
  • Optimization by gradient descent
Slides
Video (url)
Widrow and Lehr (1992)
Convergence of perceptron algorithm
4 January 27
  • Back propagation
  • Calculus of back propogation
Slides
Video (url)
Werbos (1990)
Rumelhart, Hinton and Williams (1986)
4.5 January 28
  • More on back propagation
  • More on Calculus of back propogation
Slides
Video (url)
5 January 29
  • Convergence issues in back propagation
  • Second order methods
  • Momentum and Nestorov
Slides
Video (url)
Backprop fails to separate, where perceptrons succeed, Brady et al. (1989)
6 February 3
  • Convergence in neural networks
  • Rates of convergence
  • Loss surfaces
  • Learning rates, and optimization methods
  • RMSProp, Adagrad, Momentum
Slides
Video (url)
Momentum, Polyak (1964)
Nestorov (1983)
7 February 5
  • Stochastic gradient descent
  • Acceleration
  • Overfitting and regularization
  • Tricks of the trade:
    • Choosing a divergence (loss) function
    • Batch normalization
    • Dropout
Slides
Video (url)
ADAGRAD, Duchi, Hazan and Singer (2011)
Adam: A method for stochastic optimization, Kingma and Ba (2014)
8 February 10
  • Convolutional Neural Networks (CNNs)
  • Weights as templates
  • Translation invariance
  • Training with shared parameters
  • Arriving at the convlutional model
Slides
Video (url)
9 February 12
  • Cascade Correlation Filters (Scott Fahlman and Dean Alderucci)
Slides
Slides
Video (url)
Falhman and Lebiere (1990)
10 February 17
  • Models of vision
  • Neocognitron
  • Mathematical details of CNNs
Slides
Video (url)
11 February 19
  • Guest Lecture by Mike Tarr
Slides
Video (url)
12 February 24
  • Backpropagation in CNNs
  • Variations in the basic model
  • Some history of the Imagenet
Slides
Video (url)
13 February 26
  • "Recurrent Neural Networks (RNNs)
  • Modeling series
  • Back propogation through time
  • Bidirectional RNNs"
Slides
Video (url)
Bidirectional Recurrent Neural Networks
14 March 2
  • Stability
  • Exploding/vanishing gradients
  • Long Short-Term Memory Units (LSTMs) and variants
Slides
Video (url)
LSTM
15 March 4
  • Loss functions for recurrent networks
  • Sequence prediction
Slides
Video (url)
How to compute a derivative (*.pdf)
16 March 6
  • Sequence To Sequence Methods
  • Connectionist Temporal Classification (CTC)
Slides
Video1
Video2
Labelling Unsegmented Sequence Data with Recurrent Neural Networks
March 9
  • Spring Break, no class
March 11
  • Spring Break, no class
March 16
  • Corona virus break, no class, recitation posted
17 March 18
  • Sequence-to-sequence models
  • Models examples from speech and language
  • Transformers and self attention
Slides
Video (url)
18 March 23
  • Representations and Autoencoders
Slides
Video (url)
19 March 25
  • Hopfield Networks
Slides
Video (url)
20 March 30
  • Hopfield Networks
  • Boltzmann Machines
Slides
Video (url)
21 April 1
  • Boltzmann Machines
Slides
Video (url)
22 April 6
  • Learning about Language with Normalizing Flows (Graham Neubig)
Slides
Video (url)
23 April 8
  • Sizing Neural Network Experiments (Gerald)
Slides
Video (url)
24 April 13
  • Variational Autoencoders Part 1
Slides
Video (url)
25 April 15
  • Variational Autoencoders Part 2
Slides
Video (url)
Tutorial on VAEs (Doersch)
Autoencoding variational Bayes (Kingma)
26 April 20
  • Generative Adversarial Networks (GANs) Part 1
Slides
Video (url)
27 April 22
  • Generative Adversarial Networks (GANs) Part 2
Slides
Video (url)
28 April 27
  • Reinforcement Learning 1
Slides
Video (url)
29 April 29
  • Reinforcement Learning
Slides
Video (url)

Tentative Schedule of Recitations

Recitation Date Topics Notebook Videos Instructor
0 - Part A January 5 Fundamentals of Python Notebook (*.tar.gz)
YouTube (url)
Joseph Konan
0 - Part B January 5 Fundamentals of NumPy Notebook (*.tar.gz) YouTube (url) Joseph Konan
0 - Part C January 5 Fundamentals of Jupyter Notebook Notebook (*.tar.gz) YouTube (url) Joseph Konan
0 - Part D January 5 AWS. Will include tutorial, with google doc polling to check student status Doc (url) YouTube (url) Christopher George
1 January 13 Your First Deep Learning Code Notebook (*.zip) Video (url) Bhuvan, Soumya
2 January 24 How to compute a derivative Slides (*.pdf) Video (url) Amala, Yang
3 January 31 Optimizing the network Slides (*.pdf)
Notebook (*.zip)
Video (url) Advait, Rohit
4 February 7 Tensorboard, TSNE, Visualizing network parameters and outputs at every layer Slides (*.pdf)
Notebook (*.ipynb)
Video (url) Yash, Zhefan
5 February 14 CNN: Basics Notebook (*.zip) Video (url) Advait, Hao
6 February 21 CNN: Losses, transfer learning Slides (*.pdf)
Notebook (*.zip)
Video (url) Rohit, David
7 February 28 RNN: Basics Slides (*.pdf)
Notebook (*.zip)
Video (url) Vedant, Zhefan
8 March 16 CTC Slides (*.pdf)
Notebook (*.zip)
YouTube (url)
Video (url)
Amala, Soumya
9 March 20 Attention Slides (*.pdf)
Notebook (*.zip)
YouTube (url)
Video (url)
Jianfeng, Yuying
10 March 27 Listen Attend Spell Slides (*.pdf)
Notebook (*.zip)
YouTube P1(url)
YouTube P2(url)
Vedant, Chris
11 April 3 Hopfield nets / Boltzmann machines Slides (*.pdf)
Notebook (*.zip)
YouTube (url) Vedant, Yang
12 April 10 VAEs Slides (*.pdf) YouTube (url) Yuying, Chris
13 April 17 Generative Adversarial Networks (GANs) Slides (*.pdf) YouTube (url) Hao, Yash
14 April 24 Reinforcement Learning Slides (*.pdf) Video (url) Bhuvan, Jianfeng

Homework Schedule

Number Part Topics Release Date Early-submission Deadline On-time Deadline Links
HW0 January 5 January 19 Handout (*.targ.gz)
HW1 P1 MLP pytorch January 19 February 8 Handout (*.tar)
Writeup (*.pdf)
P1-bonus Dropout and Adam April 29
P2 MLP, phoneme recognition January 19 January 29 February 8 Writeup (*.pdf)
HW2 P1 CNN as scanning MLP, backprop February 9 March 7 Handout (*.tar)
Writeup (*.pdf)
P1-bonus Conv2D and Pooling May 3
P2 Face Recognition: Classification and Verification February 9 February 23 March 8 Writeup (*.pdf)
HW3 P1 RNN: forward/backward/CTC beam search March 8 April 5 Handout (*.tar)
Writeup (*.pdf)
Submission Link
P1-bonus CTC Loss and RNN BPTT May 7
P2 Connectionist Temporal Classification March 8 March 25 April 5 Writeup (*.pdf)
Code Submission Link
Writeup Submission Link
HW4 P1 Word-Level Neural Language Models April 5 May 3 Writeup (*.pdf)
P1-bonus TBD
P2 Attention Mechanisms and Memory Networks April 5 April 22 May 3 Writeup (*.pdf)

Course Project Timeline

Assignment Deadline Description Links
Team Formation February 8 Teams will be formed in groups of four each
*If you do not have a team after this point, you will be grouped randomly
Project Proposal February 15 Project Description Guidelines
Midterm Report April 5 report template is provided to detail your initial experiments
Final Project Video May 6 This is the final video for the course project Video Instructions
Final Project Report May 8 This should be the final document for the course project
Final Project Playlist Please enjoy the final project videos from our students Youtube Playlist

Grade Breakdown: 10% - Proposal; 15% - Midterm Report; 30% - Project Video and follow-up; 45% - Project Report.

Documentation and Tools

Textbooks

Deep Learning
Deep Learning By Ian Goodfellow, Yoshua Bengio, Aaron Courville Online book, 2017
Neural Networks and Deep Learning
Neural Networks and Deep Learning By Michael Nielsen Online book, 2016
Deep Learning with Python
Deep Learning with Python By J. Brownlee