About OH Events Syllabus Lectures Recitations & Bootcamps Assignments Docs & Tools Previous Iterations S24 F23 S23
Recitations & Bootcamps
Docs & Tools
Previous Iterations
S24 F23 S23
11-785 Introduction to Deep Learning
Spring 2024
Class Streaming Link

In-Person Venue: Giant Eagle Auditorium, Baker Hall (A51)

Active Deadlines and Bulletin

Assignment Deadline Description Links
HW1P1 Bonus Final Submission: 2nd March, 11:59 PM
Adam, AdamW Optimizers and Dropout


HW1P1 Autograd Final Submission: 9th March, 11:59 PM
Automatic Differentiation Engine


HW2P1 Early Submission: 23th February, 11:59 PM
Final Submission: 8th March, 11:59 PM
1D and 2D CNNs Autolab
HW2P2 Early Submission: 23th February, 11:59 PM
Final Submission: 8th March, 11:59 PM
Face Classification and Verification Piazza
HW2P1 Bonus Final Submission: 23rd March, 11:59 PM
Face Classification and Verification Autolab
HW2P1 Autograd Final Submission: 30th March, 11:59 PM
Face Classification and Verification Autolab
Project Gallery

The Course

“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all the AI tasks, ranging from language understanding, speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.

In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.

If you are only interested in the lectures, you can watch them on the YouTube channel.

Course Description from a Student's Perspective

The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.


  1. We will be using Numpy and PyTorch in this class, so you will need to be able to program in python3.
  2. You will need familiarity with basic calculus (differentiation, chain rule), linear algebra, and basic probability.


Courses 11-785 and 11-685 are equivalent 12-unit graduate courses, and have a final project and HW5 respectively.
Course 11-485 is the undergraduate version worth 9 units, the only difference being that there is no final project or HW5.

Your Supporters



Past TA Acknowledgments

Pittsburgh Schedule (Eastern Time)

Lecture: Monday and Wednesday, 8:00 a.m. - 9:20 a.m. - Good times :)

Recitation Labs: Friday, 8:00 a.m. - 9:20 a.m.

Office Hours: We will be using OHQueue (11-785) for both zoom and in-person office hours. Please refer the below OH Calendar / Piazza for up-to-date information.

Homework Hackathon: During 'Homework Hackathons', students will be assisted with homework by the course staff. It is recommended to come as study groups.
Jan. 20th

Every Other Saturday

Event Calendar: The Google Calendar below contains all course events and deadlines for student's convenience. Please feel free to add this calendar to your Google Calendar by clicking on the plus (+) button on the bottom right corner of the calendar below. Any adhoc changes to the schedule will be reflected in this calendar first.

OH Calendar: The Google Calendar below contains the schedule for Office Hours. Please feel free to add this calendar to your Google Calendar by clicking on the plus (+) button on the bottom right corner of the calendar below. Any adhoc changes to the schedule will be reflected in this calendar first.


Score Assignment      Grading will be based on weekly quizzes (24%), homeworks (50%) and a course project (25%). Note that 1% of your grade is assigned to Attendance.
Quizzes      There will be weekly quizzes.
  • We will retain your best 12 out of the remaining 14 quizzes.
  • Quizzes will generally (but not always) be released on Friday and due 48 hours later.
  • Quizzes are scored by the number of correct answers.
  • Quizzes will be worth 24% of your overall score.
Assignments There will be five assignments in all, plus the Peer Review assignment during the last week of the semester. Assignments will include Autolab components, where you implement low-level operations, and a Kaggle component, where you compete with your colleagues over relevant DL tasks.
  • Autolab components are scored according to the number of correctly completed parts.
  • We will post performance cutoffs for HIGH (90%), MEDIUM (70%), LOW (50%), and VERY LOW (30%) for Kaggle competitions.
    Scores will be interpolated linearly between these cutoffs.
  • Assignments will have a “preliminary submission deadline”, an “on-time submission deadline” and a “late-submission deadline.”
    • Early submission deadline: You are required to make at least one submission to Kaggle by this deadline. People who miss this deadline will automatically lose 10% of subsequent marks they may get on the homework. This is intended to encourage students to begin working on their assignments early.
    • On-time deadline: People who submit by this deadline are eligible for up to five bonus points. These points will be computed by interpolation between the A cutoff and the highest performance obtained for the HW. The highest performance will get 105.
    • Late deadline: People who submit after the on-time deadline can still submit until the late deadline. There is a 10% penalty applied to your final score, for submitting late.
    • Slack days: Everyone gets up to 10 slack days, which they can distribute across all their homework P2s only. Once you use up your slack days you will fall into the late-submission category by default. Slack days are accumulated over all parts of all homeworks.
    • Kaggle scoring: We will use max(max(on-time score), max(slack-day score), .0.9*max(late-submission score)) as your final score for the HW. If this happens to be a slack-days submission, slack days corresponding to the selected submission will be counted.
  • Assignments carry 50% of your total score, with each of the four HWs being worth 12.5%.
  • A fifth HW, HW5, will be released later in the course and will have the same weight as a course project. Please see Project section below for more details.
  • Bonus HWs will count towards the score of the correlating HWp1 assignment number. (For example, Bonus1 points go towards HW1p1.)
  • The Peer Review assignment is required of all students, 11-485/685/785. The task is for all students to review and grade 4-6 of the videos. It is to be completed over the last weekend, after classes finish (but before finals week). We will tell you which projects you have been assigned to review; each review should take around 15~20 minutes. Here is what we expect you to do for each review:
  • Watch the video carefully. As you watch the video, jot down some notes/concerns/questions that you might have.
  • Reference the initial report to clear up any confusion.
  • You must post at least one comment to the corresponding Piazza post of your reviewee. This comment must be a meaningful question or concern that demonstrates you have understood the material.
  • Finally, fill out the project review form carefully. More details will be shared over Piazza.
  • All students taking a graduate version of the course are required to do a course project. The project is worth 25% of your grade. These points are distributed as follows: 20% - Midterm Report; 35% - Project Video; 5% - Responding to comments on Piazza; 40% - Project report.
  • Note that a Project is mandatory for 11-785 students. In the event of a catastrophe (remember Spring 2020), the Project may be substituted with HW5. 11-685 Students may choose to do a Project instead of HW5. Either your Project OR HW5 will be graded.
  • Important information for project reports and video presentations (including midterm report rubric, final report rubric, video timeline, and video grading): Link.
  • If you are in section A you are expected to attend in-person lectures. We will track attendance.
  • If you are in any of the other (out-of-timezone) sections, you must watch lectures live on zoom. Real-time viewing is mandatory unless you are in inconvenient time zones. Others are required to obtain specific permission to watch the pre-recorded lectures (on MediaServices).
  • If viewed on MediaServices, the lectures of each week must be viewed before Monday 8AM of the following week (otherwise, it doesn’t count).
  • At the end of the semester, we will select a random subset of lectures and tabulate attendance.
  • If you have attended at least 70% of these (randomly chosen) lectures, you get the attendance point.
  • Final grade
    Final grade The end-of-term grade is curved. Your overall grade will depend on your performance relative to your classmates.
    Pass/Fail Students registered for pass/fail must complete all quizzes, HWs and if they are in the graduate course, the project. A grade equivalent to B- is required to pass the course.
    Auditing Auditors are not required to complete the course project, but must complete all quizzes and homeworks. We encourage doing a course project regardless.
    End Policy

    Study groups

    We believe that effective collaboration can greatly enhance student learning. Thus, this course employs study groups for both quizzes and homework ablations. It is highly recommended that you join a study group; Check piazza for further updates.

    Piazza: Discussion Board

    Piazza is what we use for discussions. You should be automatically signed up if you're enrolled at the start of the semester. If not, please sign up here. Also, please follow the Piazza Etiquette when you use the piazza forum.

    AutoLab: Software Engineering

    AutoLab is what we use to test your understand of low-level concepts, such as engineering your own libraries, implementing important algorithms, and developing optimization methods from scratch.

    Kaggle: Data Science

    Kaggle is where we test your understanding and ability to extend neural network architectures discussed in lecture. Similar to how AutoLab shows scores, Kaggle also shows scores, so don't feel intimidated -- we're here to help. We work on hot AI topics, like speech recognition, face recognition, and neural machine translation.

    MediaServices/YouTube: Lecture and Reciation Recordings

    CMU students who are not in the live lectures should watch the uploaded lectures at MediaServices in order to get attendance credit. Links to individual videos will be posted as they are uploaded.

    Our YouTube Channel is where non-CMU folks can view all lecture and recitation recordings. Videos marked “Old“ are not current, so please be aware of the video title.

    Books and Other Resources

    The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.

    You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.

    Academic Integrity

    You are expected to comply with the University Policy on Academic Integrity and Plagiarism.
    • You are allowed to talk with and work with other students on homework assignments.
    • You can share ideas but not code. You should submit your own code.
    Your course instructor reserves the right to determine an appropriate penalty based on the violation of academic dishonesty that occurs. Violations of the university policy can result in severe penalties including failing this course and possible expulsion from Carnegie Mellon University. If you have any questions about this policy and any work you are doing in the course, please feel free to contact your instructor for help.

    Class Notes

    A book containing class notes is being developed in tandem with this course; check it out.

    Schedule of Lectures

    You can watch the recorded lectures on MediaServices.
    Lecture Date Topics Slides, Videos Additional Materials Quiz
    0 Monday,
    Jan. 8th
    • Course Logistics
    • Learning Objectives
    • Grading
    • Deadlines
    No Quiz
    1 Wednesday,
    Jan. 17th
    • Introduction
    Slides (PDF)
    The New Connectionism (1988)
    On Alan Turing's Anticipation of Connectionism
    McCullogh and Pitts paper
    Rosenblatt: The perceptron
    Bain: Mind and body
    Hebb: The Organization Of Behaviour
    Quiz 1
    2 Friday,
    Jan. 19th
    • Neural Nets As Universal Approximators
    Slides (PDF)
    Shannon (1949)
    Boolean Circuits
    On the Bias-Variance Tradeoff
    3 Monday,
    Jan. 22nd
    • Training Part I
      • The Problem of Learning
      • Empirical Risk Minimization
    Slides (PDF)
    Widrow and Lehr (1992)
    Adaline and Madaline
    Quiz 2
    4 Friday,
    Jan. 26th
    • Training Part II
      • Gradient Descent
      • Training the Network
      • Backpropagation
    Slides (PDF)
    Widrow and Lehr (1992)
    Adaline and Madaline
    Convergence of perceptron algorithm
    Threshold Logic
    TC (Complexity)
    AC (Complexity)
    5 Monday,
    Jan. 29th
    • Training Part III
      • Backpropagation
      • Calculus of Backpropagation
    Slides (PDF)
    Werbos (1990)
    Rumelhart, Hinton and Williams (1986)
    Quiz 3
    6 Wednesday,
    Jan. 31st
    • Training Part IV
      • Convergence issues
      • Loss Surfaces
      • Momentum
    Slides (PDF)
    Backprop fails to separate, where perceptrons succeed, Brady et al. (1989)
    Why Momentum Really Works
    7 Monday,
    Feb. 5th
    • Training Part V
      • Optimization
      • Batch Size, SGD, Mini-batch, Second-order Methods
    Slides (PDF)
    Momentum, Polyak (1964)
    Nestorov (1983)
    Derivatives and Influences
    Quiz 4
    8 Wednesday,
    Feb. 7th
    • Training Part VI
      • Optimizers and Regularizers
      • Choosing a Divergence (Loss) Function
      • Batch Normalization
      • Dropout
    Slides (PDF)
    Derivatives and Influence Diagrams
    ADAGRAD, Duchi, Hazan and Singer (2011)
    Adam: A method for stochastic optimization, Kingma and Ba (2014)
    9 Monday,
    Feb. 12th
    • Shift Invariance
    • Convolutional Neural Networks (CNNs)
    Slides (PDF)
    Quiz 5
    10 Friday,
    Feb. 16th
    • Models of Vision and CNNs
    Slides (PDF)
    11 Monday,
    Feb. 19th
    • Learning in CNNs
    Slides (PDF)
    CNN Explainer Quiz 6
    12 Wednesday,
    Feb. 21st
    • Learning in CNNs
    • Transpose Convolution
    • CNN Stories
    Slides (PDF)
    13 Wednesday,
    Feb. 28th
    • Time Series and Recurrent Networks
    Slides (PDF)
    Fahlman and Lebiere (1990)
    How to compute a derivative, extra help for HW3P1 (*.pptx)
    Quiz 7
    14 Friday,
    Mar. 1st
    • Stability and Memory, LSTMs
    Slides (PDF)
    Bidirectional Recurrent Neural Networks
    - Monday,
    Mar. 4th
    • No Class - Spring Break
    Quiz 8
    - Wednesday,
    Mar. 6th
    • No Class - Spring Break
    15 Monday,
    Mar. 11th
    • Sequence Prediction
    • Alignments and Decoding
    TBA LSTM Quiz 9
    16 Wednesday,
    Mar. 13th
    • Sequence Prediction
    • Connectionist Temporal Classification
      • Blanks
      • Beam Search
    17 Monday,
    Mar. 18th
    • Language Models
    • Sequence To Sequence Predictions
    TBA Labeling Unsegmented Sequence Data with Recurrent Neural Networks Quiz 10
    18 Wednesday,
    Mar. 20th
    • Sequence To Sequence Models
    • Attention
    TBA Attention Is All You Need
    The Annotated Transformer - Attention is All You Need paper, but annotated and coded in PyTorch!
    19 Monday,
    Mar. 25th
    • Transformers and Newer Architectures
    TBA Quiz 11
    20 Wednesday,
    Mar. 27th
    • Large Language Models
    21 Monday,
    Apr. 1st
    • Representation and Autoencoders
    TBA Quiz 12
    22 Wednesday,
    Apr. 3rd
    • Variational Auto Encoders I
    TBA Tutorial on VAEs (Doersch)
    Autoencoding variational Bayes (Kingma)
    23 Monday,
    Apr. 8th
    • Variational Autoencoders II
    TBA Quiz 13
    24 Wednesday,
    Apr. 10th
    • Flow and Diffusion
    25 Monday,
    Apr. 15th
    • Generative Adversarial Networks I
    TBA Quiz 14
    24 Wednesday,
    Apr. 17th
    • Generative Adversarial Networks II
    27 Monday,
    Apr. 22nd
    • Graph Neural Networks
    TBA A Comprehensive Survey of GNNs No Quiz
    28 Wednesday,
    Apr. 24th
    • Hopfield Networks
    • Boltzmann Machines

    Recitations and Bootcamps

    Recitation Date Group Topics Materials Youtube Videos Instructor
    0A Tuesday,
    Jan. 9th
    Python Programming Python Fundamentals Slides
    Colab Notebook


    Yuzhou Wang,
    Harini Subramanyan
    0B OOP Fundamentals Colab Notebook

    Links: 1, 2

    Kateryna Shapovalenko,
    Aarya Makwana
    0C Numpy Fundamentals Broadcasting
    Colab Notebook (Part I, II, III)
    Broadcasting Pitfalls (Part IV) Numpy Exercises

    Links: 1, 2, 3, 4

    Heena Chandak,
    Shreya Ajay Kale
    0D PyTorch PyTorch Fundamentals Colab Notebook
    PyTorch Cheatsheet


    Jeel Shah,
    Rucha Manoj Kulkarni
    0E Computational Resources Available Compute and Google Colab Colab Notebook


    Yu-Cheng "Samuel" Lin,
    Aarya Makwana
    0F Google Cloud Setup Shell File

    Links: 1, 2

    Syed Abdul Hannan,
    Ishan Mamadapur
    0G AWS


    Liangze "Josh" Li
    0H Kaggle


    Alexander Moker
    0I Data Handling and Processing Datasets Colab Notebooks: 1, 2

    Links: 1, 2

    Quentin Auster,
    Heena Chandak
    0J Dataloaders Colab Notebook


    Chetan Chilkunda,
    Dareen Safar Alharthi
    0K Data Preprocessing Colab Notebook

    Links: 1, 2

    Kateryna Shapovalenko,
    Gabrial Zencha Ashungafac
    0L Debugging and Problem Solving Debugging Colab Notebook (Part I)
    Colab Notebook (Part III)

    Links: 1, 2, 3, 4

    Syed Abdul Hannan,
    Harshith Arun Kumar
    0M What to Do When Struggling

    Links: 1, 2

    Quentin Auster,
    Ishan Mamadapur
    0N Cheating


    Harini Subramanyan,
    Pavitra Kadiyala
    0O HWs and Project Workflow Management Workflow of HWs Colab Notebook


    Shreya Ajay Kale,
    Pavitra Kadiyala
    0P Weights and Biases (WandB) Colab Notebook Exercises


    Chetan Chilkunda,
    Liangze "Josh" Li
    0Q Git


    Harshit Mehrotra
    0R Flow of the Project


    Denis Musinguzi,
    Sarthak Bisht
    0S Writing a Report


    Jinhyung David Park,
    Rukayat Sadiq
    0T Algorithmic Techniques Losses Colab Notebook

    Links: 1, 2

    Jeel Shah,
    Dareen Safar Alharthi
    0U Block Processing Colab Notebook


    Alexander Moker,
    Denis Musinguzi
    Lab 1 Saturday,
    Jan. 20th
    Your First MLP Slides
    Colab Notebook
    Link Yu-Cheng "Samuel" Lin,
    Alexander Moker,
    Rucha Manoj Kulkarni
    HW1 Bootcamp HW1P1, HW1P2 Slides
    Lab 2 Saturday,
    Jan. 27th
    Ablations, Hyperparameter Tuning Methods, Normalizations Slides
    Link Jeel Shah,
    Harini Subramanyan,
    Yu-Cheng "Samuel" Lin
    Lab 3 Friday,
    Feb. 2nd
    Debugging Deep Learning Networks Slides Link Kateryna Shapovalenko,
    Harshith Arun Kumar,
    Ishan Mamadapur,
    Quentin Auster
    Lab 4 Friday,
    Feb. 9th
    Computing Derivatives and Autograd Slides Link Liangze Li,
    Dareen Safar Alharthi,
    Shreya Ajay Kale
    HW2 Bootcamp Saturday,
    Feb. 10th
    HW2P1, HW2P2 HW2P1 Slides,
    HW2P2 Slides
    Link Chetan Chilkunda,
    Heena Chandak,
    Ishan Mamadapur,
    Kateryna Shapovalenko,
    Syed Abdul Hannan
    Lab 5 Saturday,
    Feb. 17th
    CNN: Basics and Backprop Slides
    Link Denis Musinguzi,
    Syed Abdul Hannan,
    Miya Sylvester
    Lab 6 Friday,
    Feb. 23rd
    CNN: Classification and Verification Slides Link Ishan Mamadapur,
    Shreya Ajay Kale,
    Aarya Makwana,
    Sarthak Bisht
    Lab 7 Friday,
    Feb. 30th
    RNN Basics TBA TBA Aarya Makwana,
    Alexander Moker,
    Harshit Mehrotra


    Assignment Release Date (EST) Due Date (EST) Related Materials / Links
    HW1P1 Friday, Jan. 19th 11:59 PM Early Deadline: Friday, Jan. 26th 11:59 PM
    On-Time Deadline: Friday, Feb. 9th 11:59 PM




    HW1P1 Bonus Saturday, Mar. 2nd 11:59 PM


    HW1P1 Autograd Saturday, Mar. 9th 11:59 PM


    HW2P1 Friday, Feb. 9th 11:59 PM Early Deadline: Friday, Feb. 23rd 11:59 PM
    On-Time Deadline: Friday, Mar. 8th 11:59 PM




    HW2P1 Bonus Saturday, Mar. 23rd 11:59 PM


    HW2P1 Autograd Saturday, Mar. 30th 11:59 PM


    HW3P1 Friday, Mar. 8th 11:59 PM Early Deadline: Friday, Mar. 15th 11:59 PM
    On-Time Deadline: Friday, Mar. 29th 11:59 PM
    HW3P2 TBA
    HW4P1 Friday, Mar. 29th 11:59 PM Early Deadline: Friday, Apr. 12th 11:59 PM
    On-Time Deadline: Friday, Apr. 26th 11:59 PM
    HW4P2 TBA

    Documentation and Tools


    This is a selection of optional textbooks you may find useful

    Deep Learning
    Dive Into Deep Learning By Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola PDF, 2020
    Deep Learning
    Deep Learning By Ian Goodfellow, Yoshua Bengio, Aaron Courville Online book, 2017
    Neural Networks and Deep Learning
    Neural Networks and Deep Learning By Michael Nielsen Online book, 2016