“Deep Learning” systems, typified by deep neural networks, are increasingly taking over all AI tasks, ranging from language understanding, and speech and image recognition, to machine translation, planning, and even game playing and autonomous driving. As a result, expertise in deep learning is fast changing from an esoteric desirable to a mandatory prerequisite in many advanced academic settings, and a large advantage in the industrial job market.
In this course we will learn about the basics of deep neural networks, and their applications to various AI tasks. By the end of the course, it is expected that students will have significant familiarity with the subject, and be able to apply Deep Learning to a variety of tasks. They will also be positioned to understand much of the current literature on the topic and extend their knowledge through further study.
The course is well rounded in terms of concepts. It helps us understand the fundamentals of Deep Learning. The course starts off gradually with MLPs and it progresses into the more complicated concepts such as attention and sequence-to-sequence models. We get a complete hands on with PyTorch which is very important to implement Deep Learning models. As a student, you will learn the tools required for building Deep Learning models. The homeworks usually have 2 components which is Autolab and Kaggle. The Kaggle components allow us to explore multiple architectures and understand how to fine-tune and continuously improve models. The task for all the homeworks were similar and it was interesting to learn how the same task can be solved using multiple Deep Learning approaches. Overall, at the end of this course you will be confident enough to build and tune Deep Learning models.
Instructor: Bhiksha Raj
* -- contingent on registration
Lecture: Monday and Wednesday, 9.00am-10.20am
Location: Baker Hall A51
Recitation: Friday, 9.00am-10.20am
Location: Baker Hall A51
This course is worth 12 units.
Grading will be based on weekly quizzes (24%), homeworks (51%) and a course project (25%).
|Quizzes||There will be weekly quizzes.
|Assignments||There will be five assignments in all. Assignments will include autolab components, where you must complete designated tasks, and a kaggle component where you compete with your colleagues.
|Project||All students are required to do a course project. The project is worth 25% of your grade|
|Final grade||The end-of-term grade is curved. Your overall grade will depend on your performance relative to your classmates.|
|Pass/Fail||Students registered for pass/fail must complete all quizzes, HWs and the project. A grade equivalent to B- is required to pass the course.|
|Auditing||Auditors are not required to complete the course project, but must complete all quizzes and homeworks. We encourage doing a course project regardless.|
The course will not follow a specific book, but will draw from a number of sources. We list relevant books at the end of this page. We will also put up links to relevant reading material for each class. Students are expected to familiarize themselves with the material before the class. The readings will sometimes be arcane and difficult to understand; if so, do not worry, we will present simpler explanations in class.
We will use Piazza for discussions. Here is the link. You should be automatically signed up if you're enrolled at the start of the semester. If not, please sign up.
You can also find a nice catalog of models that are current in the literature here. We expect that you will be in a position to interpret, if not fully understand many of the architectures on the wiki and the catalog by the end of the course.
Kaggle is a popular data science platform where visitors compete to produce the best model for learning or analyzing a data set.
For assignments you will be submitting your evaluation results to a Kaggle leaderboard.
All recitations and lectures will be recorded and uploaded to Youtube. Here is a link to the Youtube channel. Links to individual lectures and recitations will also be posted below as they are uploaded. All videos for the Spring 2019 edition are tagged “S19”. CMU students can also access the videos on Panopto from this link.
|Lecture||Date||Topics||Lecture notes/Slides||Additional readings, if any||Quizzes/Assignments||Shadow Instructor|
Assignment 0 due on January 20.
||Slides||Assignment 1 released on January 24.||Cody Smith|
||Daanish Ali Khan, Hengrui Lui|
Assignment 1 due on February 16
Assignment 2 released on February 16.
|Sarvesh D., Hengrui Lui|
||Simral Chaudhary, Daanish Ali Khan|
||Simral Chaudhary, Sarvesh D.|
Assignment 2 due on March 10.
Assignment 3 released on March 10.
||Assignment 3 due on March 31.|
Assignment 4 released on April 1.
||Assignments 4 due on April 28.|
|0 - Part 1||January 2||Python coding for the deep learning student||Notebook||Part 1 video
|0 - Part 2||January 2||Python coding for the deep learning student||Notebook||Part 2 video||Simral Chaudhary,
|1||January 16||Amazon Web Services (AWS)||
|David Bick, Cody Smith|
|2||January 25||Your First Deep Learning Code||Alex Litzenberger, Daanish Ali Khan|
|3||February 1||Efficient Deep Learning/Optimization Methods||Kai Hu, Cody Smith|
|4||February 8||Debugging and Visualization||Raphael Olivier, Sarvesh D.|
|5||February 15||Convolutional Neural Networks||Simral Chaudhary, Hengrui Lui|
|6||February 22||CNNs: HW2||Hira Dhamyal, Hengrui Lui|
|7||March 1||Recurrent Neural Networks|
|8||March 8||RNN: CTC||Kai Hu, Alex Litzenberger|
|9||March 22||Attention||Daanish Ali Khan,|
|10||March 29||Variation Auto Encoders|
|12||April 19||Boltzmann machines|
|13||April 26||Reinforcement Learning|
Most homeworks require submissions to autolab. If you are an autolab novice here is an “autolab for dummies” document to help you.
|Number||Part||Topics||Release date||Early-submission deadline||On-time deadline||Links|
|HW0||-||Python coding for DL||2 Jan||none||20 Jan|| pdf