Deep Learning

📚 CS 6353 3.0 Credits (3 Lectures/Week) 🎯 academic

    CS 6353: Deep Learning

    Fall 2023: Tue/Thu, 3:40-5:00pm; GC 2900

    • Instructor: Anna Fariha
      Office Location: WEB 2851
      Office hours: Tue/Thu, after class.
    • TA hours:
      • Join the TA queue Links to an external site. and wait for your turn.
      • Zoom hours may overlap with the in-person office hours and in-person students will be given priority. You may have to wait a bit in the overlapping Zoom hours.
      • Additional TA hours: available during the weeks when assignments are due.

    Exam, Quiz, and Homework Dates

    EventNoteDue/Event Date (all due dates are 11:59 PM Mountain Time)Weight
    Homework 1Image Classification, kNN, SVM, Softmax, Neural NetworkSeptember 22 (Friday)10%
    Quiz 1In-class, Based on Homework 1, Duration: 30 minsSeptember 26 (Tuesday)10%
    Homework 2Part 1: Fully-Connected Nets, Batch Normalization
    Part 2: Dropout, Convolutional Nets
    October 20 (Friday)10%
    Quiz 2In-class, Based on Homework 2, Duration: 30 minsOctober 24 (Tuesday)10%
    Mid TermIn-class, Duration: 70 minutes.October 31 (Tuesday)20%
    Homework 3Image Captioning, Fooling Images, Image GenerationNovember 17 (Friday)10%
    Quiz 3In-class, Based on Homework 3, Duration: 30 minsNovember 21 (Tuesday)10%
    FinalIn-class, Duration: 70 minutes (can be used to replace mid)December 11 (Monday) 3:30 – 5:30 pm20%
    Project (Optional)Project Guideline
    You can choose to either do a project or take the final exam. You can do both and the best of these two will be counted towards your final grade.
    December 11 (Monday)20%

    Lectures

    Here is the table converted to markdown:

    EventDateTopicAdditional Materials
    Lecture 1Aug 22Course Organization and Structure, Introduction and BackgroundJupyter Notebook Tutorial, Python and numpy tutorial, Google Colab Tutorial, Accessing compute resources in CADE lab
    Lecture 2Aug 24Image classification and the data-driven approach, k-nearest neighbor, Linear classification
    Lecture 3Aug 29Loss Functions
    Lecture 4Aug 31OptimizationInteractive Web Demo
    Lecture 5Sep 5BackpropagationVector, Matrix, and Tensor Derivatives (Erik Learned-Miller, UMass Amherst), Derivatives, Backpropagation, and Vectorization (Justin Johnson)
    Lecture 6Sep 7Neural Networks, Activation FunctionsCalculus on Computational Graphs: Backpropagation, How the backpropagation algorithm works, Learning: Neural Nets, Back Propagation, Efficient Backpropagation
    Lecture 7Sep 12Training Neural Networks, Weight initialization, batch normalizationdeeplearning.net tutorial with Theano, ConvNetJS demos, Michael Nielsen’s tutorials, Additional Slides for BatchNorm
    Lecture 8Sep 14Training Neural Network II: Network Architecture and LearningDropout: A Simple Way to Prevent Neural Networks from Overfitting, Dropout Training as Adaptive Regularization, DropConnect, ElasticNet
    Lecture 9Sep 19Training Neural Network III: Review of Backpropagation for SoftMax, Hyperparameter TuningLarge Scale Distributed Deep Networks, RmsProp, SGD tips and tricks, Efficient BackProp, Practical Recommendations for Gradient-Based Training of Deep Architectures
    Lecture 10Sep 21Training Neural Network IV: Learning and Evaluation, Parameter Updates, Hyperparameter OptimizationA Visual Explanation of Gradient Descent Methods (Momentum, AdaGrad, RMSProp, Adam), Momentum and Learning Rate Decay, Gradient Descent Visualization
    Sep 26Quiz 1, Course Feedback, Minimal Neural Network Case Study: 2D toy data exampleMinimal Neural Network Case Study: 2D toy data example
    Lecture 11Sep 28Convolutional Neural Networks IDemo, CIFAR-10 Demo
    Lecture 12Oct 3Convolutional Neural Networks IIImageNet Large Scale Visual Recognition Challenge
    Lecture 13Oct 5Convolutional Neural Networks III, Data AugmentationLeNet-5, AlexNet, ZFNet, VGGNet, GoogleNet, ResNet, AlphaGo
    Lecture 14Oct 17Recurrent Neural Networks (RNN)RNN Book Chapter, More reading, RNN Illustration
    Lecture 15Oct 19Long Short Term Memory (LSTM), Generative Adversarial Networks (GAN)Tutorial on LSTM, Understanding LSTM, Style-GAN, Alias-free GAN
    Oct 24Quiz 2, Project discussion (report writing)Taking a project to a paper
    Oct 26Midterm Review
    Oct 31Mid TermSample Mid, Solution
    Nov 2Guest Lecture (Shraddha Barke, UCSD), How Large Language Models Revolutionized Program SynthesisDocPrompting, MuseNet, AlphaGo, Github Copilot, OpenAI GPT-4, Stable Diffusion, Dall-E-2, Chain-of-Thought Reasoning in LLMs

    Course Description

    Course Type: In Person
    Description: This course will focus on modern, practical methods for deep learning, with an emphasis on computer vision. Students will learn to implement, train and debug their own neural networks and be introduced to the ideas underlying state-of-the-art research in computer vision and natural language processing. We will cover learning algorithms, optimization, neural network architectures, and techniques for training and fine-tuning networks for tasks such as object recognition, image captioning, natural language processing, data management, etc.

    The course will begin with a description of simple classifiers such as perceptrons and logistic regression classifiers, and move on to standard neural networks, convolutional neural networks, and some elements of recurrent neural networks, such as long short-term memory networks (LSTMs). The emphasis will be on understanding the basics and on practical application more than on theory. Most applications will be in computer vision, but we will make an effort to cover some natural language processing (NLP) applications as well. The current plan is to use Python and associated packages such as Numpy. Prerequisites include Linear Algebra, Probability and Statistics, and Multivariate Calculus. Assignments will be in Python and potentially some in C++. This is a required course for the Deep Learning in AI & Robotics certificate program.

    Prerequisites

    • Python (e.g. numpy and Jupyter notebooks, tutorial)
    • Calculus and linear algebra (e.g. MATH 2210 & 2270, review)
    • Basic probability and statistics (e.g. MATH 1040, review)
    • Familiarity with machine learning would be really helpful (e.g. CS 5350, Stanford’s CS 229)

    Course Materials

    All course materials will be available on Canvas. All materials for this course are copyrighted. Do not distribute or share course resources without instructor permission.

    Slides used in lectures will be made available on Canvas.

    There is no required textbook, however the following are some of the many online references you may find useful:


    Course Outcomes

    By the end of the course, you will:

    • understand the key concepts of deep learning, such as neural networks, loss functions, backpropagation, stochastic optimization, regularization, data augmentation, etc.
    • be familiar with important neural networks in computer vision and natural language processing.
    • be able to use deep learning in research and other applications.

    Teaching and Learning Methods

    The course lectures will be held in-person during regularly-scheduled class periods. We will cover both theoretical and practical aspects of deep learning, so the ability to both do math (calculus, linear algebra) and program (python, pytorch) are critical. Lectures will mostly use slides, though there will be an occasional experiential opportunity. Homeworks will be done outside of class and focus on gaining both theoretical understanding and practical experience.

    As with all learning, the more time and thought you put into this course, the more you will get out of it. Deep learning is an enormous and rapidly growing subject and research area. We can only cover a small part of it in a course like this. However, if you put in the time, by the end of the course you should be familiar enough with deep learning to use it in research and industrial applications.

    Lectures (PDF)

    Homeworks

    Homework 1

    Homework 2

    Homework 3

    Share on