Lectures and Materials


  • Mar 17 Lec 9
    Epilogue Lecture.

    Part 1: Midterm 2
    Part 2: Prof. Li's concluding remarks on the lectures.
  • Mar 3 Lec 8
    Going deeper into convolutional networks.
    [Slides] [Audio] [Doc Cam]
    Part 1:
    1. Dark knowledge
    2. Mentor nets and Fit nets
    3. Autoencoders

    Part 2:
    1. Denoising autoencoders
    2. Generative adversarial networks

  • Feb 24 Lec 7
    Going deeper into convolutional networks.
    [Slides-I] [Slides of Residual Networks] [Slides of Generality] [Slides of Mentor Nets] [Audio - I] [Audio - II]
    Part 1:
    1. Dropouts and Maxouts for CNNs
    2. GoogLenet and VGG
    3. Guest Lecture on ResNet

    Part 2:
    1. Generality of neural image features
    2. Dark-knowledge

  • Feb 17 Lec 6
    The convolutional layer
    [Slides] [Doc Cam] [Audio - I] [Audio - II]
    Part 1:
    1. Sparsely connected networks.
    2. Weight sharing between neurons.
    3. Convolutional neuron.
    4. The max pooling operation.
    5. The conv-pool-activation (convpool) layer.
    6. The convolutional neural Network.

    Part 2:
    1. LeNet
    2. AlexNet
    3. Dropouts and Ensemble Learning (will revisit again in the next lecture.)
    4. Miniproject 4 discussions.

  • Feb 10 Lec 5
    Midterm Exam and the Yann Toolbox.
    [Yann Toolbox] [Full Audio]
    Part 1:
    1. Midterm exam.

    Part 2:
    0. Surprise Quiz !!
    1. Solutions to midterm exam discussed.
    2. Tutorial on yann toolbox.


  • Feb 3 Lec 4
    Perceptron, Backpropagation and Multi-Layer Neural Networks.
    [Slides] [Audio - I] [Audio - II].

    Part 1:
    1. The biological neuron
    2. Linear regression is a neuron
    3. Rosenblatt perceptron
    4. Logistic neuron
    5. Single-layer multi-neuron multi-class Networks
    6. Cross-entropy loss.

    Part 2:
    1. Modular backpropagation
    2. Multi-layer neural networks
    3. ReLU activations
    4. Maxout networks
    5. Batch normalization
  • Jan 27 Lec 3
    Regularization and Optimization
    [Nuts and Bolts of DL Lecture] [Overfitting] [Slides] [Audio].

    Some slides are repeated twice, one of which was notes from doc cam and the other is my personal notes. I forgot to re-start the audio at somepoint so, it misses some slides, but as I said at the beginning, I don't guarantee it will always work.

    Part 1:
    1. Revise MLE Estimation.
    2. Basis Function Expansion.
    3. Regularization.
    4. Cross Validation.
    5. Optimization through Gradient Descent.

    Part 2:
    1. Geometry of Regularization.
    2. Non-Convex Error Surfaces.
    3. Stochastic, Batch and Online Gradient Descent.
    4. Second Order Methods.
    5. Momentum.
  • Jan 20 Lec 2
    Wrapping up intro to CV and moving on to Linear Regression
    [Edge Detection Code] [Hoggles Code] [DPM Code] [Employee Compensation Dataset] [Slides] [Audio] [Doc Cam Notes].

    Part 1:
    1. Sampling and quantization
    2. Pixel Representations
    3. Edges and Gradients
    4. Image Histograms
    5. Histogram of Oriented Gradients
    6. Deformable Part Models

    Part 2:
    1. Intro to Supervised Learning
    2. Linear Models, Linear Regression
    3. Objective Functions
    4. Flow of learning systems
    5. Least Squares and Analytical Solution

  • Jan 13 Lec 1
    Course logistics and introduction to deep learning and image representations.
    [Slides I] [Slides II].

    Part 1:
    1. What is Computer Vision?
    2. CV then and now, a historical perspective.
    3. Classification, then and now - Non-neural vs. Neural in imagenet.
    4. Some results of CNNs. What is really new? What is really wrong? What all can they do?

    Part 2:
    1. Course Logistics and course plan.
    2. Basics of Image Acquisition (sampling and quantization).
    3. Fourier representations.
    4. Histogram representations.
    4. Edge map representations.