9.520: Statistical Learning Theory and Applications, Spring 2007

Class Times: Monday and Wednesday 10:30-12:00
Units: 3-0-9 H,G
Location: 46-5193
Instructors: Tomaso Poggio (TP), Ryan Rifkin (RR), Jake Bouvrie (JB), Lorenzo Rosasco (LR)
Office Hours: By appointment
Email Contact : 9.520@mit.edu
Previous Class: SPRING 06

Course description

The 2007 spring edition is an updated version of the course which has been running for several years. It focuses on the problem of supervised and unsupervised learning from the perspective of modern statistical learning theory, starting with the theory of multivariate function approximation from sparse data. Develops basic tools such as regularization, including support vector machines for regression and classification. Derives generalization bounds using stability. Discusses current research topics such as manifold regularization, feature selection, bayesian connections and techniques, and online learning. It emphasizes more than in previous years applications in several areas: computer vision, speech recognition and bioinformatics. It discusses advances in the neuroscience of cortex and their impact on learning theory and applications. Final projects and hands-on applications and exercises.


6.867 or permission of instructor. In practice, a substantial level of mathematical maturity is necessary. Familiarity with probability and functional analysis will be very helpful. We try to keep the mathematical prerequisites to a minimum, but we will introduce complicated material at a fast pace.


There will be two problem sets, a Matlab assignment, and a final project. To receive credit, you must attend regularly, and put in effort on all problem sets and the project.

Problem sets

Problem set #1: PDF -- Due March 19th.
Problem set #2: PDF -- Due Mon., April 23rd.


Project ideas: PDF (list as of 03/01/07)


Follow the link for each class to find a detailed description, suggested readings, and class slides. Some of the later classes may be subject to reordering or rescheduling.

Date Title Instructor(s)
Class 01 Wed 06 Feb The Course at a Glance TP
Class 02 Mon 12 Feb The Learning Problem and Regularization
Introduction to Fenchel Duality
Class 03 Wed 14 Feb Reproducing Kernel Hilbert Spaces
Tikhonov Regularization, Value Regularization, and Fenchel Duality
Mon 18 Feb - President's Day
Class 04 Tue 20 Feb Regularized Least Squares RR
Class 05 Wed 21 Feb Several Views Of Support Vector Machines RR
Class 06 Mon 26 Feb Manifold Regularization LR
Class 07 Wed 28 Feb Sparse Approximation and Variable Selection LR
Class 08 Mon 05 Mar Iterative Optimization Techniques Ross Lippert
Class 09 Wed 07 Mar Generalization Bounds, Intro to Stability Sasha Rakhlin
Class 10 Mon 12 Mar Stability of Tikhonov Regularization Sasha Rakhlin
Class 11 Wed 14 Mar Bayesian Methods (TP's slides) TP+Vikash
Class 12 Mon 19 Mar Online Learning Sanmay Das
Class 13 Wed 21 Mar Loose ends, Project discussions
Class 14 Mon 02 Apr Multiclass Classification RR
Class 15 Wed 04 Apr Iterative Optimization Techniques RR
Class 16 Mon 09 Apr Vision and Visual Neuroscience TP
Class 17 Wed 11 Apr A Somewhat Unified Approach to Semi- and Un-supervised Learning Ben Recht
Class 18 Wed 18 Apr Learning in Circuits of Spiking Neurons - Hedonistic Synapses and Dynamic Conductance Perturbation Sebastian Seung
Class 19 Mon 23 Apr Computer Vision, Object Detection Stan Bileschi
Class 20 Wed 25 Apr Speech, Audio, and Auditory Neuroscience JB
Class 21 Mon 30 Apr Vision and Visual Neuroscience Thomas Serre
Class 22 Wed 02 May Energy-Based Models : the cure against Bayesian fundamentalism - slides | paper Yann LeCun
Class 23 Mon 07 May Conversion to the Bayesian Cult Rev. Sayan Mukherjee
Class 24 Wed 09 May Morphable Models for Video Tony Ezzat
Class 25 Mon 14 May Project Presentations
Class 26 Wed 16 May Project Presentations

Math Camp 1 Functional analysis XY
Math Camp 2 Probability theory YX

Reading List

There is no textbook for this course. All the required information will be presented in the slides associated with each class. The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of suggested readings will also be provided separately for each class.

Primary References

Secondary References

Background Mathematics References