9.520: Statistical Learning Theory and Applications, Spring 2009


Class Times: Monday and Wednesday 10:30-12:00
Units: 3-0-9 H,G
Location: 46-5193
Instructors: Tomaso Poggio (TP), Ryan Rifkin (RR), Jake Bouvrie (JB),
Lorenzo Rosasco (LR), Charlie Frogner (CF)
Office Hours: By appointment
Email Contact : 9.520@mit.edu
Previous Class: SPRING 08

Course description

Focuses on the problem of supervised and unsupervised learning from the perspective of modern statistical learning theory, starting with the theory of multivariate function approximation from sparse data. Develops basic tools such as regularization, including support vector machines for regression and classification. Derives generalization bounds using stability. Discusses current research topics such as manifold regularization, sparsity, feature selection, bayesian connections and techniques, and online learning. Emphasizes applications in several areas: computer vision, speech recognition, and bioinformatics. Discusses advances in the neuroscience of the cortex and their impact on learning theory and applications. The course is graded on the basis of final projects and hands-on applications and exercises.

Prerequisites

6.867 or permission of instructor. In practice, a substantial level of mathematical maturity is necessary. Familiarity with probability and functional analysis will be very helpful. We try to keep the mathematical prerequisites to a minimum, but we will introduce complicated material at a fast pace.

Grading

There will be two problem sets and a final project. To receive credit, you must attend regularly, and put in effort on all problem sets and the project.

Problem sets

Problem set #1: PDF --
Problem set #2: PDF -- Due Mon. April 13th (in class)

Projects

Project ideas: PDF

Syllabus

Follow the link for each class to find a detailed description, suggested readings, and class slides. Some of the later classes may be subject to reordering or rescheduling.



Date Title Instructor(s)
Class 01 Wed 04 Feb The Course at a Glance TP
Class 02 Mon 09 Feb The Learning Problem and Regularization TP
Class 03 Wed 11 Feb Reproducing Kernel Hilbert Spaces LR
Mon 16 Feb - President's Day
Class 04 Tue 17 Feb Regularized Least Squares RR
Class 05 Wed 18 Feb Several Views Of Support Vector Machines RR
Class 06 Mon 23 Feb Multiclass Classification RR
Class 07 Wed 25 Feb Spectral Regularization LR
Class 08 Mon 02 Mar Manifold Regularization LR
Class 09 Wed 04 Mar Generalization Bounds, Intro to Stability LR/TP
Class 10 Mon 09 Mar Stability of Tikhonov Regularization LR/TP
Class 11 Wed 11 Mar Sparsity Based Regularization I LR
Class 12 Mon 16 Mar Regularization for Multi-Output Learning LR
Class 13 Wed 18 Mar Loose ends, Project discussions
SPRING BREAK March 23-27
Class 14 Mon 30 Mar Sparsity, rank, and all that Ben Recht
Class 15 Wed 01 Apr Bayesian Interpretations of Regularization CF
Class 16 Mon 06 Apr A Bayesian Perspective on Statistical Learning Theory Dan Roy
Class 17 Wed 08 Apr Nonparametric Bayesian Regression and Density Estimation Vikash
Class 18 Mon 13 Apr Hierarchical Bayesian Modeling for Unsupervised Learning Vikash
Class 19 Wed 15 Apr Geometry and Learning Partha Niyogi
Mon 20 Apr - Patriot's Day
Class 20 Wed 22 Apr Demographic forecasting and the role of priors Federico Girosi
Class 21 Mon 27 Apr Vision and Visual Neuroscience TP
Class 22 Wed 29 Apr Vision and Visual Neuroscience Thomas Serre
Class 23 Mon 04 May Derived Kernels JB
Class 24 Wed 06 May Application of Belief Nets to Modelling Attention Sharat/Thomas
Class 25 Mon 11 May Project Presentations
Class 26 Wed 13 May Project Presentations

Math Camp Tue 09 Feb
5:00pm-7:00pm
Probability theory notes
Old Math Camp Slides XX Functional analysis
Old Math Camp Slides XX Probability theory

Reading List

There is no textbook for this course. All the required information will be presented in the slides associated with each class. The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of suggested readings will also be provided separately for each class.

Primary References

Secondary References

Background Mathematics References

Neuroscience Related References