9.520/6.860: Statistical Learning Theory and Applications, Fall 2018



Units: 3-0-9 H,G
Class Times: Monday and Wednesday: 1:00 pm - 2:30 pm
Location: 46-3310 (PILM Serminar Room)
Instructors:

Tomaso Poggio (TP), Lorenzo Rosasco (LR), Alexander Rakhlin (AR), Andrzej Banburski (AB)

TAs:

Michael Lee, Nhat Le, David Zhou

Office Hours: Friday 1:00 pm - 2:00 pm, 46-5156 (Poggio lab lounge) and/or 46-5165 (MIBR Reading Room)
Email Contact: 9.520@mit.edu
Previous Class: FALL 2017, 2017 lecture videos
Registration: Please register to 9.520/6.860 by filing this registration form
Mailing list: Registered students will be added in the course mailing list (9520students)
Stellar page: http://stellar.mit.edu/S/course/9/fa18/9.520/

Course description

The course covers foundations and recent advances of machine learning from the point of view of statistical learning and regularization theory.

Understanding intelligence and how to replicate it in machines is arguably one of the greatest problems in science. Learning, its principles and computational implementations, is at the very core of intelligence. During the last decade, for the first time, we have been able to develop artificial intelligence systems that can solve complex tasks, until recently the exclusive domain of biological organisms, such as computer vision, speech recognition or natural language understanding: cameras recognize faces, smart phones understand voice commands, smart speakers/assistants answer questions and cars can see and avoid obstacles. The machine learning algorithms that are at the roots of these success stories are trained with examples rather than programmed to solve a task.

Among different approaches in modern machine learning, the course focuses on a regularization perspective and includes both shallow and deep networks. The content is roughly divided into two parts. In the first part, key algorithmic ideas are introduced, with an emphasis on the interplay between modeling and optimization aspects. Algorithms that will be discussed include classical regularization networkds (regularized least squares, SVM, logistic regression),stochastic gradient methods, implicit regularization, sketching, sparsity based methods and deep neural networks. In the second part, key ideas in statistical learning theory will be developed to analyze the properties of the various algorithms previously introduced. Classical concepts like generalization, uniform convergence and Rademacher complexitities will be developed, together with topics such as bounds based on margin, stability, and privacy. The final part of the course focuses on deep learning networks. It will introduce an emerging theoretical framework addressing three key puzzles in deep learning: approximation theory -- which functions can be represented more efficiently by deep networks than shallow networks -- optimization theory -- why can stochastic gradient descent easily find global minima -- and machine learning -- whether classical learning theory can explain generalization in deep networks. It will also discuss connections with the architecture of visual cortex, which was the original inspiration of the layered local connectivity of modern networks and may provide ideas for future developments of deep learning.

The goal of the course is to provide students with the theoretical knowledge and the basic intuitions needed to use and develop effective machine learning solutions to challenging problems.

Prerequisites

We will make extensive use of basic notions of calculus, linear algebra and probability. The essentials are covered in class and in the math camp material. We will introduce a few concepts in functional/convex analysis and optimization. Note that this is an advanced graduate course and some exposure on introductory Machine Learning concepts or courses is expected. Students are also expected to have basic familiarity with MATLAB/Octave.

Grading

Requirements for grading are attending lectures/participation (10%), four problems sets (60%) and a final project (30%).

Grading policies, pset and project tentative dates: (slides)

Problem Sets

Problem Set 1, out: Sep. 19, due: Tue., Sep. 25 (Class 07).
Problem Set 2, out: Oct. 03, due: Tue., Oct. 09 (Class 10).
Problem Set 3, out: Oct. 31, due: Sat., Nov. 10 (Class 18).
Problem Set 4, out: Nov. 14, due: Tue., Nov. 20 (Class 21).

Submission instructions: Follow the instructions included with the problem set. Use the latex template for the report (there is a maximum page limit). Submit your report online through stellar.mit by the due date/time and a printout in the first class after the due date.

Projects

Guidelines and key dates. Online form for project proposal (complete by Nov. 01).

Reports are expected to be within 5 pages, with extended abstracts using NIPS style files

Projects archive

List of Wikipedia entries, created or edited as part of projects during previous course offerings.


Syllabus

Follow the link for each class to find a detailed description, suggested readings, and class slides. Some of the later classes may be subject to reordering or rescheduling.

Class Date Title Instructor(s)

Reading List

Notes covering the classes will be provided in the form of independent chapters of a book currently in draft format. Additional information will be given through the slides associated with classes (where applicable). The books/papers listed below are useful general reference reading, especially from the theoretical viewpoint. A list of additional suggested readings will also be provided separately for each class.

Book (draft)

Primary References

Resources and links



Announcements