6.863J/9.611J Natural Language Processing
 
 
Course home
[  Main  ] [  About ] [ Assignments ]
 

Staff
Prof. Robert C. Berwick
berwick@csail.mit.edu
32-D728, x3-8918
Office hours: W4:30-5:30pm

Course Support
Lynne Dell
lynne@mit.edu
32-D724, 617-324-1543
TA: Olga Wichrowska
olgaw@mit.edu
32D-740
Office hrs: Tu, 3-4pm.

Course Time & Place
Lectures: M, W 3-4:30 PM
Room: 32-144,  map

Level & Prerequisites
Undergrad/Graduate; 6.034 or permission of instructor

Policies
Textbooks & readings
Grading marks guide
Style guide

Course Description

A laboratory-oriented course in the theory and practice of building computer systems for human language processing, with an emphasis on how human knowledge of language can be integrated into natural language processing.

This subject qualifies as an Artificial Intelligence and Applications concentration subject, Grad H level credit.

Textbook required for puchase or reference (on library reserve, Barker P98.J87 2009):
Jurafsky, D. and Martin, J.H., Speech and Language Processing
2nd edition, Prentice-Hall: 2008. 
Some of the chapters from the revised edition may be posted in pdf form, as per the schedule shown on the homepage.

Announcements:
• Please fill out the course evaluation form, here.
• Week 13: Under the volcano! no classes
• Week 12: No class on Monday (Patriot's Day). Class on Wednesday!
• Week 11: Laboratory 5 on statistical parsing and lexical semantics released here. This is the last lab for the course.
Note that you will receive by email a particular verb to analyze as part of this assignment. Only the final project remains!
• Week 11: Free week. Focus on doing lab 4, and thinking about projects.
• Week 10: Laboratory 4 on Feature parsing released here.
• Week 9: Project ideas posted here.
• Week 7: Laboratory 3 on Parsing released here.
• Week 5: Laboratory 2 on Word parsing here.
• Week 4: Laboratory 1 on Statistical Language Models here.
• Week 2: CGW assignment posted here.
• Week 2: Slides posted; CGW assignment will be out Weds (please come to class!)
• Week 1: Please fill out the course questionnaire here.
• Week 1: Fun NLP link of the week: Postmodernist paper generator. Try 'writing' a new paper by following this link.
• Week 1: And then, if you think the 'hard' sciences are immune, you can follow this link.
• Week 1: Reading & response 1, available here, due next Monday for in-class discussion.


Class days in blue, holidays in green, reg add/drop dates in orange.

February 2010
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
March 2010
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
           
April 2010
Sun
Mon
Tue
Wed
Thu
Fri
Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28 29 30  
             
May 2010
Sun
Mon
Tue
Wed
Thu
Fri
Sat
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26 27 28 29
30 31          
Course schedule at a glance
Date
Topic
Slides & Reference Readings
Laboratory/Assignments

2/3
Weds

Introduction: walking the walk, talking the talk
Lecture 1 pdf slides; pdf 4-up; Jurafsky & Martin (JM), ch. 1.
If you don't know Python, read the NLTK book, ch. 1-3; otherwise, skim NLTK book, chs 2–3.
Background Reading (for RR 1): Jurafsky & Martin ch.4 on ngrams. (pp. 83-94; p. 114-116)
Background Reading (for RR 1): Abney on statistics and language.
Background Reading (for RR 1): Chomsky, Extract on grammaticality, 1955.
(Optional) Background chapters on NLP from Russell & Norvig, ch. 22.
Reading & response (RR) 1 OUT
(Ngrams; NLTK Python warmup)

2/8
Mon
RR1 discussion; Bayes' rule and smoothing; from words to parsing
Lecture 2 pdf slides; pdf 4-up
• JM, ch. 5 (skim)
• NLTK book, part of speech tagging, ch. 4

Reading & response 1 DUE


2/10
Weds
Parts of speech, parsers, and statistical parsing
Intro to Competitive Grammar Writing
• JM, ch. 13 (parsing), pp. 427-435; ch. 14, pp. 459-467
• (Optional) NLTK book on advanced parsing (skim)

Competitive Grammar Writing (CGW): teams assigned; CGW checkpoint OUT WEDS
Read CGW handout
CGW Checkpoint DUE FRI

2/16
Tues
Parsing & competitive grammar writing I

Bring notebook computer to class (at least 1 per team)


Competitive Grammar Writing I
2/17
Weds
Parsing & competitive grammar writing II

Bring notebook computer to class (at least 1 per team)


Competitive Grammar Writing II
2/22
Mon
Competitive Grammar Evaluation & Wrap-up
Bring notebook computer to class (at least 1 per team)

Competitive Grammar AWARDS
Lab 1 Statistical Language Models OUT

2/24
Weds
Word parsing and morphology

Lecture 5 pdf slides; pdf 4-up
• JM ch. 3

 
3/1
Mon

Word Parsing & POS tagging



3/3
Weds
Context-free parsing I

Lab 1 Statistical Language Models Lab DUE
Lab 2 Computational Morphology OUT

3/8
Mon

Context-free parsing II



3/10
Weds
Parsing tricks


3/15
Mon
Statistical Parsing

Lecture 10 pdf slides; pdf 4-up
• JM ch. 14
• Background Reading: deMarcken, Lexical heads, phrase structure, & the induction of grammar, 1995.
• Background Reading: Collins, Head-driven statistical models for natural language processing, 2003.


 
3/17
Weds

Modern statistical parsers I; Evaluating Treebank parsers

3/29
Mon
Modern statistical parsers II
3/31
Weds
Lexical Semantics & Treebank parsers

4/5
Mon
Semantics I: the lambda calculus view
4/7
Weds
Semantics II: SQL
Lecture 15 pdf slides; pdf 4-up

Project proposal checkpoint

4/12
Mon

Patriot's Day





4/14
Weds



Laboratory 4 DUE
Laboratory 5 Lexical SemanticsOUT
4/21
Weds
Volcano week


4/26
Mon
Discourse
Laboratory 5 Lexical SemanticsDUE
4/28
Weds
Discourse
 
5/4
Mon
Language Learning
• Lecture 18 pdf slides; pdf 4-up
• Background Reading: Niyogi & Berwick, A language learning model for finite parameter spaces, 1996.
 
5/6
Weds
Language Learning & Language Change
• Lecture 19 pdf slides; pdf 4-up
Background Reading: Niyogi & Berwick, A dynamical systems model for language change, 1997.
5/11
Mon
Evolution of language
• Lecture 20 pdf slides; pdf 4-up
• Background Reading: Chomsky, Fitch, Hauser, The Faculty of Language
Background Reading: Berwick, Syntax Facit Saltum, 2008.
 
5/13
Weds
Evolution of language
• Lecture 221 pdf slides; pdf 4-up
 

 

MIT Home