Deep-learning and symbolic models of human language, Spring 2020

1 Course information

2 Instructor information

Instructor Roger Levy (rplevy@mit.edu)
Instructor's office 46-3033
Instructor's office hours Mondays 2-4pm

3 Course Description

This once-weekly advanced reading seminar will be devoted to recent and ongoing work on computational models of human language. Deep learning, symbolic models, and a range of approaches bringing the two together will be covered. There will be special focus on research that sheds light on language acquisition and processing in human minds and brains. This seminar offers a regular forum for cross-disciplinary interaction among cognitive scientists, linguists, and computer scientists interested in understanding and improving contemporary computational models of language. Participating in this seminar will offer you opportunities to gain expertise in state-of-the-art methods in computational linguistics, develop and implement your own novel ideas for models, and forge relationships and collaborations with peers and colleagues.

4 Course organization

We'll meet once a week; seminar participants will take turns leading open discussions of recent papers, including both technical details and how the work fits into the larger current research landscape.

5 Intended Audience

Undergraduate or graduate students in Brain & Cognitive Sciences, Linguistics, Electrical Engineering & Computer Science, and any of a number of related disciplines. Postdocs and faculty are also welcome to participate. Participants should have prior experience with contemporary computational models of language, through an advanced class such as 9.19 (Computational Psycholinguistics), 6.864 (Natural Language Processing), or their own research. If you are interested in participating but unsure about preparation, or if you cannot make it to the first meeting, please contact the instructor.

6 Readings

To be determined based on the interests of seminar participants. We will make sure to cover core techniques widely used in the field (recurrent neural networks, Transformers, probabilistic symbolic grammars,…) and we will also cover newer and in-development approaches.

Readings are accessible on Stellar.

Week Day Topic Readings Presenter Related readings
1 2/4 Introduction None    
2 2/14 RNNs Goldberg 2017 14.1-14.3, 14.5, 15.2-15.3; Weiss et al., 2018   RNN Tutorial; Elman, 1991; Christiansen & Chater, 1999; Chen et al., 2018; Peng et al., 2018; Understanding LSTMs; Neural History of NLP;
3 2/21 Grammar Jurafsky & Martin Chapter 12; Eisenstein Chapter 9; Dyer et al. 2016   Linzen et al., 2016; Kuncoro et al. 2017; Wilcox et al., 2018; Hale et al., 2018; Hale et al., 2019
4 2/28 No Meeting      
5 3/6 Compositionality Hupkes et al., 2020   Soulos et al., 2019; Andreas, 2019; Eisensten 2018 Section 18.3; Luong et al., 2015
6 3/13 Classes Cancelled Luong et al., 2015   Bahdanau et al., 2015
7 3/20 Classes Cancelled      
  3/27 Spring Break      
8 4/3 Contrastive Estimation Kong et al., 2020 Peng van den Oord et al., 2018;
9 4/10 Priming Prasad et al., 2019 Hector  
10 4/17 Attention Jain & Wallace, 2019; Wiegreffe & Pinter, 2019 Natasha & Dariusz  
11 4/24 Probes Pimentel et al., 2020 Noga Hewitt & Manning, 2019;Hewitt & Liang, 2019
12 5/1 Pragmatics Monroe et al., 2017; McDowell & Goodman, 2019 Cathy Achlioptas et al., 2019
13 5/8 Morphology Corkery et al., 2019; McCurdy et al., 2020 (on Stellar; optional) Roger Kirov & Cotterell., 2018; Elsner et al, 2019; Albright & Hayes, 2003

7 Background Resources

The following texts are all useful background resources for this seminar:

  1. Yoav Goldberg. 2017. Neural network methods for natural language processing. Synthesis Lectures on Human Language Technologies, 10(1), 1-309. You can get a PDF version of this textbook through MIT Libraries.

    You'll find this text most accessible if you have some background in machine learning but there is a brisk introductory section that will get you up to speed regardless.

  2. Daniel Jurafsky and James H. Martin. Speech and Language Processing. Third edition (draft). Draft chapters can be found here.

    This textbook is the single most comprehensive and up-to-date introduction available to the field of computational linguistics.

  3. Christopher D. Manning and Hinrich Schütze. (1999). Foundations of statistical natural language processing. Cambridge: MIT press. Book chapter PDFs can be obtained through the MIT library website.

    This is an older but still very useful book on natural language processing (NLP).

  4. Jacob Eisenstein. 2019. Introduction to Natural Language Processing. MIT Press.

    This is a brand-new textbook; I haven't used it in teaching before but it looks really quite good. You can get a very usable draft here and I will be pointing to chapters in that draft.

  5. Ian Goodfellow, Yoshua Bengio, and Aaron Courville. 2016. MIT Press. Deep Learning.

    This book is not NLP-specific but may be useful to consult.

8 Requirements & grading

You need to do the readings in advance, show up to class, lead discussions when it is your turn, and do a final project for the class. There will be ample opportunities to collaborate with fellow seminar participants and to get guidance and advice from the instructor.

9 Mailing list

There is a mailing list for this seminar: http://mailman.mit.edu/mailman/listinfo/9.s916-spring2020. This mailing list will be used both for organizational purposes and for communication about seminar content. If you plan to participate in the seminar, please subscribe!

Author: Roger Levy (rplevy@mit.edu)

Created: 2020-05-07 Thu 11:58

Emacs 25.3.50.1 (Org mode 8.2.10)

Validate