l

About

I am a Postdoctoral Associate in the Linguistics and Philosophy Department at the Massachusetts Institute of Technology. I received my Ph.D. from the Department of Linguistics at Cornell University. My other prior affiliations were with C.Psyd at Cornell, the Cornell Computational Linguistics Lab, the Cornell NLP group, and the Cornell Phonetics Lab. I received my B.A. in Computer Science and Mathematics from Columbia University.

Research

I am interested in computational psycholinguistics. In particular, I want to understand the representational content of language and how language is processed incrementally. Within linguistics, I am interested in the interaction of levels of linguistic representation, such as the mapping of phonetics to phonology, the relationship between prosodic and morphological structure, and how pragmatics and discourse shape and influence syntactic representations. Within psycholinguistics and cognitive science, I am interested in event representations and how linguistic structure follows from dynamical systems.

My dissertation titled "On the Limitations of Data: Mismatches between Neural Models of Language and Humans" argued for the use of neural models of language as evidence for mismatches between linguistic data and human linguistic knowledge. I explored three key phenomena: implicit causality, ambiguous relative clause attachment, and the interaction between coreference and the Binding Principles from Chomsky (1981), to show concrete cases where linguistic data leads models to non-human like linguistic systems.

Recently, I have been working on limits in computational modeling due to distinctions between language comprehension and production with relative clause attachment, the phonological representation of compounds and stress using corpora, and what aspects of pragmatic and discourse knowledge can be acquired from raw text. More broadly, I've been focused on developing a framework for interpreting neural models in light of linguistic theory. Code for aspects of my previous work can be found on Github.

[CV]

Upcoming Presentations

TBA
Harvard Language & Cognition (LangCog)
February 28, 2023

Publications

Incremental Processing of Principle B: Mismatches Between Neural Models and Humans
Forrest Davis
Proceedings of the 2022 Conference on Computational Natural Language Learning (CoNLL 2022)
[preprint]

Uncovering Constraint-Based Behavior in Neural Models via Targeted Fine-Tuning
Forrest Davis and Marten van Schijndel
Proceedings of the 59th Annual Meeting of the Association of Computational Linguistics (ACL 2021)
[paper]

Finding Event Structure in Time: What Recurrent Neural Networks can tell us about Event Structure in Mind
Forrest Davis and Gerry T.M. Altmann
Cognition (in press)
[preprint]

Discourse structure interacts with reference but not syntax in neural language models
Forrest Davis and Marten van Schijndel
Proceedings of the 2020 Conference on Computational Natural Language Learning (CoNLL 2020)
[paper]

Interaction with Context During Recurrent Neural Network Sentence Processing
Forrest Davis and Marten van Schijndel
Proceedings of the 42nd Annual Meeting of the Cognitive Science Society (CogSci 2020)
[preprint] [proceedings]

Recurrent Neural Networks Always Learn English-like Relative Clause Attachment
Forrest Davis and Marten van Schijndel
Proceedings of the 58th Annual Meeting of the Association of Computational Linguistics (ACL 2020)
[paper]

Linguistically Rich Vector Representations of Supertags for TAG Parsing
Dan Friedman, Jungo Kasai, R Thomas McCoy, Robert Frank, Forrest Davis, Owen Rambow
Proceedings of the 13th International Workshop on Tree Adjoining Grammars and Related Formalisms, 122-131. 2017.
[paper]

Presentations

Incremental Processing of Principle B: Mismatches Between Neural Models and Humans
Forrest Davis 2022 Conference on Computational Natural Language Learning (CoNLL 2022)
Abu Dhabi, United Arab Emirates. December 7-8, 2022.

Uncovering Constraint-Based Behavior in Neural Models via Targeted Fine-Tuning
Forrest Davis and Marten van Schijndel
59th Annual Meeting of the Association of Computational Linguistics (ACL 2021)
Virtually. August 2-4, 2021. [poster] [video]

Discourse structure interacts with reference but not syntax in neural language models
Forrest Davis and Marten van Schijndel
2020 Conference on Computational Natural Language Learning (CoNLL 2020)
Virtually. November 19-20, 2020. [video]

Interaction with context during recurrent neural network sentence processing
Forrest Davis and Marten van Schijndel
42nd Annual Meeting of the Cognitive Science Society (CogSci 2020).
Virtually, my office. July 29-August 1, 2020.
[poster] [video] [transcript]

Recurrent Neural Networks Always Learn English-like Relative Clause Attachment
Forrest Davis and Marten van Schijndel
58th Annual Meeting of the Association of Computational Linguistics (ACL 2020).
Virtually, my office. July 6-8, 2020
[slides] [video]

Recurrent neural networks use discourse context in human-like garden path alleviation
Forrest Davis and Marten van Schijndel
33rd Annual CUNY Conference on Human Sentence Processing (CUNY 2020).
Amherst, MA. March 19-21, 2020.
[abstract] [poster] [osf (includes video)]

Categorical and gradient dimensions of stress in English compounds
Forrest Davis and Abigail C Cohn
2020 Workshop of the Berkeley Linguistics Society (BLSW 2020).
Berkeley, CA. February 7-8, 2020.
[abstract] [poster]

The relationship between lexical frequency, compositionality, and phonological reduction in English compounds
Forrest Davis and Abigail C Cohn
94th Annual Meeting of the Linguistic Society of America (LSA 2020).
New Orleans, LA. January 2-5, 2020.
[abstract] [poster]

Effects of lexical frequency and compositionality on phonological reduction in English compounds
Forrest Davis and Abigail C Cohn
25th Architectures and Mechanisms of Language Processing Conference (AMLaP 2019).
Moscow, Russia. September 6-8, 2019.
[abstract] [poster]

The pragmatics of single wh-in situ questions in English
Forrest Davis
93rd Annual Meeting of the Linguistic Society of America (LSA 2019).
New York, NY. January 3-6, 2019.
[poster]

Teaching

Spring 2023 @ MIT
Lead Instructor
Topics in Computational Linguistics

FALL 2022 @ MIT
Lead Instructor
Special Seminar: Methods in Computational Linguistics
[link]

SPRING 2021 @ Cornell
Teaching Assistant
COGST 1101: Introduction to Cognitive Science
Professor: Khena Swallow

FALL 2021 @ Cornell
Teaching Assistant
LING 2223: Language and Law
Professor: Molly Diesing

SPRING 2020 @ Cornell
Teaching Assistant
LING 4424: Computational Linguistics
Professor: Marten van Schijndel

FALL 2019 @ Cornell
Research Assistant
Professor: Marten van Schijndel

SPRING 2019 @ Cornell
Teaching Assistant
LING 4424: Computational Linguistics
Professor: Natalie DelBusso

FALL 2018 @ Cornell
Teaching Assistant
LING 1101: Introduction to Linguistics
Professor: Miloje Despic

Bonus Content

Outside of academics, I swam competively through college. I now lesiurely swim, as well as rock climb, hike, and cross country ski. I cohabitate with a cat named Figaro, who spends his time sleeping throughout our apartment, sitting on my keyboard, and meowing for food.

I have served as a senior editor for SALT 29, an editor for SALT 28, and a member of the organizing committee for NELS 49. Additionally, I have reviewed for ACL, CoNLL, EMNLP, HSP, and PLC.

Last Updated: 29 January 2023