Kyle Wade Grove

Linguistics, Cornell University
email: kwg33 "at" name-of-school "dot" edu

Picture of Me


Activities

At Cornell University, my work has examined psycholinguistic phenomena by building computational models of the types of difficulty humans encounter during structural language processing.  I have been specifically interested in how statistical and grammatical knowledge are utilized in online sentence processing.  I model this knowledge by building probabilistic multiple context free grammars (Nekanishi 1994) with rich lexical models. The dependent variables of interest constitute information theoretical metrics (Shannon, 1948) predicting what types of human sentence processing difficulty the user will encounter.   As of July 2012, I will be working as Director of Artificial Intelligence/Senior Computational Linguist with AskZiggy, Inc. in Sacramento, California. AskZiggy is a cross-platform voice-activated personal assistant and NLPaaS provider for mobile applications, and I will be working directly to enhance features, infrastructure, and recognition.

Semisupervised Discriminative Machine Learning

My most recent work has been on how discriminative machine learning models for named entity recognition and domain identification can be built for novel domains and registers with a minimum of annotated corpora.

Information Theoretic Metrics and Linking Theories of Cognitive Difficulty

My graduate work has attempted to compare/contrast the information theoretic metrics of surprisal (Hale 2001) and entropy both formally and empirically.  I have been pursuing the hypothesis that surprisal and entropy map onto different types of psycholinguistic phenomena (respectively, surprising garden path continuations vs. center embedded structures and weak islands which stress the resource limitations of a limited parallelism sentence processor). 

Processing Verbal Structure in Garden Path Sentences

I have been examining the reduced relative clause (RRC) processing asymmetry, first reported by Stevenson and Merlo (1997):

"The horse raced past the barn fell."   Unergative RRC
"The cakes baked in the oven fell."     Unaccusative RRC

My account is that the unaccusative case is the standard reduced relative, and that the unergative case is made difficult by a co-occurence restriction on the PP "past the barn".  In the unergative case, the intransitive directed motion construction exhibits an adjunct/argument ambiguity, attested in Zubizaretta and Oh (2007), and others, but the causative case is possible with only an argument attachment of PP.  Thus, in the unergative case PP ambiguity compounds the main clause ambiguity.  This account avoids appeal to the lexicon and other strategy-based explanations, as the same causative head is implicated in the unaccusative case and in the unergative case (the directed motion construction).



Achievements

Surprisal Derives the Recent Filler Heuristic in Mildly Context Sensitive Grammars.Poster, 10th Annual Conference on Tree Adjoining Grammars and Related Formalisms, New Haven, CT. 6/11/10.

Why Unaccusatives Have It Easy: Processing Lexical Semantics without Lexical Encoding. Talk, Penn Linguistics Colloquium, Philadelphia, PA. 3/22/09.

Why Unaccusatives Have It Easy: Reduced Relative Garden Path Effects and Verb Type. Concordia Verb Concepts, Montreal, CA, Concordia University 10/3/2008.

Why Unergatives Have it Hard: Garden Path Asymmetries as Co-occurence Restrictions. EALING Fall School, Paris, France 9/22/2008.

Classifying Garden Path Constructions by Verb Type: Why Unaccusatives Have it Easy. with John Hale. CUNY, UNC 3/14/2008.

Why Unergatives Select Themselves a Fake Reflexive and Unaccusatives Don't. LSA. 1/6/2008, Chicago.

Why Unergatives Select Themselves a Fake Reflexive; MALC 2007, Kansas City, 10/24/2007.

Why Unergatives Select Themselves a Fake Reflexive Midwest Semantics Workshop, East Lansing, 10/6/2007.











States

MCFGCKY v. 1.0. doc: web pdf A chart parser written in OCaML, uses the Guillamin compiler to translate MG to MCFG. Computes surprisals and entropies for arbitrary probabilistic CFGs and MCFGs trained from treebank.

















Accomplishments

Why Unaccusatives Have It Easy: Processing Lexical Semantics without Lexical Encoding. Penn Working Papers in Linguistics. (PWPL) 17.1

Surprisal Derives the Recent Filler Heuristic in Mildly Context Sensitive Grammars. Proceedings of the 10th Annual Conference on Tree Adjoining Grammars and Related Formalisms.

Why Unaccusatives Have It Easy: Processing Lexical Semantics without Lexical Encoding. Penn Working Papers in Linguistics (PWPL) 17.1, Proceedings of PLC 34.

with Michael Putnam. Verbal alternation as an interface property: Harmony in S-M and C-I, Proceedings of DEAL II.

Why Unergatives Select Themselves a Fake Reflexive. Proceedings of MALC 2007.



Kyle Grove