Differences between revisions 3 and 4
Revision 3 as of 2009-12-24 02:49:56
Size: 5593
Editor: cpe-67-240-134-21
Comment:
Revision 4 as of 2009-12-24 19:23:27
Size: 5763
Editor: cpe-67-240-134-21
Comment:
Deletions are marked like this. Additions are marked like this.
Line 36: Line 36:
  * Zipf   * Zipf (1929/49)
  * Manin (2006, 2007)
Line 60: Line 61:
  * Genzel and Charniak (2002, 2003)
  * Piantadosi and Gibson (2008)
  * Qian and Jaeger (2009, 2010a,b)
Line 61: Line 65:
  * Information density, Neighborhood density, Ambiguity (Piantadosi et al 09; Gassner)   * Information density, Neighborhood density, Ambiguity (Piantadosi et al 09; Plotkin and Nowak; Gassner 04; Wasow, Perfors, and Beaver)
Line 63: Line 67:
  * Plotkin and Nowak

Computational Accounts of Production

Synopsis:

  • Connectionist and spreading-activation models of language production (lexical and syntactic production, but with a focus on speech errors)
  • Information theoretic models of incremental language production (phonetic, morphological, syntactic, and extra-syntactic preferences)
  • Computational models of adaptation in language processing (implicit learning, ideal observer models)

We will start with a quick refresher (written for language researchers) on probability theory and information theory and then read a lot of papers on examples of how information content, entropy, etc. affect language production. The goal of the class would be to provide a thorough introduction to these topics, but also to discuss the short-comings of these types of accounts and their relation to other mechanistic accounts of language production.

Prerequisites

The seminar is intended for graduate students though I may consider advanced undergraduate students with a psycholinguistics background and strong interest. A very basic background in probability theory is assumed, but we'll go through the basics at the beginning of the class.

Requirements

This will be a reading/discussion seminar (not a lecture). So, even if you plan to audit I would appreciate if you do the readings (see webpage for more detail on requirements etc.).

Students who are taking the class for credits will have to prepare for every discussion. I plan to use the BlackBoard forum feature and students taking the class for credit will have to post 2 questions or comments at least 1 day before each class about the readings. Additionally, they will have to lead some discussions. There also will be a final project, which can be a discussion paper or a proposal for an experiment (or grant ;). The final write-up should be about 4-10pp.

Readings

There will be a lot readings for each day, but the goal is not for all of them to be read by everyone. Instead, we will have a short obligatory reading and then distribute additional readings across people in the class. Discussion leaders have to have read all of the papers.

Syllabus

This is a very rough draft of a syllabus. I am also blatantly stealing parts of a great class taught by Dan Jurafsky and Michael Ramscar at Stanford (Fall 2009). The list below is meant as a superset suggestion (covering all topics would take more than a semester). Please feel free to suggest additional topics or to tell me your favorites.

Computational Approaches to Production

  1. Background in probability theory and information theory

    • Robert A. Rescorla. 1988. Pavlovian Conditioning: It's Not What You Think It Is. American Psychologist, 43(3), 151-160 PLUS
    • For those with no probability theory or information theory, start with: John A. Goldsmith. 2007. Probability for linguists.
    • For those with no information theory, the above plus: Sheldon Ross. 2010. A First Course in Probability. Eigth Edition. Section 9.3 "Surprise, Uncertainty, and Entropy", pages 425-429.
  2. Early applications of information theory to natural language: The entropy of English

    • Shannon, C. Prediction and entropy of printed English. Bell System Technical Journal, 30, 50-64.
    • Thomas M. Cover and Roger C. King. 1978. A Convergent Gambling Estimate of the Entropy of English. IEEE Transactions on Information Theory 24:4, 413-421.
  3. Least Effort

    • Zipf (1929/49)
    • Manin (2006, 2007)
  4. Shannon Information and Sub-Phonemic/Phonemic Reduction

    • Duration reduction (Bell et al. 03, 09); Aylett and Turk 04; Pluyymaerkers et al. 05)
    • Vowel weakening (Van Son and Van Santen, 05)
  5. Shannon Information and Sub-Phonemic/Phonemic Reduction

    • Phone deletion (Cohen Priva, 08)
    • Fluency (Shriberg and Stolcke 96)
  6. Shannon Information and Morpho-syntactic Reduction

    • Auxiliary reduction and omission (Frank and Jaeger 08)
    • Prefix deletion (Norcliffe and Jaeger 10)
    • Case-marker omission
  7. Connectionist Models of Lexical Production

    • Speech errors (Dell, 86)
  8. Connectionist Models of Syntactic Production

    • Chang et al
  9. Shannon Information and Syntactic Reduction

    • Wasow et al 07; Jaeger 10a,b
  10. Relative Entropy and Argument Omission

    • Argument drop (Resnik 96)
    • Ellipsis
  11. Uncertainty Reduction and Referring Expressions

    • Beaver et al
    • Tily and Piantadosi
  12. Shannon Information and Neighborhood Entropy across the Discourse

    • Genzel and Charniak (2002, 2003)
    • Piantadosi and Gibson (2008)
    • Qian and Jaeger (2009, 2010a,b)
  13. Optimal Lexica

    • Information density, Neighborhood density, Ambiguity (Piantadosi et al 09; Plotkin and Nowak; Gassner 04; Wasow, Perfors, and Beaver)
    • Phonological optimality (Graff and Jaeger 09)
  14. Information theoretic approaches to Morphological Paradigms

    • Baayen
    • Moscovo del Prado Martin

Computational Models of Priming, Implicit Learning, Adaptation

  1. Priming and Implicit Learning

  2. Computational Models of Skill Maintenance

    • Huber et al
  3. Connectionist Models of Syntactic Priming

    • Chang et al
  4. ACT-R Models of Syntactic Priming

  5. Surprisal and Surprisal-based Models of Syntactic Priming

    • Hale 01; Levy 08
    • Snider & Jaeger

  6. Phonetic Adaptation

    • Clayards et al 09; Kraljic and Samuel
  7. Syntactic Adaptation

    • Wells et al 09; Sauerland et al., 09
  8. Ideal Observer Approaches to Adaptation

ComputationalAccountsOfProduction (last edited 2011-06-27 17:13:58 by 192)

MoinMoin Appliance - Powered by TurnKey Linux