Size: 15793
Comment:
|
Size: 16102
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 27: | Line 27: |
'''The Basics of Efficiency'' | |
Line 38: | Line 38: |
'''More than a Curious Phenomenon? Constant Entropy Rate''' | |
Line 77: | Line 78: |
* Tily, H., Gahl, S., Arnon, I., Kothari, A., Snider, N., and Bresnan, J. (2009). Pronunciation reflects syntactic probabilities: Evidence from spontaneous speech. Language and Cognition, 1, XX-XX. |
* Tily, H., Gahl, S., Arnon, I., Kothari, A., Snider, N., and Bresnan, J. (2009). Pronunciation reflects syntactic probabilities: Evidence from spontaneous speech. Language and Cognition, 1, XX-XX. |
Line 83: | Line 83: |
* also covered: * Johnson, K. (2004). Massive reduction in conversational American English. In Spontaneous speech: Data and analysis, Proceedings of the 1st session of the 10th international symposium (The National International Institute for Japanese Language, Tokyo, Japan), 29-54. |
|
Line 98: | Line 100: |
* Ellipsis | |
Line 99: | Line 102: |
12. Class: Information Density and Planning Beyond the Clause | 12. Class: Information Density and Planning Beyond the Clause, Inference, Differences across Languages |
Line 101: | Line 105: |
* Hagoort, P., and Van Berkum, J. J. A. (2007). Beyond the sentence given. Philosophical Transactions of the Royal Society. Series B: Biological Sciences, 362, 801-811 | |
Line 111: | Line 116: |
14. Class: Information Theory and Information Structure * Prince, Ellen F. 1992. The ZPG letter: Subjects, definiteness, and information-status. In S. Thompson and W. Mann, eds., Discourse Description: Diverse Analyses of a Fundraising Text. * pp 1-7 pf Rosenfeld, R. (1996). A maximum entropy approach to adaptive statistical language modelling. Computer speech and language 10(3),187- * Arnold, J. (I assume) CUNY talk I just reviewed that argues that givenness cannot be reduced to predictability/information density. * Tily & Piantadosi (2009). Refer efficiently: Use less informative expressions for more predictable meanings. Proceedings of the Workshop on Production of Referring Expressions, Cogsci 2009. * Also covered: * Excerpts from Givon, T. (1995). Functionalism and grammar. Amsterdam: John Benjamins. * Wasow, Perfors, and Beaver |
|
Line 112: | Line 126: |
14. Class: Connectionist Accounts of Production * Dell, G. S., Chang, F., & Grin, Z. M. (1999). Connectionist models of language production: Lexical access and grammatical encoding. Cognitive Science: A Multidisciplinary Journal, 23 (4), 517-542. |
15. Class: Connectionist Accounts of Production * Dell, G. S., Chang, F., & Grin, Z. M. (1999). Connectionist models of language production: Lexical access and grammatical encoding. Cognitive Science: A Multidisciplinary Journal, 23 (4), 517-542. |
Line 123: | Line 136: |
13. Class: Zipf continued, early evidence from phonology and speech | 16. Class: Zipf continued, early evidence from phonology and speech |
Line 128: | Line 142: |
14. Class: Functionalist Theories of Language Change | 17. Class: Functionalist Theories of Language Change |
Line 136: | Line 150: |
9. Class: More on Optimal Lexica and Multiple Functional Pressures | 18. Class: More on Optimal Lexica and Multiple Functional Pressures |
Line 141: | Line 155: |
* Graff, P. and Jaeger, T.F. (submitted). Locality and Feature Specificity in OCP Effects: Evidence from Aymara, Dutch, and Javanese. CLS. |
* Graff, P. and Jaeger, T.F. (submitted). Locality and Feature Specificity in OCP Effects: Evidence from Aymara, Dutch, and Javanese. CLS. * Bi-directional OT approaches |
Line 144: | Line 158: |
== Topics == | 19. Class: Entropy, Neighborhood, and Paradigms * Milin, P., Kuperman, V., Kostic, A. & Baayen, R.H. Paradigms bit by bit: an information- theoretic approach to the processing of inflection and derivation. In press in Blevins, James P. and Juliette Blevins (eds.), Analogy in Grammar: Form and Acquisition. Oxford: Oxford University Press. '''The End''' 20. Final Discussion and Wonders == Additional Topics == |
Line 146: | Line 167: |
1. '''Background in probability theory and information theory''' 2. '''Early applications of information theory to natural language: The entropy of English''' 3. '''Least Effort''' * Zipf (1929/49) * Manin (2006, 2007) 4. '''Shannon Information and Sub-Phonemic/Phonemic Reduction''' * Duration reduction (Bell et al. 03, 09); Aylett and Turk 04; Pluyymaerkers et al. 05) * Vowel weakening (Van Son and Van Santen, 05) 5. '''Shannon Information and Sub-Phonemic/Phonemic Reduction''' * Phone deletion (Cohen Priva, 08) * Fluency (Shriberg and Stolcke 96) 6. '''Shannon Information and Morpho-syntactic Reduction''' * Auxiliary reduction and omission (Frank and Jaeger 08) * Prefix deletion (Norcliffe and Jaeger 10) * Case-marker omission 7. '''Connectionist Models of Lexical Production''' |
* '''Connectionist Models of Lexical Production''' |
Line 163: | Line 169: |
8. '''Connectionist Models of Syntactic Production''' * Chang et al 9. '''Shannon Information and Syntactic Reduction''' * Wasow et al 07; Jaeger 10a,b 10. '''Relative Entropy and Argument Omission''' * Argument drop (Resnik 96) * Ellipsis 11. '''Uncertainty Reduction and Referring Expressions''' * Wasow, Perfors, and Beaver * Tily and Piantadosi 12. '''Shannon Information and Neighborhood Entropy across the Discourse''' * Genzel and Charniak (2002, 2003) * Piantadosi and Gibson (2008) * Qian and Jaeger (2009, 2010a,b) 13. '''Optimal Lexica''' * Information density, Neighborhood density, Ambiguity (Piantadosi et al 09; Plotkin and Nowak; Gassner 04) * Phonological optimality (Graff and Jaeger 09) 14. '''Information theoretic approaches to Morphological Paradigms''' |
* '''Information theoretic approaches to Morphological Paradigms''' |
Line 184: | Line 173: |
=== Computational Models of Priming, Implicit Learning, Adaptation === 1. '''Priming and Implicit Learning''' 2. '''Computational Models of Skill Maintenance''' * Huber et al 3. '''Connectionist Models of Syntactic Priming''' * Chang et al 4. '''ACT-R Models of Syntactic Priming''' 5. '''Surprisal and Surprisal-based Models of Syntactic Priming''' * Hale 01; Levy 08 * Snider & Jaeger 6. '''Phonetic Adaptation''' |
* '''Priming and Implicit Learning''' * '''Computational Models of Skill Maintenance''' * Huber et al * '''Connectionist Models of Syntactic Priming''' * Chang et al * '''ACT-R Models of Syntactic Priming''' * '''Surprisal and Surprisal-based Models of Syntactic Priming''' * Snider & Jaeger * '''Phonetic Adaptation''' |
Line 196: | Line 184: |
7. '''Syntactic Adaptation''' |
* '''Syntactic Adaptation''' |
Line 200: | Line 186: |
8. '''Ideal Observer Approaches to Adaptation''' | * '''Ideal Observer Approaches to Adaptation''' |
Computational Accounts of Production
Synopsis:
- Connectionist and spreading-activation models of language production (lexical and syntactic production, but with a focus on speech errors)
- Information theoretic models of incremental language production (phonetic, morphological, syntactic, and extra-syntactic preferences)
- Computational models of adaptation in language processing (implicit learning, ideal observer models)
We will start with a quick refresher (written for language researchers) on probability theory and information theory and then read a lot of papers on examples of how information content, entropy, etc. affect language production. The goal of the class would be to provide a thorough introduction to these topics, but also to discuss the short-comings of these types of accounts and their relation to other mechanistic accounts of language production.
Prerequisites
The seminar is intended for graduate students though I may consider advanced undergraduate students with a psycholinguistics background and strong interest. A very basic background in probability theory is assumed, but we'll go through the basics at the beginning of the class.
Requirements
This will be a reading/discussion seminar (not a lecture). So, even if you plan to audit I would appreciate if you do the readings (see webpage for more detail on requirements etc.).
Students who are taking the class for credits will have to prepare for every discussion. I plan to use the BlackBoard forum feature and students taking the class for credit will have to post 2 questions or comments at least 1 day before each class about the readings. Additionally, they will have to lead some discussions. There also will be a final project, which can be a discussion paper or a proposal for an experiment (or grant ;). The final write-up should be about 4-10pp.
Readings
There will be a lot readings for each day, but the goal is not for all of them to be read by everyone. Instead, we will have a short obligatory reading and then distribute additional readings across people in the class. Discussion leaders have to have read all of the papers.
Syllabus
This is a very rough draft of a syllabus. I am also blatantly stealing parts of a great class taught by Dan Jurafsky and Michael Ramscar at Stanford (Fall 2009). The list below is meant as a superset suggestion (covering all topics would take more than a semester). Please feel free to suggest additional topics or to tell me your favorites.
The Basics of Efficiency Zipf 1949 (1-22) and Zipf 1935 (20-39, 172-176) on the inverse frequency-form link Zipf35-49.pdf pdf John A. Goldsmith. 2007. Probability for linguists. Goldsmith07.pdf pdf Sheldon Ross. 2010. A First Course in Probability. Eigth Edition. Section 9.3 "Surprise, Uncertainty, and Entropy", pages 425-429. see [http://onlinestatbook.com/] Shannon, C.E. Prediction and entropy of printed English. Bell System Technical Journal, 30, 50-64. shannon51.pdf pdf Thomas M. Cover and Roger C. King. 1978. A Convergent Gambling Estimate of the Entropy of English. IEEE Transactions on Information Theory 24:4, 413-421. coverking78.pdf pdf Manin, D. 2006. Experiments on predictability of word in context and information rate in natural language. manin06.pdf pdf Genzel, D. and Charniak, E. (2002). Entropy rate constancy in text. In Proceedings of ACL-02. genzelcharniak02.pdf pdf Shannon, C.E. (1948). A Mathematical Theory of Communication. Mobile Computing and Communications Review, Volume 5, Number I. Reprinted from Bell System Technical Journal with corrections. shannon48.pdf pdf Keller, F. (2004). The entropy rate principle as a predictor of processing effort: An evaluation against eye-tracking data. In Proceedings of the Conference on Empirical Methods in Natural Language Processing, Barcelona, pp. 317-324. keller04.pdf pdf Qian, T. and Jaeger, T.F. (submitted). Entropy profiles in Language: A cross-linguistic investigation. Entropy. qianjaeger10.pdf pdf Qian, T. and Jaeger, T.F. (2009). Evidence for Efficient Language Production in Chinese. In CogSci Proceedings. qianjaeger09.pdf pdf Zipf 1935 (283-287) on speech rate (velocity of speech) Zipf35-49_sound.pdf pdf Pluymaekers, M., Ernestus, M., and Baayen, R. (2005). Lexical frequency and acoustic reduction in spoken Dutch. The Journal of the Acoustical Society of America, 118, 25-61. pluymaekersetal05.pdf pdf Alan Bell, Jason Brenier, Michelle Gregory, Cynthia Girand, and Dan Jurafsky. (2009) Predictability Effects on Durations of Content and Function Words in Conversational English. Journal of Memory and Language 60:1, 92-111. belletal09.pdf pdf Aylett, M. and Turk, A. (2004). The smooth signal redundancy hypothesis: A functional explanation for relationships between redundancy, prosodic prominence, and duration in spontaneous speech. Language and Speech, 47(1), 31-56. aylettturk04.pdf pdf Gahl, S., Garnsey, S. M., Fisher, C., & Matzen, L. (2006). "That sounds unlikely": Syntactic probabilities affect pronunciation. Proceedings of the 28th Annual Conference of the Cognitive Science Society (pp. 1334-1339). Van Son, R., and Pols, L. (2003). How efficient is speech? Proceedings of the Institute of Phonetic Sciences, 25, 171-184. vansonpols03.pdf pdf Aylett, M.P. and Turk, A. (2006) Language redundancy predicts syllabic duration and the spectral characteristics of vocalic syllable nuclei. The Journal of the Acoustical Society of America 119, 30-48. aylettturk06.pdf pdf van Son, R. and van Santen, J. (2005) Duration and spectral balance of intervocalic consonants: A case for efficient communication. Speech Communication 47(1), 100-123. vansonvansanten05.pdf pdf Shriberg, E., & Stolcke, A. (1996). Word predictability after hesitations: A corpus-based study. In Proceedings of ICSLP '96. Cook, S. W., Jaeger, T. F., & Tanenhaus, M. K. (2009). Producing less preferred structures: More gestures, less Fluency. In Proceedings of the 31st conference of the Cognitive Science Society. Vancouver, BC. Clark, H. H., & Fox Tree, J. E. (2002). Using "uh" and "um" in spontaneous speech. Cognition, 84, 73-111. Frank, A., & Jaeger, T.F. (2008, July). Speaking rationally: Uniform information density as an optimal strategy for language production. In The 30th annual meeting of the Cognitive Science Society (CogSci08) (p. 933-938). Washington, D.C. Wasow, T., Jaeger, T.F., & Orr, D. (in press). Lexical variation in relativizer frequency. In H. Wiese & H. Simon (Eds.), Proceedings of the workshop on expecting the unexpected: Exceptions in grammar at the 27th annual meeting of the German Linguistic Association. Berlin and New York: Mouton de Gruyter. Brown, P., & Dell, G.S. (1987). Adapting production to comprehension: The explicit mention of instruments. Cognitive Psychology, 19 (4), 441-472. Gomez Gallo, C., Jaeger, T. F., & Smyth, R. (2008, July). Incremental syntactic planning across clauses. In The 30th annual meeting of the Cognitive Science Society (CogSci08) (p. 845-850). Washington, D.C. Levy, R., & Jaeger, T.F. (2007). Speakers optimize information density through syntactic reduction. In B. Schlokopf, J. Platt, & T. Homan (Eds.), Advances in neural information processing systems (NIPS) 19 (p. 849-856). Cambridge, MA: MIT Press. Smith, N., & Levy, R. (2008, July). Optimal processing times in reading: A formal model and empirical investigation. In The 30th annual meeting of the Cognitive Science Society (CogSci08). Washington, D.C.. Tily & Piantadosi (2009). Refer efficiently: Use less informative expressions for more predictable meanings. Proceedings of the Workshop on Production of Referring Expressions, Cogsci 2009. Dell, G. S., Chang, F., & Grin, Z. M. (1999). Connectionist models of language production: Lexical access and grammatical encoding. Cognitive Science: A Multidisciplinary Journal, 23 (4), 517-542. Chang, F., Dell, G. S., & Bock, J. K. (2006). Becoming syntactic. Psychological Review, 113 (2), 234-272. focusing on discussion of competition accounts, have another look at: Cook, S. W., Jaeger, T. F., & Tanenhaus, M. K. (2009). Producing less preferred structures: More gestures, less Fluency. In Proceedings of the 31st conference of the Cognitive Science Society. Vancouver, BC. Zipf 1935 (73-81, 109-121) and Zipf 1949 (98-108) on phonological change Zipf35-49_sound.pdf pdf Schuchardt, H. (1885) On sound laws: Against Neogrammarians. Translated by T. Vennemann and T.H. Wilbur. schuchardt1885.pdf pdf Bates, E. and MacWhinney, B. (1982) Milin, P., Kuperman, V., Kostic, A. & Baayen, R.H. Paradigms bit by bit: an information- theoretic approach to the processing of inflection and derivation. In press in Blevins, James P. and Juliette Blevins (eds.), Analogy in Grammar: Form and Acquisition. Oxford: Oxford University Press.
Snider & Jaeger
More than a Curious Phenomenon? Constant Entropy Rate
Probability and Information in Online Language Production
Computational Accounts and Mechanisms
Language Change: The Link between Processing and Grammar
The End Additional Topics
Computational Approaches to Production
Connectionist Models of Lexical Production
Computational Models of Skill Maintenance