Lab Meeting, Autumn 2015, Week 11

Awe- and Aw-inspiring readings (or just stuff you think deserves a mention)

What we did over last week

Florian

  1. In press: Yildirim et al at JML.

  2. Accepted: Alex's paper on the role of verb repetition during cumulative priming (none ;)), at JEP:LMC.

  3. Edited Karuza et al for resubmission, after rejection from JEP:General (which was was kinda expected). I think we'll need another 2 rounds of edits.

  4. Feedback on Toscano et al after major revision request. The reviewers seemed to actually quite like the study but our framing bombed it. I think though that this is mostly good news. We just need to adjust the framing to focus less on critical period.

  5. Letters of recommendation: 20 (4 news recommendees)
  6. Feedback on VALP (1) and CUNY (1) abstracts.

  7. Met to discuss CUHK-UR collaboration:
    • Provost Lennie and Vice Provost Gatewood
    • URNI Director Brad Berk, and folks over at the Stroke unit at Strong

Kodi

  1. Analyzed data for Bradlow & Bent (2008) replication. So far, so good!

  2. Revised perspective/review paper on language processing as inference under uncertainty based on social-indexical knowledge

  3. Helped Zach analyze data for subtitle project

  4. Edited stimuli for undergrad replication of Clarke & Garrett (2004): normalized accented and unaccented sentence pairs to have same duration.

Andrew

  1. More BCS 152
  2. Started the Language Survey everyone wants. Preview version available on www, updated as I'm happy with a section.

  3. Added Wednesday to the lab people page

Olga

  1. Finally wrote up the Audit wiki page. Please read over it https://wiki.bcs.rochester.edu/HlpLab/Audit#preview Still need to link the form to it.

Esteban

  1. Read through and left feedback for BCS 152 experimental proposals.

  2. Helped prepare review and exam materials.

  3. Taught one of my RAs how to script in Praat. Unclear if this is a good idea as it encourages more usage of praat scripting.

  4. Sent Guillermo some tips on what to do with disfluency data.

  5. Continued revisions to my VOT feedback study.

Dave

  1. Analysis of behavioral part of phonetic adaptation experiment in the scanner

  2. Analysis (and explanation thereof) of some phonetic adaptation from a paper I reviewed and volunteered to help with stats on.

  3. Wrote (and submitted) SVALP abstract on inferring prior beliefs.

  4. Made a 2d belief-updating schematic showing changes in belief about two category means with a correlated prior and a known variance (a necessary simplification for an analytic solution), including working out a bit of necessary math.

  5. (Hastily) completed my NRSA progress report.
  6. Met with Dick and Florian about DP5 topics.

Sarah

Dan

Amanda

  1. Tried to find subjects for my kid study that I may or may not present next weekend at BUCLD.
  2. Help run subjects for my kid study.
  3. Re-wrote my talk twice, scrapped both.
  4. Thought a lot about this talk, reviewed a lot of kid papers, tried to organize thoughts.
  5. Thought about what studies would make this line of work worthwile.

Zach

  1. Presented in Robbie's class

  2. Homework in Robbie's class

  3. Coded stuff for CUNY abstract

  4. Analyzed and wrote for CUNY abstract

  5. Graded BCS 152 Exams

  6. Coded up normalization stuff for the undergrad replication of Clarke and Garrett.

Linda

  1. Modified various experiments to incorporate new questions & feedback.

  2. Ran Bradlow and Bent Exp 1.
  3. Ran Subtitle Exp 1 (subtitled vs no subtitled condition).
  4. Wrote code to parse the above data into form for analyzing.
  5. Wrote methods for CUNY Subtitle project.
  6. Worked on Robbie's homework. Homework for Language.
  7. Organized new lists to run the rest of B&B Exp1.

  8. Need to finish responding to undergrad email from BCS Clark and Garrett and incorporate the new parts of their experiment.

Maryam

Wednesday

  1. Finished coding stimulus presentation for binomials experiment

  2. Did some additional analyses of last binomials experiment. Also cleaned up old analysis files so that they're actually halfway usable by people who aren't me (this took a surprisingly long time)

  3. Revised CUNY abstract

  4. Got a bunch of pointwise mutual information estimates for uncertainty maintenance project (lots of fighting with Google ngrams in the process)
  5. Read some NLP papers on cache models and maximum entropy models

  6. Wrote stimuli for Bicknell et al. extension

  7. Classes etc

LabmeetingAU15w11 (last edited 2015-11-10 02:48:09 by FlorianJaeger)

MoinMoin Appliance - Powered by TurnKey Linux