CUNY

CUNY 2015 plenary

Posted on Updated on

As requested by some, here are the slides from my 2015 CUNY Sentence Processing Conference plenary last week:

This slideshow requires JavaScript.

I’m posting them here for discussion purposes only. During the Q&A several interesting points were raised. For example Read the rest of this entry »

HLP Lab at CUNY 2015

Posted on Updated on

We hope to see y’all at CUNY in a few weeks. In the interest of hopefully luring to some of our posters, here’s an overview of the work we’ll be presenting. In particular, we invite our reviewers, who so boldly claimed (but did not provide references for the) triviality of our work ;), to visit our posters and help us mere mortals understand.

  • Articulation and hyper-articulation
  • Unsupervised and supervised learning during speech perception
  • Syntactic priming and implicit learning during sentence comprehension
  • Uncovering the biases underlying language production through artificial language learning

Interested in more details? Read on. And, as always, I welcome feedback. (to prevent spam, first time posters are moderated; after that your posts will always directly show)

Read the rest of this entry »

Fine grained linguistic knowledge, CUNY poster

Posted on

And here is one more poster from CUNY. This one is work by Robin Melnick at Stanford together with Tom Wasow. Robin ran forced-choice and 100-point-preference norming experiments on that-mentioning in relative and complement clauses to investigate the extent to which the factors that affect processing correlate with the factors affecting acceptability judgments. Going beyond previous work, he actually directly correlates the effect sizes of individual predictors in the processing and acceptability models. All experiments were run both in the lab and over the web using MechanicalTurk.

Psycholinguistics in the field 2, CUNY Poster

Posted on Updated on

And here is one more poster on Yucatec, following Lindsay’s example. This is work by Lis Norcliffe, who just graduated from Stanford and join the MPI in Nijmegen. Her thesis work is on the (possibly resumptive) morphology discussed in this poster and the experiments were part of that thesis, too. You’ll find effects of definiteness and dependency length, which we investigated since they (in our view) provide evidence that this morphological reduction alternation is affected by both a preference for uniform information density and a preference for dependency minimization. Feedback welcome.

Psycholinguistics in the field, CUNY Sentence Processing poster

Posted on Updated on

We presented the results of the animacy and accessibility study on Yucatec on March 18, 2010 at the CUNY Sentence Processing Conference in New York (see image below, or download the poster pdf file here: CUNY 2010 Sentence Processing poster Yucatec. See poster pdf file for additional data, abbreviations and references). We encountered a lot of support for our project and a lot of enthusiasm for continuing research. Soooooooo, stay tuned for more production studies on Yucatec to be carried out this summer.

Pre-cuny workshop on regression and multilevel modeling (cntd)

Posted on Updated on

Some time ago, I announced that some folks have been thinking about organizing a small workshop on common issues and standards in regression modeling (including multilevel models) in psycholinguistic research to be held the day before CUNY 2009 (i.e. 03/25 at UC Davis). Here’s an update on this “workshop” along with some thoughts for planning. Read the rest of this entry »

CUNY was fun

Posted on Updated on

That’s all. I mean I had fun. Oh, and before there are more questions. The stats program mentioned during one of the sessions (for multilevel or mixed models) is R, which is freely available and you can learn more about it by following the R-lang list on the right of this blog. R is shell-based, which means you will need some time to get used to it, but it is a powerful program with implementations of most of the types of models that everyone is talking about these days and there is already a rather large community of R-users interested in language research (see the R-lang email list). Enjoy.