Ever noticed?

The (in)dependence of pronunciation variation on the time course of lexical planning

Posted on Updated on

Language, Cognition, and Neuroscience just published Esteban Buz​’s paper on the relation between the time course of lexical planning  and the detail of articulation (as hypothesized by production ease accounts).

Several recent proposals hold that much if not all of explainable pronunciation variation (variation in the realization of a word) can be reduced to effects on the ease of lexical planning. Such production ease accounts have been proposed, for example, for effects of frequency, predictability, givenness, or phonological overlap to recently produced words on the articulation of a word. According to these account, these effects on articulation are mediated through parallel effects on the time course of lexical planning (e.g., recent research by Jennifer Arnold, Jason Kahn, Duane Watson, and others; see references in paper).

 

This would indeed offer a parsimonious explanation of pronunciation variation. However, the critical test for this claim is a mediation analysis, Read the rest of this entry »

Ways of plotting map data in R (and python)

Posted on Updated on

Thanks to Scott Jackson, Daniel Ezra Johnson, David Morris, Michael Shvartzman, and Nathanial Smith for the recommendations and pointers to the packages mentioned below.

  • R:
    • The maps, mapsextra, and maptools packages provide data and tools to plot world, US, and a variety of regional maps (see also mapproj and mapdata). This, combined with ggplot2 is also what we used in Jaeger et al., (2011, 2012) to plot distributions over world maps. Here’s an example from ggplot2 with maps.
    Example of using ggplot2 combined with the maps package.
    Example use of ggplot2 combined with the maps package (similar to the graphs created for Jaeger et al., 2011, 2012).

Is my analysis problematic? A simulation-based example

Posted on Updated on

This post is in reply to a recent question on in ling-R-lang by Meredith Tamminga. Meredith was wondering whether an analysis she had in mind for her project was circular, causing the pattern of results predicted by the hypothesis that she was interested in testing. I felt her question (described below in more detail) was an interesting example that might best be answered with some simulations. Reasoning through an analysis can, of course, help a lot in understanding (or better, as in Meredith’s case, anticipating) problems with the interpretation of the results. Not all too infrequently, however, I find that intuition fails or isn’t sufficiently conclusive. In those cases, simulations can be a powerful tool in understanding your analysis. So, I decided to give it a go and use this as an example of how one might approach this type of question.

Results of 16 simulated priming experiments with a robust priming effect (see title for the true relative frequency of each variant in the population).
Figure 1: Results of 16 simulated priming experiments with a robust priming effect (see title for the true relative frequency of each variant in the population). For explanation see text below.

Read the rest of this entry »

A few reflections on “Gradience in Grammar”

Posted on Updated on

In my earlier post I provided a summary of a workshop on Gradience in Grammar last week at Stanford. The workshop prompted many interesting discussion, but here I want to talk about an (admittedly long ongoing) discussion it didn’t prompt. Several of the presentations at the workshop talked about prediction/expectation and how they are a critical part of language understanding. One implication of these talks is that understanding the nature and structure of our implicit knowledge of linguistic distributions (linguistic statistics) is crucial to advancing linguistics. As I was told later, there were, however, a number of people in the audience who thought that this type of data doesn’t tell us anything about linguistics and, in particular, grammar (unfortunately, this opinion was expressed outside the Q&A session and not towards the people giving the talks, so that it didn’t contribute to the discussion). Read the rest of this entry »

Congratulations to Dr. Alex B. Fine

Posted on

It’s my great pleasure to announce to the world (i.e., all 4 subscribed readers to this blog) that Alex B. Fine successfully defended his thesis entitled “Prediction, Error, and Adaptation During Online Sentence Comprehension” jointly advised by Jeff Runner and me. Alex is the first HLP lab graduate (who started his graduate studies in the lab), so we gave him a very proper send-off and roasted the heck out of him. Alex will be starting his post-doc at the University of Illinois Psychology Department in June, working with Gary Dell, Sarah Brown-Schmidt, and Duane Watson.

Dr. Fine's defending
Dr. Fine’s defending

Read the rest of this entry »

NSF post-doctoral funding opportunities

Posted on

Thanks to Jeff Runner, I just became aware of this post-doctoral program of the NSF (in SBE, i.e. the Social, Behavioral & Economic Sciences, which includes psychology, cognitive science, and linguistics). This program also recently underwent some changes. The program provides 2 years of funding. As for eligibility, let me quote the linked page: Ph.D. degree of the fellowship candidate must have been obtained within 24 months before application deadline (previously was within 30 months) or within 10 months after the application deadline (previously was 12 months).

Good luck to everyone interested.

transferring installed packages to a different installation of R

Posted on Updated on

It used to take me a while to reinstall all the R packages that I use after upgrading to a new version of R.  I couldn’t think of another way to do this than to create a list of installed packages by examining the R package directory, and to manually select and install each one of those packages in the new version of R.  In order to ensure that my home and office installation of R had the same packages installed, I did something similar.

I recently discovered that there is a much, much easier way to transfer the packages that you have installed to a different installation of R.  I found some R code on the web that I adapted to my needs.  Here is what you need to do:

1. Run the script “store_packages.R” in your current version of R.

# store_packages.R
#
# stores a list of your currently installed packages

tmp = installed.packages()

installedpackages = as.vector(tmp[is.na(tmp[,"Priority"]), 1])
save(installedpackages, file="~/Desktop/installed_packages.rda")

(Make sure that all the quotation marks in the script are straight.  The scripts will generate an error if they include any curly quotation marks.  For some reason, when I saved this blog entry, some quotation marks changed to curly ones.  WordPress is probably to blame for this problem, which I have not been able to fix.)

2. Close R.  Open the installation of R that you want the packages to be installed in.

3. Run the script “restore_packages.R”.

# restore_packages.R
#
# installs each package from the stored list of packages

load("~/Desktop/installed_packages.rda")

for (count in 1:length(installedpackages)) install.packages(installedpackages[count])

Note that if you want to install the list of packages in an installation of R on a different computer, you should transfer the .rda file that is created by the store_packages script to that computer, and make sure that the path for the “load” command in the restore_packages script is set to the right location.