Resources

Target confusability competition model (TCC)

The TCC reveals a new conceptualization of visual memory, and calls into question the basis of the dominant theories of visual working memory over the past decade.

I have several resources for those interested in applying the TCC to the study of visual memory:

Object Stimuli Set

Below is a link to download the stimuli from the majority of my papers (Schurgin & Flombaum, 2018; 2017; 2015; Schurgin, Reagh, Yassa & Flombaum, 2013). These include hundreds of categorized objects, with both matched foils/pairs for similar comparisons and categorically distinct new foils.


​Feel free to use these in academic settings. Of course we appreciate if you cite their source if you do use them.

Noise Code

I developed the following noise code as a way of introducing variability into object stimuli. This code randomly shuffles a percentage of pixels, creating a "rain-like" effect as noise increases.


An advantage of this method is that it preserves the pixel-level structure of the image. Previous research (Schurgin & Flombaum, in press; Schurgin & Flombaum, 2015) has found it introduces linear decreases in performance when added to test during LTM retrieval.

Bayesian Memory Model

In order to better assess memory performance, I created a Bayesian model of memory signal strength. For more details on the specifics of this model, such as it's structure, assumptions and parameters, please see:

Schurgin & Flombaum, 2017.


I have provided the code of the model for Experiment 1a for anyone interested in using this analysis for their own experiments. This code operates using R and the rstan package.

If you have any questions about the resources provided here, or are looking for other potential resources not provided on my website, please feel free to contact me with any questions or inquiries.