Erin M. Sparck
e: emsparck@gmail.com
w: erinmsparck.com

Erin M. Sparck

Ph.D., Cognitive Psychology, UCLA

Based in Austin, TX

My Research

After graduating from Rice University and reflecting upon my academic career that, to that point, had lasted for 17 years, I realized that despite making all A’s through high school and graduating magna cum laude from an elite learning institution, I could not remember much of what I learned. I invested a lot of time and effort into my education and struggled to comprehend why I couldn’t remember things I thought I had learned well. I started to do some reading and realized that I (along with most others students) approached studying with many incorrect ideas that led to a suboptimal learning experience. Given my background in cognitive psychology, I decided my best way to combat this problem is by contributing to research that promotes effective learning and disseminating this wealth of knowledge to learners everywhere.

Broadly, my interests lie in researching desirable difficulties. My concentration lies in the area of metacognition and why we don’t always seem to appreciate strategies that promote learning, as well as different techniques that can be employed to increase active learning in both self-regulated study sessions and in classrooms. For example, highlighting is a commonly used study strategy that can be accomplished in either an active manner that encourages encourage deep processing and retrieval of information or a passive manner that shows little benefit for learning. One of my current lines of research focuses on highlighting strategies that look to support active engagement and long-term retention of information.

Another line of research focuses on a more nuanced area of retrieval practice. Multiple-choice practice tests (when well designed with competitive incorrect alternatives) may have an extra benefit over other testing formats in that by selecting against closely related alternatives, the learner retrieves relevant information that may help answer questions about that related information on the final test (Little, Bjork, Bjork, & Angello, 2012). Many students, however, report that they do not always engage in this strategy where they think critically about the incorrect alternatives. My research, thus, has focused on a confidence-weighed multiple-choice tests as a way of increasing the benefit for related information without explicit instruction (see Sparck, Bjork, & Bjork, 2016). A recent area of research also involves using multiple-choice tests to optimize vocabulary learning.

Skinner

Skinner is a Javascript framework for creating behavioral psychology experiments designed by Dustin Bachrach that I use for many of my projects. Get it on GitHub.