Thomas Wilschut
 

Projects

Speak Smart! Speech-Based Adaptive Vocabulary Learning 

Memorizing second-language (L2) vocabulary is one of the most important aspects of learning a new language. Because vocabulary learning is tedious and effortful, methods that can improve the efficiency of this process are valuable for anyone who is learning a new language. In recent years, advances in cognitive psychology have led to the development of computerized adaptive learning systems that aim to improve the process of word learning by determining optimal learning strategies for individual learners in real time. In practice, using such adaptive learning systems results in higher learning efficiency than learning with traditional, non-adaptive methods, which translates into better retention at the end of the study sessions. Although adaptive learning methods have made written word learning more efficient, the possibilities for adaptive pronunciation- or speech-based learning have not yet received considerable scientific attention. Even though some language learning systems employ speech-recognition software to automatically assess the accuracy of pronunciations, this measure is typically solely used for learner-feedback, not for more refined adaptation. In my PhD project, I aim to examine the possibilities for speech-based adaptive learning. Such an examination is important for two reasons. First, the project can contribute to a better understanding of the mechanisms involved in speech-based learning and of the nature of phonological or sound representations in memory. Second, examining the opportunities for speech-based adaptive learning could lead to the development of speech-based systems that are applicable in a wide range of educational settings. 

Interactions Between Visual Working Memory, Attention and Color Categories

Recent studies have found that visual working memory (VWM) for color shows a categorical bias: observers typically remember colors as more prototypical to the category they belong to than they actually are.  In this project, we further examine color-category effects on VWM using pupillometry. Participants remembered a color for later reproduction on a color wheel. During the retention interval, a colored probe was presented, and we measured the pupil constriction in response to this probe, assuming that the strength of constriction reflects the visual saliency of the probe. We found that the pupil initially constricted most strongly for non-matching colors that were maximally different from the memorized color; this likely reflects a lack of visual adaptation for these colors, which renders them more salient than memory-matching colors (shown before). Strikingly, this effect reversed later in time, such that pupil constriction was more prolonged for memory-matching colors as compared to non-matching colors; this likely reflects that memory-matching colors capture attention more strongly, and perhaps for a longer time, than non-matching colors do. We found no effects of color categories on pupil constriction: after controlling for color distance, (non-matching) colors from the same category as the memory color did not result in a different pupil response as compared to colors from a different category; however, we did find that behavioral responses were biased by color categories. In summary, we found that pupil constriction to colored probes reflects both visual adaptation and VWM content, but, unlike behavioral measures, is not notably affected by color categories.