On the temporal dynamics of language-mediated vision and vision-mediated language
Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., "Is there a red vertical?") can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to "pick up the candy") can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal dynamics of individual behavioral responses. We review a variety of studies that demonstrate how these methods are particularly promising in further elucidating the dynamic competition that takes place between underlying linguistic and visual representations in multimodal contexts, and we conclude with a discussion of the consequences that these findings have for theories of embodied cognition. © 2010 Elsevier B.V.
Anderson, S., Chiu, E., Huette, S., & Spivey, M. (2011). On the temporal dynamics of language-mediated vision and vision-mediated language. Acta Psychologica, 137 (2), 181-189. https://doi.org/10.1016/j.actpsy.2010.09.008