Research

Uncovering the format and content of mental representations

Revealing the features of mental representations is a longstanding goal of cognitive psychology. There is currently no general framework for investigating representations of high-level visual concepts. I developed a method to relate the semantic space of category labels to the space of visual features and reconstruct the internal representations of many visual concepts. This framework should enable answering general questions about visual representations. An ongoing project aims to extend this method to answer questions about the format, specifically the level of abstraction, of representations.

Selected works

Caplette, L., & Turk-Browne, N. B. (in preparation). Assessing the abstractness of visual representations using deep image synthesis and behavior. [related poster]

Caplette, L., & Turk-Browne, N. B. (forthcoming). Computational reconstruction of mental representations using human behavior. Nature Communications. [preprint]

Awards

MAIN 2021 Best Postdoctoral Abstract Submission




Disentangling stimulus time and processing time in the brain

During the fixation of an object, information is not only processed in the brain through time: it is also received on the retina continuously across time. Because of several factors such as brain oscillations, attention and temporal integration, information is likely to be processed differently depending on when it is received on the retina. I developed an experimental paradigm that allows us to observe the processing, through time and across the brain, of specific information received at specific moments. By visualizing both stimulus time and processing time simultaneously, we can advance our understanding of how brain oscillations interact with a continuous visual input, assess how information sampling is modulated by top-down processing across the brain, and directly observe the temporal integration of information for a behavioral response. We have demonstrated the potential of this approach in two recent papers and are currently writing an opinion paper about the framework.

Selected works

Caplette, L., Jerbi, K., & Gosselin, F. (2023). Rhythmic information sampling in the brain during visual recognition. Journal of Neuroscience, 43(24), 4487–4497. [paper] [code/data]

Caplette, L., Ince, R. A. A., Jerbi, K., & Gosselin, F. (2020). Disentangling presentation and processing times in the brain. NeuroImage, 218, 116994. [paper] [code/data]

Awards

SQRP 2021 Guy-Bégin Award for Best Fundamental Paper

VSS 2018 Student Travel Award

Methods for cognitive neuroscience and psychology 

Throughout my various projects, I have developed new methods and experimental paradigms to explore various aspects of human cognition (e.g., to disentangle stimulus time and processing time, or to visualize mental representations of concepts, as discussed in other sections). More recently, some of my projects have focused specifically on methodological development. For example, my colleagues and I have developed methods to (i) improve signal-to-noise ratio in all psychological experiments that analyze differences (in behavior or brain activity) between correct and incorrect trials; (ii) translate between different modalities of neuroimaging data and integrate them together; and (iii) fit models of brain activity that can generalize to new participants and datasets. These methods should provide new ways of investigating the human mind, and improve existing ones.


Selected works

Caplette, L. & Turk-Browne, N. B. (2023). An encoding model in shared functional space to reconstruct representations in multiple datasets. VSS. [poster]

Gosselin, F., Daigneault, V., Larouche, J.-M., & Caplette, L. (2024). Reclassifying guesses to increase signal-to-noise ratio in psychological experiments. Behavior Research Methods. [paper] [code]

Expectations and object recognition

Our expectations influence how we recognize objects and generally see the world. Perceptual and neuronal mechanisms underlying this influence are however unclear, especially when the expected objects are complex real-world objects. Notably, how expectations influence the information represented and used to recognize objects is still largely unknown. In a recent study, we observed that specific object expectations will lead to an object-specific sampling of information, and that expectations overall accelerate the successful use of coarse information. We are currently running a neuroimaging project to investigate how, when and where in the brain expectations of objects are represented and integrated with sensory information.

Selected works

Caplette, L., Gosselin, F., & West, G. L. (2021). Object expectations alter information use during visual recognition. Cognition, 214, 104803. [paper] [code/data]

Caplette, L., Gosselin, F., Mermillod, M., & Wicker, B. (2020). Real-world expectations and their affective value modulate object processing. NeuroImage, 213, 116736, [paper] [code/data]

The time course of information use for visual recognition

Because, among other things, of their limited processing capacity, humans do not use all available visual information at once when fixating a visual object or scene: different features must be attended and used at different moments. We can uncover this time course of information use with great precision using cutting-edge behavioral methods. For example, we observe that neurotypical subjects use coarse information throughout recognition and fine information only later when recognizing everyday objects, and that this pattern is reversed in autistic subjects.

Selected works

Wiesmann, S. L.*, Caplette, L.*, Willenbockel, V., Gosselin, F., & Võ, M. L.-H. (2021). Flexible time course of spatial frequency use during scene categorization. Scientific Reports, 11, 14079. [paper] *co-first-authors

Caplette, L., Wicker, B., & Gosselin, F. (2016). Atypical time course of object recognition in Autism Spectrum Disorder. Scientific Reports, 6, 35494. [paper]

Awards

CPA Certificate of Academic Excellence

UdeM Psychology Best Graduate Student Talk