Stimulus Value Gates Multisensory Integration

[Concepts in Sensorimotor Research]


Application to Autism. 

The advantage of a good filtering system is less getting overwhelmed by your sensory environment in the real world, but the disadvantage was that you could lose out on critical information. This almost automatic value-based filtering ability is, I think, an issue many autistics like me struggle with. But even with extensive experience, value-based filters could vary around task domain, context-specific or even things like predictability.  Which is still, i think, why we still face challenges in trying to understand what exactly is going on in the non-neurotypical populations. 




Bean, N. L., Stein, B. E., & Rowland, B. A. (2021). Stimulus value gates multisensory integration. European Journal of Neuroscience, 53(9), 3142-3159.



Summary of the paper we discussed in my Multisensory Integration seminar this week. 


Stimulus Value Gates Multisensory Integration 


My takeaways from this paper were three key ideas.  


One is that multimodal stimuli showed enhanced responses compared to single mode stimuli. Response was also enhanced when the two modalities were spatiotemporal and or when they congruent or had high-coupling. Kind of like when we both listen to its audio recording as we read a book. There were neural correlates in the midbrain superior colliculus (SC), where overlapping topographic sensory maps meant the ability to route cross modal neural signals onto common target neurons. 


A second takeaway, was that rewards/value played a big role in crossmodal integration. If a specific modality like the auditory-input had less value, it basically got filtered out before it even reached the multisensory integration computation stage. It did not matter if the stimuli was spatiotemporal or congruent.  


The third takeaway was that experience can selectively block or filter out inputs to multisensory integration. When a modality in a stimulus-pair loses its reward value, it gets filtered out, even if it's congruent. But when there is a re-associated with reward, there is a bounce back to multisensory integration and enhanced response with this effect persisting for a while, even if you again take away the reward.  


Individual Experiments

This study made use of 3 experiments using cats along with visual and auditory stimuli. Rewards were associated with the visual stimuli, with 4 pellets being high rewards and 2 pellets being low rewards. Approach behavior in response to crossmodal /unimodal stimuli was recorded.  


Experiment 1 looked at coupling probability and history of being rewarded. Cats were trained to approach the cross-modal stimuli associated with rewards, whether the rewards was high or low. Like A1-V1-high and A2-V2-low or no-approach with the no-reward-auditory-only stimuli.


As expected, cats learned the response contingencies and behaved appropriately.  And this response behavior continued even if the V1 or V2 intensities were decreased. What this shows is the high motivation of the cats, where rewards were sufficient to offset time and energy for the orientation response. More specifically two results. One is that coupling probability had no effect on multisensory integration. Two is that reward association had a significant effect on multisensory integration. 


(I think cats went for the pellets no matter if they were 2 pellets or 4 pellets)


Experiment 2 asked if the auditory stimuli had to be explicitly associated with the visual stimuli in order to see enhanced cross-modal multisensory integration, or where the cross-modal stimuli integrated by default, unless explicitly dissociated from reward. So a third novel auditory stimulus was added on. V1-A3-high, V2-A3-low and the A3-no-reward.  


What they found was that novel stimuli combinations were integrated to enhance responses, just like familiar combos from before. 


Experiment 3 looked at how multisensory integration operates in dynamic circumstances where stimulus value can rapidly change. This was done by reversing the coupling probabilities and switching the visual reward associations to which they had earlier adapted.


The result showed that reversing the reward contingencies reversed the multisensory enhancement pattern. That is to say, when a modality lost its reward value, the cats ceased to integrate it, even if that modality-pair was congruent (or mutually informational).


But when that stimulus was re-associated with reward, it regained its ability to integrate and enhance responses and this effect persisted for a while even if the pair was no longer lined with reward. This suggested that experience can modify the seemingly automatic process of multisensory integration. That is, experience can selectively block or filter out stimuli to multisensory computation.  


Discussion.

In addition to the conclusions around findings, the discussion section of the paper also touched upon neural correlates for the filter. For instance, in the primary visual cortex, neurons sensitive to reward association show more enhanced representations and higher BOLD responses around the timing of reward.  Responses in visual area V4 were modulated by increasing reward across targets. Responses in primary auditory cortex A1 were also modulated for reward and aversive cues.   


The SC was also discussed with respect to experience. Cats developed the ability to integrate cross-modal stimuli only at locations at which they had experience. This influences the SC by a corticotectal projection originating in an area of association cortex, the anterior ectosylvian sulcus. This component is crucial for the expression of multisensory integration in detection/localization behaviors. This experience is necessary for the SC to be able to integrate visual and auditory inputs. 


The discussion then makes an important distinction, that while experience with covariant cross-modal stimuli is essential in development, it appears to establish a general capability to integrate information across those senses. This means that once formed, stimuli with different features, like the value-based filter, appear to be integrated at the same level, and extensive experience with all specific feature combinations is not essential. Extensive additional experience with covariant stimulus features has no facilitation effect. However, this may not be universally true in all multisensory domains. 

 



====


Some terms explained: 


Multisensory Integration: This is the process by which the brain combines information from multiple senses to make decisions. It is believed to be pre-conscious, effortless, and highly efficient. 

Covariant Cross-Modal Stimuli: This refers to pairs of auditory-visual stimuli that are congruent and mutually informative.

The Superior Colliculus (SC) is a part of the brain located in the midbrain that is involved in the control of eye movements and visual attention. It is responsible for integrating visual and auditory inputs, and for generating enhanced responses to auditory-visual stimuli.

Covariant auditory-visual experience refers to the experience of two stimuli that are presented together, such as a sound and a visual stimulus. This experience is necessary for the SC to be able to integrate visual and auditory inputs. 

The anterior ectosylvian sulcus is an area of association cortex located in the brain. It is responsible for the expression of multisensory integration in detection/localization behaviors. 



No comments:

Post a Comment