Featured Post

social media

Why Sensory Overload Isn’t About “Too Much”

 

Why Sensory Overload Isn’t About “Too Much”

A neuroscientist’s view of sensory effort in autism and ADHD

Key points

  • The brain works harder when sensory information is unclear, and eases off when it’s clear.
  • Sensory overload often reflects sustained effort, not oversensitivity.
  • Autism and ADHD can involve carrying that effort for longer periods of time.
  • Predictability often reduces sensory strain more than reducing stimulation.

When autistic or ADHD people talk about sensory overload, the responses are usually meant to be reassuring.

“Everyone gets overwhelmed sometimes.”
“Try to tune it out.”
“You’ll get used to it.”

What these comments quietly assume is that sensory overload is a problem of quantity. Too much noise. Too much light. Too much stimulation.

As a neuroscientist, I think that framing is incomplete.

What strains the brain is not simply intensity. It’s uncertainty.

The brain is not a passive receiver of sensory input. It doesn’t wait for the world to arrive and then react. Instead, it is constantly combining information from sight, sound, touch, movement, and timing to answer a basic question: What is happening right now, and how important is it?

In neuroscience, this process is called multisensory integration. It refers to how the brain fuses information across the senses into a single interpretation of an event.

You experience it all the time. A voice becomes linked to a face. Footsteps paired with motion feel more urgent. A room feels calm or chaotic before you can explain why. Most of the time, this integration happens smoothly and outside awareness.

Until it doesn’t.

One detail that’s often overlooked is that multisensory integration isn’t guaranteed. The brain doesn’t always fuse information just because it arrives through more than one sense. Integration depends on how trustworthy the signals feel and whether combining them actually reduces uncertainty.

That distinction matters, because it means integration itself can be effortful.

One of the core principles governing multisensory integration is known as inverse effectiveness. Despite the technical name, the idea is intuitive.

When sensory signals are weak, ambiguous, or unreliable, the brain boosts their combination more strongly. When signals are already clear and robust, adding more information helps less.

Neuroscientists describe this using terms like superadditive and subadditive integration.

Superadditive integration means the brain’s response to two signals together is greater than the sum of each signal on its own. Two weak cues can suddenly feel urgent when they occur together. Imagine hearing a faint sound in a quiet house. On its own, you might ignore it. Now imagine that same faint sound paired with a slight movement in your peripheral vision. Neither signal is strong, but together they demand attention. The brain amplifies the combination because it reduces uncertainty: something is happening.

Subadditive integration, by contrast, occurs when signals are already strong and clear. In that case, adding more information doesn’t help much—and can even interfere. If someone is speaking loudly and clearly right in front of you, adding background music or visual clutter doesn’t improve understanding. It makes the experience more effortful, because the brain has to sort out what’s relevant and what isn’t.

These aren’t abstract math concepts. They describe how the brain allocates effort. Superadditive responses reflect a system working hard to extract meaning from uncertainty. Subadditive responses reflect efficiency—the brain already has enough information and doesn’t need to amplify further.

This distinction helps explain why sensory experiences can feel so different across people and contexts.

Many everyday environments are not just stimulating; they are informationally messy. Loud, but not meaningful. Busy, but unpredictable. Full of signals that don’t line up cleanly in space or time.

In those conditions, the brain may remain in a more superadditive mode—continually amplifying combined sensory input in an effort to reduce uncertainty. That amplification is adaptive. But it is also costly.

For autistic and ADHD individuals, whose sensory systems often place greater weight on incoming information, that cost can accumulate quickly.

Multisensory integration also depends on expectations about space and time. Signals that come from the same location and occur close together in time are easier for the brain to bind. Neuroscientists refer to these constraints as spatial alignment and temporal alignment.

When these expectations are met, integration tends to be efficient and often subadditive. When they are violated—when sound and sight drift apart, when timing is inconsistent—integration becomes less efficient, and amplification increases.

Modern environments introduce many small misalignments: overlapping conversations, asynchronous audiovisual cues, subtle visual flicker, unpredictable movement. None of these is necessarily overwhelming on its own. Over time, however, they can push the brain toward sustained superadditive processing—constantly boosting signals to maintain coherence.

What looks like “overreaction” from the outside is often ongoing neural problem-solving.

Another important insight from neuroscience is that multisensory integration isn’t just a reflex. Even rapid orienting responses are shaped by top-down influences—meaning the brain’s expectations, goals, and current state help decide when sensory signals should be amplified and when they should be restrained.

In other words, the brain actively regulates multisensory gain. State matters. Fatigue matters. Context matters. The system can dial integration up or down—but doing so requires resources.

This helps explain why sensory tolerance often collapses when someone is tired, stressed, or already carrying a heavy cognitive load.

This is why sensory challenges are not well captured by the idea of “hypersensitivity” alone.

A more precise concept from neuroscience is gain control—essentially the brain’s volume knob. When information is unclear, the brain turns the signal up to extract meaning. The tradeoff is that everything feels louder, including variability and noise.

From this perspective, heightened sensory responses can reflect a nervous system operating in a high-gain, superadditive state for extended periods. The system isn’t malfunctioning. It’s compensating.

This helps explain patterns many autistic and ADHD people recognize immediately: why unfamiliar environments are harder than familiar ones; why fatigue collapses tolerance; why predictability can be regulating even in stimulating settings; and why recovery often involves restoring coherence rather than eliminating input entirely.

People are often told to “tune out” unwanted sensory input. But multisensory integration happens automatically, at early stages of processing. It is not something one can switch off through effort or intention.

The brain binds information because that is how perception is built. Asking someone to simply ignore conflicting sensory cues is asking their nervous system to suspend a fundamental operation.

What can look like avoidance or rigidity may instead reflect strategic regulation—an attempt to move the system back toward a lower-gain, more subadditive state.

Another detail rarely discussed outside research contexts is that multisensory integration is not fully mature at birth. The brain learns not just how sensory signals relate to one another, but how much weight to give them. This calibration process is shaped by repeated experience.

This learning continues across the lifespan and is shaped by both individual processing styles and repeated experience. Seen this way, sensory patterns in autism and ADHD reflect how people adapt to their environments, rather than fixed characteristics.

Change the context, and the balance between superadditive and subadditive processing can shift—sometimes dramatically.

When sensory challenges are framed solely as personal limitations, responsibility rests entirely on the individual: cope more, adapt faster, tolerate longer.

Looking at sensory overload this way adds nuance without replacing one explanation with another. It shifts attention to the fit between how a person processes information and what their surroundings ask of them—and to how long anyone can reasonably sustain that level of effort.

Some individuals can carry that load with little cost. Others can do so only briefly, or only in certain contexts.

Neither response is a failure.

Rather than treating sensitivity as something to overcome, neuroscience invites us to see it as information—a signal about how a particular nervous system is interacting with a particular set of demands.

Some brains amplify uncertainty more strongly. Some situations generate more uncertainty than others. When those factors align, overload can emerge—not as a sign of weakness, but as a sign that regulatory limits have been reached.

I’m curious how readers recognize this in their own lives.
Are there environments where your sensory system feels efficient and supportive—and others where it feels effortful or draining? What kinds of predictability, timing, or alignment make the biggest difference for you?

I’d welcome your reflections in the comments.

References

Stein, B. E., & Stanford, T. R. (2008).
Multisensory integration: Current issues from the perspective of the single neuron.
Nature Reviews Neuroscience, 9(4), 255–266.
https://doi.org/10.1038/nrn2331


How the Brain Fuses the Senses

 

How the Brain Fuses the Senses

A classic paper that changed how we think about perception

Imagine walking down the street when you hear a dog bark and, at the same moment, see something moving toward you. You don’t experience these as two separate events—sound first, vision second. Instead, your brain delivers a single, urgent message: something is coming—pay attention.

That seamless fusion is so natural we rarely question it. But how the brain actually pulls this off—how it combines sight, sound, touch, and movement into a coherent experience—turns out to be one of the deepest questions in neuroscience.

In 2008, neuroscientists Barry Stein and Terrence Stanford published a paper that fundamentally reshaped how scientists think about this process. Rather than talking about perception in abstract terms, they asked a far more concrete question: what does a single neuron do when it receives information from more than one sense?

The answer changed everything.

Multisensory integration is not just “many senses at once”

At first glance, multisensory integration sounds obvious. We see and hear at the same time, so of course the brain combines those signals. But Stein and Stanford were very precise about what counts as integration.

From a neural perspective, integration only occurs when a neuron responds differently to a combined stimulus than it does to the strongest single stimulus alone. If a neuron fires the same way whether it hears a sound or hears a sound plus sees a flash, nothing special is happening. But if the combined input changes the response—boosting it, suppressing it, or reshaping it—then the brain is doing real computation.

This distinction matters because it shows perception isn’t about stacking sensory channels side by side. It’s about transformation.

Inside the multisensory neuron

Some neurons are wired to respond to more than one sensory modality. A single neuron might fire to a sound, fire to a visual cue, and then respond even more strongly when those cues occur together.

What Stein and Stanford showed is that this extra response follows rules. Sometimes the combined signal produces a dramatic boost. Other times, the neuron actually responds less when multiple senses are involved.

That might seem counterintuitive. Why would the brain ever dampen a response when it has more information? Because integration isn’t about maximizing input—it’s about deciding what matters.

When more becomes less—and why that’s useful

One of the most influential insights from the paper is that multisensory integration can enhance or depress neural responses. A combined sight-and-sound signal might amplify activity, or it might suppress it, depending on context.

This led to a deeper realization: neurons don’t combine signals in a single way. Sometimes the response to two senses is greater than the sum of their parts. Sometimes it’s roughly equal. And sometimes it’s far less than you’d expect.

Out of this came a principle that now appears everywhere in multisensory research: inverse effectiveness. The weaker or noisier the individual signals, the more the brain gains by combining them. When each sense is already clear and strong, integration adds little. But when information is uncertain—dim light, background noise, ambiguity—the benefits of fusion become dramatic.

This helps explain why multisensory processing plays such a powerful role in development, in challenging environments, and in many clinical contexts. Integration is not a luxury. It’s a strategy for dealing with uncertainty.

Space and time set the boundaries

The brain doesn’t integrate signals indiscriminately. Stein and Stanford showed that multisensory neurons obey strict spatial and temporal constraints.

Signals are most likely to be fused when they come from the same place. If a sound originates on the left and a visual cue appears on the right, neurons are far less likely to treat them as part of the same event. This spatial rule reflects a basic assumption built into the nervous system: things that belong together tend to happen together in space.

Timing matters just as much. The brain operates with a temporal binding window—a span of time during which signals can still be linked even if they don’t arrive simultaneously. This window accounts for the fact that sound, light, and touch travel at different speeds and are processed at different rates. Integration works best when neural responses overlap in time, not merely when stimuli occur at the exact same instant.

Together, these spatial and temporal rules ensure that integration supports coherent perception rather than confusion.

A midbrain structure with outsized influence

Much of Stein and Stanford’s work focused on the superior colliculus, a midbrain structure involved in orienting the eyes and head, shifting attention, and responding quickly to important events.

The superior colliculus turned out to be densely packed with multisensory neurons, making it an ideal place to study integration at the level of individual cells. When integration occurs here, behavior improves: responses are faster, localization is sharper, reactions are more efficient.

But one of the paper’s most striking findings is that the superior colliculus doesn’t work alone.

Integration is not a reflex—it’s a circuit

When researchers temporarily deactivated certain cortical areas, superior colliculus neurons still responded to sights and sounds. But something crucial disappeared. The extra boost from combining senses vanished. So did the behavioral advantages.

This showed that multisensory integration is not a simple bottom-up reflex. It depends on communication between cortex and midbrain. Integration emerges from a distributed circuit, shaped by experience, context, and higher-level processing.

Learning to fuse the senses

Perhaps the most surprising insight in the paper is that multisensory integration is not fully present at birth. Early in development, neurons may respond to multiple senses, but they don’t yet integrate them effectively.

Integration has to be learned.

Animals raised without normal cross-sensory experience—such as visual input paired with sound—fail to develop typical multisensory integration. The brain needs correlated experience to discover which signals belong together.

This makes multisensory integration a powerful example of experience-dependent plasticity. The brain doesn’t just receive the world. It learns how to bind it.

Cortex adds meaning, not just alignment

In higher cortical areas, multisensory integration becomes less about where and when, and more about meaning. Signals are evaluated for context, relevance, and semantic fit.

A voice paired with a matching face enhances neural responses. A mismatch can suppress them. Integration here reflects interpretation, not just detection.

This reveals an important shift: multisensory integration is not one process but many. Each brain region integrates information in ways that serve its goals—action, communication, prediction, understanding.

Rethinking “unisensory” cortex

The paper ends with a question that still unsettles neuroscience. If even early sensory areas receive input from other senses, does it still make sense to talk about purely visual or auditory cortex?

Stein and Stanford don’t argue for abandoning these labels altogether. Instead, they suggest a more nuanced view—one that recognizes gradients, transitional zones, and widespread multisensory influence.

Perception, in this view, is never purely unisensory. It is shaped by context from the start.

Why this paper still matters

Nearly two decades later, this work remains foundational because it demonstrated that multisensory integration is nonlinear, rule-governed, learned, and behaviorally meaningful. It showed that perception is not passive reception but active synthesis—built from circuits that balance signal strength, uncertainty, experience, and purpose.

That insight continues to shape how we think about attention, development, peripersonal space, predictive processing, and sensory differences in autism and ADHD.

In short, the paper taught us that the brain doesn’t simply sense the world. It actively constructs it—one integrated moment at a time.

Reference
Stein, B. E., & Stanford, T. R. (2008). Multisensory integration: current issues from the perspective of the single neuron. Nature Reviews Neuroscience, 9(4), 255–266. https://doi.org/10.1038/nrn2331

A paradox many neurodiverse people face — being told to show emotion, only to be judged when we do.

Read Full Article at


 

My TedX talk

  My Ted X talk titled "Pebbles in the Pond of Change

Hari Srinivasan, shares a powerful message about the power of small actions in creating ever-widening ripples in the pond of change. Drawing from personal experiences and the legacy of disability rights leaders, he redefines progress as a journey that starts with simple, accessible steps. His inspiring message encourages everyone to identify and act on their own "small pebbles" to drive societal transformation.

2025, Quietly

 2025 Quietly

On neuroscience, disability, staying with the mess, and the work that continues anyway.


I don’t always write year-in-review posts; my last one was the year I graduated from UC Berkeley; a giddy experience. But this one felt worth writing down—not as a highlight reel, but as a record of how a year actually unfolded.


2025 unfolded quietly—with a calendar that filled up faster than I expected and a body that occasionally reminded me (loudly) that plans are always provisional. Coming off a stressful 2024 dominated by qualifying exams, the year felt different—more open. Looking back, 2025 was a year of steady academic progress, the kind where progress doesn’t always feel dramatic in the moment, but adds up when you stop and actually take stock.


I love knowledge. The neuroscientist part of me is always looking for patterns, mechanisms, explanations—how systems adapt, where things break. That instinct hasn’t gone anywhere. But disability doesn’t work like a clean equation. And life doesn’t reduce to mechanisms.


There were health bumps this year. The kind that don’t ask permission. They slowed me down, forced adjustments, and reminded me—again—that there’s no clean story here. 

Disability is messy—messy, messy. Many days are just downright hard.


Academically, 2025 was one of my strongest grounding years yet. Not in a flashy way—but in the steady, settled sense that the work is maturing, finding its voice, and reaching the people it’s meant to reach.


Publishing entered the year quietly, through drafts, revisions, and the unfamiliar process of seeing early work move toward completion.


Journal publications (3 first-author, 2 co-author):

Several other manuscripts are still winding their way through peer review—living in that space  of waiting and cautious optimism.


Alongside writing, I spent time on the other side of the process too: completing peer reviews, joining the editorial team of Autism in Adulthood, and serving as incoming Student Editor for Vanderbilt Reviews Neuroscience. Reviewing isn’t glamorous, but it’s where the research fields quietly decide what counts—a responsibility that should not be taken lightly.


On the research front, something important happened: I completed an initial pilot study for my own work. Careful steps, but real data. So not only did my research task do what its supposed to do and measure; it turned out that my task was a novel measure of multisensory science. And its simple and intuitive design meant it is accessible to a wider range of autistic participants. I also had my first dissertation committee meeting, which helped guide the direction of the research moving forward. Along the way, I presented my research at multiple conferences, received two research poster awards, and was invited to the Sigma Xi scientific honor society.


In July, I traveled to the UK with my research lab for the IMRF conference in Durham—a meeting devoted entirely to multisensory science. It was the kind of conference where no one needs convincing that perception is embodied, contextual, and relational. Everyone spoke the same sensory language and that made the science feel both rigorous and expansive.


Getting there was part of the experience. The long train ride from the rather chaotic King’s Cross station up to Durham gave the trip a sense of gradual arrival—watching the landscape shift before the intensity of the conference began. The train journey spilled into a small moment of levity—a poem I wrote, “We all live in a multisensory world,” was loosely set to Yellow Submarine and echoing a phrase my research mentor often repeats. Evenings found their own rhythm: dinners with lab mates (including a Turkish one), narrow alleyways, and dancing at Durham Castle, which still serves as a student dorm at Durham University. Imagine getting to live in a castle.


Durham cathedral itself is hard to stand in without thinking of Harry Potter—so steeped in that imagery that you half expect a spell to echo through the nave. I also saw the Magna Carta—all surviving versions in one place. Impressive, yes, but what also came to mind was a long-running version-control problem. When the guide pointed out how rosaries in portraits were painted over or restored depending on the era, it felt like a visual changelog of belief systems being edited to fit the moment.


After the conference, I spent a couple of quieter days in Kent with the lively Aunty Bessie, who is Tongan, enjoying her stories and updates from Tonga. I needed that shift in pace.


One of the unexpected highlights of the year was serving on the Scientific Advisory Committee for the Autism Europe Congress in Ireland in September. It was an honor—but also a reminder of how powerful it is to be in rooms where disability-centered research is treated as foundational, not peripheral. And get to witness how autism is being unfolded in Europe. I’m not the most “social” person in face-face interactions,  but it was  meaningful to meet folks who had been just email voices before.


Outside the conference halls, Ireland itself left an imprint. The grass really does have a special shade of green—and the rainbows don’t arc politely across the sky; they stretch a full 180 degrees, as if insisting on being seen. You half expect a leprechaun to appear at the edge of the light, digging for gold where the rainbow meets the meadow. I also saw the rugged coasts of Northern Ireland, including basalt rock formations believed to be bridges built by giants, and heard tales of Irish and Welsh giants from a tour bus driver who delivered them with complete sincerity, along with the rendering of an Irish ballad “Cockles and Mussels”, of a fishmonger calling out her wares.


Dare I say it: that was my best conference yet. And for once, I wasn’t stressed at all for an entire trip.


2025 continued as a media year. Two pieces in particular traveled further than I expected:

  • The Physics of Autistic Inertia
  • Do You Grow Out of Autism?

One of the most meaningful milestones was co-writing a book foreword with Dr. Temple Grandin: Unique Journeys, Common Ground in the Autism for Dummies Book. That collaboration mattered to me not just intellectually, but personally. I've grown up hearing her name.


Invited presentations:

  • NSF ERVA: Engineering Visioning the Future for Neurodivergence (Vanderbilt)
  • UC Berkeley Neurodiversity Symposium (Keynote)
  • Teaching two classes on autism research at the Stanford SNP-REACH Summer Program
  • Autism Tree Global Neurodiversity Conference (UCTV)
  • Chennai & Bangalore autism community events (India)

In November, I traveled to India to see grandma. India trips are always layered. In recent years, several visits have coincided with the loss of grandparents—I’d lost two in close succession—and that changes how you experience time, family, and return. This visit carried that weight too, alongside moments of connection, memory, and grounding.


And then December arrived. Quietly. Without fanfare.


I was named an awardee of the SfN Neuroscience Scholars Program.


What stayed with me wasn’t the recognition itself, but what it symbolized. Coming on the heels of last year’s NSF GRFP, along with other awards and the Sigma Xi invitation earlier this year, it felt less like a single moment and more like a steady throughline: that disability-centered neuroscience belongs inside the field, shaping how it moves forward.


If there’s a theme to the year, it’s this: momentum doesn’t always roar. Sometimes it arrives quietly, carrying both grief and growth, and asks you to keep going—with care.


None of this happened in isolation. Whatever worked this year did so because of an ecosystem around me—my superstar research mentors Mark Wallace and Carissa Cascio, my teammates in the Wallace Lab, Keivan Stassun and Tim Vogus at the Frist Center for Autism and Innovation. Their patience, flexibility, and steady support made it possible for me to keep working through challenges that aren’t always visible, and to do work that would not have been possible on my own.


I’m an “awe-tistic neuroscientist” and a writer who loves ideas—but I don’t want ideas to float above people. I’m still learning how to turn knowledge into care, and how to stay curious without losing humility. I’m a thinker with a very deep inner world; I experience the outside sensory world intensely—and I’m still figuring out how to bridge the two. Disability is messy, and solutions often feel far away. The work continues not because it’s clear, but because it’s necessary.


2025, quietly. 2026, still unfolding.