I want to start with a moment most of us recognize, even if we’ve never named it.
You’re waiting to cross the street. Your eyes are fixed on the signal. Somewhere in the background, there’s a faint beeping sound. You’re not consciously deciding which one to trust. You’re just waiting—and the instant something tells you it’s time, you move.
Now imagine the light changes and the beep happens at the same time. You step forward a little faster than usual.
At first glance, it feels obvious why. Two senses together must be “working better,” right? Vision and hearing combine, reinforce each other, and speed things up.
But neuroscience has a habit of questioning things that feel obvious.
This is where the idea known as the race effect comes in, and it quietly complicates how we think about multisensory processing—especially in autism.
The race effect starts with a surprisingly modest claim. What if your senses aren’t collaborating at all? What if they’re competing?
Instead of vision and hearing merging into a single unified signal, imagine them running in parallel, like two runners heading toward the same finish line. Whichever one gets there first triggers your response. When both are present, you’re faster not because your brain fused them, but because you gave it two chances to succeed.
This isn’t a metaphor neuroscientists use casually. It’s formalized in what’s called the race model, which acts as a kind of skeptic inside multisensory research. It asks whether the benefits of seeing and hearing something together can be explained by simple probability alone. Two independent processes, racing side by side, will naturally produce faster responses some of the time. No communication required.
Why does this matter? Because for years, faster responses to multisensory input were often taken as automatic evidence of integration. The race model forces a pause. It draws a line in the sand and says: up to this point, speed can be explained without the senses ever talking to each other. Only when responses are faster than that line allows do we have strong evidence that the brain is truly integrating information across senses.
This distinction turns out to be especially important when we talk about autism.
Autistic sensory processing is often described using blunt language. Too sensitive. Not sensitive enough. Overwhelmed. Delayed. But the race effect invites a more careful question: when autistic people respond differently to multisensory input, is that because integration is impaired—or because the brain is doing something else entirely?
In many studies, autistic participants don’t always show strong violations of the race model. Sometimes multisensory cues don’t speed things up as much as expected. Sometimes they help only under specific timing conditions. Sometimes they don’t help at all.
It’s tempting to interpret this as a deficit. But that interpretation assumes that faster is always better, and that automatic integration is always the goal.
What if it isn’t?
If your brain is less inclined to fuse sensory signals automatically, you may rely more on each sense independently. That can mean slower responses in simple lab tasks—but it can also mean greater precision, reduced susceptibility to misleading cues, and more control over when and how information is combined.
From this perspective, autistic sensory processing isn’t broken integration. It’s selective integration.
And selective integration comes with a cost that doesn’t show up neatly in reaction-time graphs: effort.
Many everyday environments are designed around the assumption that multisensory integration happens effortlessly. Classrooms, offices, restaurants, and public spaces bombard us with overlapping sounds, lights, movements, and social signals. If your nervous system doesn’t automatically collapse all of that into a single, coherent stream, you’re left doing continuous sensory arbitration—deciding, moment by moment, what to trust, what to ignore, and what to act on.
The race effect helps explain why this can be exhausting. When senses are racing rather than fusing, the brain stays on high alert. It doesn’t take shortcuts. It doesn’t assume redundancy is helpful. It waits.
Slower responses, in that light, aren’t signs of disengagement. They’re signs of caution.
This reframing matters because it challenges a quiet moral judgment that often sneaks into discussions of autism: that efficiency equals health, and speed equals competence. The race model reminds us that nervous systems are not optimizing for speed alone. They are optimizing for survival in specific contexts.
In uncertain or overwhelming environments, automatic integration can backfire. Ignoring redundant cues, delaying decisions, or keeping sensory channels separate may actually be protective. Sometimes, letting senses race instead of forcing them to merge is the safer strategy.
Autism makes this tradeoff visible. It reveals the hidden labor that most brains perform invisibly—and reminds us that what looks like delay from the outside may reflect careful computation on the inside.
Once you see the race effect this way, the question shifts. It’s no longer “Why don’t autistic people integrate senses automatically?” It becomes “What kinds of environments assume automatic integration—and who do those environments leave behind?”
That’s not just a neuroscience question. It’s a design question. A social question. And, ultimately, an ethical one.








