I recently wrote a Psychology Today piece [Why Perception is Not Just What We Sense] about a simple idea: perception isn’t something we receive. It’s something the brain builds. I used a few familiar illusions—the McGurk effect, the stream–bounce illusion, the sound-induced flash illusion, and the parchment-skin illusion—to show what happens when the building process becomes visible.
What I couldn’t fit into that article is the part I think about most: illusions aren’t one category of party trick. They’re a toolkit. Different illusions probe different “decisions” the brain has to make—about timing, about cause, about whether signals belong together, about what counts as part of the body, and about how much certainty is “enough.”
And once you see that, a lot of so-called sensory problems start to look less like raw sensitivity and more like sustained interpretation.
The brain’s job isn’t accuracy. It’s a usable story.
In the PT article, I described how a click sound can make two ambiguous moving dots feel like they bounced rather than streamed through each other. Or how two quick beeps can make a single flash look like two flashes. These aren’t failures. They’re examples of something the brain does constantly: it takes incomplete data and tries to settle on a stable explanation.
Most people never notice the settling. That’s because in everyday life, the senses usually cooperate. The world is redundant. Events line up. The brain can lean on shortcuts.
But when signals are noisy, conflicting, or slightly out of sync, the shortcuts matter more. Some nervous systems rely on those shortcuts heavily. Others apply them more cautiously. Either way, the same problem exists: the brain has to decide what belongs together.
That decision is where the work is.
A quick map: what different illusions are actually testing
If you want to understand multisensory processing—especially in autism and ADHD—it helps to sort illusions by what kind of decision they stress-test.
1) “Where did it happen?” (spatial capture + recalibration)
Classic example: the ventriloquism effect, where vision pulls perceived sound location. With repeated exposure, the brain can even recalibrate (“ventriloquism aftereffect”).
What it tests: cue weighting (which sense do you trust more for location?) and the brain’s willingness to “retune” when cues conflict.
2) “When did it happen?” (temporal binding + temporal recalibration)
This includes the sound-induced flash illusion, and a whole family of tasks measuring how wide or narrow someone’s binding window is.
What it tests: temporal grouping rules—how strict your brain is about what counts as “the same moment.”
3) “What happened?” (causal structure under ambiguity)
The stream/bounce illusion is the cleanest example: same visual data, different interpretation. A click nudges the brain toward “collision.”
What it tests: causal inference—is this one event or two? Did they interact? Did something cause something else?
4) “What am I looking at?” (identity and speech binding)
The McGurk effect lives here. It’s not about where or when—it’s about what the percept is.
What it tests: how strongly the brain fuses cues into a single identity when they disagree.
5) “What is my body?” (body ownership + body schema)
Think rubber hand illusion–type paradigms.
What it tests: which signals win when defining “mine”—vision, touch, proprioception, agency.
6) “What does it feel like?” (material and surface properties)
The parchment-skin illusion is a great example: sound changes perceived texture/dryness.
What it tests: how the brain constructs material qualities—often from cross-sensory priors about what roughness should sound like.
This map matters because it shows something subtle: “sensory issues” aren’t one thing. You can struggle with timing but not localization. Or be great at spatial integration but get wrecked by causal ambiguity. The word “sensitivity” flattens all of that.
Autism, ADHD, and the cost of unresolved decisions
Here’s the extension I wanted to make more explicitly.
A lot of the exhausting moments aren’t “too much input.” They’re moments where the brain can’t quickly settle: Is that voice and that face one event, or two? Is that sound part of this movement, or background? Is this touch coming from me, or from something external? Is this mismatch important, or ignorable? Do these signals belong together enough to fuse?
When those decisions resolve quickly, you get smooth perception: one coherent world.
When they don’t, you get something else: attention sticks. Not because you’re dramatic, but because the brain is still doing the job.
This is where autism and ADHD can look similar—both can involve distractibility, overload, and fatigue—but for different reasons. In broad strokes:
-
Autistic perception is often described (in some lines of research) as more cautious about fusing when cues conflict—less reliance on “automatic unity.” That can preserve fidelity, but it can also make the world feel less forgiving when signals don’t match.
-
ADHD can involve instability of attention and salience, where the system has trouble holding one interpretation steady long enough for it to become background.
Those are not diagnoses in a sentence. They’re just a way of naming what many people recognize: the brain is not only sensing; it’s negotiating. And the negotiation has a metabolic cost.
The real accessibility lever: alignment, not elimination
If perception is “work under uncertainty,” the goal isn’t to remove all stimuli. That’s impossible, and it’s not even always desirable. The lever is simpler:
Reduce unnecessary conflict. Reduce forced decision points.
That can look like: Better audio-video sync (even tiny lags matter). Cleaner acoustics (less masking and competing streams). Predictable rhythms (consistent pacing in speech, predictable transitions). Fewer simultaneous demands (don’t pair complex listening with complex navigation). Environmental design that minimizes “sensory disagreements” (e.g., harsh lighting + echo + crowd movement is a perfect storm)
Sometimes the most supportive change isn’t dimmer lights or quieter rooms. It’s coherence. Less mismatch. Less ambiguity. Less “invent reality just to keep going.”
A different reframe
The point of illusions isn’t that we’re easily fooled. It’s that the brain is always choosing between interpretations, and it usually chooses the one that keeps the world usable.
So when someone says, “I’m sensory,” I increasingly hear: “My brain is doing more interpretation, more often.” And when someone looks “overwhelmed,” I don’t assume weakness. I assume workload.
Sometimes, the illusion isn’t the problem. It’s the clue.
.jpg)
No comments:
Post a Comment