I’ve been thinking a lot about who gets heard by AI—and who doesn’t. We tend to talk about artificial intelligence as if it’s neutral. Objective. Just math and data. But for many autistic people—especially those who are minimally speaking or nonspeaking—AI systems don’t just fail sometimes. They quietly shut people out. That’s what my paper (currently under peer review) is about: something I call engineered exclusion
What do I mean by “engineered exclusion”?
Most AI communication tools assume a very specific kind of user:
- Speaks fluently
- Speaks quickly
- Uses “standard” English
- Communicates in neat, predictable ways
- Voice assistants that don’t recognize their speech at all
- Text-to-speech voices that mispronounce basic words or names
- Systems that require extra labor just to be understood
- Interfaces designed more for caregivers than for the user themselves
“Nonspeaking” doesn’t mean “no language”
AI systems, however, tend to flatten all of that variation into a single question: Does this look like typical speech or not? If the answer is no, the system often treats the user as noise.
Why this keeps happening
- Fluent speakers
- Neurotypical communicators
- Majority-language users
- Western norms of “clear” expression
Then we evaluate those systems using benchmarks that reward speed, fluency, and predictability. So when a system fails to understand a nonspeaking autistic user, the problem isn’t labeled exclusion. It’s labeled error. And the burden to fix it gets pushed onto the user—who has to type things phonetically, add extra spaces, reword sentences, or give up altogether. From the system’s perspective, everything looks fine. From the user’s perspective, communication becomes exhausting.
Designed dignity: a different way forward
That means:
- Valuing expressive access as much as input accuracy
- Designing for communication that changes over time and state
- Measuring whether people can be heard, not just whether the system performs well on average
- Including nonspeaking autistic people (and their families) as co-designers, not edge cases
Accessibility isn’t a bonus feature. It’s part of whether AI can honestly claim to be fair.


