Jeremy is a 31-year-old autistic man who loves music and biking. He’s highly sensitive to lights, sounds, and textures, has difficulty initiating movement, and can say only a few words. Throughout his schooling, it was assumed he was incapable of learning to read and write. But for the past 30 minutes, he’s been wearing an augmented-reality (AR) headset and spelling single words on the HoloBoard, a virtual keyboard that hovers in the air in front of him. And now, at the end of a study session, a researcher asks Jeremy (not his real name) what he thought of the experience.
Deliberately, poking one virtual letter at a time, he types, “That was good.”
It was not obvious that Jeremy would be able to wear an AR headset, let alone use it to communicate. The headset we use,
Microsoft’s HoloLens 2, weighs 566 grams (more than a pound), and the straps that encircle the head can be uncomfortable. Interacting with virtual objects requires precise hand and finger movements. What’s more, some people doubt that people like Jeremy can even understand a question or produce a response. And yet, in study after study, we have found that most nonspeaking autistic teenage and adult participants can wear the HoloLens 2, and most can type short words on the HoloBoard.
Nonspeaking autistic people can use the HoloBoard to type independently.
The HoloBoard prototype that Jeremy first used in 2023 was three years in the making. It had its origins in an interdisciplinary
feasibility study that considered whether individuals like Jeremy could tolerate a commercial AR headset. That study was led by the three of us: a developmental psychologist (Vikram Jaswal at the University of Virginia), an electrical and software engineer (Diwakar Krishnamurthy at the University of Calgary), and a computer scientist (Mea Wang, also at the University of Calgary).
Our journey to this point was not smooth. Some autism researchers told us that nonspeaking autistic people “do not have language” and so couldn’t possibly communicate by typing. They also said that nonspeaking autistic people are so sensitive to sensory experiences that they would be overwhelmed by augmented reality. But our data, from more than a half-dozen peer-reviewed studies, have shown both assumptions to be wrong. And those results have informed the tools we’re creating, like the HoloBoard, to enable nonspeaking autistic people to communicate more effectively.
What Is Nonspeaking Autism?
Autism is a lifelong neurological condition that affects people in very different ways. It’s most commonly associated with social differences, but many autistic people also have difficulty with communication. In fact, about
one-third of autistic children and adults are nonspeaking: Even after years or decades of speech therapy, they cannot communicate effectively using speech. We don’t yet know why, but it may be related to the significant motor challenges associated with producing speech. As with autism in general, nonspeaking autistic people have a range of abilities and language skills: Some are comfortable typing, while others struggle to communicate at all.
Nonspeaking autistic people may also appear inattentive, engage in impulsive behavior, and score poorly on standard intelligence tests (many of which require spoken responses within a set amount of time). Historically, these challenges have led to unfounded assumptions about these individuals’ ability to understand language and their capacity for symbolic thought. To put it bluntly, it has sometimes been assumed that someone who can’t talk
is also incapable of thinking.
Most attempts to provide nonspeaking autistic people with an alternative to speech have been rudimentary.
Picture-based communication systems, often implemented on an iPad or tablet, are frequently used in schools and therapy clinics. If a user wants a cookie, they can tap a picture of a cookie. But the vocabulary of these systems is limited to the concepts that can be represented by a simple picture.
When asked what he thought of a HoloBoard session, a user typed out a positive review. Ethereal Research Group
There are other options. Some nonspeaking autistic people have learned, over the course of many years and guided by parents and professionals, to communicate by spelling words and sentences on a letterboard that’s held by a trained human assistant—a communication and regulation partner, or CRP. Part of the CRP’s role is to provide attentional and emotional support, which can help with conditions that commonly accompany severe autism and that interfere with communication, including anxiety, attention-deficit hyperactivity disorder, and obsessive-compulsive disorder. Having access to such assisted methods of communication has allowed nonspeaking autistic people to
graduate from college, write poetry, and publish a best-selling memoir.
But the role of the CRP has generated considerable controversy.
Critics contend that the assistants can subtly guide users to point to particular letters, which would make the CRP, rather than the user, the author of any words produced. If nonspeaking autistic people who use a letterboard really know how to spell, critics ask, why is the CRP necessary? Some professional organizations, including the American Speech-Language-Hearing Association, have even cautioned against teaching nonspeaking autistic people communication methods that involve assistance from another person.
And yet, research suggests that CRP-aided methods can teach users the skills to communicate
without assistance; indeed, some individuals who previously required support now type independently. And a recent study by coauthor Jaswal showed that, contrary to critics’ assumptions, most of the nonspeaking autistic individuals in his study (which did not involve a CRP) knew how to spell. For example, in a string of text without any spaces, they knew where one word ended and the next word began. Using eye tracking, Jaswal’s team also showed that nonspeaking autistic people who use a letterboard look at and point to letters too quickly and accurately to be responding to subtle cues from a human assistant.
Our AR Approach to Autism
So how can technology help nonspeaking autistic people communicate? It’s not unusual for researchers to look at a platform technology like AR and imagine how it could be used to help a group of people. However, the ultimate success of any such project isn’t judged by technical innovation or elegance. Rather, the main criterion for success is whether or not the end result is used and useful. An amazing technology that is, say, too delicate or expensive to escape the laboratory is of limited value. And a raft of innovations that miss the mark in meeting the needs of the people it’s supposed to help is similarly limited.
Our focus then was not on improving underlying AR hardware and system software, but finding the most productive ways to adapt it for our users.
We knew we wanted to design a typing system that would allow users to convey anything they wanted. And given the ongoing controversy about assisted communication, we wanted a system that could build the skills needed to type independently. We envisioned a system that would give users more agency and potentially more privacy if the tool is used outside a research setting.
Geoff Ondrich [left] uses the Meta Quest 3 headset to type letters independently via the HoloBoard system. The augmented-reality system can be configured to use either hand tracking or eye tracking to determine which letter the user intends to press. Madison Imber
Augmented reality has various features that,
we reasoned, make it attractive for these purposes. AR’s eye- and hand-tracking capabilities could be leveraged in activities that train users in the motor skills needed to type, such as isolating and tapping targets. Some of the CRP’s tasks, like offering encouragement to a user, could be automated and rolled into an AR device. Also, AR allows users to move around freely as they engage with virtual objects, which may be more suitable for autistic people who have trouble staying still: A HoloBoard can “follow” the user around a room using head tracking. What’s more, virtual objects in AR are overlaid on a user’s actual environment, making it safer and less immersive than virtual reality (VR)—and potentially less overwhelming for our target population.
We carefully considered our choice of hardware. While lightweight AR glasses like the Ray-Ban Meta AI glasses and
Snap’s AI Spectacles would have been less cumbersome for users, they don’t have the high-fidelity hand-tracking and gaze-tracking we needed. Headsets like the HoloLens 2 and Meta’s Quest 3 provide greater computing power and support a broader range of interaction modalities.
We aren’t the first researchers to consider how AR can help autistic people. Other groups have used AR to offer autistic children real-time information about the
emotions people show on their faces, for example, and to gamify social- and motor-skill training. We drew inspiration from those efforts as we took on the new idea of using AR to help nonspeaking autistic people communicate.
A Collaborative Design Project
Our efforts have been powered by our close collaboration with nonspeaking autistic people. They are, after all, the experts about their condition, and they’re the people best suited to guide the design of any tools intended for them. Everything we do is informed by their input, including the design of prototypes and the studies to test those prototypes.
When neurotypical people see someone who cannot talk, whose body moves in unusual ways, and who acts in socially unconventional ways, they may assume that the person wouldn’t be interested in collaborating or wouldn’t be able to do so. But, as noted by
Anne M. Donnellan and others who conduct research with disabled people, behavioral differences don’t necessarily reflect underlying capacities or a lack of interest in social engagement. These researchers have emphasized the importance of presuming competence—in our case, that means expecting nonspeaking autistic people to be able to learn, think, and participate.
Thus, throughout our project, we have invited nonspeaking autistic people to offer suggestions and feedback in whatever manner they prefer, including by pointing to letters on a physical letterboard while supported by a CRP. Although critics of assisted forms of communication may object to this inclusive approach, we have found the contributions of nonspeakers invaluable. Through Zoom meetings, email correspondence, comments after research sessions, and shared Google docs, these participants have provided essential input about whether and how the AR technology we’re developing could be a useful communication tool. In keeping with the community’s interest in more independent communication, our tests of the technology have focused on nonspeakers’ performance
without the assistance of a CRP.
A user selects a letter on the HoloBoard by “pushing” it toward a virtual backplate. Successful activation is accompanied by a click and a recorded voice saying the letter aloud.Ethereal Research Group
In early conversations, our collaborators raised several concerns about using AR. For example, they worried that wearing a head-mounted device wouldn’t be comfortable. Our
first study investigated this topic and found that, with appropriate support and sufficient time, 15 of 17 nonspeakers wore the device without difficulty. We now have 3D-printed models that replicate the shape and weight of the HoloLens 2, to allow participants to build up tolerance before they participate in actual experiments.
Some users also expressed concern about the potential for sensory overload, and their concerns made us realize that we hadn’t adequately explained the difference between AR and VR. We now provide a
video before each study that explains exactly what participants will do and see and shows how AR is less immersive than VR.
Some participants told us that they like the tactile input from interacting with physical objects, including physical letterboards, and were concerned that virtual objects wouldn’t replicate that experience. We currently address this concern using
sensory substitution: Letters on the HoloBoard hover slightly in front of a semitransparent virtual backplate. Activating a letter requires the user to “push” it approximately 3 centimeters toward the backplate, and successful activation is accompanied by an audible click and a recorded voice saying the letter aloud.
Nonspeakers’ Preferences and Goals
Our users’ needs and preferences have helped us set priorities for our research program. One person noted that an AR communication system seemed “cool,” but worried that the motor skills required to interact in AR might not be possible without practice. So from the
very first app we developed, we built in activities to let users practice the motor skills they needed to succeed.
Participants also told us they wanted to be able to customize the holograms—not just to suit their aesthetic preferences but also to better fit their unique sensory, motor, and attentional profiles. As a result, users of the HoloBoard can choose its color scheme and the size of the virtual letterboard, and whether the letters are said aloud as they’re pressed. We’ve also provided several ways to
activate letters: by pressing them, looking at them, or looking at them while using a physical clicker.
We had initially assumed that users would be interested in predictive text capabilities for the HoloBoard—having it autofill likely words based on the first letters typed. However, several people explained that although such a system could theoretically speed up communication, they would find it distracting. We’ve put this idea on the back burner for now; it may eventually become an option that users can toggle on if they wish.
To make things easier for users, we’ve investigated whether the HoloBoard could be positioned automatically in space, dynamically adjusting to the user’s motor skills and movement patterns throughout a session. To this end, we used a
behavioral cloning approach: During real-world interactions between nonspeakers and their CRPs, we observed the position of the user’s fingers, palms, head, and physical letterboard. We then used that data to train a machine learning model to automatically adapt the placement of a virtual letterboard for a specific user.
So many assumptions are made about people who cannot speak, including that they don’t have anything to say.
Many nonspeaking participants who currently communicate with human assistance see the HoloBoard as providing a way to communicate with more autonomy. Indeed, we’ve found that after a 10-minute training procedure, most users of the HoloBoard can, like Jeremy,
use it to type short words independently. We recently began a six-month study with five participants who have regular sessions in building their typing skills on the HoloBoard.
One of the most common questions from our nonspeaking participants, as well as from parents and professionals, is whether AR could teach the skills needed to type on a standard keyboard. It seems possible, in theory. As a first step, we’re creating other types of AR teaching tools, including an
educational AR app that teaches typing in the context of engaging and age-appropriate lessons.
We’ve also begun developing a
virtual CRP that can offer support and feedback as a user interacts with the virtual letterboard. This virtual assistant, named ViC, can demonstrate motor movements as a user is learning to spell with the HoloBoard, and also offers verbal prompts and encouragement during a training session. There aren’t many professionals who know how to teach nonspeakers typing skills, so a virtual CRP could be a game changer for this population.
Practical and Technical Challenges of AR
Although nonspeakers have responded enthusiastically to our AR communication tools, our conversations and studies have revealed a number of practical challenges with the current technology.
For starters, most people can’t afford Microsoft’s HoloLens 2, which
costs US $3,500. (It’s also recently been discontinued!) So we’ve begun testing our software on less expensive mixed-reality products such as Meta’s $500 Quest 3, and preliminary results have been promising. But regardless of which device is used, most headsets are bulky and heavy. It’s unlikely that someone would wear one throughout a school day, for example. One idea we’re pursuing is to design a pair of AR glasses that’s just for virtual typing; a device customized for a single function would weigh much less than a general-purpose headset.
Shonagh Rae
We’ve also encountered technical challenges. For example, the HoloLens 2’s field of view is only 52 degrees. This restricts the size and placement of holograms, as larger holograms or those positioned incorrectly may be partially or entirely invisible to the user. So when participants use their fingers to point at virtual letters on the HoloBoard, some letters near the edges of the board may fall outside the visible area, which is frustrating to users. To address these issues, we used a vertical layout in our
educational app so that the multiple-choice buttons always remain within a user’s field of view. Our systems also allow a researcher or caregiver to monitor an AR session and, if necessary, adjust the size of virtual objects so they’re always in view.
We have a few other ideas for dealing with the field-of-view issue, including deploying devices that have a larger field of view. Another strategy is to use eye tracking to select letters, which would eliminate the reliance on hand movements and the problem of the user’s pointing fingers obscuring the letters. And some users might prefer using a joystick or other handheld controller to navigate and select letters. Together, these techniques should make the system more accessible while working within hardware constraints.
We have also been developing
cross-reality apps, which allow two or more people wearing AR headsets to interact within the same virtual space. That’s the setup we use to enable researchers to monitor study sessions in real time. Based on our development experience, we created an open-source tool called SimpleShare for the development of multiuser extended-reality apps in a device-agnostic way. A related issue is that many of our users make sudden movements; a sudden shake of a head can interfere with the sensors on the AR headset and upset the spatial alignment between multiple headsets. So our apps and SimpleShare instruct the headset to routinely scan the environment and use that data to automatically realign multiple devices, if necessary.
We’ve had to find solutions to cope with the limited computing power available on AR headsets. Running the AI model that automates the
custom placement of the HoloBoard for each user can cause a lag in letterboard interactions and can cause the headset to heat up. We solved this problem by simplifying the AI model and decreasing the frequency of the model’s interventions. Rendering a realistic virtual CRP via a headset is also computationally intensive. In our virtual CRP work, we’re now rendering the avatar on an edge device, such as a laptop with a state-of-the-art GPU, and streaming it to the display.
As we continue to tackle these technology challenges, we’re well aware that we don’t have all the answers. That’s why we discuss the problems that we’re working on with the nonspeaking autistic people who will use the technology. Their perspectives are helping us make progress toward a truly usable and useful device.
Everyone Deserves to Be Heard
So many assumptions are made about people who cannot speak, including that they don’t have anything to say. We went into this project presuming competence in nonspeaking people, and yet we still weren’t sure if our participants would be able to adapt to our technology. In our
initial work, we were unsure whether nonspeakers could wear the AR device or interact with virtual buttons. They easily did both. In our evaluation of the HoloBoard prototype, we didn’t know if users could type on a virtual letterboard hovering in front of them. They did so while we watched. In a recent study investigating whether nonspeakers could select letters using eye-gaze tracking, we wondered if they could complete the built-in gaze-calibration procedure. They did.
The ability to communicate—to share information, memories, opinions—is essential to well-being. Unfortunately, most autistic people who can’t communicate using speech are
never provided an effective alternative. Without a way to convey their thoughts, they are deprived of educational, social, community, and employment opportunities.
We aren’t so naïve as to think that AR is a silver bullet. But we’re hopeful that there will be more community collaborations like ours, which take seriously the lived experiences of nonspeaking autistic people and lead to new technologies to support them. Their voices may be stuck inside, but they deserve to be heard.
From Your Site Articles
Related Articles Around the Web