XR and Accessibility

Feature written by Shanna Finnigan of USCDornslife

One of the biggest contentions most critics have with Extended Reality is its lack of attention to accessibility. The fact of the matter is, it is not guaranteed that we can all use our senses to the same extent as everyone else around us; we will all therefore experience XR in different ways. Some are at an even bigger disadvantage due to issues in motor, visual, or auditory capabilities. XR is all about being able to engage with the world around you through your senses. It depends heavily on motion, visuals, and audio to create a memorable experience. So what happens when someone is unable to engage with this medium due to a disability?

VR and AR designers are entering the age of new mediums, in which the usual solutions we tend to have for accessibility are no longer as applicable. Dylan Fox and Isabel Thornton bring up a great example of this issue in their overview of ethical practices in the space: “features like captions are well understood on rectangular 2D screens but present many new challenges in XR” (IEEE Global Initiative). It has been challenging for designers to completely reimagine user interface and experience to accommodate as many people as possible. VR is already very taxing on the able body, but for people who have motor dysfunctions that either cause them to move their head non-stop or to be paralyzed, it is even more so. Many XR products have turned to eye-tracking as a method of solving this problem, but even this is not enough. Many people who have undergone strokes or who have developed ALS, for example, cannot move their eyes at all. Because we are so early in the process, not many solutions exist yet.  

Ventures like Cognixion are trying to change this. Cognixion is a growing company that specializes in “Assistive Reality” solutions. Their main product, Cognixion One, is an AR headset that aids in communication between users who are non-verbal (or cannot clearly communicate verbally) and other people, who may include their caretakers, family members, friends, etc. In a very simplified sense, this is how it works: there is an EEG at the back of each headset which is secured to the user’s head and reads the brain waves in their occipital lobe in the back of their head. On the sunglass-like frames in front of the user’s eyes, they are able to see a stereoscopic hologram-like projection of a keyboard and other UI elements. The EEG at the back of the headset is able to detect the most minute changes in brain activity, and deduct what letter the user is trying to look at on the display in front of them. Essentially, they are typing out words to the little computer inside their headset using their brain! Once fully formulated, the words are played back in the form of audio, and as a reflection on the front side of the shades on their headset. This way, whoever they are holding a conversation with can see and hear what the user is saying. 

The field of XR is growing rapidly, and with this fast growth, comes a responsibility of designers and engineers to be mindful of who they are designing for. 


For the full feature and to see more of Shanna’s great work head over to USCDornslife here