Microsoft researchers and university collaborators have developed a set of tools to help people with low vision experience virtual reality (VR).
Intern Yuhang Zhao, a graduate student at Cornell Tech, will present a paper at the CHI Conference on Human Factors in Computing Systems in Glasgow, Scotland, next month, which summaries the joint research with Microsoft’s Ed Cutrell, Christian Holz, Eyal Ofek, Meredith Ringel Morris and Andrew Wilson on enhancing the accessibility of VR technologies.
The SeeingVR set of 14 tools for Unity developers are designed to overcome the visual disabilities experienced by up to 217 million people worldwide when it comes to using VR, which stands some way behind desktop software and the accommodations it already offers.
End-users can activate different combinations of the SeeingVR tools depending on their abilities and the context of the current application and task. Example tools include a magnifier and bifocal views, brightness and contrast adjustment for the scene, edge-enhancement to make virtual objects more salient from their backgrounds, depth measurement tools, and the ability to point at text or objects in a virtual scene to have them read or described aloud.
Microsoft’s principal researcher and research manager, Ringel Morris, explained in a blog post that the majority of the SeeingVR tools can be applied to existing Unity applications post-hoc, to support easy adoption.
She said: “Evaluation with 11 people with low vision completing a variety of tasks in VR (for example, menu selection, grasping objects, shooting moving targets) found that all participants could complete tasks more quickly and accurately when using SeeingVR tools as compared to the default VR experience. All participants chose different combinations of the available tools, reinforcing the value of allowing flexibility and customization of low vision accessibility options.”
Experience a demo of the work in the video below:
Microsoft researchers are also exploring non-visual representations of VR for people who are completely blind.
Using Microsoft Soundscape, a smartphone application that deploys spatial audio to deliver a rich, non-visual navigation experience, intern and University of Washington student Anne Spencer Ross has crafted an audio-only VR experience that can allow people to rehearse a walking route virtually before experiencing the route in the physical world.
Her paper, Use Cases and Impact of Audio-Based Virtual Exploration, is a collaboration between engineers from the Soundscape team and researchers in the Microsoft Research Redmond Lab.
Image credits: Microsoft Research