Google Research

Investigating Cursor-based Interactions to Support Non-Visual Exploration in the Real World

  • Anhong Guo
  • Saige L. McVea
  • Xu Wang
  • Patrick Clary
  • Ken Goldman
  • Yang Li
  • Yu Zhong
  • Jeffrey Bigham
Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (2018)

Abstract

The human visual system processes complex scenes to focus attention on relevant items. However, blind people cannot visually skim for an area of interest. Instead, they use a combination of contextual information, knowledge of the spatial layout of their environment, and interactive scanning to find and attend to specific items. In this paper, we define and compare three cursor-based interactions to help blind people attend to items in a complex visual scene: window cursor (move their phone to scan), finger cursor (point their finger to read), and touch cursor (drag their finger on the touchscreen to explore). We conducted a user study with 12 participants to evaluate the three techniques on four tasks, and found that: window cursor worked well for locating objects on large surfaces, finger cursor worked well for accessing control panels, and touch cursor worked well for helping users understand spatial layouts. A combination of multiple techniques will likely be best for supporting a variety of everyday tasks for blind users.

Learn more about how we do research

We maintain a portfolio of research projects, providing individuals and teams the freedom to emphasize specific types of work