Paper Reading #29 : Usable Gestures for Blind People: Understanding Preference and Performance


Reference
Authors and Affiliations:
Shaun K. Kane, Jacob O. Wobbrock - The Information School, DUB Group University of Washington Seattle, WA 98195 USA
Richard E. Ladner -Computer Science & Engineering, DUB Group University of Washington, Seattle, WA 98195 USA
Presentation :CHI 2011, May 7–12, 2011, Vancouver, BC, Canada

Summary
Hypothesis

The paper suggests that the blind people have different gesture preferences than sighted people, including preferences for edge-based gestures and gestures that involve tapping virtual keys on a keyboard.

They hypothesize differences in the speed, size, and shape of gestures performed by blind people versus those performed by sighted people.

Contents

Accessible touch screens still present challenges to both users and designers. Users must be able to learn new touch screen applications quickly and effectively, while designers must be able to implement accessible touch screen interaction techniques for a diverse range of devices and applications. Because most user interface designers are sighted, they may have a limited understanding of how blind people experience technology.Authors argue that accessible touch screen interfaces can be improved substantially if designers can better understand how blind people actually use touch screens.

The authors conduct two user studies that explore how blind and sighted people interact with touch screens and then presents design principles based on results.


Motivation
Apple’s VoiceOver for iPhone1 and Google's Eyes Free Shell use completely different layouts and interaction primitives, and thus there is currently no lingua franca for touch screen interactions for blind people.There still exists little information about how to design best accessible touch screen interfaces for tablets and other large touch screens.

Methods

Authors conducted a gesture elicitation study in which blind and sighted participants invented gestures for performing common computing tasks on a touch screen-based tablet PC.

The participant was asked to invent 2 different gestures that could initiate the command, and to think aloud while doing so.Then they described the gesture verbally to the experimenter and demonstrated it 3 times using the tablet PC’s touch screen. Finally, the experimenter prompted them to rate each of the gestures using scales.

Second, they conducted a gesture performance study in which both blind and sighted participants repeatedly performed a set of standard gestures on a touch screen. After

participant had practiced the gesture, they performed the gesture 3 times, rated the gesture using a variation of the easiness scale.

Results

Logistic regression showed that blind participants rated the gestures they created as significantly better on the good match question . There was no significant difference between the two groups on the easiness question. Strokes count was relatively higher for blind people than sighted ones. Blind people were more likely to use edges and corners to create gestures and

to invent multi-touch gestures. Higher number of symbolic gestures were invented by sighted participants, and a higher number of abstract and metaphorical gestures invented by blind participants. On screen qwerty was preferred method for text input.
There was no significant effect on easiness due to blindness but gesture’s category affected its rating differently for blind and sighted participants.Blind participants tended to create significantly larger and wider gestures,with a greater variation and with less speed, than sighted participants.Gestures from sighted participants were better recognized, formed a form factor and have higher angular acceleration.

Discussion
The research was really fascinating and informative to me. I can see a lot of applications of this in mobile computing touch based devices. We always have been assuming things work similarly for blind like they do for sighted person, but this paper asks us to modify on those assumptions and provides guidelines for better designs to incorporate blind people into a single mainstream. I will be excited to have these implemented on future phones and see how these results match up to a wider range of users.

Comments

Popular posts from this blog

Paper Reading #20: The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only ANOVA Procedures

Paper Reading #27 : Sensing Cognitive Multitasking for a Brain-Based Adaptive User Interface