Paper Reading #28: Experimental Analysis of Touch-Screen Gesture Designs in Mobile Environments

Reference
Authors and Affiliations :
Andrew Bragdon,Brown University Providence, RI, USA
Eugene Nelson, Brown University Providence, RI, USA
Yang Li,Google Research Mountain View, CA, USA
Ken Hinckley, Microsoft Research Redmond, WA, USA
Presentation:CHI 2011, May 7–12, 2011, Vancouver, BC, Canada.
Summary
Hypothesis
The paper hypothesizes that under various levels of environmental demands on attention, gestures can offer significant performance gains and reduced attentional load, while performing as well as soft buttons when the user‟s attention is focused on the phone. Furthermore, they propose that the speed and accuracy of bezel gestures will not be significantly affected by environment, and some gestures could be articulated eyes-free, with one hand.
Contents
The authors examine four factors: moding technique(hard button, soft button and bezel based), gesture type(mark based and free form), user‟s motor activity(sitting and walking), and distraction level of the environment(no distraction,moderate awareness and attention saturating task) that play a crucial role in interaction with the smartphones.
Authors were motivated by observations like putting a physical button which produced a mechanical click did not let people look at the screen to make sure they pressed button and was less cumbersome as well and the tactile feedback of touching the bezel confirms to users that they are contacting the bezel without having to look at screen and can be done with single hand as well.
Methods
15 participants from the general population of Brown University with diverse backgrounds and computers expertise were introduced to the study; read a description of the first command- invocation technique and shown a 40-second demonstration video. Users were given a short demonstration and then asked to complete a series of tasks for each of the above four factors influencing interaction, using the technique in the first environment: 2 training blocks followed by 6 measured blocks; each block contained each of the 12 commands once.
Results
There was not a significant time difference for the hard and soft buttons. Bezel marks and soft buttons performed similarly in direct, and that with various distraction types, bezel marks significantly outperformed soft buttons in each case.The biggest effect for soft buttons came from not looking directly at the phone at all times which was not at all affected in case with bezel marks. Bezel marks were unaffected by not looking directly at the phone.
Hard buttons and bezel marks had the highest accuracy followed closely by soft buttons.The path gesture-based techniques were consistently the least accurate in all cases.There was no significant difference for glances between any of the gestural techniques.
For indirect interaction, none preferred soft buttons; 7 of 15 users chose bezel marks, 7 chose hard button marks, and 1 chose bezel paths.For direct, a majority (8 of 15 users) chose soft buttons as the preferred technique for performance; 4 chose hard button marks and 3 chose bezel marks.
Discussion
Soft buttons have been the major forms of interaction in touch based tablets, iPhone provides the minimal number of buttons possible and they are even trying to get rid of what they have. In light of that this research somewhat points to another direction. It might work out but we cannot really say so based on 15 participants. Furthermore, there cannot be a definitive set of bezel gestures given phones have different form factors and there is not a standard design for smartphones. I don't see this being implemented really soon, but may be Google can try this out on their phones to see how it works.
Comments
Post a Comment