Paper Reading #3: Pen+ Touch = New Tools
Reference
Authors: Ken Hinckley, Koji Yatani, Michel Pahud, Nicole Coddington, Jenny Rodenhouse, Andy Wilson, Hrvoje Benko, and Bill Buxton
Affiliation: Microsoft Research, One Microsoft Way, Redmond, WA 98052
Presentation : UIST’10, October 3–6, 2010, New York, New York, USA.
Hypothesis
There have been advancements in technologies which use Pen interface and Touch interface as a means of interaction. This paper presents a nobel way to integrate both of those functionalities over a single interface.
Methods
To further facilitate the functionality of the system a design study was done with 8 people to find out how people work with physical tools and pieces of paper. They then summarized the interaction properties shared by pen and touch and made design considerations for the pen+ touch interface. Microsoft Surface with a custom infrared LED pen, activated during contact via a tip switch with the software written in C# with WPF and the Microsoft Surface SDK was used. Basic tasks like zooming, manipulating and selecting objects, grouping items into stack were performed to test the system.
Results
The results showed that while writing, the secondary tasks of creating new objects, flipping back and forth between pages, and navigating to the extended workspace can all help retrieve, compare, and create task- relevant information.The approach to combine pen and touch interaction was appealing to users. They quickly formed habits around the general pattern (hold touch + gesture with pen) .
Discussion
Companies like Microsoft, Palm, and Nokia were involved in heavy research over the design of pen and touch interface on devices like tablets and phones since the 90’s. Pen interface now has seen radical changes in implementation. But it was not until 2007 when iPhone came into existence that people got to see a viable touch interface that was integrated deeply with user experience. Now the option of touch of pen shows good promise to me given the results shown in paper, but only if it can be integrated deeply within the needs of people. The idea of being able to write with one hand-using pen and being able to flip pages using the other hand using the touch interface seems exciting to me. If we can implement the concept of augmented reality and 3D interface on the same screen, we can probably devise an interface, which will provide no less of experience then working on a paper notebook. It should not be about bringing everything together in a single product and being good at nothing, but I feel like if a product can truly provide a natural interface like being able to work at the same comfort as with paper and pen, it can be successful.
Comments
Post a Comment