Wednesday, September 21, 2011

Paper Reading #10: Sensing Foot Gestures from the Pocket

References
Jeremy Scott, et al.  "Sensing Foot Gestures from the Pocket". UIST '10 Proceedings of the 23rd annual ACM symposium on User interface software and technology.  ACM New York, NY, USA ©2010.

Author Bios
Jeremy Scott received his Bachelors in Computer Engineering from the University of Toronto and is working toward a Ph.D. in Computer Science from MIT. He was an undergraduate researcher at U of T and is now a research assistant at MIT.

David Dearman is a Ph.D. student at University of Toronto. His research interests lie in combining HCI, Ubiquitous computing, and Mobile computing.

Koji Yatani is a Ph. D. student at the University of Toronto. His research revolves around HCI and Ubiquitous computing, with an emphasis on hardware and sensing technologies.

Khai N. Truong is a professor at University of Toronto. His main interests lie in HCI and ubiquitous computing, revolving around enhancing usability and usefulness of mobile technology.

Summary

  • Hypothesis - A mobile device placed in a pocket of a person's pants can recognize simple foot gestures using the built-in accelerometer.



  • Method - For this paper, two experiments were conducted. The first used 16 right-footed individuals (8 male, 8 female). They were given specific angles to rotate their feet at (in four different ways, as seen below).
    The researchers recorded the accuracies of each angle as well as the amount of time it took for a user to confidently position their foot. After comparing all of this research, they then designed a system for the iPhone using the accelerometer to sense the foot gestures. A user would double tap their foot (average of about 330 milliseconds apart) and subsequently execute at gesture (rotating their foot a certain way to a range of angles). They used 6 ranges of angles, 3 clockwise and 3 counter (from natural foot position). This experiment used 6 right-footed participants, 4 male and 2 female.



  • Results - The results were relatively successful. Their system could determine about 10 different foot gestures with 86% accuracy. They determined that with this system it is possible to augment user experience.



  • Discussion
    I found this article to be really interesting. I have a particular interest in gestures that aren't based on visual feedback or physical manipulation of technology. This seems to be a good step in that direction. I'm not sure if foot gestures can really be read accurately enough to make it be a viable replacement for other similar gestures, but this is definitely a good stepping stone.

    No comments:

    Post a Comment