The INIT Lab has applied for and received a gift of several high-resolution tablets from Wacom, Inc. This work will help us examine more features of children’s touch and gesture interaction using the specialized sensors available with this technology. Stay tuned for the results of this exciting new research!
Thank you, Wacom!
The ACM International Conference on Multimodal Interaction, was recently held in Istanbul, Turkey. My co-author Radu-Daniel Vatavu presented the poster for our paper entitled “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations,” which you can check out here. I was sorry not to be able to attend this year, but perhaps next year (it will be in Seattle, WA).
I’m excited to announce I’ve been invited to be a keynote speaker at the upcoming University of Iowa Obermann Center Working Symposium entitled “Designing the Digital Future: A Human-Centered Approach to Informatics.” I’ll be giving a version of my research talk, “Understanding, Designing, and Developing Natural User Interfaces for Children.” I’m honored to be part of an amazing slate of HCI speakers, including Mary-Beth Rosson, Ron Wakkary, Celine Latulipe, Tammy Clegg, and Lisa Nathan. It will be much colder in Iowa City compared to Florida this time of year, but I’m sure our good company and rich conversations will keep us warm! Thank you, Juan-Pablo Hourcade, for the invitation.
I have received a gift of equipment from Intel’s Software Academic Program to support my class on Natural User Interfaces. The tablets will be used in my class by the students to design and develop more natural interfaces that use touch and gesture interaction for their projects. With the class being so large, this equipment gift will ensure we have enough devices to go around!
Thank you, Intel!
My colleagues, Radu-Daniel Vatavu and Jacob O. Wobbrock, and I have had another paper accepted for publication! This paper continues our efforts to understand patterns and inconsistencies in how people make touchscreen gestures. This time, we introduced a way to use heatmap-style visualizations to examine articulation patterns in gesture datasets, and our paper “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations” was accepted to the ACM International Conference on Multimodal Interaction, to be held in Istanbul, Turkey, in November 2014. Here is the abstract:
We introduce gesture heatmaps, a novel gesture analysis technique that employs color maps to visualize the variation of local features along the gesture path. Beyond current gesture analysis practices that characterize gesture articulations with single-value descriptors, e.g., size, path length, or speed, gesture heatmaps are able to show with colorful visualizations how the value of any such descriptors vary along the gesture path. We evaluate gesture heatmaps on three public datasets comprising 15,840 gesture samples of 70 gesture types from 45 participants, on which we demonstrate heatmaps’ capabilities to (1) explain causes for recognition errors, (2) characterize users’ gesture articulation patterns under various conditions, e.g., finger versus pen gestures, and (3) help understand users’ subjective perceptions of gesture commands, such as why some gestures are perceived easier to execute than others. We also introduce chromatic confusion matrices that employ gesture heatmaps to extend the expressiveness of standard confusion matrices to better understand gesture classification performance. We believe that gesture heatmaps will prove useful to researchers and practitioners doing gesture analysis, and consequently, they will inform the design of better gesture sets and development of more accurate recognizers.
Check out the camera ready version of our paper here. Our paper will be presented as a poster at the conference, and I’ll post the PDF when available.
In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.
Recently, a new colleague Katie Stofer and I visited Hatfield Marine Science Center, a research site affiliated with Oregon State University, in Newport, OR, to conduct an observational study of how visitors to a public science center use gestures to interact with tech-enabled exhibits. In our case, we looked at exhibits running on a touch table and a touch wall already in use at the center. We wrote a guest blog post about our visit for the Free-Choice Learning Laboratory’s website. The Free-Choice Learning Laboratory focuses on how people learn in informal settings, typically when the learning is at their own pace and by their own choice. I’m particularly interested in how gestural interactions on touchscreens both hinder and afford learning. Read the blog post here.