In a previous post, we announced our International Journal of Child-Computer Interaction article “Children (and Adults) Benefit From Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices” was accepted for publication. Quite a long while later, we’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.
Tag Archives: gesture interaction
The INIT Lab has applied for and received a gift of several high-resolution tablets from Wacom, Inc. This work will help us examine more features of children’s touch and gesture interaction using the specialized sensors available with this technology. Stay tuned for the results of this exciting new research!
Thank you, Wacom!
The ACM International Conference on Multimodal Interaction, was recently held in Istanbul, Turkey. My co-author Radu-Daniel Vatavu presented the poster for our paper entitled “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations,” which you can check out here. I was sorry not to be able to attend this year, but perhaps next year (it will be in Seattle, WA).
My colleagues, Radu-Daniel Vatavu and Jacob O. Wobbrock, and I have had another paper accepted for publication! This paper continues our efforts to understand patterns and inconsistencies in how people make touchscreen gestures. This time, we introduced a way to use heatmap-style visualizations to examine articulation patterns in gesture datasets, and our paper “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations” was accepted to the ACM International Conference on Multimodal Interaction, to be held in Istanbul, Turkey, in November 2014. Here is the abstract:
We introduce gesture heatmaps, a novel gesture analysis technique that employs color maps to visualize the variation of local features along the gesture path. Beyond current gesture analysis practices that characterize gesture articulations with single-value descriptors, e.g., size, path length, or speed, gesture heatmaps are able to show with colorful visualizations how the value of any such descriptors vary along the gesture path. We evaluate gesture heatmaps on three public datasets comprising 15,840 gesture samples of 70 gesture types from 45 participants, on which we demonstrate heatmaps’ capabilities to (1) explain causes for recognition errors, (2) characterize users’ gesture articulation patterns under various conditions, e.g., finger versus pen gestures, and (3) help understand users’ subjective perceptions of gesture commands, such as why some gestures are perceived easier to execute than others. We also introduce chromatic confusion matrices that employ gesture heatmaps to extend the expressiveness of standard confusion matrices to better understand gesture classification performance. We believe that gesture heatmaps will prove useful to researchers and practitioners doing gesture analysis, and consequently, they will inform the design of better gesture sets and development of more accurate recognizers.
Check out the camera ready version of our paper here. Our paper will be presented as a poster at the conference, and I’ll post the PDF when available.
In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.