Tag Archives: gesture interaction

MTAGIC IJCCI article now available online.

In a previous post, we announced our International Journal of Child-Computer Interaction article “Children (and Adults) Benefit From Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices” was accepted for publication. Quite a long while later, we’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.

Leave a comment

Filed under Publication

INIT Lab receives equipment gift from Wacom Inc.

The INIT Lab has applied for and received a gift of several high-resolution tablets from Wacom, Inc. This work will help us examine more features of children’s touch and gesture interaction using the specialized sensors available with this technology. Stay tuned for the results of this exciting new research!

Thank you, Wacom!

Leave a comment

Filed under Funding

ICMI 2014 poster now available.

The ACM International Conference on Multimodal Interaction, was recently held in Istanbul, Turkey. My co-author Radu-Daniel Vatavu presented the poster for our paper entitled “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations,” which you can check out here. I was sorry not to be able to attend this year, but perhaps next year (it will be in Seattle, WA).

Leave a comment

Filed under Talk / Presentation

Paper on visualizing touchscreen gestures with heatmaps accepted to ICMI 2014!

My colleagues, Radu-Daniel Vatavu and Jacob O. Wobbrock, and I have had another paper accepted for publication! This paper continues our efforts to understand patterns and inconsistencies in how people make touchscreen gestures. This time, we introduced a way to use heatmap-style visualizations to examine articulation patterns in gesture datasets, and our paper “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations” was accepted to the ACM International Conference on Multimodal Interaction, to be held in Istanbul, Turkey, in November 2014. Here is the abstract:

We introduce gesture heatmaps, a novel gesture analysis technique that employs color maps to visualize the variation of local features along the gesture path. Beyond current gesture analysis practices that characterize gesture articulations with single-value descriptors, e.g., size, path length, or speed, gesture heatmaps are able to show with colorful visualizations how the value of any such descriptors vary along the gesture path. We evaluate gesture heatmaps on three public datasets comprising 15,840 gesture samples of 70 gesture types from 45 participants, on which we demonstrate heatmaps’ capabilities to (1) explain causes for recognition errors, (2) characterize users’ gesture articulation patterns under various conditions, e.g., finger versus pen gestures, and (3) help understand users’ subjective perceptions of gesture commands, such as why some gestures are perceived easier to execute than others. We also introduce chromatic confusion matrices that employ gesture heatmaps to extend the expressiveness of standard confusion matrices to better understand gesture classification performance. We believe that gesture heatmaps will prove useful to researchers and practitioners doing gesture analysis, and consequently, they will inform the design of better gesture sets and development of more accurate recognizers.

Check out the camera ready version of our paper here. Our paper will be presented as a poster at the conference, and I’ll post the PDF when available.

Leave a comment

Filed under Publication

JPUC article now available online.

In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.

Leave a comment

Filed under Publication

Guest blog post on recent visit to Hatfield Marine Science Center.

Recently, a new colleague Katie Stofer and I visited Hatfield Marine Science Center, a research site affiliated with Oregon State University, in Newport, OR, to conduct an observational study of how visitors to a public science center use gestures to interact with tech-enabled exhibits. In our case, we looked at exhibits running on a touch table and a touch wall already in use at the center. We wrote a guest blog post about our visit for the Free-Choice Learning Laboratory’s website. The Free-Choice Learning Laboratory focuses on how people learn in informal settings, typically when the learning is at their own pace and by their own choice. I’m particularly interested in how gestural interactions on touchscreens both hinder and afford learning. Read the blog post here.

Leave a comment

Filed under Publication

MTAGIC project releases UI Design Guidelines app!

The MTAGIC Project, which is studying differences in how children and adults interact with touchscreen devices, has released a new open-source app to help developers implement the design recommendations we included in our research papers. Based on findings from our studies of children and adults using mobile touchscreen devices, we found that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. We developed recommendations for how to design touchscreen interfaces to increase children’s success, and those are demonstrated in a handy Android app illustrating how to integrate the design recommendations into your own apps. Check out screenshots, a video demo, and the source code itself for the app here.

If you use this app in your own apps or in your research, we want to hear about it! Drop us a line or post a comment here! Of course, citations to the design recommendations we make in our papers are always welcome as well.

Leave a comment

Filed under Software / Data