In a previous post, we announced our International Journal of Child-Computer Interaction article “Children (and Adults) Benefit From Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices” was accepted for publication. Quite a long while later, we’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.
Category Archives: Publication
Two workshop papers accepted at IDC 2015!
My lab has had two papers accepted to workshops at the upcoming ACM Interaction Design and Children conference, to be held later this month in Boston, MA. The first paper is for the Innovations in Interaction Design and Learning workshop, and reports what we’ve learned on the MTAGIC project so far regarding using HCI principles to design effective educational technology. See the paper here.
The other is for the Every Child a Coder? Research Challenges for a 5-18 Programming Curriculum workshop, and is a paper first-authored by my Ph.D. student Jeremiah Blanchard on his vision for open research challenges in transitioning students from blocks-based to “real” programming languages. His co-advisor and my colleague Christina Gardner-McCune is also an author. You can read the paper here.
We’re looking forward to the conference!
INTERACT 2015 paper accepted on detecting child users from touch input.
My colleagues, Radu-Daniel Vatavu and Quincy Brown, and I, have combined our efforts on exploring touch interaction for children on a paper which has been accepted to the INTERACT 2015 conference! The paper, titled “Child or Adult? Inferring Smartphone Users’ Age Group from Touch Measurements Alone,” showed the results of our experiments to classify whether a user is a young child (ages 3 to 6) or an adult from properties of their touch input alone. Radu used his dataset of 3 to 6 year olds and supplemented with our MTAGIC dataset. The abstract is as follows:
We present a technique that classifies users’ age group, i.e., child or adult, from touch coordinates captured on touch-screen devices. Our technique delivered 86.5% accuracy (user-independent) on a dataset of 119 participants (89 children ages 3 to 6) when classifying each touch event one at a time and up to 99% accuracy when using a window of 7+ consecutive touches. Our results establish that it is possible to reliably classify a smartphone user on the fly as a child or an adult with high accuracy using only basic data about their touches, and will inform new, automatically adaptive interfaces for touch-screen devices.
You can download the camera-ready version of the paper here. Radu will be presenting our work at INTERACT, which will be held in Bamberg, Germany, in September. I’ll post the talk when available!
Paper on visualizing touchscreen gestures with heatmaps accepted to ICMI 2014!
My colleagues, Radu-Daniel Vatavu and Jacob O. Wobbrock, and I have had another paper accepted for publication! This paper continues our efforts to understand patterns and inconsistencies in how people make touchscreen gestures. This time, we introduced a way to use heatmap-style visualizations to examine articulation patterns in gesture datasets, and our paper “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations” was accepted to the ACM International Conference on Multimodal Interaction, to be held in Istanbul, Turkey, in November 2014. Here is the abstract:
We introduce gesture heatmaps, a novel gesture analysis technique that employs color maps to visualize the variation of local features along the gesture path. Beyond current gesture analysis practices that characterize gesture articulations with single-value descriptors, e.g., size, path length, or speed, gesture heatmaps are able to show with colorful visualizations how the value of any such descriptors vary along the gesture path. We evaluate gesture heatmaps on three public datasets comprising 15,840 gesture samples of 70 gesture types from 45 participants, on which we demonstrate heatmaps’ capabilities to (1) explain causes for recognition errors, (2) characterize users’ gesture articulation patterns under various conditions, e.g., finger versus pen gestures, and (3) help understand users’ subjective perceptions of gesture commands, such as why some gestures are perceived easier to execute than others. We also introduce chromatic confusion matrices that employ gesture heatmaps to extend the expressiveness of standard confusion matrices to better understand gesture classification performance. We believe that gesture heatmaps will prove useful to researchers and practitioners doing gesture analysis, and consequently, they will inform the design of better gesture sets and development of more accurate recognizers.
Check out the camera ready version of our paper here. Our paper will be presented as a poster at the conference, and I’ll post the PDF when available.
JPUC article now available online.
In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.