Tag Archives: touch interaction

INTERACT 2015 paper accepted on detecting child users from touch input.

My colleagues, Radu-Daniel Vatavu and Quincy Brown, and I, have combined our efforts on exploring touch interaction for children on a paper which has been accepted to the INTERACT 2015 conference! The paper, titled “Child or Adult? Inferring Smartphone Users’ Age Group from Touch Measurements Alone,” showed the results of our experiments to classify whether a user is a young child (ages 3 to 6) or an adult from properties of their touch input alone. Radu used his dataset of 3 to 6 year olds and supplemented with our MTAGIC dataset. The abstract is as follows:

We present a technique that classifies users’ age group, i.e., child or adult, from touch coordinates captured on touch-screen devices. Our technique delivered 86.5% accuracy (user-independent) on a dataset of 119 participants (89 children ages 3 to 6) when classifying each touch event one at a time and up to 99% accuracy when using a window of 7+ consecutive touches. Our results establish that it is possible to reliably classify a smartphone user on the fly as a child or an adult with high accuracy using only basic data about their touches, and will inform new, automatically adaptive interfaces for touch-screen devices.

You can download the camera-ready version of the paper here. Radu will be presenting our work at INTERACT, which will be held in Bamberg, Germany, in September. I’ll post the talk when available!

Leave a comment

Filed under Publication

JPUC article now available online.

In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.

Leave a comment

Filed under Publication

Guest blog post on recent visit to Hatfield Marine Science Center.

Recently, a new colleague Katie Stofer and I visited Hatfield Marine Science Center, a research site affiliated with Oregon State University, in Newport, OR, to conduct an observational study of how visitors to a public science center use gestures to interact with tech-enabled exhibits. In our case, we looked at exhibits running on a touch table and a touch wall already in use at the center. We wrote a guest blog post about our visit for the Free-Choice Learning Laboratory’s website. The Free-Choice Learning Laboratory focuses on how people learn in informal settings, typically when the learning is at their own pace and by their own choice. I’m particularly interested in how gestural interactions on touchscreens both hinder and afford learning. Read the blog post here.

Leave a comment

Filed under Publication

MTAGIC project releases UI Design Guidelines app!

The MTAGIC Project, which is studying differences in how children and adults interact with touchscreen devices, has released a new open-source app to help developers implement the design recommendations we included in our research papers. Based on findings from our studies of children and adults using mobile touchscreen devices, we found that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. We developed recommendations for how to design touchscreen interfaces to increase children’s success, and those are demonstrated in a handy Android app illustrating how to integrate the design recommendations into your own apps. Check out screenshots, a video demo, and the source code itself for the app here.

If you use this app in your own apps or in your research, we want to hear about it! Drop us a line or post a comment here! Of course, citations to the design recommendations we make in our papers are always welcome as well.

Leave a comment

Filed under Software / Data

Short paper to appear at IDC 2014 on user-defined gestures for children!

More work with my University of Maryland collaborators, including assistant professor Leah Findlater, has been accepted for publication! Look for our short paper “Understanding Child-Defined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction” to appear at the upcoming Interaction Design and Children (IDC) 2014 conference. We extended prior work by Jacob O. Wobbrock and colleagues in a paper from CHI 2009 on eliciting gesture interactions for touchscreen tabletops directly from users themselves; in our case, we asked children to define the gestures, and compared them to similar gestures designed by adults. Here is the abstract:

Creating a pre-defined set of touchscreen gestures that caters to all users and age groups is difficult. To inform the design of intuitive and easy to use gestures specifically for children, we adapted a user-defined gesture study by Wobbrock et al. [12] that had been designed for adults. We then compared gestures created on an interactive tabletop by 12 children and 14 adults. Our study indicates that previous touchscreen experience strongly influences the gestures created by both groups; that adults and children create similar gestures; and that the adaptations we made allowed us to successfully elicit user-defined gestures from both children and adults. These findings will aid designers in better supporting touchscreen gestures for children, and provide a basis for further user-defined gesture studies with children.

You can see the camera-ready version of the paper here. The conference will be held in Aarhus, Denmark (home of LEGO!). Unfortunately, I won’t be attending, but first-author (and graduating Master’s student) Karen Rust will present the paper at the conference. Look for her in the short paper madness session, and the poster session!

Leave a comment

Filed under Publication