Tag Archives: child-computer interaction

JPUC article now available online.

In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.

Leave a comment

Filed under Publication

IDC 2014 poster now available.

Master’s student Karen Rust recently presented our poster “Understanding Child-Defined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction” at the Interaction Design for Children (IDC) conference, in Aarhus, Denmark. Check out the poster here. We’re looking forward to IDC 2015 in Boston, MA!

Leave a comment

Filed under Talk / Presentation

MTAGIC project releases UI Design Guidelines app!

The MTAGIC Project, which is studying differences in how children and adults interact with touchscreen devices, has released a new open-source app to help developers implement the design recommendations we included in our research papers. Based on findings from our studies of children and adults using mobile touchscreen devices, we found that children have more difficulty successfully acquiring touch targets and making consistent gestures than adults do. We developed recommendations for how to design touchscreen interfaces to increase children’s success, and those are demonstrated in a handy Android app illustrating how to integrate the design recommendations into your own apps. Check out screenshots, a video demo, and the source code itself for the app here.

If you use this app in your own apps or in your research, we want to hear about it! Drop us a line or post a comment here! Of course, citations to the design recommendations we make in our papers are always welcome as well.

Leave a comment

Filed under Software / Data

Short paper to appear at IDC 2014 on user-defined gestures for children!

More work with my University of Maryland collaborators, including assistant professor Leah Findlater, has been accepted for publication! Look for our short paper “Understanding Child-Defined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction” to appear at the upcoming Interaction Design and Children (IDC) 2014 conference. We extended prior work by Jacob O. Wobbrock and colleagues in a paper from CHI 2009 on eliciting gesture interactions for touchscreen tabletops directly from users themselves; in our case, we asked children to define the gestures, and compared them to similar gestures designed by adults. Here is the abstract:

Creating a pre-defined set of touchscreen gestures that caters to all users and age groups is difficult. To inform the design of intuitive and easy to use gestures specifically for children, we adapted a user-defined gesture study by Wobbrock et al. [12] that had been designed for adults. We then compared gestures created on an interactive tabletop by 12 children and 14 adults. Our study indicates that previous touchscreen experience strongly influences the gestures created by both groups; that adults and children create similar gestures; and that the adaptations we made allowed us to successfully elicit user-defined gestures from both children and adults. These findings will aid designers in better supporting touchscreen gestures for children, and provide a basis for further user-defined gesture studies with children.

You can see the camera-ready version of the paper here. The conference will be held in Aarhus, Denmark (home of LEGO!). Unfortunately, I won’t be attending, but first-author (and graduating Master’s student) Karen Rust will present the paper at the conference. Look for her in the short paper madness session, and the poster session!

Leave a comment

Filed under Publication

MTAGIC project paper accepted to IJCCI!

We are pleased to announce that a new paper on the MTAGIC project has been accepted to the International Journal of Child-Computer Interaction! The paper, entitled “Children (and Adults) Benefit From Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices,” is an extension of our IDC 2013 paper on visual feedback and gestural interaction for children and adults. The journal version examines more features and additional recognizers to uncover the effects of presence or absence of visual feedback during gestural interaction. Here is the abstract:

Surface gesture interaction styles used on mobile touchscreen devices often depend on the platform and application. Some applications show a visual trace of gesture input being made by the user, whereas others do not. Little work has been done examining the usability of visual feedback for surface gestures, especially for children. In this paper, we extend our previous work on an empirical study conducted with children, teens, and adults to explore characteristics of gesture interaction with and without visual feedback. We analyze 9 simple and 7 complex gesture features to determine whether differences exist between users of different age groups when completing surface gestures with and without visual feedback. We find that the gestures generated diverge significantly in ways that make them difficult to interpret by some recognizers. For example, users tend to make gestures with fewer strokes in the absence of visual feedback, and tend to make shorter, more compact gestures using straighter lines in the presence of visual feedback. In addition, users prefer to see visual feedback. Based on these findings, we present design recommendations for surface gesture interfaces for children, teens, and adults on mobile touchscreen devices. We recommend providing visual feedback, especially for children, wherever possible.

When this article is officially published, I’ll add a link, but until then, you can check out the preprint version.

Leave a comment

Filed under Publication