I am thrilled to announce that I have recently been awarded an NSF CAREER (short for: Faculty Early Career Development Program) grant in the IIS Division entitled “Natural User Interfaces for Children.” This grant will fund my lab’s research over the next 5 years, and form the foundation of my long-term research agenda, on natural interactions for children, ranging from touchscreen interaction, whole-body interaction, and multimodal interaction. Keep an eye on the INIT Lab website for updates!
Tag Archives: children
In a previous post, we announced our International Journal of Child-Computer Interaction article “Children (and Adults) Benefit From Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices” was accepted for publication. Quite a long while later, we’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.
My colleagues, Radu-Daniel Vatavu and Quincy Brown, and I, have combined our efforts on exploring touch interaction for children on a paper which has been accepted to the INTERACT 2015 conference! The paper, titled “Child or Adult? Inferring Smartphone Users’ Age Group from Touch Measurements Alone,” showed the results of our experiments to classify whether a user is a young child (ages 3 to 6) or an adult from properties of their touch input alone. Radu used his dataset of 3 to 6 year olds and supplemented with our MTAGIC dataset. The abstract is as follows:
We present a technique that classifies users’ age group, i.e., child or adult, from touch coordinates captured on touch-screen devices. Our technique delivered 86.5% accuracy (user-independent) on a dataset of 119 participants (89 children ages 3 to 6) when classifying each touch event one at a time and up to 99% accuracy when using a window of 7+ consecutive touches. Our results establish that it is possible to reliably classify a smartphone user on the fly as a child or an adult with high accuracy using only basic data about their touches, and will inform new, automatically adaptive interfaces for touch-screen devices.
You can download the camera-ready version of the paper here. Radu will be presenting our work at INTERACT, which will be held in Bamberg, Germany, in September. I’ll post the talk when available!
The INIT Lab has applied for and received a gift of several high-resolution tablets from Wacom, Inc. This work will help us examine more features of children’s touch and gesture interaction using the specialized sensors available with this technology. Stay tuned for the results of this exciting new research!
Thank you, Wacom!
I’m excited to announce I’ve been invited to be a keynote speaker at the upcoming University of Iowa Obermann Center Working Symposium entitled “Designing the Digital Future: A Human-Centered Approach to Informatics.” I’ll be giving a version of my research talk, “Understanding, Designing, and Developing Natural User Interfaces for Children.” I’m honored to be part of an amazing slate of HCI speakers, including Mary-Beth Rosson, Ron Wakkary, Celine Latulipe, Tammy Clegg, and Lisa Nathan. It will be much colder in Iowa City compared to Florida this time of year, but I’m sure our good company and rich conversations will keep us warm! Thank you, Juan-Pablo Hourcade, for the invitation.
In a previous post, we announced our Journal of Personal and Ubiquitous Computing article “Designing Smarter Touch-based Interfaces for Educational Contexts” was accepted for publication. We’re pleased that the definitive version is finally available online here. Please cite this version wherever applicable.
Master’s student Karen Rust recently presented our poster “Understanding Child-Defined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction” at the Interaction Design for Children (IDC) conference, in Aarhus, Denmark. Check out the poster here. We’re looking forward to IDC 2015 in Boston, MA!