If you are in Paris at CHI 2013 this week, come see me present our ‘Best Paper Award’ winner “Analyzing User-Generated YouTube Videos to Understand Touchscreen Use by People with Motor Impairments” ! The talk will be on Tuesday, April 30th, from 11:20am to 11:40am in the Impairment and Rehabilitation session in room 242AB. Watch our video preview (narrated by my talented co-author Leah Findlater!) for a sneak peek:
Over the past year on the MTAGIC project, we’ve been investigating differences in how children and adults make gestures and touch targets on mobile touchscreen devices. We have designed our study tasks to reflect the designs of existing apps on the market today, and have recently examined our data to understand the impact of visual feedback on gesture interaction for kids. Our paper on this topic, “Examining the Need for Visual Feedback during Gesture Interaction on Mobile Touchscreen Devices for Kids,” has been accepted to the Interaction Design and Children (IDC) conference! Read the abstract:
Surface gesture interaction styles used on modern mobile touchscreen devices are often dependent on the platform and application. Some applications show a visual trace of gesture input as it is made by the user, whereas others do not. Little work has been done examining the usability of visual feedback for surface gestures, especially for children. In this paper, we present results from an empirical study conducted with children, teens, and adults to explore characteristics of gesture interaction with and without visual feedback. We find that the gestures generated with and without visual feedback by users of different ages diverge significantly in ways that make them difficult to interpret. In addition, users prefer to see visual feedback. Based on these findings, we present several design recommendations for new surface gesture interfaces for children, teens, and adults on mobile touchscreen devices. In general, we recommend providing visual feedback, especially for children, wherever possible.
As usual, you can find the camera-ready version of this paper here. See you in New York City this June!
In next-generation classrooms and educational environments, interactive technologies such as surface computing, natural gesture interfaces, and mobile devices will enable new means of motivating and engaging students in active learning. Our foundational studies provide a corpus of over 10,000 touch interactions and nearly 7,000 gestures collected from nearly 70 adults and children ages 7 years old and up, that can help us understand the characteristics of children’s interactions in these modalities and how they differ from adults. Based on these data, we identify key design and implementation challenges of supporting children’s touch and gesture interactions, and we suggest ways to address them. For example, we find children have more trouble successfully acquiring onscreen targets and having their gestures recognized than do adults, especially the youngest age group (7 to 10 years old). The contributions of this work provide a foundation that enables touch-based interactive educational apps that increase student success.
I’ll add a post when this special issue is officially published. For now, if you’re interested, you can check out the camera-ready version.
I am an associate professor at the University of Florida in the Computer & Information Science & Engineering department. I work on understanding, designing and developing natural user interactions like pen, touch, gesture, and mixed-reality interaction, especially for children and families, with applications in human-AI interaction, learning, and health.