The ACM International Conference on Multimodal Interaction, was recently held in Istanbul, Turkey. My co-author Radu-Daniel Vatavu presented the poster for our paper entitled “Gesture Heatmaps: Understanding Gesture Performance with Colorful Visualizations,” which you can check out here. I was sorry not to be able to attend this year, but perhaps next year (it will be in Seattle, WA).
Tag Archives: new talk
I’m excited to announce I’ve been invited to be a keynote speaker at the upcoming University of Iowa Obermann Center Working Symposium entitled “Designing the Digital Future: A Human-Centered Approach to Informatics.” I’ll be giving a version of my research talk, “Understanding, Designing, and Developing Natural User Interfaces for Children.” I’m honored to be part of an amazing slate of HCI speakers, including Mary-Beth Rosson, Ron Wakkary, Celine Latulipe, Tammy Clegg, and Lisa Nathan. It will be much colder in Iowa City compared to Florida this time of year, but I’m sure our good company and rich conversations will keep us warm! Thank you, Juan-Pablo Hourcade, for the invitation.
Master’s student Karen Rust recently presented our poster “Understanding Child-Defined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction” at the Interaction Design for Children (IDC) conference, in Aarhus, Denmark. Check out the poster here. We’re looking forward to IDC 2015 in Boston, MA!
Last week at the ICMI 2013 conference in Sydney, Australia, I presented work done in collaboration with my co-authors Radu-Daniel Vatavu and Jacob O. Wobbrock on new ways of understanding how users make stroke gestures (for example, with stylus and finger on touchscreen devices), through the use of 12 “Relative Accuracy Measures for Stroke Gestures” that our paper introduced. The paper has details on the measures themselves and how they are derived; the talk focuses on what these types of measures can be used for and how they can help us design and build better gesture interactions. For those interested, my presentation slides are available here.
We have also released an open-source toolkit, which we call “GREAT” (Gesture RElative Accuracy Toolkit) that you can use to compute the measures on your own dataset. Download it here!
This past week the Interaction Design for Children (IDC) conference was held in New York City! It was a great conference on lots of cutting edge research and design work being done in the area of children interacting with technology, ranging from exercise games to educational applications to kids with special needs. There were also great demos of some of the exciting projects.
I presented a paper on the MTAGIC project’s findings related to the impact of visual feedback on gesture interaction for kids. For those interested, check out the slides. UMBC PhD student Germaine Irwin presented the MTAGIC project’s poster on the use of gamification elements to encourage children to stay focused during empirical studies. She did a great job on the madness-talk (only 15 seconds!) and discussing the poster with interested people.
The CHI 2013 conference was two weeks ago, and I presented a paper on work I did with a University of Maryland colleague, Leah Findlater, called “Analyzing User-Generated YouTube Videos to Understand Touchscreen Use by People with Motor Impairments.” We looked at YouTube as a source of data on users with physical disabilities telling their own stories about how they use touchscreen devices like tablets and smartphones in their daily lives. We also received a ‘Best Paper Award’ for this work! If you’re interested, you can find my presentation slides here.
In November, I gave an overview of my research at Rutgers University‘s School of Communication and Information during their LIS (Library & Information Science department) Brown Bag seminar. All of their seminars are video-recorded and uploaded to YouTube, so I am grateful to them for allowing me to share the talk with you here. It’s called “Understanding, Designing, and Developing Natural User Interactions for Children” and outlines my research arc in this area beginning with my dissertation work and continuing through my current work on the MTAGIC project and the $-family of gesture recognizers. You can watch the full video (about an hour) below, or check it out at Rutgers CommInfo’s YouTube channel.