I am thrilled to announce that I have recently been awarded an NSF CAREER (short for: Faculty Early Career Development Program) grant in the IIS Division entitled “Natural User Interfaces for Children.” This grant will fund my lab’s research over the next 5 years, and form the foundation of my long-term research agenda, on natural interactions for children, ranging from touchscreen interaction, whole-body interaction, and multimodal interaction. Keep an eye on the INIT Lab website for updates!
Tag Archives: natural user interaction
I’m excited to announce I’ve been invited to be a keynote speaker at the upcoming University of Iowa Obermann Center Working Symposium entitled “Designing the Digital Future: A Human-Centered Approach to Informatics.” I’ll be giving a version of my research talk, “Understanding, Designing, and Developing Natural User Interfaces for Children.” I’m honored to be part of an amazing slate of HCI speakers, including Mary-Beth Rosson, Ron Wakkary, Celine Latulipe, Tammy Clegg, and Lisa Nathan. It will be much colder in Iowa City compared to Florida this time of year, but I’m sure our good company and rich conversations will keep us warm! Thank you, Juan-Pablo Hourcade, for the invitation.
I have received a gift of equipment from Intel’s Software Academic Program to support my class on Natural User Interfaces. The tablets will be used in my class by the students to design and develop more natural interfaces that use touch and gesture interaction for their projects. With the class being so large, this equipment gift will ensure we have enough devices to go around!
Thank you, Intel!
In November, I gave an overview of my research at Rutgers University‘s School of Communication and Information during their LIS (Library & Information Science department) Brown Bag seminar. All of their seminars are video-recorded and uploaded to YouTube, so I am grateful to them for allowing me to share the talk with you here. It’s called “Understanding, Designing, and Developing Natural User Interactions for Children” and outlines my research arc in this area beginning with my dissertation work and continuing through my current work on the MTAGIC project and the $-family of gesture recognizers. You can watch the full video (about an hour) below, or check it out at Rutgers CommInfo’s YouTube channel.