Monthly Archives: October 2011

AccessComputing minigrant accepted!

My colleagues at UMBC and Landmark College and I recently had an AccessComputing minigrant accepted for funding to run a “Participatory Design Workshop for Accessible Apps and Games” at Landmark! Landmark College is a small 2-year college in Vermont that serves students with learning and cognitive disabilities. AccessComputing is a grant program administered by researchers at the University of Washington who received funding from the NSF to improve universal access to computing careers. Our workshop will take place over one day in the coming months, and will expose Landmark students to some of the basic principles of human-computer interaction. The workshop focuses on participatory design to show Landmark students how HCI takes into account user needs and characteristics when designing technology, even for diverse user populations such as themselves. Students from UMBC will also participate, leading the participatory design sessions to get user feedback on mobile apps and games they are designing as part of a current course project.

We are really excited that we were funded and are looking forward to the workshop!

Leave a comment

Filed under Funding

Paper accepted to ICMI 2011 workshop!

We just had a paper accepted to the ICMI 2011 workshop on “Inferring Cognitive and Emotional States from Multimodal Measures (MMCogEmS)“! The paper is called “Gesture Dynamics: Features Sensitive to Task Difficulty and Correlated with Physiological Sensors” and reports partial results from our recent study in the Multimodal Stress Detection project. The study included many modalities of input, but this paper focuses on the gesture modality. Here is the abstract:

This paper presents preliminary results regarding which features of pen-based gesture input are sensitive to cognitive stress when manipulated via changes in task difficulty. We conducted a laboratory study in which participants performed a vigilance-oriented continuous attention and visual search task. Responses to the search stimuli were entered via pen gestures (e.g., drawing a letter corresponding to the stimulus). Task difficulty was increased during predefined intervals. Participants’ input behaviors were logged, allowing for analysis of gesture input patterns for features sensitive to changes in task difficulty. We also collected physiological sensor readings (e.g., skin temperature, pulse rate, and respiration rate). Input behavior features such as gesture size and pen pressure were not affected by task difficulty, but gesture duration and length were affected. Task difficulty also affected physiological sensors, notably pulse rate. Results indicate that both gesture dynamics and physiological sensors can be used to detect changes in difficulty-induced stress.

Here’s the camera-ready version of the paper.

Leave a comment

Filed under Publication