Multimodal Stress Detection

Project Summary

Prior work in affective computing, human computer interaction, and related fields have found that physiological measurements such as pulse rate, galvanic skin response and skin temperature are correlated with changes in cognitive stress. However, sensing such measures during end-user interactions involves highly intrusive sensors worn on the body. Even eye tracking approaches, which can be integrated into the monitor and today do not require the user to wear anything, involve potentially costly hardware solutions that may not yet scale to all devices. We envision interactions in which the system can use features of the users’ own input behavior, ideally in multiple modalities, to unobtrusively detect cognitive states. The end goal would be to make intelligent and appropriate interventions to assist the user or adapt the interaction to suit the current cognitive state. This aspect of the research requires more study to identify what these interventions or adaptations might be, but draws on work in Dynamic Accessibility for inspiration.

Prior work has investigated a variety of modalities, including speech, typing, pen gestures, and more, usually in isolation. We have conducted a study in which we induced cognitive stress via changes in task difficulty at regular intervals. Users entered responses to the task in a variety of modalities, therefore, the data collected in this study enables cross-modality comparisons. We are currently involved in analyzing these data for the most useful features in each modality that could be used in a multimodal system to detect changes in cognitive load or stress. We are drawing on work in affective computing, accessible computing, multimodal interfaces and interaction, machine learning, and human-computer interaction.

Project Status and Findings

This project began in January 2011 and is currently underway. Major findings include:

  1. Input behavior features such as gesture size and pen pressure were not affected by task difficulty, but gesture duration and length were affected. Task difficulty also affected physiological sensors, notably pulse rate. [Anthony et al, MMCogEms’2011]
  2. (coming soon) [Anthony et al, TR 2011]

Current project status: We are continuing to analyze the data collected during summer 2011 in other modalities besides gesture and for smaller granularity timestamps to increase the sensitivity of identifying cognitive stress.


Publications and Papers:

  • Anthony, L., Carrington, P., Chu, P., Kidd, C., Lai, J., and Sears, A. 2011. Gesture dynamics: features sensitive to task difficulty and correlated with physiological sensors. Proceedings of the ICMI 2011 Workshop on Inferring Cognitive and Emotional States from Multimodal Measures (ICMI’2011 MMCogEmS), Alicante, Spain, 17 Nov 2011. [pdf]
  • Anthony, L., Carrington, P., Chu, P., Kidd, C., Lai, J., and Sears, A. 2011. Detecting Events of Interest with Physiological Sensors in a Real-World Email Search Task. Technical Report UMBC-IS-TR-007, 10 Oct 2011. [coming soon]


  • 11/2011 — Paper presentation at ICMI’2011 MMCogEmS workshop. [pdf]



  • Dr. Lisa Anthony (Co-PI, contact person)
  • Dr. Andrew Sears (Co-PI)
  • Patrick Carrington (PhD candidate, formerly undergraduate research assistant)


  • Peng Chu (PhD candidate)
  • Christopher Kidd (MS student)
  • Jianwei Lai (PhD candidate)

Selected Bibliography

Inferring cognitive states from behavioral measures:

  • Mentis, H.M. and Gay, G.K. 2002. Using TouchPad pressure to detect negative affect. Proc. ICMI 2002, ACM Press, 406-410.
  • Mota, S. and Picard, R.W. 2003. Automated Posture Analysis for Detecting Learner’s Interest Level. Proc. CVPRW 2003, IEEE Press, 49-55.
  • Ruiz, N., Taib, R., Shi, Y., Choi, E. and Chen, F. 2007. Using pen input features as indices of cognitive load. Proc. ICMI 2007, ACM Press, 315.
  • Ruiz, N., Feng, Q.Q., Taib, R., Handke, T. and Chen, F. 2010. Cognitive skills learning: pen input patterns in computer-based athlete training. Proc. ICMI-MLMI 2010, ACM Press, Article 41, 4 pgs.
  • Schuller, B., Rigoll, G. and Lang, M. 2004. Emotion recognition in the manual interaction with graphical user interfaces. Proc. ICME 2004, IEEE Press, 1215-1218.
  • Vizer, L.M., Zhou, L. and Sears, A. 2009. Automated stress detection using keystroke and linguistic features: An exploratory study. International Journal of Human-Computer Studies 67, 10 (Oct. 2009), 870-886.
  • Yin, B. and Chen, F. 2007. Towards Automatic Cognitive Load Measurement from Speech Analysis. In Human-Computer Interaction: Interaction Design and Usability, J.A. Jacko, Ed. Springer Berlin Heidelberg. 1011-1020.

Inferring cognitive states from physiological sensors:

  • Ikehara, C.S. and Crosby, M.E. 2005. Assessing Cognitive Load with Physiological Sensors. Proc. HICSS 2005, IEEE Press, 295a.
  • Shastri, D., Merla, A., Tsiamyrtzis, P. and Pavlidis, I. 2009. Imaging Facial Signs of Neurophysiological Responses. IEEE Transactions on Biomedical Engineering 56, 2 (Feb. 2009), 477-484.
  • Strauss, M., Reynolds, C., Hughes, S., Park, K., McDarby, G. and Picard, R.W. 2005. The HandWave Bluetooth Skin Conductance Sensor. In Affective Computing and Intelligent Interaction, J. Tao, T. Tan, and R.W. Picard, Eds. Springer Berlin Heidelberg. 699-706.

Cognitive load self-reports:

  • Hart, S. and Staveland, L. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, P. A. Hancock and N. Meshkati, Eds. North Holland Press.

Vigilance tasks:

  • Auburn, T.C., Jones, D.M. and Chapman, A.J. 1987. Arousal and the Bakan vigilance task: The effects of noise intensity and the presence of others. Current Psychology 6, 3 (Sep. 1987), 196-206.

last revised 04/25/2012

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s