Monthly Archives: April 2012

Journal article on my dissertation work accepted to IJHCS!

My dissertation co-advisors, Ken Koedinger and Jie Yang, and I have had a paper accepted to the International Journal of Human-Computer Studies on my thesis work incorporating handwriting recognition into intelligent tutoring systems for algebra learning. Although I am not working directly in this area any longer (I’ve shifted somewhat away from learning sciences research and now focus more on interaction and design), it is great to finally publish my dissertation work to a wider audience. As the paper’s title “A Paradigm for a Handwriting-Based Intelligent Tutor” suggests, it focuses heavily on the interaction paradigm we developed as a means to allow student learners to take advantage of natural handwriting input to solve math problems, while limiting the impact the system’s recognition errors would have on their learning experience. Here is the abstract:

This paper presents the interaction design and a demonstration of technical feasibility for intelligent tutoring systems that can accept handwriting input from students. Handwriting and pen input offer several affordances for students that traditional typing-based interactions do not. To illustrate these affordances, we present evidence, from tutoring mathematics, that the ability to enter problem solutions via pen input enables students to record algebraic equations more quickly, more smoothly (fewer errors), and with increased transfer to non-computer-based tasks. Furthermore our evidence shows that students tend to like pen input for these types of problems more than typing. However, a clear downside to introducing handwriting input into intelligent tutors is that the recognition of such input is not reliable. In our work, we have found that handwriting input is more likely to be useful and reliable when context is considered, for example, the context of the problem being solved. As touch screens and tablet computers become progressively affordable and commonplace, pen input is increasingly more available to students in classrooms. We present an intelligent tutoring system for algebra equation solving via pen-based input that is able to use context to decrease recognition errors by 18% and reduce recognition error recovery interactions to occur on one out of every five problems. We applied user-centered design principles to reduce the negative impact of recognition errors in the following ways: (1) though students handwrite their problem-solving process, they type their final answer to reduce ambiguity for tutoring purposes, and (2) in the small number of cases in which the system must involve the student in recognition error recovery, the interaction focuses on identifying the student’s problem-solving error to keep the emphasis on tutoring. Many potential recognition errors can thus be ignored and distracting interactions are avoided. This work can inform the design of future systems for students using pen and sketch input for math or other topics by motivating the use of context and pragmatics to decrease the impact of recognition errors and put user focus on the task at hand.

A link to the final version will be posted when it is available.

Leave a comment

Filed under Publication

Paper accepted to Graphics Interface 2012 as a Note!

My colleague Jacob O. Wobbrock and I have had a paper accepted to the Graphics Interface 2012 conference! We extend our $N multistroke recognizer to use the closed-form matching method of Yang Li‘s Protractor, speeding up the matching process significantly. The paper is titled “$N-Protractor: A Fast and Accurate Multistroke Recognizer,” and here is the abstract:

Prior work introduced $N, a simple multistroke gesture recognizer based on template matching, intended to be easy to port to new platforms for rapid prototyping, and derived from the unistroke $1 recognizer. $N uses an iterative search method to find the optimal angular alignment between two gesture templates, like $1 before it. Since then, Protractor has been introduced, a unistroke pen and finger gesture recognition algorithm also based on template-matching and $1, but using a closed-form template-matching method instead of an iterative search method, considerably improving recognition speed over $1. This paper presents work to streamline $N with Protractor by using Protractor’s closed-form matching approach, and demonstrates that similar speed benefits occur for multistroke gestures from datasets from multiple domains. We find that the Protractor enhancements are over 91% faster than the original $N, and negligibly less accurate (<0.2%). We also discuss the impact that the number of templates, the input speed, and input method (e.g., pen vs. finger) have on recognition accuracy, and examine the most confusable gestures.

Check out the camera ready version here. For the pseudocode of this method, see the $N project website.

Leave a comment

Filed under Publication