User-testing is employed across almost every industry, most often with surveys given to users after they experience a product. However, memory is fallible–feedback captured in the moment, as users experience a product, yields much more reliable information. But how can you do that without creating a distraction?
Elsy Meis, who graduated with a BS in Creative Technology and Design this spring, proposes an approach in a paper she will present later this month at the Human Computer Interaction International Conference, held online June 26—July 1. Titled “HCI Strategies for Informing the Design of a Teacher Dashboard: How Might Real-Time Situational Data Determine the Potential for Technological Support in the Classroom?,” she discusses user testing of a teacher dashboard developed by a team from the Institute of Cognitive Sciences.
Teacher dashboards provide teachers with information about real-time student learning. In this case, the system gauges student learning by using natural language processing to monitor conversations among small groups of students working on specific problems. Based on the content of the conversations, the system provides the teacher with a real-time assessment of student learning group by group.
However, given the existing stresses of the classroom environment, the user experience for such technology must be finely tuned for teachers to adopt it. In her paper, Meis critiques the current retrospective user-survey approach to testing, pointing out the weakness of relying on memory. Instead, she proposes an approach to gathering data during the lesson that involves a single red button on an iPad:
“When they press the button, they are answering yes to the question: ‘Are you feeling overwhelmed?’” says Meis. “The testing sessions are recorded from multiple angles, so have all the context and it’s not hard to figure out why they were feeling stressed.”
“Until now, the research group has interviewed teachers after their class ended to determine how they felt at different points during the class, or they would interview teachers based on hypothetical situations. Using this approach, we get more information without having to ask them,” says Meis.
The HCI International selection committee for late-breaking work clearly agreed, admitting her paper to the conference. “The paper deals with a very specific application of this idea,” says Meis, “but the hope is that it is seen as applying to a wide range of user-testing scenarios.”
This research was supported by the NSF National AI Institute for Student-AI Teaming (iSAT) (DRL ).
Publication
Elsy Meis, Samuel Pugh, Rachel Dickler, Mike Tissenbaum and Leanne Hirshfield, 2022, “HCI Strategies for Informing the Design of a Teacher Dashboard: How Might Real-Time Situational Data Determine the Potential for Technological Support in the Classroom?” In Proceedings of the 2022 HCIInternational, pdf. (Virtual—June 26-July 1, 2022).