Robot hand touching human hand

(SWNS / Ruhr University Bochum)

Empathic machines that predict users' feelings could be on the horizon

JYVÄSKYLÄ, Finland — Imagine if your computer could sense your emotions as you worked — feeling your joy at completing a task, your boredom during repetitive data entry, or your frustration when an error message pops up for the tenth time. This might sound like science fiction, but researchers are bringing this vision closer to reality by developing advanced computational models that can predict and simulate human emotions during computer interactions.

At the forefront of this effort is a team of Finnish scientists who have created a model that integrates two key psychological theories - appraisal theory and reinforcement learning. Appraisal theory suggests that our emotions arise from our cognitive evaluations of events. For example, we might feel happy if we appraise an event as conducive to achieving our goals. Reinforcement learning, on the other hand, is a framework for understanding how rewards and punishments shape behavior over time.

The researchers combined these two approaches into a unified computational model that can predict emotional responses as a person interacts with a computer to complete tasks. The model essentially puts itself in the user's shoes, simulating the series of actions, outcomes, and cognitive appraisals that ultimately give rise to emotions like happiness, boredom, or frustration.

“Humans naturally interpret and react to each other's emotions, a capability that machines fundamentally lack,” explains Jussi Jokinen, an associate professor of Cognitive Science at the University of Jyväskylä, in a media release. “This discrepancy can make interactions with computers frustrating, especially if the machine remains oblivious to the user’s emotional state.”

Stressed, upset millennial sitting at work computer
Empathic machines that predict users' feelings could be on the horizon. (© WavebreakMediaMicro - stock.adobe.com)

To test their model, the researchers designed a series of interactive computer tasks meant to evoke specific emotions. In the “happiness” task, users answered a series of questions and received positive feedback for correct responses. The “boredom” task involved a drawn-out series of dull, repetitive questions. In the “frustration” task, the system was intentionally programmed to display error messages and ultimately fail, regardless of the user's answers.

As study participants worked through these tasks, the emotional reactions predicted by the model closely matched the emotions reported by the users themselves. For instance, in the “happiness” condition, both the model and the human participants reported high levels of happiness and low levels of boredom or frustration. The model was even able to capture more nuanced emotional trajectories, such as a steady increase in frustration over the course of the error-ridden task.

The researchers believe their emotion-predicting model could pave the way for a new generation of emotionally intelligent computers that can tailor their behavior to the user's psychological state. An effective system might offer a stressed user soothing words of encouragement, liven up a boring task with humor or gamification, or provide empathetic assistance when frustration mounts. By creating interactions that are more emotionally attuned, designers could boost user engagement, productivity, and overall well-being.

Of course, the model is still a work in progress and will need to be extended to capture a wider range of emotions across more complex, real-world computer interactions. The researchers also emphasize the importance of gathering more diverse training data to ensure the model can accurately predict emotions for users of all backgrounds.

There are also ethical considerations. An emotion-sensing computer could be a powerful tool for empathy and assistance, but it could also be seen as invasive or manipulative if not implemented thoughtfully and transparently. Clear opt-in procedures and data protection will be crucial.

Despite these challenges, the potential benefits are significant. From more engaging educational software to emotionally supportive virtual assistants, affective computing could reshape the way we relate to and work with machines. By bridging the emotional gap between humans and computers, this technology promises to make our increasingly digital world a bit more human.

As the researchers continue to refine their model, a future where our devices don't just see our clicks and keystrokes but also perceive our joys, fears, and frustrations inches closer to reality. In this brave new world of emotionally fluent machines, we may find that computers are not only tools for productivity but also partners in our psychological journeys.

EdNews Editor-in-Chief Steve Fink contributed to this report.

About EdNews Staff

EdNews sets out to find new research that speaks to mass audiences — without all the scientific jargon. The stories we publish are digestible, summarized versions of research that are intended to inform the reader as well as stir civil, educated debate. EdNews Staff articles are AI assisted, but always thoroughly reviewed and edited by a ED News staff member. Read our AI Policy for more information.

Our Editorial Process

EdNews publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on EdNews are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

Chris Melore

Editor

Sophia Naughton

Associate Editor