Back
engineering and technologyNews

Teaching robots how to feel

Kuldar Taveter is helping computers understand humans better. Photo credit: private collection
Kuldar Taveter is helping computers understand humans better. Photo credit: private collection
Share

Computer scientist and Tartu University professor Kuldar Taveter is converting feelings into numbers.

Emotions play a big part in our lives. Most humans can pick up quickly if someone is pleased or angry. We can share joy with a friend or a family member. But for computers, it’s easier to explain how to build a space rocket than to read feelings.

Software developers and architects don’t usually design feelings into the new system. When building a new product, they rarely consider this major difference between humans and robots.

Kuldar Taveter, a computer scientist and a professor at Tartu University, wants to change that. He is creating a model that would help robots understand feelings better.

Emotions are cultural artifacts

A psychology professor and one of the most cited scientists in her field, Lisa Feldman Barrett, has studied emotions and concluded in her famous book “How Emotions are Made: The Secret Life of the Brain” that our brains construct emotions based on relationships, interactions and in a cultural context. Essentially, emotions are cultural artifacts. They don’t just pop up out of nowhere – they have roots, reasons, and goals. Our brain collects the information and translates it into feelings. We apply anger, for example, to different patterns of change in the body. Barret has explained that no facial expression is reliably associated with anger, even in the same person.

Taveter, who once helped build Estonia’s e-governance system, was fascinated by the new upcoming theories and science behind feelings and wanted to explore how this could be applied in computer science. He wondered: “How could computers measure and understand feelings?”

“Surprisingly, researching this topic in Europe is rare,” Taveter said.

In an article he published with his colleagues in the “Journal of Systems and Software,” they concluded that “software engineers fail to give fair consideration to the emotional needs of users when designing systems and point out that emotional needs have not been successfully addressed in the software engineering field.”

In their view, emotions should not be disregarded as dysfunctional elements. On the contrary, they should be treated as first-class citizens!

Their study was the first one that used the theory of constructed emotion to “elicit and integrate emotional requirements at the early stage of the software development process,” as the authors wrote.

Emotions are closely linked to human values and have to be embedded in the software (and hardware), Taveter concluded. He based his research on Schwartz’s theory of basic values. These basic human values must be considered in a working product’s requirements, design, and architecture. But as values are culture-based, there is no fit-for-all web product. Instead, in every culture, specific value systems should be considered. For example, in one culture, the users can seek more self-direction, and in another, more conformity or stimulation.

How do you “translate” emotions for computers? This can be done by applying lexicons of words and expressions related to various emotions, where the words and expressions have been placed on negative-positive valence and high-low arousal scales. These words and expressions with the related valence and arousal values are available in different languages, including Estonian. Every emotion has a numeric value based on its place on the scale. Taveter’s research group has used the Estonian version of the largest of such lexicons to teach a dedicated “emotional” chatbot to interpret human emotions and reflect on them. This helps humans understand their emotional states and seek psychological therapy. The experiment results with the “emotional” chatbot conducted in Taveter’s research group are reported in this research paper.

“We want to create a systematic, thorough software tool based on our methods and systematic approach,” Taveter explained. “That being said, I don’t think every bank web solution should be very emotional. But there are many platforms where these theories could be applied.”

Ultimately, his message is clear: we should think about emotions when designing software. The new knowledge is increasingly helping us put emotions into numbers so that scientists in other fields can apply it in their fields.

It’s not just about making robots more human. Understanding how our emotions are constructed could also bring us closer to understanding robots.

Could robots ever understand feelings the way they do in our cultural context? Photo credit: Pixabay.

Written by: Marian MänniThis article was funded by the European Regional Development Fund through Estonian Research Council.

Read more

Get our monthly newsletterBe up-to-date with all the latest news and upcoming events