Can robots understand our feelings?

David Konopnicki Photo: PR

David Konopnicki, head of emotional recognition at IBM's Haifa research center, tells "Globes" about use of robots in customer service.

The Belgium Police has recently published an unusual statement. The police site warned Facebook users about using the Reactions buttons, recently introduced by the world's largest social network. The warning stated that reacting with an emotion (such as 'angry' and 'sad') might compromise the user's privacy.

Such a claim is far from unfounded. The Facebook algorithm attempts to collect any available piece of information in order to present users with customized ads. A person whose Reactions suggest that he is in a good mood might be presented with more ads, which will be more profitable for Facebook.

Facebook is probably the last place where you can hope to find privacy, but this surprising Belgium Police statement should provoke thought regarding the motives of companies and governments - after all, there are many entities which would be eager to know what we feel, information that might be worth a lot of money.

David Konopnicki is responsible for emotion recognition at IBM's Haifa research laboratories. He told "Globes" that this issue has been on the table for quite a while. "The first scientific article in the field," he says, "was written 20 years ago, but over the years, most emotion recognition applications have remained limited to the lab. Why? Because identifying emotions is complicated. In the past, if you wanted to identify emotions, you had to connect people to complicated, heavy machinery, which you could not do in 'real life.'"


How did you start examining the field of emotion analysis?

Konopnicki: "Several years ago, we ran a brainstorming session, trying to figure out which information about people we have and have not yet used. Today, you can tell where people are and what are they interested in. After a few days, we reached the conclusion that there is one very important piece of information about people that we did not utilize yet, which is their emotions. When computers interact with people, we want them to do so in a way that simulates having emotions."

How is this actually manifested?

"In the past few years, following the development of mobile phones and computer wearables, people are no longer interacting with a computer only using a mouse and a keyboard. For example, people have been interacting with computers using a camera, which enables an examination of their facial expressions and development of emotion recognition applications. The computer could, for example, know when to thank a person and when to apologize.

"We are currently most interested in the application in customer service. We are talking about a situation in which people are talking to an automatic agent and are aware of it being a robot. The person could, say, for example, 'I had a problem with the car' and the computer would know to respond with 'first of all, I am sorry that this happened to you.' The computer knows that if a person had a car breakdown he would feel certain emotions. If the computer notes that the person is confused it may provide more detailed instructions or transfer him to a human representative.

"We are also interested in remote learning applications. Today more and more people have been learning in front of a computer. When you are in front of a computer, it can identify what you are having difficulties with, what makes you bored and what is your level of motivation. Just as a good teacher knows how to challenge his student, sparking interest etc., we are interested in developing a computer that can identify the learner's vulnerabilities in the same manner and use various programs or pedagogical techniques to improve teaching."

When the computer analyzes my emotions, does it use only text or does it use expressions as well? For example, I could say that I am angry, but in an amused tone.

"We are working on several levels. The first one is the textual level, which we use when a person sends an email or contacts a virtual service representative at a website. We examine your word choice, with each word having a different emotional baggage. For example, if you contact an insurance company using chat and write that you have had an accident an accident would be a word with a certain emotional baggage which the computer learns to identify. In addition, we look at syntax and punctuation as well as emoticons. If a person talks to the computer on the phone, we analyze the voice and then examine the text, in addition to the tone and how fast the person is talking.

"As for speech this is to a large extent culture-dependent, with some cultures in which people talk very loudly and get agitated quickly. In other cultures, a person may be agitated, but you will not feel it in his voice. The third level assisting us with emotion recognition is facial expressions and this is a universal matter. In all cultures, people smile when they are satisfied and turn the corners of their mouth down when sad. This happens in all cultures, it is not something you learn, it is innate.

"In applications in which it is possible, we also analyze facial expressions. Computer wearables are also relevant for this there are watches in the market measuring blood pressure and heart rate and this will also help us in the future."

Are emotions limited to happiness or sadness or can the system identify complex nuances?

"There are emotion analysis models that have been developed for dozens of years. There was a renowned researcher named Paul Eckman, who defined six basic emotions anger, fear, sadness, happiness, surprise and disgust. The uniqueness of these feelings is that they are physiological and shared by both humans and animals. These emotions can be identified by heart rate, facial expressions and more.

"In the case of customer service, it is important to identify anger, disappointment or happiness, for example. We are interested in differentiating between anger and disappointment, although both are negative feelings. We take recordings of real conversations with a human service representative and say 'this is an example of an angry or disappointed person.' The computer learns from these examples how an angry person sounds etc."

The computer will offer different service to angry and disappointed clients?

"Yes, exactly. We want to not only identify emotions but know what to do when a certain emotion appears. Most companies have some kind of policy, what to do in this or that case. The computer knows, for example, that if an important customer is angry, prices should go down by 10%. We offer techniques lowering prices, apologizing or transferring the client to a human representative."

Will your initiative kill the entire field of customer service? Will there be a robot smart enough to respond to any scenario?

"I do not think so. There is a certain stage in which, if the problem is complicated enough, we will need a human. At the same time, look at what is happening on the internet. In the past, if you had wanted information from a company, you had to call and ask. Nowadays, all companies have some sort of a website. The site usually has a frequently asked questions section, which you can access and find answers to about 80% of all common questions. I believe that we will be able to take care of a larger percent of problems automatically, with gradually less problems requiring human intervention."

Ethics aspects

This could spill over to work places? Will employers monitor the employee's emotions?

"All these issues have complicated ethics aspects. Today, certain firms can already monitor, for example, where you are surfing on the web and whether you are focusing on your work or not. What protects the workers in this case is the law, rather than technology.

"There are currently systems that check whether the person operating heavy machinery is tired. If a person is tired, he is prohibited from operating the machine. We know that car manufacturers are also interested in this. They could, for example, know if a driver is agitated. Will the car not start in this case? I do not know, but since there will be more autonomous cars, such a car could take control from an agitated driver."

"People are getting accustomed, they no longer notice that this a robot"

David Konopnicki: "You can take healthcare, for example. We know that emotions affect diseases and that some illnesses also have an emotional effect. We know what a person eats, how much he exercises and how his emotions changes, and this can be analyzed.

"In the field of security, security entities worldwide dream of knowing whether a person is lying without connecting him to a complicated device, as it is done now. No one can do it only based on facial imaging but however manages to do this will become very rich."

How will this trend affect what remains of our privacy?

"We are already being photographed in all kinds of places without our knowledge and have no idea what is being done with these photographs. We work more with customer service, in the field of retailing. The principle is that you are talking to a robot. I have no problem telling a client you are talking to a robot and it is analyzing your emotions."

I, as a customer, would feel very uncomfortable with this.

"You see people talking to Siri (Apple's personal assistant) and with Microsoft's Cortana and they are getting used to it. They no longer notice that this is a robot."

When you ask about the weather or request driving directions, this is not critical, but if I was to consult a doctor - I would prefer having a real doctor on the other side, rather than a program that can analyze my physical parameters.

"I am not saying that a robot will replace a doctor but I do think that a robot will in some cases replace a technical support person. I think that such systems can support doctors. In remote interaction, it could be nice for the doctor to know that the person he is talking with is in distress. The doctor might find it difficult to recognize this and a computer might do it more easily."

Published by Globes [online], Israel business news - - on July 10, 2016

© Copyright of Globes Publisher Itonut (1983) Ltd. 2016

View comments in rows
Update by email about comments talkback
David Konopnicki Photo: PR
David Konopnicki Photo: PR
Twitter Facebook Linkedin RSS Newsletters גלובס Israel Business Conference 2018