13. January 2018 | Allgemein
Posted by aisolab

Next level of Artificial Intelligence: machines understand emotions

Machine learning advances into ever new dimensions. In the meantime, Artificial Intelligence can do something that previously seemed to be reserved for humans: it understands emotions.

 

Artificial intelligence: a model of the human brain

Modern AI and machine learning via neural networks have a natural model: our human brain. This model is the most effective tool for solving all our known problems. However, a critical aspect of our intelligence was missing in the previous AI programs. It is about empathy and emotional intelligence. With these abilities, people can grasp feelings and make intuitive decisions “straight from the gut”. To date, intelligent software programs have been able to understand speech, respond to it and act independently even after a particular data template, i. e. to act intelligently in common sense. But they do not feel for anyone. Now developers have moved a step closer to the incorporation of emotions into machine intelligence. Engineers have developed a method that allows the computer to recognise human feelings using physiological reactions and facial features. The pioneers of AI programs – Google, Microsoft and other giants – are very interested in this. They would like to integrate this AI aspect into their existing solutions or create computer-aided sentiment analysis that helps machines to interpret human feelings correctly and act accordingly. These can be machines of all kinds, even construction machinery.

How does machine learning about emotions work?

Data that communicates the emotional state of a person to a machine is transmitted in many different ways. That includes:

  • a vocal sound
  • speech patterns
  • use of certain expressions and phrases
  • facial expressions
  • physiological signals such as pulse, heart rate and body temperature
  • gestures
  • body language

Physiology cannot measure every machine because it requires individual sensors. But all the other signs are audible. Especially speech and facial expressions contain various non-verbal cues, which are very meaningful. Research results show that 55% of messages in the conversation are hidden in smiles, facial expressions and body signals such as a shrug of the shoulder, 38% in tone and only 7% in the actual meaning of the words. The previous software solutions for speech analysis thus neglect most of the message; they just identify the word itself. For example, a smartphone with speech recognition currently does not yet recognise a phrase with exclamation marks or question marks. But companies using Artificial Intelligence quickly learn more. These companies want to assess the emotional impact of advertising spots. Such situation can be possible by turning on the camera on a laptop while watching an advertising video. Up to the computer, which really “empathises” with us, not much more research time should pass. Experts already point out that an ethical discussion could then arise: Does the computer have feelings, does it have rights?

Zurück