09 march 2020
Artificial intelligence has learned to recognize human emotions

Artificial intelligence has learned to recognize human emotions

Voice bots already recognise and generate speech, have a dialogue with a person and write poems. But this is not enough. They have to read the user’s mood in order to offer them the right services, assess their emotional state, and help if necessary.

Research shows that people are much more comfortable communicating with a bot capable of sympathy. Giving algorithms a high level of emotional intelligence has therefore become a priority for many AI companies. After all, bots are primarily interested in developing EQs in business.

The market for emotion recognition systems is estimated at $21.6 billion in 2020 and will double by 2024. Gartner specialists are confident that by 2022 every tenth device will have an “emotional diagnosis” function.

Titans of the IT industry have already achieved tangible success in the area of “humanizing” bots. The plans for Facebook include equipping the video call device Portal with an emotion recognition function. And Amazon has already created the Halo fitness bracelet, which analyzes the user’s voice and identifies their emotions.

However, companies will not limit themselves to voice technology only. Robots will learn to recognize people’s moods through gestures, facial expressions, gait and text messages on social networks. For example, developers from Life ASAPA have created a program that recognizes a person’s gait and guesses their mood.

The complexity of data processing is the most serious problem with EQ development. Emotions are subjective and their interpretation is extremely difficult. Therefore, training bots becomes a long process of generating relevant data sets. Even a person does not always correctly identify emotions, and it is not yet possible to expect high precision from artificial intelligence in this regard.

Modern neural networks have already learned to recognise three basic types of emotions (positive, negative and neutral) by the voice of the person they are talking to with an accuracy of 95%. The algorithm works correctly with any speech except whispers.

It is predicted that by 2024 half of online advertising will be based on the recognition of emotions. Emotional technologies allow brands to be more responsive to customer requests. For example, Disney already determines whether viewers like content by the expression of their face using a streaming platform. And financial conglomerate Ping An claims that it has reduced loan losses by 60% thanks to new EQ algorithms.

Studies show that people are much more comfortable communicating with a bot capable of sympathy. Giving the algorithms a high level of emotional intelligence has therefore become a priority for many EQ companies. After all, bots are primarily interested in developing EQs in business.

The market for emotion recognition systems is estimated at $21.6 billion in 2020 and will double by 2024. Gartner specialists are confident that by 2022 every tenth device will have an “emotional diagnosis” function.

Titans of the IT industry have already achieved tangible success in the area of “humanizing” bots. The plans for Facebook include equipping the video call device Portal with an emotion recognition function. And Amazon has already created the Halo fitness bracelet, which analyzes the user’s voice and identifies their emotions.

However, companies will not limit themselves to voice technology only. Robots will learn to recognize people’s moods through gestures, facial expressions, gait and text messages on social networks. For example, developers from Life ASAPA have created a program that recognizes a person’s gait and guesses their mood.

The complexity of data processing is the most serious problem with EQ development. Emotions are subjective and their interpretation is extremely difficult. Therefore, training bots becomes a long process of generating relevant data sets. Even a person does not always correctly identify emotions, and it is not yet possible to expect high precision from artificial intelligence in this regard.

Modern neural networks have already learned to recognise three basic types of emotions (positive, negative and neutral) by the voice of the person they are talking to with an accuracy of 95%. The algorithm works correctly with any speech except whispers.

It is predicted that by 2024 half of online advertising will be based on the recognition of emotions. Emotional technologies allow brands to be more responsive to customer requests. For example, Disney already determines whether viewers like content by the expression of their face using a streaming platform. And financial conglomerate Ping An claims that it has reduced loan losses by 60% thanks to new EQ algorithms.

Subscribe to our newsletter

By clicking the button, I accept the terms of the Offer for the use of the site and agree with privacy policy