Last twenty years has served for making science open. It’s a huge breakthrough in the overall human development yielded by the era of the Internet. Particularly I would like you to recall here such fields of science like psychiatry, neuroscience, less trivialized areas of psychology and technical field.

Firstly, they’re in trend now (you might have noticed), secondly, they’re interconnected. For many years and eras, they have been hidden from philistine eyes until the recent times of the World Wide Web.

However, the era of the Internet didn’t only give us the chance to glance over those scientific fields but it also speeded up their development and interaction. The reason of the latter lies in our infinite willing to create intelligent machines capable of feeling and empathizing. It arose with the first computers and has grown impressive after the dot-com bubble.

But have you got to think about the kind of intelligence you expect from these machines? To answer let’s firstly look at the options presented in humans.

Emotional Intelligence

Nowadays we can roughly divide human intelligence into its social, emotional and cognitive (the most trivialized, considered as the most important general intelligence and for a long time presented by IQ only) parts.

However, you can come across different classifications which involve more types based on the particular area of human development. They are helpful in particular cases but not for us right now. In the rest of the article, we will be relying on the classification with only those three types mentioned above.

The youngest type among those three is emotional intelligence (EI). First mentions of EI appeared only in 1990 when John Mayer and Peter Salovey came up with the model of EI, which included abilities to express, control, evaluate emotions and use them in other cognitive activities.

Before that happened, scientists used to refer all the manifestation of what we call now EI to social intelligence. This is something predictable enough as our social interactions derive from the emotional part, yet not fully presented by them. All the types are tightly interconnected and always work together. Therefore, attempts to evaluate intelligence is such a hard deal due to lack of tools to measure them all in the complex.

Emotional Intelligence

Back to 1990, when a student of high school stabbed his teacher of physics later explaining that by willing to argue his mark that hadn’t met his expectations. The boy was described by other students and teachers as a smart one doing great in academic subjects. He was aiming at entering Harvard University. When he received the mark “B”, he decided to show how unfair it was. He took a butcher knife and went to his teacher.

Later, he also noted that he had been going to make a suicide because of this mark. The judge declared him not guilty due to psychotic condition and released him from any accusations. Later he was taken to another private school and graduated with the mark of 4.6. Later he gained BS in Biology and MS in Medicine and for more than 10 years has been working as an Internist in Medical Practice that has received a lot of negative responses from patients.

What do you think about IQ of this guy? It might be quite high but how would you evaluate his EQ (if you could)? Differently, right? Would this difference influence the way you imagine how intelligent this guy might be? I bet, yes, it would.

There were a set of attempts to trace how high IQ influences achievements on a carrier road. After researching hundreds of ex-students carrier scenarios, scientists have come up with the conclusion that the highest academical results rarely yield successful carriers. Those who become top managers often don’t have IQ higher than 100.

The reason is that being on the top imply much more than only being able to succeed in science, for example. Here come in emotional and social intelligence that determines adaptability, self-awareness, ability to control and analyze emotions, therefore, understanding human behavior and, probably, thousands of other features, all of which make people either successful or not.

Emotional Intelligence

As an interest in EI has started to arise, studying of the particular brain structures, referred to the emotional side of our brain, has received impulse to activate the efforts. Now scientists are actively developing in studying EI with regard to anatomy and physiology and find it still difficult even having innovative research tools.

The problem is in the sparse structure of the so-called emotional brain. Of course, every medical student begins his way in brain anatomy getting to know a small area inside cerebrum called limbic system with the thalamus, hypothalamus, amygdala and the hippocampus as the main structures.

It’s considered to be the central part of the emotional part. But the tricky point is that all the functions and responses our brain is in charge of are performed by different regions together. Therefore, each complicated system which is supposed to deal with the particular type of functions is presented by sparse structures acting in synergy. With some brain functions (like human vision and hearing) performance is more straightforward, and, therefore more familiar to scientists.

However, emotional intelligence and how it influence our physical body still contains a lot of unknown unknowns which humanity starves to reduce. Therefore, now EI is even considered as a crucial component to medical education.    

Emotional Intelligence

Now let’s try to seamlessly move to our machines and their intelligence. Departing from all the importance of EI in the overall picture of intelligence we have observed so far, let’s think whether the intelligence they possess can be considered as a true intelligence? Probably, not until we bring to AI emotional component. Right from here we are going to look closely at the approaches to achieve this.

So, why do we need our computers to acquire emotions?

  1. To become self-aware;
  2. To be able to criticize;
  3. To be able to understand human intentions;
  4. To grasp relativistic nature of human mind;

EI can serve as a natural connecting wire between the inanimate world of machine and animate surroundings. As now we don’t have this connection, everything that we feed our machines (to make them learn from it) have to be translated into the language of numbers. However, with EI we will be able to demonstrate to the machines true features without any necessity to adapt it.

Emotional Intelligence

There is a handful of APIs that serve for emotion recognition within the following types of given data: appearance (images, videos or real-time), language (speech and text). Mostly their work is based on signal processing, machine learning, computer vision, natural language processing, deep linguistic analysis techniques which are combined in the best fitting to this task proportions.

Let’s look at the most interesting and convenient to use APIs.

Areas of application include:

  • creating smart and natural Conversational Interface for the applications;
  • providing greater customer service support;
  • creating amazing VR experiences;
  • easier gathering and evaluating of the customer feedback;
  • optimizing tools for test, research and analyze;

I. EMOTION RECOGNITION FROM APPEARANCE

The system of recognizing facial emotions are built up on the base of image and face recognition systems and perform emotion recognition by detecting special points (both deformable and non-deformable, based on Paul Ekman’s concept of “action units”) and analyzing their relative position.

To handle this task, it is common to use Deep Learning techniques (a branch of machine learning that is based on multiple processing layers united into complicated ANN (Artificial Neural Networks) architectures), particularly Convolutional NN and Recurrent NN.

Emotional Intelligence

Affectiva’s facial emotion recognition

Worth-trying examples:

  1. Affectiva (check this article to find out more about Affectiva).
  2. Microsoft Cognitive Services.
  3. Visage Technologies.
  4. nViso.
  5. Eyeris.
  6. Noldus.
  7. SkyBiometry.
  8. Face++.
  9. Kairos.
  10. 1SightCorp.
  11. Imotions.
  12. CrowdEmotion.
  13. MoodMe.
  14. Betaface.
  15. Emotion Recognition Task by Cambridge Cognition

Emotional Intelligence

II. EMOTION RECOGNITION FROM NATURAL LANGUAGE

To detect emotions and recognize speech patterns here we use NLP, Computational Linguistics, Deep Linguistic Analytics. For machine learning here state such approaches like Bayesian Network, SVM, k-means clustering, HMM (hidden Markov model), ANN. Even though achievements in this field are extremely wanted, the development is still difficult.

1) In speech:

  1. Good Vibrations.
  2. Vokaturi.
  3. Web Empath.
  4. EmoVoice.

Emotional Intelligence

Application created with the use of Web Empath

2) In a text:

  1. MoodPatrol.
  2. Watson by IBM.
  3. TextProcessing.
  4. Bitext.
  5. Synesketch.
  6. Toneapi.
  7. Receptiviti.
  8. Repustate.
  9. Alchemy by IBM.

Emotional Intelligence

Visual rendering of poetry by Synesketch

We have seen a lot of examples each of which is ready to be implemented in our work right now and, therefore, to make our life a bit easier today, not in the nearest/observable future. All this leads to the conclusion that attempts to build emotional intelligence are not driven just by the emulative approach of creating artificial intelligence of human likeness (even though it still makes sense for a research point) but rather with the aim of optimizing and providing better human-machine and human-human interactions in our everyday life.

Obviously, we haven’t gone very far from the start on this tricky road, however, “smart machines that know how you feel” are not waiting somewhere anymore, they’re right here. Let’s put off our innate skepticism and face them.

RECOMMENDED READING:

WHAT THE MOST INTELLIGENT CHATBOTS LOOK LIKE

The current problem of chatbots venue users face today is about expectations that go far beyond what one or another chatbot is really meant to serve for.

PROS AND CONS OF HUMAN-LIKE TECHNOLOGIES

Imagine you are entering your own home. At the very same moment, you are observing how gradually the lights are being turned on in every room you’re entering along with the temperature that is being adjusted to your physical needs.

FROM COMPLEXITY OF THE CHATBOTS TO THEIR APPLICATION

It’s getting more and more evident that current times are about adaptation to the rapid flow of the technological process. Things are getting complicated from one side enabling us to get rid of difficulties from another.

Growth Hacker and Sales Hacker, MVP builder, love to run technology companies.