Every passing day, we are hearing some breakthrough for Artificial Intelligence. We read about the Humanoid – Sophia, Google’s AI winning game GO and several other things. It is very apparent that AI and Machine Learning together would be able to replace several tasks that are based on methods and logic. Chatbots are already here which are sending out responses to humans. VPAs (Virtual Personal Assistants) are able to help with complex travel bookings as well. However, what is missing currently is these chatbots or VPAs do not understand human emotions – as yet. Emotion AI is all about bringing in emotion or empathy to the artificial intelligence so that the human-machine interaction is more human-like.
What is Emotion AI?
Developers and AI researchers are advancing the artificial intelligence not only to create systems which can think and act, but also detect and react to human emotions. This type of AI has been called as Emotion AI (or Emotional AI).
When humans communicate, they use several non-verbal cues such as voice tone, pitch, gestures, body language, and facial expression. As we all know the same sentence expressed differently can convey a very different message. Today’s VPAs use Natural Language Processing to understand and process the commands. If VPA can understand the difference due to emotions in the context, it could respond differently based on the emotion. And this is exactly what is being tried by various researchers and developers.
Affectiva seems to be a leader in this field and already has SDK and API to detect various emotions from a real-time camera, videos or images. In fact, their API can also help you understand age range and ethnicity of the subject. This could very well add to personalizing the overall communication between bots and humans. Affectiva has helped Giphy to build a search engine to search gifs by emotion by automatically tagging the gifs.
Apart from Affectiva, there are other players as well e.g. Sanbot, Pepper, Nviso. The technique typically involves using the visual sensors to capture various metrics and then using emotion detecting software and AI to take certain action. e.g. Nevermind game can detect your responses and emotions when you play the game and can adjust the game difficulty accordingly!
I found this very apt:
We are approaching an era, when artificial intelligence uses humans as organic robots to realize its goals.