While great advances are being made in the analytical capabilities of computer systems there are also impressive developments being made in making computers more emotionally intelligent. This field is known as Affective Computing, and is defined as the study and development of systems and devices that can recognize, interpret, process, and simulate human emotions (or affects).
These developments are being driven by a need for more natural human-computer interactions, but there are also many examples where affective computing technology is augmenting our own abilities, and enabling us to become more emotionally intelligent.
We decided to take a closer look at the field of affective computing and pick out the leading companies. Profiles of all the companies are shown below, and you can find more companies by searching affective computing on VentureRadar.
Humanyze’s social sensing and analytics platform, developed at MIT, enables companies to quantify social interactions that were previously unmeasurable. This information can be leveraged to enhance teamwork and employee engagement, improve processes, and plan for growth. Humanyze has created a sensor-laden badge that transmits data on speech, activity, and stress patterns. Microphones and proximity sensors help employers understand what high-performing teams are doing differently compared to less effective ones.
Drive.ai is a Silicon Valley start-up founded by former lab mates out of Stanford University’s Artificial Intelligence Lab. The company is creating AI software (deep learning) for autonomous vehicles. Drive.ai wants its autonomous vehicles to not only replicate the driving part of the human driving experience, but also that communicative aspect. The system will include a roof-mounted exterior communication device, which will use written cues, as well as more language-independent signs like emoji to communicate the intent of the vehicle to those around it.
Beyond Verbal’s patented technology analyzes emotions from a speaker’s voice in real-time, as they speak. Their API can be integrated into a number of apps and devices. The technology does not analyze the context or content of conversations, nor does it record a speaker’s statements but instead it detects many different signs in a speaker’s voice that indicate they are anxious, well-rested, agreeable or angry, for example. Applications include identifying a speaker’s complex emotional state in settings in call centers, and matching people on dating sites based on emotional state. The company is also expanding its technology to detect diseases, such as heart disease, based on the user’s voice.
Kairos is a Human Analytics Platform for Developers, allowing users to capture data that measures people’s feelings and interactions. The company’s APIs and SDKs make it easy to integrate face analysis into any mobile or web application, helping to understanding how people feel as they interact with content, products, and the real-world. Applications include: Advertising; Healthcare; Time & Attendance; Online Education; Cars & Automotive.
NuraLogix has developed patent pending technology for detecting hidden emotions. The company’s Transdermal Optical Imaging™ (TOI™) technique utilizes a conventional video camera to extract facial blood flow information from the human face. Applying advanced machine learning algorithms and neuroscience the company is able to use this information to model and detect hidden/invisible human emotions regardless of the presence or absence of facial expressions. Application areas include: Marketing; Security (deception detection); Medicine; and AI (Artificial Intelligence). Nuralogix was founded by Professor Kang Lee of the University of Toronto’s Ontario Institute for Studies in Education (OISE).
When two people interact, speech is only part of the content with the rest of the message delivered via body language, poses and gestures. gestigon’s middleware provides this body language awareness to any device, enabling them to interpret explicit gestures (such as finger pointing), as well as anticipate the user’s needs by understanding implicit human behavior (such as scratching your head). Human centric interfaces enable fundamentally new user experiences which can make communicating with technology intuitive, easy, and fun. gestigon has a special focus on embedded systems, such as smart phones, tablets, other mobile devices, automotive, and medtech.
iMotions software, which is designed to provide the most comprehensive, easy to use and scalable biometric research platform in the market. It helps clients conduct human behavior research in the areas of Psychology, Neuroscience, Human Factors Engineering, Education, Health, Business and Human Computer Interaction. The iMotions software integrates biosensors and synchronizes eye tracking, facial expression analysis, EEG, GSR, EMG, ECG and Surveys in one unified software platform. The platform, which is targeted at Market, Academic, Usability and Gaming research is used worldwide by leading universities such as Harvard, Yale and Stanford as well as corporations such as P&G, S&P and Nestle.
Affectiva’s solutions provide insights into consumers’ emotional engagement with anything from digital content to brands, advertising, movie trailers, and TV programs. Spun out of MIT Media Lab, Affectiva also enables developers to add emotion sensing and analytics technology to their own apps and digital experiences. The company’s emotion data repository consists of more than 3.9 million faces analyzed from over 75 countries, amounting to over 40 billion emotion data points. This data fuels the training and testing of its classifiers.
Receptiviti’s natural language analytics tools help organizations gain an understanding of their people and audiences. Receptiviti provides technical users, developers and data scientists with an API that enables them to integrate NLP-based analysis of psychology, personality, thinking style, authenticity and more. Receptiviti enables bot makers and AI technologists to use these insights to guide actions, communication styles and build stronger relationships and user dependencies. Receptiviti enables AI platforms with emotional intelligence by analyzing natural language, tweets, email, IM, chat and voice.
Koko / USA / Founded 2015 / Emotional support as a service
Koko provides emotional support as a service for any product, including chatbots, voice assistants, and online communities. Koko uses its crowdsourced data – a massive repository of “human kindness” – to unlock new forms of artificial, emotional intelligence. The company uses the output of its peer-to-peer system to give machines the ability to provide nuanced, empathetic support.
Emoshape develops a microchip that enables an emotional response in AI, robots and consumer electronic devices. The company says the most innovative aspect of Emoshape microcontroller breakthrough is its real-time appraisal computation and Emotional Profile Graph (EPG) computation functionality allowing the AI or robot to experience 64 trillion distinct emotional states. The technology has applications in the realm of self-driving cars, personal robotic, sentient virtual reality, affective toys, IoT, pervasive computing and other major consumer electronic devices. Applications including Human machine interaction, emotion speech synthesis, emotional awareness, machine emotional intimacy, AI’s personalities, machine learning, affective computing, medicine, advertising, and gaming.
Empatica designs and develops “the world’s smallest and most accurate” wearable device for medical research of human behavior in daily life. The company’s E4 wristband is a wearable wireless device designed for continuous, real-time data acquisition in daily life. It monitors autonomic nervous system disruption and heart rate variability, among a set of 5 sensors. The company’s Embrace watch monitors physiological stress, arousal, sleep and physical activity. Over 135 leading hospitals, universities and companies use Empatica’s wearables including Boston Children’s Hospital, Stanford, MIT, Yale, NASA, Microsoft Research and Intel.
CrowdEmotion enables smart devices to capture and index engagement, emotions, and body language. The company blends academic thinking in emotion recognition with its cloud-based sentiment software to unlock human understanding. MeMo is a two-way video format that responds to body language to serve highly personalised content, provide emotional discovery, and understand engagement. The company’s CloudEmotion API allows users to capture, quantify, and interpret emotion data. Users can track facial expressions, listen for arousal and stress, or work with CrowdEmotion to experiment new biometrics.
Feel is a wearable wristband that leverages proprietary algorithms to recognize and track human emotions throughout the day. At the same time, the mobile application provides actionable recommendations based on advanced psychological techniques, to help users develop positive emotional habits and achieve wellbeing. Feel consists of 3 parts: the wristband, the emotion recognition algorithms and the mobile application.
nViso provides a scalable, robust, and accurate artificial intelligence solutions to measure instantaneous emotional reactions of consumers in online and retail environments. Using award winning artificial intelligence and proprietary deep learning 3D Facial Imaging technology, compatible with ordinary webcams, nViso uncovers the “why and how” of customer behaviour in real-time, letting brands make smarter business decisions. The company provides real-time and actionable information for Market Research, Brands, Creative Agencies and R&D Product Development.