AAA What emotion AI can teach us about human behaviour

What emotion AI can teach us about human behaviour

Teaching machines how to think and behave like human beings is no longer a novelty in the digital age. But as artificial intelligence (AI) technology continues to advance, the reverse is starting to happen – machines are teaching us more and more about ourselves, and helping us to understand the subtle nuances of human behaviour.

Consider emotion recognition, a fast-growing field in the AI world. Machines today are learning how to read faces, study human expressions and recognise the emotions underlying them, and since machines are capable of processing vast amounts of data at a much faster pace than humans, emotion recognition is now a skill that can be automated.

Big corporates have already spotted the potential in emotion AI, and are putting tens of millions of dollars to work in the sector. To understand why businesses are so keen on emotion AI, I spoke to Vivek Ladsariya, head of investments at US-based Fenox Venture Capital, which is unusual in that it is backed entirely by corporates, including anchor investor Fenox Global, an auto parts manufacturer based in Belarus.

Investing in emotion AI

Fenox VC’s portfolio companies include emotion recognition software developer Affectiva, a 2009 spinout from MIT Media Lab, an interdisciplinary research centre at Massachusetts Institute of Technology. Affectiva has developed computer algorithms capable of analysing facial expressions – typically captured via webcam – and identifying the emotions underlying them, all in real time.

I asked Ladsariya how Fenox VC’s corporate investor base shapes the firm’s selection of portfolio companies. “We train ourselves to be a combination of both financial and strategic investors,” Ladsariya said. “The first part of the question for us is always: is it going to make us and our [limited partners] money? Then we look at whether the company is interesting to our LPs.”

Fenox VC manages at least two funds focused purely on financial returns, Ladsariya said, adding that the firm employs the “20% carry model”, a reference to the industry standard 20% of profits, or carried interest, that accrues to fund managers as a performance incentive.

Ladsariya explained that the benefit of having corporate investors was that “we can leverage the expertise of our corporate network – when we make an investment, we talk to LPs about how they understand the industry, and that serves as a data point”. But, he added, “corporate input does not change how we look at investing”.

Given Fenox VC’s broad investor base – the firm is backed by about 25 large corporates, and manages several parallel funds on behalf of these LPs – there are often synergies between the firm’s investors and the portfolio companies, and in these cases, Fenox VC can help to form strategic partnerships, Ladsariya said.

Empathetic machines

I asked him why Fenox VC chose to invest in Affectiva’s $14m funding round in May last year. He said: “One of our investment theses was that machines that are empathetic, that work on observable inputs like what a human being is feeling, are going to be far more successful.”

There is evidence that in certain circumstances, such as clinical interviews, a machine programmed to behave empathetically may actually be more effective than a human counterpart. A team of researchers led by Jonathan Gratch, director for virtual human research at the USC Institute for Creative Technologies, found that patients who spoke with Ellie, a virtual therapist, were more willing to unburden themselves when they were told that she was simply a computer, but were less so when they were told a human was controlling Ellie’s avatar.

“In health and mental health contexts, patients are often reluctant to respond honestly,” Gratch and his team wrote in a 2014 paper published by Computers in Human Behaviour. But a virtual therapist is perceived as non-judgmental, thereby encouraging patients to speak more freely.

Ladsariya said since human beings may not always say what they actually think or feel, facial expressions could often be more revealing. By studying specific parts of the face, Affectiva’s algorithms gather and interpret multiple data points to identify a person’s emotional responses.

One of Affectiva’s competitive advantages is the huge database of emotional responses that the company has assembled. Data from this repository – about 5 million faces of people from more than 75 countries – is used to inform Affectiva’s algorithms. Technology like Affectiva’s has transformed the robotics industry, enabling programmers to create robots that develop distinctive personalities through repeated interactions with people.

Take Jibo, a US-based personal robot maker backed by Fenox VC. Affectiva partnered Jibo to help the company’s robots learn to “emotionally connect with users”, Ladsariya said.

This ability to emotionally connect is meant to differentiate Jibo from other personal robots. Indeed, the 11-inch, six-pound aluminium robot with a moon-shaped face and skittle-like body is programmed to love being around people – Jibo can speak, learns to recognise faces, asks questions and even tells bedtime stories.

Jibo can also move in time to music, as Ladsariya discovered when he asked Jibo to dance. “Jibo picked a song that was pretty groovy,” Ladsariya said. Jibo is not available to the public yet, though it is expected to ship this year after its original delivery date in late 2015 was postponed.

The first social robot I have ever encountered was a robot dog at the Science Museum. While visiting the museum’s Robot Exhibition I held a honey-brown spaniel usually used as a therapy tool for people with dementia. The dog was heavier than I expected, its fur the texture of a fuzzy blanket, and as I petted its head it began to bark softly, its voice mechanical and toy-like rather than dog-like. The way the robot wriggled and moved in my arms made it seem alive. As I cradled the robot dog, other people gathered round to pet it too, stroking its fur gently as if it were a real dog. A robot dog does not need to look or sound or feel exactly like a real dog to persuade people to treat it like one, I discovered – all it has to do is behave as if it enjoys human affection.

Above: a child interacts with Jibo

Emotionally responsive gaming

If you reach a sticking point in a video game, you either consult a cheat sheet, seek help from a more advanced player, or abandon the game. But what if video games were smart enough to know when you are feeling frustrated enough to quit, and can respond accordingly, by reducing the game’s difficulty level for example, or by offering some kind of incentive to keep going?

Emotion AI enables video games to intervene in this way. By tracking players’ facial expressions and deducing how they are feeling, AI-equipped video games can modify the action to keep players engaged. Sustaining player interest is critical to game developers, so it is no surprise that companies like Japan-based Bandai Namco and Sega Sammy Holdings – limited partners in Fenox VC funds – are also Affectiva investors.

Gabi Zijderveld, Affectiva’s chief marketing officer talked about the potential for emotion AI in eSports, in which professional video game players compete with one another online or in large arenas. “eSports athletes want to understand the mood of their opponents, and of the audience, especially when playing online, and emotion AI can help with that,” Zijderveld explained.

There is big money to be made in eSports, which is why sponsors such as media publisher IGN and game livestreaming platform Twitch are spending large sums supporting the gaming community. The global eSports market has about 131 million fans and is growing fast. The industry was valued at around $325m in 2015, and is forecast to reach $1.1bn by 2019, according to market research company Newzoo.

Affectiva technology has already made its debut in a video game – its emotion software development kit was used in a 2015 release called Nevermind, created by US-based game studio Flying Mollusk. A psychological thriller set in the surreal landscapes of a trauma patient’s subconscious mind, Nevermind monitors players’ anxiety levels via a webcam, heart rate sensor, or both, and is the first biofeedback-enhanced adventure game.

Nevermind was created in part to raise “awareness about the prevalence and breadth of psychological trauma and post-traumatic stress disorder”, according to the game developers. The game is designed to do more than raise awareness about trauma – Nevermind becomes more difficult when it senses fear and eases when it senses calm, helping players to manage stress and anxiety, the company says.

Isabela Granic, chairman of the developmental psychopathology program at Radboud University Nijmegen in the Netherlands, studies interventions that work for anxious children and adolescents. In a 2014 paper on the benefits of playing video games, Granic and her team considered “reappraisal”, a key emotion-regulation strategy that is required in many games.

Reappraisal refers to the continual analysis of a situation and examines how well we adapt. Granic cited video games such as Portal 2, which asks players to solve complex puzzles that involve bending the laws of spatial physics. Each time a puzzle is solved, the rules of physics underlying that puzzle change, so players reappraise to learn a new set of rules. “Without applying reappraisal strategies, anxiety and frustration would likely be amplified,” Granic wrote, adding that by challenging players to adapt repeatedly to stressful situations, games like Portal 2 can teach “the benefits of dealing with frustration and anxiety in adaptive ways”.

Empathetic cars

Cars will also be able to respond to our moods, Zijderveld said, noting that in the past six months “there has been an explosion of interest [in Affectiva] from auto manufacturers”.

Improving driver safety is one of the main reasons for this interest. With the help of in-cab sensors and cameras, next-generation cars will be able to tell when drivers are feeling drowsy, for example. Tony Cannestra, director of corporate ventures at automobile parts maker Denso, told January’s Global Corporate Venturing & Innovation Summit in California that Denso was considering using biosensors capable of tracking eye movement and monitoring sleepiness to determine whether drivers are alert enough to take over from autonomous driving if the need arises.

Zijderveld said there were various ways an emotion AI-equipped car could intervene if a driver started to fall asleep. “The car might slow down or turn up the music to alert the driver,” she said. “The on-board computer might say: ‘There is a Starbucks ahead. Why not stop for a coffee.’ ”

In an interview with GCV contributing editor Tom Whitehouse in January, Cannestra said computer vision was steadily advancing, which “will help lock down the car’s ability to see in various different environmental situations, along with radar, better cameras and advanced mapping”.

But, Cannestra said, “rapid advances are required”, both in “computing power and chip architecture”. He added: “For those original equipment manufacturers (OEMs) that intend to move through semi-autonomous vehicles on the road to fully autonomous, I believe that personalisation of the driver’s experience will be very important and is not being directly addressed yet. The big question here is if this can all be done at a price the OEMs – and ultimately the end consumers – are willing to accept.”

What machines can teach us

Emotion AI technology is already being deployed to teach critical social and communication skills. Brain Power, a US-based neuroscience-focused technology company, has developed Affectiva-equipped software that turns wearable computers, such as Google Glass, into devices designed to help people with autism make sense of other people’s facial expressions and emotions.

The Brain Power System structures the learning process as a series of video games, and comprises an augmented reality (AR) headset worn like a pair of glasses. Users see a computer screen just beyond their field of vision and choose a video game by selecting from on-screen options.

The Emotion Game, for example, is designed to help users analyse facial expressions. Users see an AR emoticon on either side of a person’s face, and they choose the one that best captures how the person is feeling – happy or angry, for instance, turning the learning process into a game.

As Affectiva’s emotion AI database continues to expand, it is generating new insights into the similarities and differences between people across nations and cultures, Zijderveld said. “We are seeing interesting tidbits. People in the over-50 age group tend to be more expressive than younger people. Women are more expressive than men – they smile more, and for longer periods, though that varies by country. In the US, women do smile more than men, whereas in the UK it is more equal.”

Affectiva’s database has highlighted cultural differences as well, such as the “politeness smile”, a facial expression common in Southeast Asia and India. “What we found is that in more collectivist cultures, people in group settings tend to dampen their facial expressions, but when they are alone, they are much more expressive,” Zijderveld said. “In group settings, people are more expressive in the US than they are at home, while in Japan, it is the reverse.”

This sort of data can help us to understand cross-cultural nuances in emotional responses, Zijderveld said. “While we have long studied the human intelligence quotient (IQ), the notion of an emotional quotient in technology is missing, and that is what we bring to the table,” she said. “As we expand more into new markets, we are learning more about human nature, especially how humans interact with technology.”

Technology is the key word here, as what we are discovering about ourselves is increasingly machine-mediated. As with every technology, there are pitfalls, and among them is how the data companies like Affectiva gather will be used. Ladsariya acknowledged that “collecting large amounts of information comes with huge moral and legal responsibility, and every company collecting personal data needs to be mindful of that”. But even as we wrestle with the uncomfortable notion that in the digital economy our faces constitute data to be mined, the pace at which personal data is being collected continues to gather speed.

Historian Yuval Harari, a lecturer at Hebrew University of Jerusalem, warned in his book Homo Deus that because of all the personal information humans have voluntarily disclosed on social media, corporate algorithms now have the power to know us better than the people closest to us. Harari cited a Facebook study of 86,220 volunteers that found the social media giant’s algorithms were better at predicting people’s preferences than their friends, parents and spouses. “The shifting of authority from humans to algorithms is happening all around us, not as a result of some momentous governmental decision, but due to a flood of mundane personal choices,” he wrote.

As algorithms learn to predict our views and desires more and more accurately, it is entirely possible that eventually our technology will know us better than we know ourselves, Harari notes. And we may be tempted to cede our decision-making powers to computers, believing that key questions such as what we should do with our lives, whom we should marry and how we should spend our time are best answered by data-driven analysis. The danger is that we will no longer see ourselves as autonomous individuals, but rather as data points within a huge global dataset. Harari asks what will happen to society, politics and daily life in that scenario.

The dark future that he has described is not inevitable, Harari acknowledges. Technology enables us to learn more than we ever thought possible about ourselves and others, and it is up to us how to respond. For the promise of emotion AI is that it can help us not only to understand subtle differences across the dividing lines of gender, culture and nationality, but also to look beyond these markers of difference, and perhaps see a shared humanity.

Leave a comment

Your email address will not be published. Required fields are marked *