Thanks to AI, Cars May Soon Be Able to Sense Your Emotions

September 9, 2020 - 7 minutes read

Pretend you’re on your daily work commute. Stuck in the stop-and-go traffic on your local highway, you start getting a little nervous — you have a big presentation today at your job, and it could be the difference between a promotion and staying in the same role for yet another year. As you ponder this make-or-break moment, another driver cuts you off as they switch into your lane.

By this point, your emotions have come to a full boil; you slam your hand on your car horn and shout a few expletives that are inaudible to anyone else but yourself. Or so you think — aware of your inner turmoil, your car’s onboard computer goes into “soothing” mode. Calming music begins playing through your car’s audio system as the air conditioner diffuses a relaxing lavender smell into the air.

Do you think this scenario sounds weird? Well, you’re not alone. But it may not be imaginary for much longer. Thanks to the rapid development of artificial intelligence (AI), cars may soon come equipped with emotion-reading technology. But considering that the tech would use the car’s sensors, camera, and microphone, this new possibility is raising quite a few data privacy concerns.

Giving AI Emotional Intelligence

Just as machine learning apps can be trained to differentiate between pictures of cats, dogs, and humans, they can also learn to distinguish between tones of voice and facial expressions. In fact, there’s a whole research discipline dedicated to creating intelligent systems that can identify and respond to human emotions. It’s known as affective computing.

In affective computing, AI is trained on large sets of labeled data such as “tears = sad,” “smile = happy,” “laugh = funny,” and so on. This allows it to recognize various emotions and what they sound like. If that’s not creepy enough, the most sophisticated of these systems can read micro-expressions that flash across our faces — before you even have a chance to control them.

Boston-based developer Affectiva is a spinoff from MIT Media Lab that focuses on affective computing. Its algorithms are trained on 9.5 million videos of people’s faces as they participate in activities such as conversing or reacting to stimuli. These videos equate to approximately 5 billion facial frames. With this immense amount of data in tow, Affectiva claims that its software system can even account for cultural and gender differences in emotional expression.

Why Emotion-Reading AI Is Coming to the Auto Industry

Affectiva and numerous other companies in the affective computing space, such as Eyeris, Xperi, and Cerence all have plans to partner with auto industry titans to install emotion-reading AI in their new car models. Recent regulations in Europe and bills introduced in the US senate are helping to normalize the concept of “driver monitoring” by focusing on the safety benefits they present. For example, some systems can preemptively warn if a driver is falling asleep at the wheel or staring at their phone screen.

But the majority of people wouldn’t consider distraction or drowsiness as emotions. So why are they both being lumped together with emotion-reading AI? And maybe it’s just me, but does it sound like things are getting a little… Big Brother-esque?

After all, emotions are one of the most private things about human beings. We’re the only ones who know their nature. And we’ve evolved to have the ability to hide or disguise them when necessary — a useful skill in relationships, work, negotiations, and many other activities.

Putting AI systems into our cars that can recognize and collect data about our emotions seems disconnected from the intention of preventing accidents due to a distracted state of mind or sleepiness.

Privacy Invasions Abound

Remember those European regulations we mentioned earlier? They’ll help ensure that driver data is only being utilized to make commutes safer. But we can’t say the same about the comparable US bills. Right now, car companies are essentially free from any enforceable laws that would stop them from using this sensitive data as they see fit.

On its site, Affectiva lists a number of use cases for AI-fueled occupant monitoring in cars: fine-tuning environmental conditions such as air conditioning, providing alternate route options, personalizing content recommendations, and designing virtual assistant to be more emotion-aware name a few entries.

But our phones already accomplish some of these use cases. And many of the other ones can be addressed by simply reaching your hand out and clicking a button or rotating a dial. Despite these obvious and practical rebuttals, emotion-AI is not going to go away anytime soon, no matter how unnecessary or unsettling it seems. And cars are just one of the first frontiers that this new technology is entering.

For instance, Affectiva is also making software for the advertising space. Companies in this arena can watch users through their laptops (with their consent, of course) and record their reactions as they watch ads. This allows the companies to gauge what’s working and how likely they are to buy something. Education, fraud monitoring, call centers, and mental health applications are all using emotion-reading AI as well these days.

What do you think of AI that can read your emotions? Do you think it has a place in our cars? Or is it just another way to invade our privacy and collect data? As always, please let us know your thoughts in the comments below!

Tags: , , , , , , , , , , , , , , , , , , ,