We might reasonably expect about as much emotion today as on any other day. Some of it is good, some bad, some euphoric, some painful. One technology that is rapidly building an uncomfortably exciting foundation is one that might help us detect all these emotions: Emotion Sensing.

Heart


Also known as Emotion Recognition, Affective Computing, and Sentiment Analysis, tech companies today and brands that participate in this area see a new frontier for delivering optimum outcomes to real-time prospects. More and more companies we speak to both love and fear tech. They see their industries becoming less focused on brand-driven customer experiences, and wish to pivot to tech-driven solutions. Largely driven by Amazon, this shift promises to integrate high-tech data solutions that promise consumer exchanges with high rates of customer satisfaction. This excites the robot inside me, because change shouldn’t be feared, change should be capitalized on and driven forward. That is how we evolve as a society.

The human in me is maybe a little anxious about all this change.

So what is real today and what is possible tomorrow?

Humans are only about 75% accurate when reading someone else’s emotions, and if you’re a Taurus it’s even worse. Computers are already at about 85% accuracy (dating apps take note). This gap is expected to widen as algorithms, sensors, and processing power continue to improve. 

Emteq’s solution, FaceTeq, is a platform technology that uses novel sensor modalities to detect the minute electrical changes that occur when facial muscles contract. With each facial expression a characteristic wave of electrical activity washes over the skin, and this can be detected non-invasively and without the need for cameras.

iPhone X already has face detecting cameras that are also in a good position to recognize user emotion. Apple is not talking about this as a use case, but other analysts are, and it looks like a good bet that future versions of iOS may implement this. Future iPhone versions with face detection hardware will likely also employ emotion recognition technology.

Affectiva is a company in Boston that produces an emotion-sensing plugin for Unity that we have been experimenting with. In addition to their vision-based system, Affectiva has a couple of other very interesting projects underway. The company is working on a wearable emotion sensor that is being used to help autistic children and their caregivers to monitor their emotional state. It is still experimental, but the hope is to develop a system that can provide real-time emotional feedback to users.

At least one company is offering “emotion as a service”: a cloud-based real-time emotion sensing service. The service performs analysis of audio (including low-bandwidth telephone calls) to reveal the emotional state of the speaker.

Mindmaze has revealed its Mask prototype, which requires the user to wear a clip on the ear and, according to one account, “conductive gel” on the skin. Samsung has also announced a development nicknamed FaceSense, although details are still limited.

Already being integrated into the concept cars of tomorrow, applications in autonomous vehicles will detect drivers’ attention states, promising smoother hands-off control. Perhaps emotions can even be expressed in alternative ways to using a middle finger, or cursing at that fella that refuses to use his signal light. One can hope. 

At SMITH LABS we are particularly interested in the future of emotion sensing where there is a world in which everyone is wearing advanced AR glasses, with visual and depth cameras that can sense the world around them in great detail. Researchers at Oculus are developing stretch sensors in the headset's foam to read facial expressions, and others are working on electrical sensors in the headset to determine expressions. 

Emotion sensing wearables have evolved to track electrodermal activity, which gather information through a tiny electric charge zapped across our super conductive sweaty skin.

It’s possible in the near future that you will be able to get a gauge of the emotional state of everyone you meet, with a much higher accuracy than we have ever collectively experienced. 

Consider how such technology will change the way we interact. Imagine, for example, how something like a job interview would change if both parties were able to sense each other's emotions with the help of a wearable device. Learning how to more closely control your emotions, or at least their expression, could become a critical ability for people to develop, especially those whose jobs involve a lot of negotiation. 

Maybe the beauty industry will find a new category of prospects when we see the rise of "tactical botox": the voluntary, temporary paralysis of some facial muscles, not for cosmetic effect, but to make emotions harder to read for the machines that are watching.

Education holds some of the most interesting potential applications. A few years back I wrote about the rise of online education. One of the current failures is student attention, as well as not having one-to-one feedback. Online educators have not been able to react immediately to confusion, boredom, or apathy. Crowd sourcing the emotions of your audience when watching videos might be a clever way to course correct inefficient content, both online and off. 

Like it or not, as we move towards a tech-driven society that is tied to our devices, we can be better at knowing how any audience feels. It might not be The Minority Report sentient level emotional data collection we all may secretly collectively fear, but the tech will enable that Hallmark card to tell you how many points you scored with the one who received it.

 Valentine SMITH

 

Tags: Experience Design, future commerce, Technology, Emotion Sensing, Emotion Recognition, Affective Computing, And Sentiment Analysis, Valentine's Day