So much information about us can be digitally quantified now: what we read, what we buy, and even our vitals can be tracked on our phones to give us a better idea of our long-term health. But how comfortable would you be with a machine tracking, analyzing and even responding to your emotions?
Emotionally Quantified Selves
That’s what proponents of what is called affective computing want to do: to bridge the cognitive-emotional gap between computers and humans, to develop systems that would be able to interpret, adapt and respond to the emotional states of their human users. But is this something we want in our machines?
For many, the machine learning paradigms developing around speech and handwriting recognition make sense. After all, being able to speak with your computer and have it understand and interact with you on that level is useful. But evidence from the latest neuroscience is revealing that emotions do play a large role in reasoning and cognitive control, leading to affective computing experts assert that emotional reasoning is vital for true machine intelligence.
This position is now being borne out by evolving research in computer science, artificial intelligence, psychology and neuroscience, leading to a flourishing in the affective computing field. Startups like Affectiva, Emotient, Realeyes and Sension are developing applications that can analyze and differentiate the emotional quotient behind facial expressions: the furrow in a brow, whether a curled lip represents a smile or a grimace.
Technologies like Affectiva’s Affdex track emotional “classifiers” using a series of points assigned to certain parts of the face, which are analyzed in real-time by an algorithm, then compared to the world’s largest database of previously analyzed faces and emotions, in order to identify what you might be feeling at the moment. Affectiva’s facial coding software is lightning fast, taking a fraction of a second to evaluate micro-expressions on the face, and surprisingly accurate. Watch Affectiva’s chief science officer Rana El-Kaliouby demonstrate the company’s software in this TED talk:
Enabled by Big Data, Neuromarketing is on the Rise
Affective computing’s renaissance is no doubt being facilitated by the emergence of big data and its role in driving deeper machine learning, as we’ve seen in examples like Google’s Deep Dream software for artificial neural networks. Affective computing researchers are using the enormous, crowdsourced data sets of vocal, gestural, facial and physiological responses (heart rate, galvanic skin response) now available to them to push the technology further toward more natural interactions between humans and machines. Imagine if our computers could express empathy — it would change our relationships with machines. This push toward emotionally intelligent machines is also being helped along by improved sensors that are now becoming commonplace on handheld devices, and maturing distributed platforms, which allow the technology to scale up in an unprecedented way.
Yet, it’s also being helped by a shifting perspective on the importance of emotion analytics, as emotional, qualitative “soft” data is increasingly seen as equally valuable as quantitative “hard” data. Advertisers in the new field of neuromarketing are especially interested in using soft data like emotions, and other physiological changes in test subjects to fine-tune market research and to gauge the effectiveness of their commercial campaigns across different cultural and age demographics, by tracking and analyzing detailed emotional and physical responses that could be monetized in some way, without the human bias of traditional ad testing techniques.
Affective computing could have medical or educational applications too: autistic individuals could potentially benefit from a wearable that can measure and monitor their levels of anxiety and stress, and alert parents or caretakers before a meltdown happens.
But perhaps the greatest unknown is how affective computing could also play an integral role in an ubiquitous Internet of Things, where new stacks of various technologies are so embedded into our daily lives that they become unobtrusive and intuitive to use. As part of our everyday experiences, emotionally intelligent devices would gauge our feelings, and react accordingly. For instance, the coffee maker could automatically start if our energy levels and focus dipped, or the kitchen table could perhaps display recipes for energizing foods.
The New “Emotion Economy”
Observers are predicting a new emerging “emotion economy,” where emotions would be just another data set to be collected, evaluated and exploited for various purposes. Phones, watches or other wearables equipped with these algorithms could offer up mood-targeted advertising or other personalized virtual experiences, or create more immersive gaming environments, as Microsoft’s Xbox One attempts to do with its Time of Flight technology, tracking players’ eye movements and physiological condition.
Yet, despite the potential, positive applications in numerous fields, some will argue that these tools may be used to further privatize our consciousness, as companies and their neuromarketers gain more unnerving insights into our most private emotions and thoughts, and use this data to manipulate us further into buying something we don’t need, or getting us to watch yet another needless commercial. Prompted by privacy concerns, some lawmakers want to require that companies give consumers a choice for to opt out, and to warn them when their game console camera is watching them and when personal data is being collected. In the end, it would depend on how that data is used, yet the question remains: do we really want our phone, couch or television to know how we really feel?