The idea of technology detecting and responding to human behaviors has fascinated people for decades. Remember when mood rings came out in the 1970s? It was the latest technology—and fashion statement—in identifying people’s emotions. A coworker or a spouse could tell how you were feeling with a glance at your jewelry, and they could respond accordingly. The biofeedback jewelry was extremely popular and marketed as a way of understanding yourself, understanding others, and learning how to gain voluntary control of automatic emotions.
Rapid advancements in technology over the last 10 years have led to the creation of computers which possess the ability to detect and predict. Computers are now able to react to ever-changing environments, interact with animals, and exhibit human-like responses to common events. The primary focus in the affective computing discipline of computer science has been to improve a computer’s ability to learn from data points and make rational decisions in areas such as financial trading or healthcare. While smarter, faster, and more advanced computers continue to be created, we have yet to develop artificial intelligence that can correctly recognize our emotions or feelings.
The likely first response is, “So what? Why do I care if my computer can read my feelings? After all, it's just a machine.” Let us first peek into the history of affective computing and what has been accomplished to date. The discipline can be traced to the 1995 study published by Rosalind Picard whose research centered on measuring emotions. Specifically, can a non-crystal device (mood ring) be created to measure and track feelings and emotions?
After years of research and several iterations, Professor Picard’s team created a mobile sensor, worn on the wrist, which was able to detect changes in the emotional state of the individual wearing the device. The team had essentially created a mood bracelet. The device tracked when the subject experienced fear, excitement, anger, bliss, loneliness, and other emotions. In addition to providing nice-to-know data points, the ability to measure and track emotions proved to be a major breakthrough. While machines were not able to influence an individual’s emotions, the devices were able to identify and record various emotional states through multiple situations. To answer the "so what?" question above, the data determined patterns and triggers for a variety of emotions, providing test subjects the opportunity for more self-awareness and control in the future.
Additionally, and perhaps more importantly, the new device allowed researchers to observe and track the emotional states of individuals who were unable to communicate them due to various developmental or physical challenges. The discovery immediately had enormous impact on a variety of fields.
Educators, therapists, coaches, and instructors previously left to guess the feelings of their students now had a much clearer understanding of what they faced. As a result of the device’s insight, lesson plans, therapies, and other programs could be customized to meet individual needs, skyrocketing the instructions' effectiveness. Similar to today’s activity-measuring devices (Apple Watch, Garmin, Fitbit), researchers are creating a new wave of portable devices designed to provide guidance on which activities to pursue or avoid based on a user's emotions. For example, an individual who is angry and tired may choose to avoid dealing with a difficult coworker altogether, or alter their approach to avoid escalating the situation further. Practitioners working with individuals unable to communicate their feelings and emotions now have the ability to interact based on live feedback and adjust their approach in a real-time manner.
While we have examined the more sociological side of affective computing, there are practical business applications.
Affective, the organization founded by Professor Picard, is working closely with corporations to create devices which can help them with design, safety, research, marketing, and communication functions. Working with auto manufacturers, the team from Affective is developing sensors that detect when a driver becomes sleepy or distracted and take action to avoid dangerous situations. In the field of e-therapy, doctors and therapists will soon have the ability to detect emotions of their remote patients and provide a more accurate evaluation of their physiological and emotional state. Affective also conducts studies on the effects of stress, which could benefit the workplace in a variety of ways, from making sure employees are not overworked to calming nervous patients in healthcare settings.
Technological advancements have greatly altered the way we live today. From self-driving cars, computers that converse with humans, smart devices that operate themselves, to machines that replicate bodily functions, humans and machines have become greatly interwoven. Most, if not all, of this interaction is based on logic and the ability to respond to and predict past and future actions.
Many feel the last frontier lies in the computer’s ability to determine and create feelings. While the ability to create and replicate human emotions is still a distant possibility, advancement and the achievements over the past 20 years have shown us the gap continues to narrow. Until that time, the ability to better understand our physical and emotional state will allow humans to recognize and change their actions and behaviors and improve their quality of life.
This article has been adapted from a chapter from Trenegy’s book, Jar(gone).
Trenegy is a non-traditional consulting firm dedicated to helping companies clarify the latest business jargon, putting it into useful terms and solutions that actually benefit your company. Find out more: info@trenegy.com.