“Hey Alexa,” “Hello, Google!” We’re becoming accustomed to talking to AI machines. And they are learning to talk in our lingo too, as demonstrated by Google earlier this year.
Now, it could determine our emotions as well. Amazon’s Alexa team is involved in the making of a new device that can understand human emotions; by relying upon subtext and tone of voice rather than the words themselves. It is expected to be wrist-wearable and voice activated.
The potential of this technology is not to help us understand our emotions. It is to use our state of mind and how we are feeling to “recommend products and otherwise tailor responses.” However, given its functionality, it could help us figure out how we are feeling when we are confused. Taking on the role of psychologists perhaps?
The tech is still in very early stage of plans, and there is a great chance that it might not be able to determine emotions. Given that humans, who have dealt with emotions since the down of our species, still has trouble, we are not sure how well this technology would work. Moreover, we are complex beings with a multitude of emotions that go beyond just the sad, angry, happy, bored, stressed, fearful and disgust emotions that Amazon has filed a patent for.
Read more about the possibility of this emotion reading wearable tech here.