It can answer most of your questions, but can it show enough emotion to fool you?
Siri and Alexa can answer most questions you have, from how the traffic is on your morning commute to what the score of last night’s match was. As chatbots, smart speakers, and artificial intelligence become a bigger part of our lives, programmers are teaching them more about us so they can better relate to emotion. Interacting with intelligent machines can make us feel like we’re talking to another person. But can AI ever truly be your friend?
Filmmakers have explored the idea. In “Her,” the main character falls in love with an operating system, and the love is reciprocated. In “Ex Machina,” an android not only passes the Turing test with flying colors, she gains the trust of a human in order to exploit it and gain a strategic advantage. Of course, those super intelligent computers aren’t actually capable of those very human traits, they’re played by human actors, Scarlett Johansson and Alicia Vikander.
They hit at a very human desire, however. We’re social animals. We crave communication and connection. We talk to our pets as if they understand every word we’re saying and will respond. We name our cars. We anthropomorphize everything. Check the paragraph above. The title character is an operating system, yet the movie is called “Her” and the operating system chooses the name Samantha. The android in “Ex Machina” goes by Ava and is assigned female pronouns.
“People often name their cars or treat their Roomba like it is a pet, even referring to the vacuum as a ‘him’ or ‘he,’” James Mourey, an assistant professor of marketing at DePaul University who studies the relationships between people and products, told Futurity. “What we find is that these anthropomorphic products can fulfill social assurance needs in the way that genuine, interpersonal interaction often does. But there are limits.”
Remind people that they’re interacting with an object, and the illusion goes away. But programmers are working hard to make you forget you’re interacting with an object. First, you activate artificial intelligence devices such as Google Assistant and Amazon’s Alexa by speaking with them, not typing.
Now, they are being taught to pick up on the emotions in people’s voices and respond in kind. Amazon has even obtained a patent for technology that will allow Alexa to recognise physical, emotional, and behavioral states. If you sound hoarse, Alexa might recommend medicine. Tell Alexa you’re depressed, and she’ll offer a kind word then recommend hotlines you can call.
Though we might appreciate advice from artificial intelligence, it can be unsettling when the machines appear to be trying too hard. According to a study published in Cyberpsychology, Behavior, and Social Networking, while helpful suggestions from artificial intelligence are welcome, people tend to react negatively to computers expressing empathy. This is because people ascribe genuine emotion to other people, and the idea of a computer emoting throws them off.
As the study puts it, “Believers in artificial emotion might see an expression of empathy or sympathy as a true manifestation of machine emotion, which could be disturbing, as suggested by UVM; disbelievers might treat the very same expression as nothing more than a trick rather than an indicator of machine emotion.”
UVM refers to uncanny valley of mind, the observation that the closer to human a machine acts, the creepier it seems to people. Ask Alexa to marry you, and she’ll say something to keep things light, such as, “Let’s just be friends.” According to uncanny valley of mind, if Alexa responded with a gushing yes and began peppering you with questions about where to live and what to name the kids, you’d most likely freak out and drop the idea.
There may be ways out of the valley, however. If technology advances to the point where the human brain can’t pick up on cues that the robot someone’s interacting with is a little off, that person might believe the robot is another person. Say an android moves with the fluidity of a human, speaks with a human voice, and can engage in free-flowing conversation expressing emotion in vocal inflection and with physical cues. You might be fooled into believing it’s a person because nothing is setting off alarms in your brain.
If the conversation is stimulating enough and the android “ages” a little bit every day, you might just have yourself a friend for life.