For those who haven’t yet heard of Siri, “she” is the virtual assistant programmed by Apple into the new iPhone 4s. She recognizes many words and requests (though often imperfectly) and is, therefore, able to send emails, put reminders into “her” owner’s schedule, and generate GPS directions.
Just tell her, “Siri, I want pizza,” and Siri says, in a female voice, “I’m checking your current location . . . I found 13 pizza restaurants. Eight of them are fairly close to you.” She then lists the restaurants on the iPhone screen so you can choose one. With a polite tone, she will apologize if she doesn’t understand your voice, “Ok, I give up . . . could you try it again?”
Siri is even funny. Tell her you love her, and she replies, “All you need is love. And your iPhone.” Or, “You are the wind beneath my wings.”
Funny, right? Well, not really—not when you stop to consider that you have just been coaxed to interact with a virtual entity. Perhaps without thinking about it, you have tacitly agreed to use a proper name to refer to a computer program, to agree the computer program has a gender, to laugh at “her” quips and to rely on her to guide you to places to eat or to give you a reminder about when to call home.
Now, many people—including some psychiatrists—would say that this is all entirely harmless. But I believe that personifying machines and interacting with them as quasi-beings actually dumbs down our interpersonal skills and encourages us to treat other people like machines. Ultimately, it diminishes our ability to empathize with one another, because we’ve been chatting up a non-existent person and can get used to considering real people as essentially non-existent, too.
To the extent that people become “attached” to Siri and “rely” on Siri and think Siri is “funny,” they are just a tiny, tiny bit less likely to value a friend’s responsiveness, or a colleague’s help or even to appreciate the nuances in tone of voice that real humans use to convey emotion and communicate with one another.
I have laughed at Siri. I have gotten angry with Siri and called her names. I have told my kids, when Siri helped me get to a frozen yogurt joint I couldn’t find, “Siri is amazing.” And, just now, in this very paragraph, I didn’t hesitate to refer to a computer program by name and use the adjective, “her.” Because there isn’t any other way to speak of this interactive program. Its existence requires that we treat it as if almost alive. And that means that people who are actually alive and give us directions or answer our questions or joke with us are cross-contaminated with the technoviral quality of machines. They are "Sirized," which means they are downsized in their humanity.
To the small extent that we say we “love” Siri or use “her” name or rely on her to get us out of a jam (even if it is just being lost), we cut ourselves free from the interpersonal tethers that bind us, one to another, and which act as insulation against acting toward one another in dehumanizing ways.
So, next time you see a group of kids beat another kid and post the video on YouTube, or marvel at how someone could ruin her life by humiliating herself on Facebook or find yourself at a loss to remember the name of a restaurant or how to drive to that park you’ve been to a half-dozen times, you can thank the likes of Siri.
That’s why I told Siri just now, “Siri, I hate you.” She seemed irritated. She said, “Noted.” But she’ll still give me directions and send my emails. So, it doesn’t really matter that I told her that. That’s the point.
Scream into the void enough, and your words and emotions will eventually be no better than a machine’s.
Dr. Ablow is the author of "Inside the Mind of Casey Anthony." He is a psychiatrist and member of the Fox News Medical A-Team. Dr. Ablow can be reached at email@example.com. His team of Life Coaches can be reached at firstname.lastname@example.org.
Dr. Keith Ablow is a psychiatrist and member of the Fox News Medical A-Team.