There's No One Attitude Toward Robots
People aren't going to be any more consistent in the way they treat robots than they are in the way they treat people
Consider the collector at a bridge tollbooth. He’s a human being, as real as you are, his mind teeming with hopes and fears. After he clocks out, will he go home to a loving family or a lonely room-for-one? Will he look back in old age on a life worth living, or will he feel aggrieved and regretful? What is it like to be him? There are so many questions you could ask.
Do you care about any of them?
Of course not.
To you, the toll-taker is a means to an end, a thing in a skin, that stands in the way of you getting back to doing 65 in a 55-mph zone. He is there to follow a few simple algorithms. Take your money, give you change, send you on your way.
This is not a condemnation. He doesn’t care about you either. To him, you are a car. Process, and move on.
Of course, among the algorithms you both enact are some that lightly disguise the mechanical nature of your encounter. You say “hello,” and so does the toll-taker. He says “have a nice day” and you say “thanks, you too.” Most of the time people follow these simple scripts. That’s certainly a major reason that automated systems have replaced about half the tollbooths in the United States.
Years ago the psychological anthropologist Lawrence Hirschfeld gave me the example of the toll-taker to illustrate how often human beings treat one another as simple script-following devices. In many philosophies, behaving this way is a grave ethical violation. Immanuel Kant, for example, famously held that no person should ever be treated as the means to an end. A human being is always an end in herself.
But in our lived lives, almost all of us treat people as means to an end, some of the time. We follow scripts — not just with the toll-taker or the call-center worker or the passenger in seat 21A, but even with our near and dear ones. You lead a child through the routine for getting into bed, ticking off the steps. During sex with your partner, you — with their consent — use them for your pleasure.
This is obvious enough when you consider how people treat other people. But it’s often forgotten when we contemplate how people treat, or will treat, or should treat, robots.
Many people (me included!) like to point out that people often treat robots as if the machines were people. We cite examples from the world (people name their Roombas, soldiers have held funerals for fallen robots). And from the lab, where many, many experiments in Human-Robot Interaction have shown this effect. For example, in this one, kids were upset that a robot was being put in a closet seemingly against its will. In this one, people hated being told to hurt a cute robot.
These great, revealing experiments show that people anthropomorphize robots easily, in certain situations at least. But you should not let them nudge you into believing that all people will behave consistently toward robots all the time. Because people don’t even anthropomorphize people all of the time.
In our routines for treating people we don’t care about, there are, of course, available exits — moments when we can step out of the usual script and begin to get to know one another as human beings. (Maybe you commute through this toll booth every weekday, and one day you ask about the kids in the photo taped on the toll-taker’s wall. Or you need a recommendation for the best restaurant or hair salon or doctor in town.) But you can’t assume people will take those opportunities, or that they won’t. Our interest in one another’s humanity waxes and wanes, depending on circumstances.
A famous example of this is an experiment conducted among graduate students at Princeton in 1970. The students were told they had to get across campus to another building to give a talk. On their way, each passed a person slumped in an alley (really an actor, but, as far as they knew, a really distressed human being). Would the student help? Well, it depended. Some of them were told they had plenty of time to get across campus. 63% of them helped. But others had been told they were already late and had to really hurry. Of those, only 10% stopped. And, by the way, these were Divinity School students who were going to speak about the parable of the Good Samaritan.
So I think we should stop saying “people anthropomorphize robots” as if us humans will be consistent in the way we treat the machines. It makes more sense to think of robot anthropomorphism as contested ground, in which people’s feelings and behavior are unsettled. Sure, people will love and cherish robots (especially robots designed to be loved and cherished) some of the time. But probably not all of it. This is especially true since unkindness to robots (unlike cruelty to animals, children or other humans) actually has political advocates, whose arguments are quite rational.
Robot mascots, telepresence robots, javelin-retrievers and exoskeletons etc. It’s no surprise that robot-friendly Japan would use the Olympics this week to show what the latest machines can do. Joann Muller at Axios has a nice rundown of the devices on show at the Games.
Bipedal milestone. Speaking of sports, congrats to Agility Robotics’ Cassie, a bipedal robot that recently ran a 5K course. Its time was 53 minutes, which is slightly slower than the average for women between the ages of 65 and 99 but is excellent for a robot, because bipedalism is hard. In fact, it may be the best time ever for a robot, because it may be the first time a bipedal machine has run a 5K.
Nice piece David, we treat one another robotically so much of the time.