The Unbearable Indifference of Others
People hate the feeling that others don't care about them. Even if those 'others' are machines
This is a rare cross-post, appearing in both my newsletters. Robot-facing readers, I also write Who Cares? a newsletter about why people don't care about things it seems they ought to care about, like climate change or war or the hungry kids down the block. Readers of 'Who Cares?', I also write Robots for the Rest of Us, all about human-robot relations.
These newsletters are separate because I didn't see much overlap between the two topics. But there is at least one: People's feelings about being treated with indifference. These turn out to be the same even when the indifference is a machine's.
Most people are exquisitely sensitive to signs that others don't care about them. That's true in intimate relationships, where we want love ("don't you care how I feel?") but also in interactions with strangers, where we want politeness ("they didn't even apologize for the delay! They don't care about their customers!"). But I didn't expect it to be true of people's interactions with machines. That's the take-away though from some recent experiments in Israel. They found people getting upset when they thought robots didn't care about them.
This We Know: Robots Don’t Have Feelings. Also, Robots Have Feelings
Go ahead, say it: Of course robots don't care about people. Robots are machines. Care — that sense of responsibility, anxiety, solicitude — requires feelings, and the ability to sense them. Robots have circuits and programming, not minds; they sense temperature or humidity or movement, not a nagging obligation to call a neighbor.
All of that is true. But humans nonetheless force robots to relate to us in non-machine, human-ish ways. This is obvious about robots programmed to sense and respond to human actions and facial expressions. That's the case, for instance, with PARO, the cute robot seal invented by Takanori Shibata that responds to petting and nuzzling. And the more recent, somewhat similar Lovot (that's me meeting one in a Tokyo shopping center).
But robots are also social beings even when they are not designed to be, because so many people treat anything that moves as if it's part of our world of thinking, feeling beings. Before robots, people treated machines like cars and airplanes as if they were loyal friends or treacherous foes. Before machines, as Kate Darling has pointed out in her book The New Breed, humans had similar feelings about animals. Even a robot that doesn't "know" you're there can trigger the emotions, or at least the rote behaviors, that a person would. (I often react to GPS directions with "got it" and "yup, switching lanes" and "thanks." My family finds this hilarious. But I haven't stopped.)
People complained when a robot said ‘Hello’ — because they felt obligated to stop and answer
I'm far from alone on that one. Last month at the annual ACM/IEEE International Conference on Human-Robot Interaction I heard Reid Simmons, professor of Robotics and Computer Science at Carnegie Mellon University, describing decades of experiences with the robot receptionist in the lobby of the university's School of Computer Science. One finding: People in a hurry don't like it when a robot receptionist says hello. They can't help slowing down to say "hello" back. (Simmons said the "roboceptionist" team responded by programming the robot to measure people's pace. It has learned not to say hi to people who are moving fast.)
There is a kind of double vision here, a knowing two things at once (the robot is a being, alive and full of feeling like me, and the robot is a machine). It's not just, as is often said, "the willing suspension of disbelief." That phrase implies a control over these reactions that people do not have.
So, for example, Hadas Erel, who leads research on social human-robot interaction at the Media Innovation lab in the Interdisciplinary Center in Herzliya, Israel, has done experiments with her colleagues that found people feeling distressed after two little robots seemed to leave them out of a game. (I wrote about them here.) But feelings come and go. (It's a common complaint these days that clever psychological experiments find effects that don't last, or have any practical consequences.) Erel and her colleagues wanted to know if feeling ignored by robots had a lasting effect on people's behavior — one that would persist even after the robot encounter was over.
It did. Most of the people in the experiment didn't realize it, but they acted differently after they’d felt ignored by the machines, compared to people who hadn’t felt shunned.
In the study, 32 women and 20 men played a ball tossing game with two small robots. Some of these people got their fair share of tosses (13 out of 39, or one third). Others, though, got only four — one toss out of 10. They had to spend most of game watching the robots "play" with each other. After this, each participant went into another room to meet with an experimenter, fill out a couple of questionnaires and be interviewed about their experience and feelings. The next day, each of the participants received an email asking them to return an accompanying questionnaire within three days.
Feeling ignored by robots caused people to cozy up to the next human they met
One consequence of feeling that these people won't give you the time of day is a certain eagerness to please these other people. Plus a desire (often subconscious), to get close to them. That was how Erel and her colleagues tested for the effects of seeming robot indifference on the people in the experiment.
First, the researchers simply measured how people interacted with the experimenter. Out of 26 people who'd had their fair share of ball-tosses with the robots, 16 took a chair opposite the researcher — a posture that, on the videos Erel showed at a recent talk at the HRI meeting, looked businesslike. That was quite different from the people who'd felt shunned by the robots (who explained their distress vividly, saying things like "they didn't want to play with me, it hurts" and "they looked at me and chose not to toss to me, they were snobs"). Out of 26 of them, 21 placed their chair right next to the researcher's. (On the video, they look confiding, intimate — like they're about to incline their head and whisper into the other person's ear.)
Then, in the interview that followed, each person who'd played with the robots was asked what he or she wanted to do next. And 17 of these 26 people described activities with family, friends or lovers. In contrast, of the people who hadn't been shut out of the game with the robots, only 7 out of 26 chose that sort of people-centered activity.
People who felt ignored by robots were way more likely to comply when asked to do a chore the next day
Finally, another measure of the impact of the game was that email the next day. The experiment being over, and life being what it is, the former participants could ignore the message and not fill out yet another questionnaire. That's what two-thirds did — of those who had felt included in the ball game with the robots. But the people who felt largely ignored by the machines were much more likely to try to stay in the researchers' good graces. Two-thirds of them filled out their questionnaires.
I've used words like "fair share" and "shunned" to save space in this account, but of course the robots weren't deciding to include or exclude the humans. They were, as robots always do, following instructions. And their instructions had no social content. These aren't Lovots or Paros designed to respond to people. They're just little mechanisms doing their non-human thing. The sense that they were being uncaring and unkind — "they were snobs!" — was entirely in the humans' perceptions.
Yet for all that they were changed by such perceptions, people didn't realize playing with robots had affected them, Erel and colleagues write. They know because they asked. Almost everyone who'd gone through the experiment said the experience with the machines had nothing to do with how they were acting or their plans.
Most of the attention on human-influencing robots, and AI, is focused on machines deliberately designed to touch our feelings, with their cute gestures, big eyes and friendly talk. But, Erel says, even robots whose work doesn't involve humans at all can end up offending our touchy species. All they need do is give people the impression that they're being ignored — or, if you prefer, the correct impression that machines are indifferent to them. Both statements are true and both could be a problem. It's possible that future robots that are simply doing their jobs — created by engineers who thought only about how to do the work — will trigger humans' alarm at situations where they feel uncared for.