Does a Robot Need to Have a Gender?
About as much as a fish needs a bicycle. But that's not going to stop people.
There's a scene in the TV series Mad Men 1 which finds a central character, Betty Draper, alone at home and leaning into her washing machine. She finds its vibrations really hit the spot, as it were, and presses on to orgasm. So you could say she used a machine to help her get off. You could not say, though, that she "had sex" with the machine. The first phrase describes a $36 billion-per-year global industry but the second, I think, is impossible. A robot is a thing that can't consent, desire or feel. It's not alive, so a person can't "have sex" with it.
Nonetheless robots destined for sexual uses are touted as sexual partners --- as if they were alive. A few weeks ago, for example the AI chatbot company Tidio did a survey, in which it did not ask 1,200 people if they would use robots as sex toys. Instead, the company asked their respondents if they would “have sex with humanoid robots.” (About half of the men and a third of the women polled said yes.) "Sex with robots" has become a subject of conferences, books and an opposition movement as if we all know what "sex with robots" means.
I don't think anyone knows what it means, except the usual: People have a mad desire to make robots human-like (in our imaginations) even when they aren’t in reality. Getting people to pay attention to these questions practically requires playing to, and therefore perpetuating, fantasies. Love and Sex with Robots is an arresting book title. Enhanced Masturbation via Algorithm, Actuators and Advances in Materials Technology wouldn't have sold nearly so well.
Still, we media types have to persist, because we all ought to think more about the ways a robot's relationship to people could be different from a washing machine's. Robots are doing more and more tasks that require them to relate to humans in ways that don't read "machine" --- talking, walking, delivering medications or sandwiches, sterilizing hospital rooms, reminding us not to litter, telling us to it's time to take our pills and so on. That makes it more and more important to understand the human traits we'll be seeing in these sociable new devices. That means the traits designers will intentionally build into the devices and the traits people will see in robots for the same reason they see the face of Christ in their morning toast.
This post is about one of those traits: Gender. What are the instances in which you would need or want to see a robot on the gender continuum, so that the pronouns that feel right for it include "he", "she", "they" etc rather than "it"? As I learned at this discussion at RO-MAN, the annual IEEE International Conference on Robot and Human Interactive Communication, held virtually last summer (and thanks to organizers for letting me listen in), this is a question with no obvious single answer.
Makers of sex-related robots can't avoid the issue, since people want those machines to have a gender vibe. That doesn't always mean a binary one, of course. At Abyss Creations (makers of hyper-realistic sex dolls that are starting to become robots) you can order your unit’s genitalia typically female, or typically male, a combination of both, or in a sort of plug-and-play setup where you take one set out and replace it with another.
Nor does a yearning for robot gender require an imitation of a human body. This
is an iteration of Gabriel 2052, a "robot boyfriend" built by the artist Fei Liu to critique and expand on conventional "sex robot" tech. It looks like an assemblage of hobby-shop robot parts (which it pretty much is, and that's part of the point. But Liu refers to Gabriel 2052 as "he," and even polymorphous Abyss customers aren't ordering robots with no gender signals at all.
Maybe, though, sexual uses of robots are a special case, prompting gender thoughts that people don't need or want to have about a robot that translates at the hospital, or mows the lawn, or tells you to put on a mask in the store. Wouldn't it be simpler if robots in those sorts of roles each present as “it”? After all, in this study (pdf), by De’Aira Bryant, Jason Borenstein and Ayanna Howard at Georgia Tech, people's trust in a robot's abilities wasn’t much affected by their perceptions of its supposed gender. That's why the authors think robot makers shouldn't be too eager to design gender cues into their machines.
That argument, though, requires some faith that robots will be used as designers intended. Not everyone is convinced that such faith is warranted. Robert Sparrow, a professor of philosophy at Monash University who has done much interesting thinking and research on human-robot relations, notes that whatever gender (or lack of gender) designers lodge in a robot, the robot's users will still engage in gender-ing. That is, users will interpret the robot's features (voice, shape, activities and function) as cues to assign it a gender. Its machine-ness won't stop them, he says here. "If you take the same robot and you give it a shovel, people say that it's male; you give it a mop, and they say that it's female."
I heard Sparrow make a similar point in an overview of gender issues in robotics at the RO-MAN workshop. A number of other speakers, presenting their research on human-robot interaction, agreed. "Gender isn't just designed. it emerges in interaction," Astrid Weiss, a human-robot interaction researcher at the Technische Universität in Vienna, told the group in her talk. "If we ask people to assign gender to a robot, they will do so."
So gendering robots is an activity people probably do regardless of whether they want to. That activity may even be outside their awareness.
After all, some languages, like French, grammatically assign a gender to everything, so the car, she is in the garage, and the robot, he is in the kitchen. Other languages have a neutral gender that sometimes takes precedence over actual gender in grammar (as Mark Twain famously complained, German speakers say the maiden, it has gone to the opera, and the turnip, she is in the kitchen).
Other tongues, like English, assign gender only to living things whose gender the speaker knows, and everything else is “it.” That leaves English speakers aware, at some level, that they have choices to make. For instance, if I catch a fish I’ll likely say “I’m going to let him go.” But I can choose to go with the more accurate “I can’t tell if this is a male or female, but I am going to let it go.” Grammar lets me choose which pronoun, but it won’t let me choose none. Other languages, of course, don't mark gender at all.
Does any of this make a difference in how and whether people perceive a robot's gender? [Some studies, involving non-robotic objects, hint the answer could be yes. [In one, for example, Spanish speakers asked to describe a bridge (masculine gender in their language) called it strong, big, towering, and sturdy. German speakers, in whose grammar “bridge” is feminine, called it elegant, fragile, peaceful, and slender. (I am not aware of equivalent work on language and *robot* perception. If you are, please leave a comment or drop me a line.)
In other words, robot makers may think that designing robot gender is a matter of deciding what gender traits (if any) to build into a machine. But in real life people will assign their robots a gender based on traits the designer can't control --- perhaps the language they speak, certainly the culture they were raised in, their own feelings about robots, and whether the robot is mopping or shoveling (to mention only a few possibilities). Seen in this light, robot design becomes as much about foreseeing human behavior as it is about planning the robot's.
Moreover, gender-ing is an activity, so it will vary depending on the situation people find themselves in. The "female" robot they saw at breakfast with a mop may feel like a "male" robot when they see it in the afternoon with a shovel.
But let's just suppose our hypothetical gender-aware robot designers have a handle on culture, psychology and the situations in which the robot will work, so that they can predict how people will see gender in their machine. What should they do with that knowledge?
Should they use gender expectations to make the robot as good as it can be at a task (even if the gender conventions are unjust or even repugnant)? Or should they make a robot that pushes back against gender conventions ---promoting enlightenment and progress on gender inequality at the cost of making the robot less effective at its assigned task?
For example, if your robot is supposed to rescue disaster victims, do you give it a deep "male" voice because many people find such a voice more authoritative? Or do you give it a "female" voice in order to push against stereotypes? Are there situations where a researcher might want to leverage a stereotype about gender (say, that females are more caring and compassionate) to help their robot do its job (for example, teaching young children)? In [this paper, for instance, Tom Williams and Ryan Blake Jackson of the Colorado School of Mines imagine a dilemma for designers. Suppose they were to give Siri, Alexa and other digital assistants stereotypically male voices. That would help fight the gendered expectation that “deferential assistant” is a female role. And it would reduce the sexist abuse found in roughly a third of all conversations humans have with AI assistants.2 On the other hand, this move would decrease the number of female voices users experienced, and thus might help perpetuate the bias that technological work is male space.
Such questions don't have --- ahem --- binary answers. It's not unreasonable to suppose that some gender expectations can be used carefully to make robots more effective at their tasks, while others can and should be pushed against.
One thing, though, seems clear to me. The right question here probably isn't "should robots be designed with gender cues?" nor "is it OK for people to call the Roomba 'him' ?" Better, instead, to accept that robots and gender issues are entangled. Then we can ask, for each particular machine and task, the question Ericka Johnson, Professor of Gender and Society at Linköping University in Sweden, proposed at the RO-MAN session: "What are the gender traits associated with a robot doing? What work is robot gender doing here?"
This and That
"Gentle robotics"
Hands-down my favorite new (to me) use of robots this month is Project CETI (the "cetacean translation initiative") a consortium of cryptographers, linguists, AI experts, technologists, marine biologists and roboticists who work to decipher the communications of sperm whales. The goal is to understand what the whales are saying and eventually communicate with them in their "language" (that's the test to see if the project's "translations" are correct).
The robotics component involves both underwater drones and "on-whale" drones (that's the "gentle robotics" part, I think. It involves suction cups. These devices are listening to whale sounds and monitoring their movements, building the database of whale "words" that cryptographic and language-processing AI needs in order to construct a map of what the sounds mean. Christoph Droesser has an excellent account of the project here, at Hakai magazine.
Historic?
Robot dogs with guns seem to have reversed of Marx's famous dictum about "great world-historic facts." Marx said they appear first as tragedy, then second as farce.
The first robot quadruped with a gun that I know of was (intentional) farce: a paintball rifle mounted on a Spot Mini robot from Boston Dynamics --- created by the art group MSCHF to make a point. But now a real gun mounted on a quadruped robot has been displayed at Association of the United States Army’s 2021 annual conference last month.
And this is mom next to our robot bellhop...
The research firm Global Data has polled people in tourism-related companies and found that nearly a third expect their firm to invest in robotics in the next year, largely thanks to Covid-19 and the fears it provokes. Robots in airports and hotels used to be entertaining "gimmicks," said Ralph Hollister, a Travel & Tourism Analyst at the Global Data. Now they're seen as important for keeping the industry afloat.
Season 1, Episode 11
My source here is Verena Rieser, a professor in Computer Science at Heriot-Watt University, Edinburgh, who told the RO-MAN meeting that some 30 percent of AI assistant conversations involve abuse. Sadly, that problem can’t be solved by making assistants sound male. It’s true that in dealing with a “male-voiced” AI, people don’t throw sexist abuse at it. They throw homophobic abuse at it instead, Rieser said.