Robot Eldercare Is Not Going According to Plan
James Wright's "Robots Won't Save Japan" Explains Why
Consider a nursing home, where elderly residents get different levels and types of care. You might see it as a sad but necessary place, and hope it's run with kindness. But an economist will see it as a dead end. That's because workers who tend to the residents earn low wages and don't learn new skills or move up to better-paying industries. And the unpaid hours that family members devote to their aged relatives are hours not spent on working for wages or starting businesses.
Now, imagine this place after it has delegated some care work to robots – like ElliQ or Robear or Retriever Pro or Aeo or Stevie or Paro. Teams of technologists designed those devices for eldercare. Then the machines needed to be trialled, purchased, installed and maintained. Someone had to train staff to use them. All this work will continue for years. That creates jobs, of the sort politicians and voters like: high-tech, well-paying, skill-imparting.
This sort of nursing home is no dead end. You might even call it a gold mine.
A lot of good people are working on robots for eldercare with the best of intentions. But let’s be clear on the reason they have jobs: People expect to get rich giving robots a role in the care of older people — from nursing homes to assisted living centers to Mom’s apartment where she’s still hanging on, with some help.
One market research firm estimates that the worldwide market for "eldercare assistive" robots, already more than $2 billion, will be worth more than $7 billion in less than a decade. The related category of "healthcare companion robot" is said to be growing even more quickly.
As James Wright points out in his trim, insightful book Robots Won’t Save Japan, governments and corporations invite us to take this transition with a resigned shrug, "as a natural inevitability." But people in wealthy, aging nations — Japan, where he did the dissertation work that became this book, but also elsewhere in Asia, Europe and North America — should understand that nothing about robot care is so well-settled, or even well-understood.
Among a number of Japanese words to which the book introduced me, one of the most useful is genba (現場). That means, Wright says, "the place where reality happens." That place — far from the abstractions of policy papers and business plans and design specs — is where Wright, an anthropologist, spent his time.
In a Japanese care home for elderly people who needed daily help, Wright watched closely as the residents and workers tried out three different robots promoted by the Japanese government's Robot Care Project. He also spent time in the dark, quiet hallways of the National Institute for Advanced Industrial Science and Technology, where eldercare robots were designed and promoted. In a lot of media about both eldercare and robotics, there's a tendency to fantasy. Wright's careful picture of the genba of care is valuable antidote.
Why Governments Yearn for Robot Care
The case for robot care is simple: Older people are an ever-larger share of the population all over the world. The number who need help, and the scope of the help they need, are growing faster than the supply of people able and willing to give this care (either as paid workers or as unpaid “informal caregivers’’). Governments don't want to seem to encourage more immigration of the sort of young, inexpensive workers who already do a lot of care work in many wealthy nations. So no one knows how the needs of all these seniors will be met, nor how the already immense costs will be paid.
Robot helpers could help address the problem by being more available, abundant, reliable and patient than people, while costing less. So governments are supporting eldercare robot research and trying its products. For example, as Wright writes, "in 2019, the UK government announced an investment of £34 million ($48 million) in robots for adult social care, stating that they could 'revolutionize' the care system." Around the same time the European Union was promoting its €85 million ($103 million) Robotics for Ageing Well program. Last year New York state's government bought 800 of the chatty, sociable ElliQ robot to give to seniors living alone.
Thinking about robot care isn't thinking about how care gets done. It's thinking about what care really is.
Trying out any robot for the elderly is an experiment in giving human work to a machine. And that doesn't mean just counting pills or lifting trays. Robots are already being used to manage, motivate, monitor, amuse, console and comfort older people. ElliQ, for example, is a billed as "a friendly presence in your daily life," constantly "surprising you with jokes and suggestions." So, behind the obvious question in any robot test — can we automate this work? — is the big one: Should we automate this work? Thinking about robots for these tasks isn't just thinking about how care gets done. It's thinking about what care really is.
It’s important, then, to see how robots actually do in the genba. That’s what Wright found out. In 2016, he went to the nursing home to observe trials of three different robots.
One, called Hug, was an industrial robot that had been modified to lift a person, swing them around, and deposit them onto a bed, wheelchair or toilet. (Back trouble from lifting is a common complaint among nursing home workers in Japan, Wright writes.) It looks like this:
The second machine was Pepper, a commercial robot with a head, arms and the ability to respond to things people say and do in front of its camera "eyes." The care home's version of Pepper was running software for eldercare, but it had begun its career as a greeter and directions-giver in restaurants and shopping malls. Pepper looks like this:
The third machine was Paro (short for "pāsonaru robotto," or "personal robot"), a two-foot long robot version of a white-furred baby seal. Take a look:
Paro responds with blinks, cute movements and sounds when people stroke it. Of robots made to relate to people emotionally, it was one of the first to leave the lab and be sold commercially (a Paro costs about $5,000).
The Reality of Robot Care
A lot of elderly people were frightened of Hug, at least first. But a bigger psychic obstacle, Wright found, was the robot's industrial aura. Compared to a human, it was slow (it took about a minute and a half to move a person from one spot to another). Hug wasn't usable in the home's communal bath, but even if it had been, residents would have gotten awfully cold being moved this way. (Contrary to stereotype, robots are often slower than human beings — they get installed because they're more consistent and reliable than people, not faster.)
Also, the robot was heavy and had to be set up. Hurried employees often found they didn't want to spare the time to deal with it. Despite the aches and pains, they went on lifting people by themselves.
One worker worried that the residents might feel "like luggage."
Their preference wasn't just about efficiency. The action of one person lifting another had an intimacy and individuality that mattered to residents, and even more to workers. They cherished "skinship" (Wright's translation of their word for "creating kinship through touch"). Workers created it by "hugging, patting, rubbing, tickling, nuzzling, and massaging residents, occasionally putting their arms around those they were feeding at mealtimes, giving little touches as they passed by, and generally sharing a great deal of bodily contact," Wright explains.
Skinship and a sense of attentive presence were missing when a frail old person put her arms and legs around Hug and got slowly, safely hauled upward for transit. In Wright's survey, most workers thought Hug would spare them back strain, but only 15 percent thought residents would use it with "peace of mind." One worker worried that the residents might feel "like luggage."
Pepper, on the other hand, had been programmed to lift spirits, not bodies. It was supposed to be able to read emotions from facial expressions, and to understand spoken language. And it had a program to lead a daily recreation routine.
Part of the robot's appeal to management was that it would transmit elite expertise to this setting where exercise would otherwise depend on the taste and skill of whatever worker was doing the job that day. (The center's director said Pepper's apps, designed by fitness pros, would give it a "higher level of qualification than any of my staff.") When it "led" an exercise session, Pepper would call out instructions and a human staff member, standing next to the robot, would model them for the residents. Residents seemed to enjoy the robot-led sessions well enough — if the human staffer also attended. Otherwise, they weren't that drawn to it.
Paro, the robot seal, wasn't supposed to require that much staff involvement. It was invented to be like a hospital dog, but without the worries about feeding, pooping, allergies and so on. The hope was that the robot would soothe and occupy residents, especially those with dementia. That would make them more comfortable, and so reduce the number of times they acted out or pestered staff with loopy requests. So workers more time for their jobs and make residents feel better, reducing the amount of psychoactive medication they took.
People liked Paro. Residents requested it and workers would imitate its mewing sounds as they communicated with residents. Everyone ignored the manual, which said residents should get time with Paro in planned, well-defined sessions. Instead, residents got Paro on request, when it was available.
Maybe that was part of the reason the use of Paro got a little weird. One resident got very attached to the robot — so much that she stopped talking to people and was pained (to the point of not eating) whenever she was separated from the machine. Workers were ambivalent: They thought the robot was stimulating to the old woman; but they also worried that it was a portal to a personal fantasy world, which separated her from other people and maybe made her dementia worse. In this way, Wright notes, "Paro created new problems of the same sort it was expected to solve."
Instead of reducing work for the staff, the robots created more of it.
In short, none of these robot trials was a success. The two other robots, Wright says, also created work instead of reducing it. The staff soon came to treat the robots as fragile, attention-demanding beings that required careful handling — in other words, they spoke about the machines the same way they talked about the elderly residents.
You might object that we're in early days of robots' exit from factories into daily life, and that the machines will get better. After all, Wright was observing trial runs of six weeks each, that took place seven years ago. Maybe things are different in 2023?
Pepper robots aren't much used any more (their manufacturer, Soft Bank, stopped making them in 2020). Hug seems to still be in production, but it’s far from standard equipment. Even the photogenic Paro hasn't sold that much, with 5,000 in use around the world. Meanwhile, there are a lot of new robots available in 2023 that weren't years ago, including the ones I mentioned at the top of this post.
Yet an essay Wright published a couple of months ago indicates that robot eldercare hasn’t advanced much in Japan since he did his research. A 2019 survey of 9,000 elder-care facilities there found only 10 percent were using a care robot, and a 2021 poll of home care providers found only 2 percent had any experience using a robot in their work. The rest of the world is no different. New York state’s ElliQ experiment reaches 800 people over age 75 (who have to have WiFi and be "comfortable with tech equipment," the office's director said). The state has more than a million people aged 75 or older.
Still, a Great Leap Forward for robots could be imminent. Such a leap has taken place with AI and language in the past 12 months. That improvement in machines’ ability to communicate like a human will likely make robots sound way less robotic.
How much does that matter? It depends. If robots aren’t being accepted because they need technical improvements, then it’s only a matter of time before robot eldercare spreads everywhere. However, as Wright lucidly explains, the problem may be elsewhere. It may lie in a contradiction that technology can never eliminate.
A contradiction technology can’t bridge.
That contradiction is this: The robot idea is to define a task and make a machine to do it the same way every time. If you’re designing a robot, your definition of the task, then, is an abstract version — it’s all the aspects of the work that are shared by every single instance of it. (To lift an elderly person, you position them like this to avoid straining their frail bodies.)
But a person who cares for another person works in exactly the opposite way — she does the task differently every time, because the nature of the chore will depend whom she does it for, and that person's state of mind, and her own. (To lift Mrs Takagawa, remember she’s nervous about her left arm and gets cold very easily, and she’s in a good mood today, so call her “granny”).
Over and over, care workers at the home told Wright how important and valuable this kind of attention is. It was, literally, momentary: It existed only in a particular moment between particular people, and that moment was never going to be repeated.
This work required a lot of the caregivers. Not just physical effort but also the emotional labor of responding in just the right way — the way that suited the situation, the person receiving care, and the person giving it. "Skinship" was part of that. So was teasing, joking, play-acting and even fooling residents for their own good.
The value of Yoyū
To explain how to do this work well, the workers often referred to another Japanese concept I was glad to learn: Yoyū (余裕). It connotes surplus, composure, a margin of error, flexibility, scope, a bit more rope than absolutely needed — in short, what you have beyond the just-enough. Yoyū is the opposite of having not enough time, not enough control, not enough repose to act fully human. In other words, it’s opposite of the harried, frantic state of mind of every bad job in the digitalizing 21st century.
Wright calls yoyū "the surplus of time-space outside of the logic governing Japanese productivist capitalism and commodified care." If you like your job, I would be willing to be it offers you at least some yoyū. And for the workers who spoke with Wright, yoyū was essential to care.
I know it may sound strange to speak of this breathing room, this space where your best self can flow, in the context of a facility for eldercare. After all, as Wright documents, such a place runs on a strict schedule, to make sure residents are fed, toileted, bathed and kept safe and healthy. It's a place of chatter and phones and ringing alarms, to say nothing of residents asking for things (sometimes, especially if they live with dementia, the same senseless things, over and over).
But everyone who has held a job knows there is a difference between official rules that pin you down with stress and official rules observed in a way that makes space for yoyū. And he makes the case that good care requires this second approach to work.
The problem with robots is that, in theory at least, they know only the first kind — the kind where everything is explicit, replicable, uniform and there are no exceptions.
This is not just a technological fact; it is also an economic one: Governments and businesses buy robots to get work done in a predictable, rule-based measurable form. That this work replaces human variety and unpredictability is part of the point. (An exec at a construction-robot firm once said to me, "right now, every operator is an artist. No two do the work in the same way." It took me a moment realize he meant this was a bad thing, because it made work unpredictable.)
I don't mean here to claim robot care isn't worth exploring, or that we ought to return to the good old days. Those days weren't so good. Care from humans can be bad. It can be nonexistent, too. In the days before Social Security, for example, elderly people feared and fretted about how to get their relatives to care for them (often they made shifty promises about inheritances, as the historian Hendrik Hartog has shown.) And the burden of this care work has always fallen (as it still does) unfairly on women.
And there really are a lot of people getting older in societies where caregivers are, in the absence of policies to support them, getting scarcer. To show, as Wright has done, that robot care is not a magic solution is not the same thing as showing that there is no problem.
What this book does show, though, is that the problem of care is not what we have been told. That is, it is not a problem of lacking human bodies to stick into the "care worker" slot, which will be solved by putting machines in there instead. Rather, it's a problem of societies that are organized to value only what can be cut up into measurable, sellable units.
Beware plans that render care invisible
That outlook, as Wright points out, renders care invisible. How do you package and sell a particular man's talent for mimicking people in a way they enjoy? Or a woman's feel for how to lift a particular frail old body? You don't measure such things. So they're not seen and not discussed — these are hours of loving labor that capitalism counts on but doesn't count in.
Could robots be part of a better world, a world that didn't run like this? I think it might be possible. That would be a world in which robots aren't opposed to human idiosyncrasy and yoyū, but instead have become a part of it.
The concept of "robot,” as Wright says, is remarkably fuzzy. "Robot" always seems to mean a machine at the very edge of plausibility, doing surprising things. Start to use that "robot" every day though, and it loses this connotation.
Your dishwasher, coffee machine, thermostat, alarm clock and fitness tracker — devices you've set up to read their environment and respond in some way — are robots, by most reasonable definitions. But they aren't robots — weird surprises from the future. They're just handy tools. I can imagine a future in which "care robots" don't exist, but of course Dad has a gizmo that reminds him to call his friends and keeps track of medications, and another thingy that intelligently helps him get around the house. He and his caregiver (who is an actual person) use the devices in whatever way makes them comfortable.
What such a future would signify is that robots had been folded into the business of human, humane care — that they enhance and encourage yoyū for everyone. As I mentioned, a lot of idealistic people are working on robots for care, and I think many of them have this vision in mind.
If this future doesn't come to pass, it won't be technology's fault. Blame instead that economist mindset, pushing to standardize, quantify and measure everything that people do. Wright's book shows how caregivers, in the place where reality happens, resist that mindset and the policies it pushes. As we're asked more and more often to adapt to robots in our lives, those workers are an example we should follow.
A note about this post: I write two Substack blogs. This one, about robot-human relations, and another, about people’s anxieties about care, caregiving, psychic numbness and the like. See below. Sometimes (more often than I expected) the two subjects overlap. When that happens I post in both blogs. If you’re interested in the other Substack, feel free to check it out, below.
Terrific essay: humans will always need the personal touch, the idiosyncrasies of they & their carers embodied & psychic specifics. Touché
I just posted your piece to the SciFoo group hosted by Google. It's relevant to a dicussion we've been having about AI.