Automation's Hidden Menace
Robots and AI don't replace highly-skilled humans. But they block those people from training the next generation. Matt Beane's "The Skill Code" explains.
The other day at my annual eye checkup, I began with a young ophthalmologist, who measured the fluid pressure in my eyes, did some other checks, and had everything ready for my veteran doctor when he walked in.
As they talked over the data, he said, "Hmm, let me take his pressure again."
"See? It's actually a little lower than you wrote," he told his colleague. "The thing is, when you lift his eyelid a little for the machine, you're adding a very slight pressure from your finger. That can make the reading a very slight bit higher."
Alerted by Matt Beane's superb and (ahem) eye-opening The Skill Code, I knew I was witnessing something ancient and important: The transfer of skill from a master to a person learning the trade.
Not the explicit knowledge laid out in textbooks (and codified in algorithms). Rather, a feel for the craft, often below conscious awareness, which makes people good at what they do. Surgeons, bankers, salespeople, warehouse workers, musicians, cafeteria cooks and all the rest of us have gotten our skills from people who knew what they were doing, and folded us into their labors. It has always been so.
But now, Beane reports, the rapid uptake of AI and robots is breaking this essential aspect of civilization.
How the "Skill Code" Works
"This special working bond between experts and learners has been the bedrock of humanity’s transfer of skills and ingenuity for millennia," Beane writes in this lively and richly informative book. After all, "experts can’t do what they do without help. Novices want to help, and to learn. So they build a collaborative bond that’s also the engine for building skill."
This knowledge-transfer system, which Beane calls “the skill code,” is essential to keep society running. (You really don't want your eye doctor or auto mechanic or first-grade teacher taught by videos and textbooks alone). It's also what makes for a happy life. "The only thing in life that’s really worth having is good skill,” as Jerry Seinfeld once said. Because it can't be taken away, because it affirms your worth, because it's a pleasure to do something well, as he put it, “good skill is the greatest possession." (And, of course, skill is the key to good pay. When you can't be replaced by any random worker, you command higher wages.)
It's Not Just a Problem of 'De-skilling'
We've known for years that robots and other automation can lead to "de-skilling" — where, for example, the well-trained bricklayer, deciding how exactly to place each brick to make a herring-bone pattern on the walkway, is replaced by a minimum-wage dude who just feeds bricks into a "road printer." But Beane is here to tell us that the threat is even worse than decades of wage loss and despair.
The essence of the problem is this: When an expert can get better assistance from a robot than she could from a human apprentice, she has a lot of incentive to go with the machine. It's faster, more reliable — and in professions like surgery, actually safer (for every apprentice doctor in an operating room, Beane notes, "patients spent 25 percent longer under anesthesia"). But giving apprentice work to machines leaves the human learners no chance to get their hands dirty. The senior surgeon guides the robot, and the nurses and residents "sit there in the dark and fall asleep," as one nurse told Beane.
Result: The path of skill learning is broken. "Use AI to analyze social behavior on the web, and doctoral students lose the chance to learn stats and coding," Beane writes. "Use a robot for surgery, and surgical trainees end up watching, not learning. Use a robot to transport materials, and workers that handle those materials lose the chance to interact with and learn from the recipient."
Synthesizing the work of other scholars with his own "hundreds of hours observing, interviewing, and often working alongside" people in many walks of life, he has found the same disruptive effect of AI and/or robots in 31 occupations. In "higher education, online labor platforms, chip design, journalism, data science, criminal justice, neonatal intensive care, public education, music composition, robotics, open innovation, aerospace engineering, ridesharing, long-haul trucking, bomb disposal, drone piloting, food service, secondary fulfillment, radiology, construction, wealth management, retail, automotive engineering, call center operations," he writes, "there’s a different intelligent technology at work, yet the same bonds are being severed."
What are newer workers to do? The typical person unhappily accepts the new normal. But every once in a while, someone is unusually capable and determined. That person will see that she has to cobble together her learning on her own, without the official support that existed before automation. She'll become what Beane calls a "shadow learner": Someone who skirts or breaks the rules to get the experience she needs. (More on that in a minute.)
How Skill Transfer Works — and How AI and Automation Break It
How exactly do AI and robots ruin the practices with which experts have taught novices since prehistory? An Assistant Professor in the Technology Management Program at the University of California, Santa Barbara, Beane has been studying the skill-transfer process for years. The book anatomizes the process in detail, in order to explain precisely how AI and robots are breaking it.
Challenge
First, he writes, successful learning offers challenge: Learners get work that's a little beyond their comfort zone, forcing them to try new skills and get better at familiar ones. They're stressful in the good sense — not easy or dull, the sort of thing that makes you sit up straight and give it your all. But not so out of your league that you'll flop.
Complexity
Second, the "skill code" offers complexity. This is Beane's term for being focused on specific tasks, but also being aware of how they relate to one another, and the wider world. For example, he mentions Sita, a shift supervisor in a warehouse that fulfills those subscriptions people get for regular deliveries of fishing lures, coffee samplers and the like (my examples, not the book's). She rose from entry-level box-packer to her position not just by doing each of her chores well, but also by watching everything that took place before and after her tasks, and elsewhere in the building. She learned how all the pieces in the place fit together. She gets the gestalt of her work.
So, when the workers in this area go for break, she stays behind, scanning again, looking for little signs she’s learned to look out for: the smell of dog treats means packaging is getting torn, scuff marks on labeled boxes means the labeler has vibrated itself out of alignment again, jumbled bins mean someone new is in replen [“replenishment” — the job of keeping bins full].
Connection
Third, Beane writes, skill transfer involves "connection."
As every parent noticed during the pandemic, learning is a social activity as well as a cognitive one. Beane quotes studies that find that people learn better at challenging tasks when they're working with experienced supervisors than when they are struggling on their own. Without community, it doesn't go so well.
In the workplaces Beane describes, learners trust their mentors to guide them, and the mentors trust their pupils to put in the effort and do good work. Mentors help the newbies manage frustration, reminding them that the struggle will be worth it, or by pointing out how much they're extending themselves in a good way. Around that bond of trust, other emotions flourish: Mutual respect, shared pride, a sense of community.
That's the motivation to struggle and grow, to come back after a bad day and dream of the good ones to come, when people you admire will admit you to their circle. That circle is immensely important. If you've ever wondered why your favorite news source gives a lot of attention to the Pulitzers and other journalistic awards you don't care about, it's because there is a part of us journalists that writes (or speaks or videos or charts) for others in the trade. I mean, we want you to like our stuff. But five words of praise from a respected colleague is worth five pages of gush from the rest of you. Isn't that true in your work too?
I've known Beane for a few years as a top-flight source, because he backs up his claims with vast amounts of painstakingly gathered data. Having spent more than a decade anatomizing the skill code, he's authoritatively able to tell us this: AI and robots are messing up all aspects of it.
Challenge goes out the window when the robot does the work that assistants once were assigned. As for complexity, workers whose jobs have been simplified and surveilled with cameras, keystroke monitors, step counters etc. are no longer able to wander beyond their guardrails and take a moment to see the bigger picture. And without interaction with experienced mentors, newbies will never experience real connection. From surgery to banking to restaurants, our robot and AI deployments are "hollowing out" skill, Beane says.
When Learning is Criminalized, Only Criminals Will Learn
So how are people supposed to learn when the automated workplace is pushing them not to?
"By bending, breaking and rewriting the rules," Beane writes. Seeing that only "transgressive ways to learn new skills" are going to work, they transgress. They are "deviants gaining mastery outside the bright lights of convention and its rules."
These are the "shadow learners." They're surgical residents who get themselves into a less prestigious hospital — because its surgeons, knowing less about robots, are less likely to hog them. They're the warehouse worker who rushes to try to fix every little mechanical glitch when it pops up — even though he's supposed to stay at his post. Or the banking executive who takes subordinates away from their official duties so they master new tech — and then tell him about it. (Supervisors (mis)using their subordinates to learn new things are engaged in what Beane calls "reverse apprenticeship." )
Anyone subjected to a memo from HR about some pointless regulation will sympathize with shadow learners. And most organizations recognize that some rules are more important than others, and that those who break rules can create value (something I wrote about here five years ago, though the mag has churlishly removed my byline). In fact, Beane recounts, Hewlett Packard years ago created an award for "Meritorious Defiance." (The first went to the guy who ignored his boss's orders and did research that led to flat-screen displays.)
Still, "shadow learning" poses risks for the people who do it and the workplaces they do it in. And ever-more-intrusive surveillance tech is likely to drive shadow learning deeper underground, Beane writes. "It’s liable to get 'darker'—less ethical and more hidden." So "more and more critical skill development will be happening in areas of social life previously reserved for `capital-D’ deviants, criminals, and ne’er-do-wells."
Given its subject, this book will get you thinking about how you learned your trade, and who taught you. It will get you thinking about others' apprenticeships in a new light, too.
Beane is temperamentally an optimist and, by training, a searcher for solutions. So he doesn't think we're screwed. The book offers plenty of ideas (and some examples) about how we should change robot and AI deployment so that the new machines enhance the skill-transfer process, or at least stop blocking it.
I'm temperamentally a pessimist and, by journalistic training, a skeptic about statements that things can be fixed. So I found Beane’s deeply researched account of the problem more convincing than his prescriptions for saving the ancient art of mentoring. But I'll give those prescriptions their due, below. My journalistic training also included the admonition to be fair. You don’t run off with a guy’s thesis to make it sound like your own.
And this is, Beane insists, "not a doom-and-gloom book." (One more mention of training: my editor told me, when I was writing my book, that it couldn't be all doom-and-gloom, because readers hate that. Give them some hope, she said. I wonder if his did the same?)
How to Save Skill-Learning
Here, then, are some of Beane's ideas for saving the skill-learning process from the Standard Operating Procedures of AI and robot deployment.
Don’t Deploy Tech Without Thinking About Its Effect on Skill
First, he says, when AI and robots are being deployed, their effect on skill-learning should be taken into account. Ways to save the “skill code” can be found, if you look. Sometimes, he suggests, it doesn't take much tweaking to turn an AI from thing that replaces into thing that teaches. It might be as simple as requiring supervisors to keep giving novices challenging tasks, rather than sidelining them for maximum efficiency today. (After all, if efficiency today means not enough skilled people on the payroll in a decade, that's a Pyrrhic victory.)
By valuing and understanding the importance of skill transfer, organizations can adopt technology in a way that balances efficiency with the long-term need to create experts. For example, where rules and conventions push against skill-learning, they can be changed. This may take a step back from trying to quantify, specify and measure everything in the workplace. Formal training with explicit instructions is much easier to track, but hands-on experience is much better for passing on skills, Beane writes.
One example of a successful switch to robotics cites is the military's adoption of the PackBot bomb-disposal robot about 25 years ago. Soldiers learning to use a PackBot do so with an experienced user at their side. The newer soldier moves the robot via a controller, and the expert observes and guides him. Instead of being a mere onlooker to the sight of an expert wielding a robot, the learner has a hands-on experience. It was the old way, when the expert was alone in the field, rushing to defuse a bomb, that left novices out. PackBot operating is, Beane writes, "one of the only jobs that I have come across where the use of robots didn’t block skill development but enhanced it."
Befriend Those ‘Shadow Learners’
Second, Beane says, those sneaky "shadow learners" need to be understood better. "One approach I’ve seen work is to cultivate an organizational culture of psychological safety: one where people are more willing to share their shadow learning because they trust they won’t pay a price for it," he writes. Another possibility: Bring in neutral observers who can learn from the deviants but won't narc on them.
Change the Rules to Foster Mentoring Instead of Ignoring It
Third, he recommends changing workplace rules to encourage skill transfer rather than ignoring or discouraging it. A company might, for instance, decree that it's important to make sure that novice and expert level people spend time together. (And make that goal part of managers' performance reviews.) A hospital might track how much time new physicians get at the robot console, and lean on senior doctors to get those numbers up.
These kinds of changes will depend on a new and unfamiliar approach to AI and robots — one that balances desire for efficiency and convenience with the need to keep skill alive, to keep society from screwing itself. I hope that Beane is right, and that today's culture can indeed be fixed. If we can save ourselves, this startling and richly informed book will deserve much of the credit.