Robots at War
Russia makes threatening moves after Ukraine deploys a new combat robot. Also, Tesla fans seem to be gearing up for political combat over Autopilot.
A Tank-Killing Robot Vexes Russian Forces
As Russia is apparently gearing up to attack Ukraine in a few weeks, there are signs that the coming conflict might not be as one-sided as it looks. Yes, the Russians have more tanks, artillery, fighter jets and other materiel. But Ukrainians have a robot that has proven especially effective at destroying tanks.
In this fine-grained piece a few days ago, The New York Times' Andrew E. Kramer recounts how Ukrainian forces deployed --- for the first time in combat --- a Bayraktar TB2 drone. The Bayraktar, made by the Turkish firm Baykar Defense, wiped out a howitzer that Russian-backed separatists were using for "shoot and scoot" --- fire, then move quickly before they can be targeted by defenders' guns. The drone, piloted from the ground and likely armed with laser-guided bombs, found the enemy quickly enough to foil that strategy.
The Russian response to their clients' loss of one howitzer was intense: Their military scrambled jets and began moving tanks up to the border with Ukraine.
They didn't say why. The official Russian line is that Ukraine and the West are making it all up (Ukrainian and U.S. intelligence say they most definitely are not). Still, the Turkish robot has already shown itself to be a formidable new weapon, the sort that gets called a "game-changer" in combat operations. It has proven to be quite lethal specifically against Russian-made equipment. Fighting the Syrian Army a few years ago, Turkish forces came to call the drone the "Pantsir killer," for its record of destroying Russian made Pantsir air-defense systems.
No wonder the Ukrainians, as Joseph Trevithick reports in this complete run-down, have already bought a dozen Bayraktars and have plans to get more. Trevithick says they're also planning with Bayraktar to construct maintenance and training facilities in Ukraine itself, and even maybe to make the robots in their own territory.
Of course the Russian military and its clients have robots of their own, and they've often taken advantage of the fact that their high-tech equipment is more advanced than the Ukrainians' (for example, using drones to jam Ukrainian communications and radar). But the Bayraktar could give Ukrainian forces a new advantage in at least some types of combat this long-running conflict.
The drone has had a “game changing” effect elsewhere. About a year ago, for example, Azerbaijan and Armenia fell from occasional clashes back into outright war over Nagorno-Karabakh, a territory in Azerbaijan with a large Armenian population. When the nations first warred over the region from 1988 to 1994, the Armenians won. But this time, they got a rude surprise. Their artillery and missile installations, and their tanks, were fat, easy targets for drones --- some Israeli, some Bayraktars.
In the Azeri-Armenian conflict, heavy, expensive tanks were pitted against light, cheap drones, and the drones won. (A Bayraktar reportedly costs about $5 million, which is cheaper than a missile-defense installation or a high-end tank, and a lot cheaper than a fighter jet.) It should not be surprising then that the Bayraktar is selling well. Niger just ordered some. So did Kyrgyzstan. So, according to The Guardian, did the government of Ethiopia, which is fighting an increasingly successful Tigray insurrection.
In its conflict with Ukraine, Russia has a lot more of everything to throw into the fray --- technology, troops, tanks, airplanes etc etc etc. When it showed off its latest fighter jet in Dubai a few days ago, its promotional video included scenes of the plane shooting down a Bayraktar. But it's not unreasonable to think that Ukraine's adoption of these drones might have their adversary scrambling for new tactics.
Update, March 3: According to this report, Bayraktars have in fact helped the Ukrainians hold back Russian forces. And today the Ukrainian defense minister, Oleksii Reznikov, told the world Ukraine just deployed more Bayraktars, as James Hamblin reports here.
Did Tesla Fans Freak Out About a Researcher?
Though Hollywood prods us to fear a robot uprising, we’re all more endangered right now by machines that do what we expect them to. AI and robots have lulled drivers, pilots, missile crews and others into over-trusting the machinery, to the point where the humans couldn’t jump in effectively in a crisis.
When a device handles 98 percent of a task very well, human beings face two problems: First, we tend to assume that it's more capable than it really is, and, second, we have a hard time paying enough attention to cope if we should fall into the 2 percent of the time when the sh\*t hits the fan.
"When they think automation is in charge, people do get complacent," Missy Cummings told me last year, when I interviewed her for a magazine piece I was writing. Cummings is a former Navy fighter pilot who later became a researcher on AI, robotics and how humans interact with those technologies. I'd called her because I'd read about a study in her Humans and Autonomy Laboratory at Duke. The experiment had people walk around a controlled streetscape so researchers could see how they'd behave in response to autonomous vehicles. When people in the experiment thought they were dealing with a robot car, they were more careless. They felt free, for instance, to step in front of the supposedly self-driving van --- because they apparently assumed that the machine couldn’t make a mistake, and thus would stop.
Over-trusting robots in this way is dangerous, especially when it’s combined with an inescapable human weakness: People are very, very bad at paying attention to a situation in which nothing is happening. (There's a reason "it's like watching paint dry" is a phrase used to describe an unpleasant experience.) Imagine, then, a system that does routine and easy tasks very well, causing most of your interactions with it to require almost zero attention. If and when it gets overwhelmed and you need to take over, you are at serious risk of not knowing what the hell to do in time.
Or, to put it more concretely, as Cummings did in our conversation, when we turned to Tesla’s Autopilot AI: "Men and Teslas are very dangerous. Men typically trust the technology far more than they should."
Driving is where robot hype --- which tells you the technology is better than it is --- actually gets dangerous. Drivers with impressive automation features in their cars ought to be aware of what the tech cannot do. Instead, they're exposed to ads, news and rhetoric that stresses all the great things the machine can do. Cummings has been warning about this for years, especially with regards to Tesla's claims for autonomy and more generally the company's assurances that fully self-driving cars are moments away. (For example, after Tesla's Elon Musk, at an "Autonomy Day" event in 2019, predicted Tesla robotaxis would be deployable in 2020, Cummings tweeted: "My lab has been running controlled experiments on Tesla Autopilot & I can say with certainty that they are not even close to being ready. My student on this project should get hazardous duty pay."
It is a good thing Cummings and other researchers are refuting hype and exploring how complacency and boredom interact with robots and AIs. And, I'd argue, it's a great thing for public safety that last month Cummings was appointed a "senior advisor for safety" at the National Highway Traffic Safety Administration.
Musk doesn't think so. About Cummings' appointment, he tweeted that he thinks she's biased against Tesla. More alarmingly, a lot of Tesla fans (owners? investors? cosplayers?) jumped in. One tweeted "If they try and take Autopilot away from us we will riot so hard January 6 will look like a day at Disneyland," which was maybe a joke (the tweeter said it was). Thousands signed a petition (which has since been taken down) at Change.org against the appointment.
You might be forgiven for thinking that the politics of robots would be kind of dispassionate --- that when it comes to policies about automobile AIs or elder-assistant devices or collaborative robots at work, people would make choices based on facts and figures. After all, these are just plastic, metal and algorithms we're talking about, right?
But the hullabaloo about this appointment is a harbinger. Passionate, irrational flag-waving identity politics is going to be part of robot debates. Instead of race or class or religion, the fighting clans will be self-driving car lovers versus haters, or sex-robot enthusiasts versus anti-sexbot crusaders, and so on. Having written a book about how people can make an intense, combative identity out of *anything* they have in common, I am not surprised. You shouldn't be either.
Instead, consider the nitty-gritty of the political situation. As David Zipper explains here at Slate, Tesla fans are worked up because they know that the regulatory cavalry is coming for their fantasies of robot magic. They don't like it, and they could turn (or be turned by skillful operatives) into a potent force against sensible regulation.
I have exciting news to share: You can now read Robots for the Rest of Us in the new Substack app for iPhone.
With the app, you’ll have a dedicated Inbox for my Substack and any others you subscribe to. New posts will never get lost in your email filters, or stuck in spam. Longer posts will never cut-off by your email app. Comments and rich media will all work seamlessly. Overall, it’s a big upgrade to the reading experience.
The Substack app is currently available for iOS. If you don’t have an Apple device, you can join the Android waitlist here.