What to Make of "Robot Police Dogs"
Fact and fantasy mix in the politics of law enforcement's use of Spot
Boston Dynamics’ Spot, rebranded in NYPD blue as “Digidog.”
Encounters with artificial intelligence are seldom consensual. Am I willing to click on photos of traffic lights so an algorithm can make sure I'm human? Will I say "yes" to the bank blocking my credit card because an algorithm decides I'm acting fishy? I've never been asked. (Please don't reply that I signed user "agreements"; no one reads those.) In these encounters, a few people have dropped new tech on the rest of us. And we cope with their choices.
Older technologies went through this phase as well (weavers didn't get to vote on adopting mechanical looms, either). But things like factories and cars have now been around a while. Norms, regulations and standards have developed to structure people's relations with the machines. So, for example, your local factory can't install whatever equipment the owner fancies, willy-nilly. Laws govern worker safety and equipment maintenance on the job. And the harm you can inflict with a car is hemmed in by a laws about licensing, insurance and the rules of the road.
New, AI-based tech doesn't have so many guide-rails. (You can't just hop in a car and take off, but you can jump into a social network with a click.) In the absence of civic infrastructure for AI and related technologies, many of us feel — rightly — that we're on our own.
Even fewer consensus guide-rails exist for robots. When automated mall-patrollers, assembly-line exoskeletons or autonomous floor-moppers pop up one day, people can't call upon a store of law, convention and experience to know how to live with them. More importantly, they don't have obvious political channels to register their reactions. There’s no Robot Complaint Board, no Robot Safety hearing every third Wednesday at the town hall. The robot-safety agencies that do exist around the world are rooted in the factory setting where most of today's robots work. They're just starting to take on the prospect of robots in homes, schools, hospitals and other human places.
Eventually, I guess, we’ll develop a shared understanding of how people engage with these machines — laws, conventions, expectations and day-by-day experiences that together will make up the civics of robotics in democratic nations. But we’re now still in the stage of messy conversations — confused, ambivalent, contradictory — as robots pop up where they haven't been before.
Case in point: The recent adoption of an unfamiliar, exotic very roboty robot by a few law enforcement agencies in the U.S.
Those agencies — the New York City Police Department, the Massachusetts State Police, and the Honolulu Police Department — recently quietly acquired Spot, a glamorous four-legged walking robot made by Boston Dynamics. And in all three places, some people were surprised and kind of freaked out.
That freakout is instructive. The part of it that made sense, and the other part that was off base, can both suggest how a future civics of robots should be structured.
Let’s begin with New York City. Late last month Daniel Valls of Freedomnews.tv took this video of New York police in the Bronx, walking with a Spot robot. Someone on the video can be heard saying "that thing is creepy" as the device steps methodically along the snowy street, its feet tap-tapping and its mechanisms whirring. A few seconds later what sounds like the same voice says, "how do we get one of those?" Score that guy as "ambivalent." And let’s say Lesson 1 for robot civics is that the constituencies will not be just “pro” and “anti.” There will be people with mixed feelings.
Spot is made by Boston Dynamics, a 30-year-old company whose mission is to build automation for places that most other robots can’t handle — dirt tracks, crawlspaces, stairs and the like. That’s why their robots walk on legs instead of rolling around on wheels. The walking gives Boston Dynamics robots a spooky ability to mimic animal movements (when their human controller directs — they are remote-controlled like a flying drone). This is part of the reason Black Mirror creator Charlie Brooker was inspired by a different Boston Dynamics robot to write "Metalhead.” That's the episode in which Spot-like robots have gotten their own "brains," a lot of armament, and a directive to kill every living thing they find (none of which real Spots have).
This episode has shaped some of the politics of Spot, so we could say Lesson 2 of robot civics is this: Fiction matters. If people can easily imagine a path from the real robot to the one they fear, then the fantasy robot will influence their opinions. That happened in New York.
Spot wasn’t built with any particular use in mind. It’s designed to carry whatever tools its user finds useful (video cameras, radiation monitors, air samplers, thermal scanners to name just a few examples).
"It’s a platform," Michael Perry, the company’s Vice President of Business Development, told me when we spoke last summer. "We assumed our customers would teach us what environments the robot would be successful in." When Spot went out in the world (first in lease arrangements in 2019, then available for outright purchase for $74,500 in 2020), early adopters used Spots to check on things at a kiwi farm, monitor a nuclear power plant, and inspect construction sites.
Then the Covid-19 pandemic hit, and governments and organizations put their Spots to work delivering take-out food, greeting hospital patients, and dancing at baseball games for a team owned by SoftBank, which at the time also owned Boston Dynamics. Boston Dynamics’ attitude toward this could, I think, be reasonably described as it’s not what we had in mind, but, hey, you do you. In Singapore, one Spot was outfitted with a camera and loudspeaker so that a city official could drive it around Bishan-Ang Mo Kio Park, observing people and playing prerecorded "let's keep Singapore healthy!" reminders. In Boston, Women and Brigham Hospital used a Spot fitted with an iPad to greet arrivals. A team of reseachers at MIT developed instruments that could be mounted on Spot that would allow doctors to measure blood oxygen saturation (further reducing the amount of human contact patients and health-care workers had to risk).
Most of the 18,000 U.S. law-enforcement departments are small. They can’t afford Spot. Some have the much simpler and more familiar bomb-disposal robots (though most have to borrow even those when they need them). But a few have more resources, and among those the three I mentioned decided to test Spot for police work.
In New York, cops have been using Spot for at least six months, in settings where a human officer might be endangered. For instance, last October they sent the robot into a house where a man had barricaded himself after shooting someone, as Tina Moore and Amanda Woods reported in the New York Post (Cops captured the guy. It’s not clear what role NYPD's Spot played in the arrest.)
Back then Spot was still painted in Boston Dynamic’s construction-site yellow, suggesting the trial was in its early days. Some weeks later, Spot was used in a hostage situation in Queens, where at one point it delivered food to the suspects and the five people they were holding hostage.
Then came last month’s Bronx video, taken as cops were bringing in a Spot unit to help search for two men who had kidnapped and tortured two other men in a nearby apartment. This time, the machine was painted NYPD blue and the word "DigiDog," suggesting the department has committed to Spot. From the POV of a cop on the job, this is not hard to understand. If you were asked to go look in a dark unfamiliar house for some desperate armed criminals, would you choose (a) to go in yourself and take your chances or (b) send in a camera drone first?
Unlike earlier Spot appearances in New York, this one went locally viral, and not in a good way.
Several news stories quoted the "creepy" line, and The Guardian headlined their piece "A dystopian robot dog now patrols New York City,’ which managed to be wrong on all counts. (Spot isn’t patrolling anywhere in New York, and its "brain" is there to keep it balanced and coordinate its four legs, not to bite you, follow you home, nudge you for a walk or any other kind of choice. So, despite its wink-wink name, it's not a dog — it's a drone that carries instruments to places where wheels and propellers wouldn't work. That doesn't sound very dystopian to me.)
Politicians and activists weighed in too. Congresswoman Alexandria Ocasio-Cortez, in whose district the Bronx incident took place, tweeted about "robotic surveillance ground drones." Speaking to Maria Cramer and Christine Hauser for this New York Times story, a policy analyst at the American Civil Liberties Union said adding robot capacities to police power raised concerns about bias, privacy, surveillance and the possibility that bad actors could hack the machines.
That echoed criticisms from the ACLU in Boston in 2019, after Massachusetts State Police tested Spots in their bomb squad, according to this report from radio station WBUR.
In Honolulu, the Police Department paid for a Spot with federal funds designated for “necessary expenditures incurred due to the public health emergency." That appears to have avoided triggering civil-liberties anxieties. But it left the department open to criticism on fiscal grounds. Christina Jedra’s account at Honolulu Civil Beat quotes citizens, politicians and even a cop complaining that purchasing Spot wasn’t a good use of public money — especially as the justification for the robot is that it would be used for telemedicine. (Unless the consultations were with patients who live at the top of trackless jungle mountains, Spot is probably tech overkill for greeting patients and taking measurements. A simple remote-controlled wheeled device costs a great deal less.)
From these reactions, I think we can draw another civics point. Call it Lesson 3: Quietly dropping a robot into people's lives, as if it were just the latest model iPad, isn't the best strategy when the machine works for an agency that can ruin people's lives, or end them. When the robot’s appearance takes the average person by surprise, there’s a lot of room for misinterpretation and exaggeration.
Take, for instance, the idea that Spot is a "surveillance drone."
This robot's top speed is that of a typical walking human. It runs for about 90 minutes before its battery hits "empty." The latest model can recharge itself when low, but to do that it needs to return to a special charging station. And Spot is neither especially quiet nor inconspicuous. I have driven a Spot myself (thanks to the people at Formant, who kindly offered me a hands-on demonstration of their control system for autonomous robots). As I walked the robot through San Francisco, my POV video feed was filled with people stopping, staring and photographing.
So, to review, Spot is about as inconspicuous as a Macy's parade balloon. You can get away from it by upping your walking pace from "normal" to "running a little late." You can outlast its battery by hunkering down and watching a movie, unless it's a recharging model, in which case you can disable it by bashing in its charging station. And if you imagine that a police department will make up for those limitations by flooding the streets with many Spots, remember two facts. First, each one costs $74,500 (not counting the camera or mic or any other equipment you want it to carry). Second, each active Spot requires an officer to control it. Most departments have neither the money nor the personnel for that.
New York City's police department is required by a recent law to report to the public on its high-tech spy tools. As far as I know, it has not included Spot in these documents. I think this is because Spot simply isn't a surveillance device.
Why does it matter that one Spot-inspired fear was fantastical? Imagining Spot as a spy drone led politicians to complain about the wrong problem. Ocasio-Cortez, for example, tweeted
Spot wasn't being tested on that community, or any other. It was being tested in a house with a hostage crisis happening in it. Such a situation is more likely to occur in a low-income Bronx neighborhood than on Manhattan's super-rich Park Avenue, for reasons rooted in structural and systemic racism. But that doesn't mean the neighborhood was targeted.
I don't mean to suggest that critics should confine themselves to the specs of today’s model of any robot. It's reasonable to worry about the future powers and uses of a device. But worries about what it will do should be realistically based on what it does do. Maybe that should be Lesson 4: Make future projections based on current realities. By that criterion, I'd argue that it isn't reasonable to worry about Spot-surveillance.
On the other hand, it is appropriate to ask if the robot could be used to kill people.
Boston Dynamics policy is that Spot should never be used to harm or intimidate human beings, Perry told me. And in 2019 the robot had to be leased, so the company could yank it back in case of abuse. In 2020, though, Spot went on sale, making it hard to see how the company could prevent all abusive uses.
In fact, Spot has already been outfitted with a gun — by the art and marketing firm MSCHF, which earlier this month attached a paintball rifle to the robot. That was for "Spot's Rampage," an online event where anyone on the web could use the robot to blast away at whatever target they chose.
Boston Dynamics was not happy:
But the mischief-makers at MSCHF have a point. Their project wasn’t about the robot-makers’ intentions. It was about what Spot can be made to do, at least once, right now.
Off-label uses of simpler robots by police have already occurred. In November 2014, Albuquerque police used a robot to launch “chemical munitions" into a hotel room where a suspect had barricaded himself. (He gave himself up.) In June 2018 police in Dixmont Maine attached explosives to a bomb-squad robot and detonated it near a gunman they were trying to flush out. He came out firing after the explosion, was wounded, and arrested. And on July 7, 2016, Dallas police attached about a pound of C-4 explosive to a bomb-squad robot, steered it up to a wall near an active shooting suspect, and detonated the charge. In the explosion, the suspect, Micah Xavier Johnson, was killed.
The NYPD says it isn’t using Spot robots outside of hostage and hazardous-materials situations.
For the moment, your conviction that you’re safe from armed Spot robots requires trust that police departments won't use them for abusive ends.
That, in 2021, is a big ask. Robots are supposed to enhance and augment human capacities. If the humans they serve have bad intentions, robots will help them realize those intentions more quickly and more forcefully. For this reason, a New York City Council member, Ben Kallos, recently introduced a bill that would explicitly ban police robots from ever being armed (did he mention that Black Mirror episode? Reader, he did). If passed, his proposal would add another layer of civic protection to the no-armed-robot statement of the NYPD and the Boston Dynamics policy. No higher-up could propose arming robots, and no cop on the street could jerry-rig a gun to a robot, without knowingly violating the law.
Is that enough? I don’t know. I know some robot-makers think it definitely is not. They believe American policing is so inherently racist and oppressive that no department should get robots of any kind until the whole institution is reformed. You can read their "No Justice No Robots" manifesto here. Other roboticists think that’s misguided, on grounds that robot powers (for example to communicate with armed suspects or to map a room where they are hiding) can make cops less dangerous to civilians. You can read more about that controversy in a piece I wrote about the issues here at the New York Times.
A broader issue of fairness and equity involves government priorities. The NYPD spent at least $75,000 on each Spot (and probably more, as the platform has to be equipped with devices like mics, speakers and cameras). As I said, if I were facing a dark room with an armed man in it, I'd be happy to have Spot. But New York, like all American cities, is in a deep fiscal crisis post-pandemic, and a lot of other needs, which come up much more often than armed confrontations, are going unmet. And is a device that is going to show off police power in poorer neighbohoods (see above point about crime and racism) the best use of government money? As Ocasio-Cortez put it:
Lesson 5 for robot civics might then be this: Always ask what is government giving up because it acquired a robot. It's odd how rarely this question comes up in robot news. Perhaps because robots are still exotic and exceptional, people don't think about them in terms of the big numbers in which government deals. And in the trial phases of new robots, this actually makes sense. (In the New York Police Department’s $6 billion budget, the cost of even 10 Spots is barely a rounding error.) But if the point of a trial is to consider widespread deployment, then the cost starts to add up. And even when the amounts are relatively small, robots — because they’re new and exotic — invariably make a statement about where government’s attention and priorities are.
So, to sum up, “robot police dogs” are not going to be coming to your street any time soon. But these early fights over early deployments of these robots can serve as a preview of the robot politics to come.
This and That
Warehouse robots: While we’re talking Boston Dynamics, the company has a new robot out this week: Stretch, an arm-and-gripper for unloading trucks and moving freight around a warehouse.
Robot cheetahs: They’re a step closer, now that there’s a vast new database available that maps how real cheetahs move.
As Jack Clark explains here, researchers from South Africa, Japan and Switzerland have made "a large-scale annotated dataset, consisting of ~120,000 frames of multi-camera-view high speed video footage of cheetahs sprinting, as well as 7588 hand-annotated images. Each annotated image is annotated with 20 key points on the cheetah (e.g, the location of the tip of the cheetah's tail, its eyes, knees, spine, shoulders, etc). Combined, the dataset should make it easier for researchers to train models that can predict, capture, or simulate cheetah motion." The paper (from which the cheetah-map above is taken) is here.
Tele-operated forklifts: French logistics company Geodis is trying out tele-operated forklifts, to reduce the number of workers inside warehouses, open the job of operator to people who live far from the worksite, and to attract workers who might not be able to work in a warehouse but can nonetheless work the machine. Kirsten Korosec reports.
Fascinating and insightful piece again. The point about the ambiguity of Robo-tech like this is what most interests me. This piece charts an important course between the established techno-dystopian and utopian narratives.