San Francisco made history in 2019 when its Board of Supervisors voted to ban city agencies including the police department from using face recognition. About two dozen other US cities have since followed suit. But on Tuesday San Francisco voters appeared to turn against the idea of restricting police technology, backing a ballot proposition that will make it easier for city police to deploy drones and other surveillance tools.
Proposition E passed with 60 percent of the vote and was backed by San Francisco Mayor London Breed. It gives the San Francisco Police Department new freedom to install public security cameras and deploy drones without oversight from the city’s Police Commission or Board of Supervisors. It also loosens a requirement that SFPD get clearance from the Board of Supervisors before adopting new surveillance technology, allowing approval to be sought any time within the first year.
Matt Cagle, a senior staff attorney with ACLU of Northern California, says those changes leave the existing ban on face recognition in place but loosen other important protections. “We’re concerned that Proposition E will result in people in San Francisco being subject to unproven and dangerous technology,” he says. “This is a cynical attempt by powerful interests to exploit fears about crime and shift more power to the police.”
Mayor Breed and other supporters view it as a response to concerns over crime in San Francisco. While crime rates have generally decreased, there has been a recent spike in overdose deaths due to fentanyl and the commercial downtown districts are still grappling with vacancies caused by the pandemic in office and retail spaces. The proposition also had backing from groups related to the tech industry, such as the GrowSF campaign group, which did not reply to a request for comment.
“By endorsing the work of our police force, increasing our use of technology, and getting officers away from their desks and onto our streets, we will carry on our aim to make San Francisco a safer city,” stated Mayor Breed on the passing of the proposition. She pointed out that the city saw its lowest crime rates in ten years in 2023, barring a temporary increase in 2020 due to the pandemic, and that the rates of property and violent crimes continued to drop even further in 2024.
Proposition E also provides police with greater flexibility to engage in car chases and reduces their paperwork duties, including in situations where officers use force.
Caitlin Seeley George, the managing director and campaign director for Fight for the Future, a nonprofit organization that has long been opposed to face recognition, labeled the proposition as “a setback to the intensely fought reforms that San Francisco has been promoting in the past few years to control surveillance.”
“By expanding police use of surveillance technology, while simultaneously reducing oversight and transparency, it undermines peoples’ rights and will create scenarios where people are at greater risk of harm,” George says.
Although Cagle of ACLU shares her concerns that San Francisco citizens will be less safe, he says the city should retain its reputation for having catalyzed a US-wide pushback against surveillance. San Francisco’s 2019 face recognition ban was followed by around two dozen other cities, many of which also added new oversight mechanisms for police surveillance.
Andy Greenberg
Caroline Haskins
Matt Simon
Amanda Hoover
“What San Francisco started by passing that ban and oversight legislation is so much bigger than the city,” Cagle says. “It normalized rejecting the idea that surveillance systems will be rolled out simply because they exist.”
The San Francisco mayor’s office hasn’t said which type of drone, surveillance, or body-worn cameras police might use under the new rules. Anshel Sag, a Principal Analyst at Moor Insights & Strategy, a tech research firm, notes that almost all newer drones on the market have forms of face recognition technology built in. Some of Insta360’s action cameras include this, he says, as well as drones made by DJI, the world’s largest commercial drone maker. “DJI’s cameras use it to track a person and stabilize the video capture,” he says.
In some cases, the customer may be able to toggle off tracking options. And, Sag adds, the video-capture technology may be more coarse and not specifically track a face. But this isn’t always clear to users of the technology, he says, “because the object-tracking algorithms operate like a black box.”
Saira Hussain, a senior staff attorney for the Electronic Frontier Foundation, notes that San Francisco’ previous ban on face recognition allows the police department to possess devices with the technology built in if it’s a manufacturer-installed capability. (San Francisco’s Board of Supervisors had to update the law to make iPhones, which use face recognition technology to unlock, legal.) The law stipulates that these devices not be acquired for the basis of using it in policing functions.
More concerning to the EFF specifically is how Proposition E allows for a certain level of secrecy around surveillance technologies trialed by SFPD, for as long as a year without being disclosed, Hussain says. “It’s about making sure the police stick to the contours of the law.”
Additional reporting by Amanda Hoover