
City Hall was in damage control mode. It was early 2019 and I’d gotten word that managers were worried they’d opened a backdoor to a rather intense surveillance effort. They’d started hosting community forums in anticipation of the backlash.
Part of me couldn’t believe what I’d heard. The City Council had rolled out thousands of cameras as part of an environmental project and didn’t ask a single question about how the streetlight technology would benefit police, or impact people’s civil liberties?
Turns out, that’s exactly what happened. Officials acknowledged that they’d made mistakes upfront but said they’d installed guardrails to protect people’s privacy and mitigate the potential for abuse. But in the process, they’d allowed the Police Department to write its own rules for accessing the devices.
I wrote up what I’d found and moved on professionally, assuming that the attention would fade on this issue, as it does on most issues in San Diego.
But in 2020, a group of activists and community members made damn sure that nobody forgot how the streetlights program came together because buried in that history were lessons for the entire region. The Trust SD Coalition lobbied hard and found sympathetic members of the City Council willing to champion their cause within the power structure.
By year’s end, officials had approved the first draft of an ordinance that lays out ground rules for the use and acquisition of surveillance gear. The City Council is also on its way to establishing a privacy advisory board.
A good amount of my time in 2020 was spent investigating this and other surveillance efforts. We’ve seen an explosion of interest in smart technologies, as companies make the case to cash-strapped public agencies that one gadget or another can improve their operations. Qualcomm has plans to accelerate the trend.
The boosters of smart cities — who often have something to sell — view the problems of society as merely technical in nature. They offer new tools in the name of efficiency and convenience that seem impartial and nonideological on the surface. But there are value sets embedded in any form of technology, and officials ignore this reality at the risk of exacerbating rather than flattening inequities.
As Ben Green notes in his excellent book “The Smart Enough City,” “Technology can be a valuable tool to promote social change, but a technology-driven approach to social progress is doomed from the outset to provide limited benefits or beget unintended negative consequences.”
Some of the proposals I’ve seen around San Diego seem benign. Carlsbad, for instance, is in the process of modernizing its IT infrastructure and looking at physical sensors buried in the road that can count cars without the visual component. Still, the city surveyed residents online and found that 80 percent of respondents were either “very” or “somewhat” worried about the privacy implications of this digital transformation.
What worries me are the more invasive types of technologies and the nonchalant, patronizing way in which officials talk about these things. A defense contractor wanted to fly a military-grade drone over San Diego this year and the city’s Office of Homeland Security — citing the mayor and police chief — suggested removing any references to public safety from the press release. Before the SkyGuardian was rerouted to the desert, General Atomics had planned to use its technologies to detect speeders on the freeway.
What else have we not been told about? San Diego is, after all, on the border and home to various federal agencies that operate with considerably less oversight.
But there are lessons in the projects we already know about. Here are a few of my takeaways from the last year.
Smart Tech Bends Toward Law Enforcement
It’s not just military-grade drones. The smaller, commercial ones are becoming more popular, and Chula Vista has emerged as a national leader. Police in the largest South Bay city are using drones — sophisticated enough to automatically avoid obstacles — to respond to 911 calls and broadcast messages to the homeless.
In San Diego, the streetlight system became exclusively a crime-fighting tool after the City Council defunded it. The city’s Sustainability Department tried to shut the whole thing down but the private Florida-based company that owns the underlying tech offered to keep it going, free of charge, for cops. The CEO wrote in an email: “Clearly these are unprecedented times for law enforcement.”
There’s evidence to suggest that San Diego officials had considered tapping into the camera network in real time, but the conversation ended as public pushback began to mount. As far as I can tell, no one at SDPD is sitting in front of a wall of computer screens, flipping back and forth between feeds.
The streetlights are reactive tools of law enforcement. The visuals delete themselves after five days. The city releases a public access log upon request with some basic information about who went looking for evidence and why.
By and large, investigators have used the network to prosecute people for violent crimes. But there are also examples of the devices being used to investigate not-so-violent crimes, which demonstrates another important point: how we use technology evolves. And in the era of smart devices, that can be achieved with a simple software update.
It’s not hard to imagine a day when an artificial intelligence could monitor all the footage and is sophisticated enough to predict behaviors. This isn’t science fiction. The military is already developing this type of technology overseas, and it’s bound to come home.
It’s reasonable to question what happens to our civil liberties when a digital dragnet is established and a computer sets the criteria. The threat rises when one device or database is hooked up with another: more data points of someone’s life to stitch together and more opportunities for abuse.
Palantir sells software that can analyze criminal records, gang member databases, license plates, social media archives and jailhouse telephone records to calculate the likelihood someone will commit a violent crime in the future. We know that police agencies in San Diego use the software, but we don’t know how. I’ve asked for records and gotten stonewalled behind the veil of public safety.
The backers of this technology also like to argue that surveillance can hold everyone accountable. Streetlight cameras have cleared people of wrongdoing, they point out. But the benefit seems to move largely in one direction. Even if I wanted to, I couldn’t tap into the cameras to prove an officer committed misconduct.
People made the same argument about body-worn cameras years ago. Yet as my colleague Lisa Halverstadt reported in July, the Metropolitan Transit System purged the footage of officers citing a homeless man before his attorney could access the footage to use it in his defense.
Smart Tech Can Be Expensive
Flattery is one of the oldest tricks in the salesperson’s arsenal. The subtext of many smart tech pitches is that you’d be a moron not to buy this. But just as often you’ll hear someone say: You can’t afford not to do this. It’ll literally pay for itself.
Both San Diego and the Port heard this when officials were in the process of rolling out their streetlight sensors. By dimming lights from afar, the agencies could recoup their costs through energy savings, the pitch went.
It didn’t work out that way. NBC 7 reported in February that the costs of the project were ballooning. Officials didn’t have the expertise they needed, so they wound up hiring a data scientist in-house. (Some of the devices were purchased with federal anti-poverty dollars.)
The Port also discovered that the data wasn’t terribly useful because it wasn’t terribly sophisticated. They could get the same, if not better, data by using old fashioned methods of counting people and cars.
The Port wound up dismantling its project after a year-and-a-half trial run. I asked Job Nelson, the Port’s chief policy strategist, what it would take to revive the streetlight system and he mentioned costs — they needed to come down. But he also reinforced the need to get buy-in from visitors and tenants.
“The public needs to get comfortable with what you’re doing with the data,” he said. “I know I heard from people on the outside saying to us, ‘Well, the private sector uses this stuff all the time.’ My response back was, ‘We’re not the private sector. We’re the public sector. And the public expects something different of us.’”
The Inevitable Eye Roll
I regularly hear people say, “Well, if you’ve got nothing to hide, what’s the problem?” The Union-Tribune published several letters to this effect over the summer, making the case that more surveillance was a good thing.
These people miss the point. Everyone has something to hide. Yes, everyone. There are things about you that others have no business knowing — like your sexual preferences or your political affiliations or the last time you went to a doctor and why. Wanting to carve out a little space in this cruel and competitive world of ours does not make you a criminal. It makes you human.
That’s where the conversation gets really interesting, though. I’ll often ask people for their Facebook password or, hell, the key to their front door. The response is always the same. They scoff.
Then comes another popular refrain: “The laws of the land protect my stuff from unreasonable search, but there’s no expectation of privacy in public.” That may be the case, even though privacy is enshrined as an inalienable right in the California Constitution.
But again, this argument misses the point: the unblinking eye of a camera effectively turns everyone into a suspect. The constant monitoring of common space is corrosive on its own terms because it breeds paranoia and gives our infrastructure the veneer of a fortress. The 18th century philosopher Jeremy Bentham spoke of the panopticon, a prison with a large central tower where guards could see without being seen themselves. The inhabitants couldn’t say for sure if they were or weren’t being watched, so they assumed they were at all times. What Bentham imagined was a form of social control.
Consider the threat to civil liberties on a less abstract level: What if a police officer who had access to the cameras wanted to track his ex-wife? What if a politician wanted to unmask the source of a leak so they followed a journalist?
I’m not opposed to police using cameras to catch murderers. It’s a question of risk and what we’re all willing to give up to feel a little safer at a time of historically low crime rates.