The Morning Report
Get the news and information you need to take on the day.
For more than a year, residents in some of San Diego’s most heavily policed neighborhoods have been telling City Hall that officers engage in racial profiling.
Maria Morales, 38, is one of those residents. She said she has no doubt that what she and her boyfriend experienced a while back at a trolley stop was racial profiling.
“We have tattoos and we’re colored people. And there was a white couple. (The officers) totally bypassed the white couple and came up to us,” Morales said. “I already knew what to expect.”
Morales said the officers asked whether she was on parole.
“Just because I have tattoos and because I have long, dark hair and colored skin and a certain look, you know nothing about me. I’ve never been to prison in my life,” Morales said. “For you to automatically assume that, it’s a shaming feeling.”
Trolleys are usually patrolled by a combination of MTS officers, sheriff’s deputies and officers from a handful of departments from the region, including SDPD – but the sentiment is a pervasive one: Residents in certain neighborhoods believe they’re targeted by law enforcement officers because of their race.
Morales’ story and hundreds of similar anecdotes haven’t convinced elected officials or police brass that racial profiling is a reality in the San Diego Police Department. After she first took the helm of the SDPD, Chief Shelley Zimmerman was careful to acknowledge that community members believed they were profiled without actually saying whether profiling indeed happened.
Since 2001, department leaders have said the data they have on race and policing is inconclusive. And they’re not alone.
At the public’s urging, law enforcement agencies across the nation now have an unprecedented amount of information on their officers – cell phone videos, body camera footage and traffic stop cards. But most haven’t drawn any wide-ranging conclusions with that data. The public dialogue on racial profiling still plays out like a debate, and often circles around the idea that the real problem is mutual misunderstandings about culture and police practices.
Criminologist Joshua Chanin said the departments are right about taking their data with a grain of salt.
Earlier this year, Councilwoman Marti Emerald tapped Chanin and other researchers at San Diego State University to take a look at the San Diego Police Department’s data to gauge whether people of color are being pulled over for traffic stops disproportionately.
The quick-and-dirty method used by most departments, advocacy groups and media outlets has been to make a comparison between the rate at which certain racial groups are stopped, and Census data. For example, if the share of black drivers stopped by police is higher than the share of black residents, many allege racial profiling.
“But this isn’t the most accurate assessment of those people that are driving,” Chanin said. “There’s a distinct difference between the driving population and the residency.”
Researchers call this “the denominator issue,” and Chanin will spend the next several months trying to get around it.
Animation by Jorge Contreras, KPBS
Researchers need a reliable baseline to find disparities, and Census population figures don’t provide that. Not everyone in a given area is old enough to drive or has a car. And San Diego’s proximity to the border and its tourism draw mean the driving population is constantly in flux.
So Chanin plans to use something called “the veil of darkness.”
“What this technique does,” Chanin said, “is it uses natural changes in light to isolate the effect of race on the likelihood that a driver will be stopped.”
Animation by Jorge Contreras, KPBS
The Veil of Darkness assumes two things. First, that it’s more or less the same people driving on a given street between 5:30 and 9 p.m. They’re coming home from a 9-to-5 or heading out for the night shift. Second, it assumes than an officer can better observe a driver’s skin color when the sun is up.
So, researchers can compare traffic stop data from 5:30 to 9 p.m. in July, when it’s light out, to the same timeframe in January, when it’s dark. If more people of color are pulled over in that area during the summer, one can assume race is at play.
The method isn’t perfect.
“The extent to which a driver’s physical characteristics predict the likelihood of being stopped or searched is one of a host of variables that may shape or may influence a police officer’s decision,” Chanin said. He’ll be able to control for the area’s crime rate and police presence in his statistical model, but whether the driver was making furtive movements or otherwise tipping off the officer can’t be accounted for.
Statistician Greg Ridgeway created the Veil of Darkness with his colleague, Jeffrey Grogger, and said it’s the best option out there. They came up with it after stumbling on an article in which an officer said it was impossible for him to target certain racial groups because he couldn’t see skin color while patrolling at night.
The strategy has since been used in five jurisdictions: Oakland, Cincinnati, Minneapolis, Syracuse and the state of Connecticut.
“It really gets at simply the direct question, ‘Does the ability to see the driver in advance influence which drivers the officer is going to stop?’ And that really gets a lot closer to the key question of racial profiling,” Ridgeway said.
To paint a fuller picture, Chanin’s group will also study whether people of color are more likely to get a ticket or be searched after they’re pulled over. And he’ll include interviews with community members and police officers.
If the final picture is one of systemic racial profiling, Emerald said the city will act.
“It’s a big if,” she said. “Maybe the study will show some improvements are needed. If so, I think our police chief is ready, willing and able to do something about it.”
To date, most researchers have not found systemic racial profiling using the Veil of Darkness. But Ridgeway said that doesn’t mean there shouldn’t be a call to action when the study comes out.
“You can get past the question of whether the department is race-neutral to get to, ‘Well, let’s solve these individual cases of individual officers and individual incidents and try to minimize the risk of those,’” Ridgeway said.
In Cincinnati, the result was special software that would more accurately scan the department for problem officers. Most systems only track complaints and disciplinary problems. And officers have slipped through the cracks in San Diego.
Ridgeway’s system customizes a benchmark for each officer and looks for abnormal enforcement patterns.
“It’s basically comparing that officer to peers that would be exposed to the same kinds of suspicious characters, the same kinds of interactions,” Ridgeway said.
Back at the trolley stop, Morales said she wishes her voice were proof enough for city leaders. But she said she welcomes the study if it means greater accountability for officers.
“I think we’re just kind of desperate,” Morales said. “Just give us something that will support what we’re already saying.”
The study results will come in three phases. The first is expected in late October.