Want the news summarized?
Subscribe to The Morning Report.

Monday, Dec. 17, 2007 | University of California, San Diego’s Preuss School and the local housing market have a lot in common. In the early part of the decade, the reputation of the much-acclaimed charter school rode the wave of media hype, and Preuss’ potential seemed limitless. In recent months, a grade-tampering scandal has forced casual observers of the local education establishment take a closer look, and they haven’t liked much of what they’ve found.

But the truth is, a lot of what we know today about the Preuss School — with the exception of the apparent grade fraud — we could’ve found out years ago, simply by looking at UCSD’s own analyses of student performance at the charter school. The problem is that few people have bothered to look.

And contrary to the stories Preuss boosters have consistently fed to the local media in recent years, those numbers have painted a much more nuanced picture of the school’s academic achievements. The numbers also tell us where to look to truly gauge the numerous positive effects the school has had on the lucky students who have been chosen to attend it.

Lies, Damned Lies, and Statistics

Since the Preuss School opened in 1999, many people have been enamored by its mission to provide the most rigorous level of academic preparation for kids most at risk for failure in the public school system. Among those captivated by its apparent success have been the staff and editorial board of The San Diego Union-Tribune, which dutifully reported on the honors and glowing reviews of the school. The problem is, most of those honors and reviews were based on deceptive, if not entirely meaningless, statistics.

Now, with the tide turning and blood in the water, the paper has seized on equally meaningless statistics to expose apparent problems at the school.

Consider Thursday’s story, printed on the front page of the paper’s local section. The story focuses on a number buried in a recent audit of the school showing that only 26 percent of Preuss students pass the national Advanced Placement exams, tests that allow students in rigorous academic classes to get college credit for their high school work. That number, the paper reported, lags the San Diego Unified School district, where 45 percent of students taking the AP tests actually pass them.

The trouble with those numbers, as the italics should suggest, is that they tell us very little. Unlike the school district, Preuss makes all of its students take AP classes, and heavily encourages all of them to take the tests. At the district, a much smaller percentage of mostly high-achieving students enroll in AP classes and choose to take the exam. With a bigger proportion of students at Preuss taking the tests, we should indeed expect the average score of each student to be lower — a point a Preuss board member makes in the U-T story, though one that gets lost behind the dramatic headline and lead.

If that last paragraph lost you, just think of it this way: If Preuss leaders simply told the lowest-performing half of students not to take the AP exams, they could double the school’s passage rate without improving the academic performance of a single student. You read that right: The passage rate is as much a measure of who takes the test as a gauge of how well they do.

With this in mind, consider now this news from late May, when Newsweek magazine named Preuss the No. 9 high school in the nation. The U-T hyped that news, too. Take a closer look at the Newsweek methodology and you’ll find that it too is based on the percentage of students at the school who took the AP test. Except this time, having more students take the test gets you a higher rank in the Newsweek index.

This point is important, so I’ll repeat it yet again: Having more students take the AP tests gets your school a higher score on indices like those released by Newsweek and U.S. News and World Report (more U-T hype), but a lower overall passage rate.

So, you might wonder, what about the results of the state’s standardized tests, which show that the Preuss School outperforms most of its peers? Those numbers would indeed be important, if the student body at Preuss represented the student body of the average California public school. But it doesn’t: To get into Preuss, students must fill out a lengthy application, collect letters of recommendation, and otherwise show that they can succeed. So the statewide test scores may simply tell us that better students choose to go to Preuss in the first place, not that the school does a better job of teaching them.

The Beauty of Randomness

In an ideal world, to get a really good measure of Preuss’ success, we would take a group of students and randomly assign some of them to attend the Preuss school, and send some to another school. The random assignment would ensure that both groups are statistically identical (“probabilistically equivalent”) to begin with, and after some time, we could compare the kids who attended Preuss to their counterparts who did not.

Fortunately, we have just such an experiment: When too many students apply in a single year, Preuss uses a random lottery to decide who gets the chance to attend. And in June 2004, UCSD released a report on how the students who lost the lottery compared to those who won.

That report, which was posted on the university’s website, appeared to correct many of the myths that the press had spread about the Preuss School. For one thing, it found that Preuss students received essentially the same grades and did almost identically well on the state standardized tests compared to those who lost the lottery and went elsewhere. The biggest difference, the report suggested, was not academic achievement but the fact that Preuss students were completing far more college-prep classes and more of them were going on to college. Except the local newspaper didn’t publish the findings.

More than six months later, in January of 2005, UCSD’s student paper got a hold of the report and wrote a story titled “Study: Preuss School students do not outperform some peers.” It wasn’t until three months later that the U-T wrote a similar story, citing a “recent report” that “raises questions about whether Preuss students would succeed with or without the school’s influence.”

In November 2005, the university released a follow-up report on its treasured charter school, with many of the same findings. Again, no one seemed to care, so the following March, Preuss School leaders decided to hold a press conference to publicize the research, inviting me (at the time I was a reporter at the student newspaper The Guardian), the U-T and voiceofsandiego.org.

Yet, at that press conference, the media did not seem very interested. The U-T reporter largely ignored the report, instead focusing her questions on what the Preuss leaders thought was the secret to the school’s success. Neither Voice nor the U-T wrote a story about the report or the press conference.

The Bottom Line

What the university’s own research tells us is that the Preuss School’s most important work — and this is indeed important — has been in helping its students take the necessary classes that leave them qualified to apply to a University of California or California State University campus. It also requires all them to apply to college, and helps them submit winning applications.

What it doesn’t appear to do is make students any more prepared for standardized tests, or help them get better grades.

This is what the real story is, and has been for the past three years, folks. The lower AP passage rates, the Newsweek honors and the grade irregularities may be newsworthy, but they are simple footnotes.

But don’t blame the media, which has always had a hard time with statistics. Just think of Brian Fantana, the man-in-the-field from the San Diego movie classic “Anchorman.” In the movie, the astute journalist gets conned into buying cologne called Sex Panther (“it’s made with bits of real panther, so you know it’s good.”)

“They’ve done studies, you know. Sixty percent of the time, it works every time,” he says in one of the best lines in the movie.

Hopefully, our real San Diego journalists can do better.

Vladimir Kogan is a doctoral student at University of California, San Diego’s department of political science and a voiceofsandiego.org contributor. You can contact him directly with your thoughts, ideas, personal stories or tips at vladimir.kogan@voiceofsandiego.org or send a letter to the editor.

Leave a comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.