My readers are so smart. We’ve already been talking to school district officials about why San Diego scores on a national exam only grew slightly while state test scores jumped. Now you’ve chimed in with your thoughts:
•Sue Moore, a homeschooling mom who has a doctorate in educational leadership, argues the biggest problem with making sense of test scores, especially the state tests, is “the massaging of raw scores.” Let me explain: What I and other reporters usually call state scores are actually a derived score from multiple tests that students take in California schools.
Moore says it’s such a nightmare to understand — and the way that data are reported has changed so often — that it makes it difficult to draw valid comparisons from one year to the next.
•Attorney Tyler Cramer, who sat last year on an advisory group on the national tests, says the national tests are the golden standard and we should take them — and their less rosy results — more seriously than state tests.
But both sets of tests don’t account for student mobility, Cramer notes. They test different groups of students from year to year, and in areas where kids move in and out frequently, it isn’t fair to chalk up those kids’ scores to a school system they just joined. Cramer writes:
In other words, there is no way to tell whether the district’s flat [national] scores were a product of district’s ineffectiveness. In fact, the district’s education program and its impact on accelerating student learning rates could have been the very best there is and, at the same time, such effects would not be reflected in [national] scores …
The example I love in this area is a high school whose basketball teams have been mediocre for years. One year, however, six outstanding students who are 6’4″ or taller and have played A level Club Basketball their entire lives go out for the team. The team wins the state championship, of course, but can you say it was the product of the school’s improved coaching or other inputs?
— EMILY ALPERT