Monday, June 8, 2009 | Regarding the Chula Vista sales tax poll, there is something wrong with a survey that shows a ballot measure winning with 69% support only to have it lose less than four months later with 33% of the vote.
The state’s imposition of a sales tax increase happened after the poll was conducted and obviously that caused some voters to oppose the measure. But the 180 degree flip is incredible n literally. In 21 years of polling I’ve never seen a turn like that in such short a time span.
One reason is: there never was 69% support for the measure among the real electorate.
Part of the problem with this poll is that 43% of the respondents should not have been surveyed to begin with. That’s the percentage of voters who were included in the sample but who were unlikely to vote in a low turnout election. (See page 19 of the poll document; page 8 of the crosstabs). If the polling firm had surveyed only the hard core voters, support would have been 63%. Those aren’t my numbers, they’re in the crosstabs. 63% is still a long ways off the mark, but it’s more reasonable.
Second problem: respondents were primed to give a pro-tax measure response. Immediately prior to asking the ballot measure question, respondents heard eight questions that got them thinking about “maintaining police, fire and emergency services” and “preserving youth afterschool programs,” among other things. Overwhelming numbers of respondents attach a high priority to those programs, so at that point in the survey respondents were somewhat locked in to the rationale for supporting the ballot measure. See page four of the document and note that these prioritization questions were randomized, meaning that the order of those questions was different for all respondents. Note further, however, that the very next question was not randomized. That question was: “On a scale of 1 to 7 where one means the situation is extremely bad and seven means the situation is extremely good, how would you rate the financial situation for Chula Vista city government?” 53% answered below !
average. So the pollster set it up so that, after contemplating the programs and services they could lose, every respondent was forced to focus on the dire fiscal situation at city hall. The hammer then falls in the form of the ballot measure “test” question. Wanting to appear rational, respondents were more likely to give a “yes” answer.
Third problem: The survey gives the opponents’ side extremely short shrift. Only two questions (placed after the initial ballot measure test question and mixed in with three other pro-tax questions) give the opponents’ side. Those were overwhelmed by the 27(!) pro-tax arguments or programs tested in the survey. In the trade we call this an “unbalanced design.” I’m getting way into the weeds here, but there is a growing body of evidence showing that respondents who hear an unbalanced poll are likely to hang-up if they are on the “wrong” side. That could have happened in this poll and further skewed the numbers in a pro-tax direction. Returning to the central point, when the poll subsequently tested the measure a second and a third time, the lack of opponent arguments clearly tilted the field of play in favor of pro-tax sentiment. The pollsters did include a strong anti-tax message, but that only came at the end after 60%+ of the respondents had already committed to a pro-tax position. Therefore, that question would have done little to gauge the true amount of opposition.
Those are the problems with the sampling methodology and what’s in the questionnaire. But what they left out was just as important: the potential State sales tax increase was not included. The Governor had been talking about a sales tax increase since August of 2008. He proposed a 1.5% increase in November and proposed it again on January 1st 2009. When the questionnaire was being drafted it was entirely plausible that a statewide sales tax would go into effect before Chula Vista voters weighed in on their tax measure. The effect a state sales tax increase would have on the electorate would have been measurable had the question been asked. Similarly, using consumer confidence questions, the effect of a declining economy could also have been factored in. This would have addressed Jim’s Bartell’s point. And Jim’s other point about tracking surveys is correct, although if the tracker would have been conducted using the same methods as the initial benchmark, it still would have been off the mark.
Finally, the quick-turnaround excuse is weak. This is politics — things move fast n pollsters should be able to meet client deadlines or have the guts to tell their client to wait until the questionnaire is right. Anything less is a disservice to the client and the taxpayers. After all, they paid the $19,800 for this poll which directly led to a bad and costly decision to put this measure on the ballot.