Tuesday, November 09, 2004

Polling Accuracy - Does RCP Have The Answers?

Hugh Hewitt has cited Real Clear Politics' post, "Which Pollster Was The Most Accurate?" as a terrific resource which will be valuable for years to come. That was certainly my opinion when I first read it too.

But Rick Brady from the blog Stones Cry Out begs to differ, and he has some excellent reasons why.

Yesterday, Rick explained the polling concepts of Confidence Level and Confidence Interval, using these concepts to clear up some common misconceptions about what poll numbers actually mean, and faulting RCP for missing their significance. After a bit more analysis and observation, he concluded:

Statistically, the polls were generally right on. We expect too much from polling organizations. Although the Battleground/Terrance and Pew Research polls nailed the spread, I'm sure that privately, those pollsters are surprised. They know their results have a margin of error and any pollster would be stupid to say that a sample of 1,200 people could predict the exact outcome of a national election involving more than 100,000,000 voters. After all, Zogby "nailed it" in 1996, right?

The part that grabbed me about this was the statement "we expect too much from polling organizations. " Absolutely true. We treat pollsters like prophets, and they're nothing of the kind.

Rick continued today with a further analysis of RCP's post, going into state by state numbers, polling dates, and the RCP rankings of polling organizations. I strongly encourage anyone swayed by the RCP analysis to read Rick's entire post. But here are a couple of especially salient bits:

#1 Mason-Dixon: A final Minnesota poll showing a one-point Bush win is the only blemish on Mason-Dixon's otherwise perfect scorecard this year.

Blemish? The Mason Dixon poll had a 4% margin of error and predicted Bush would win by 1%. What the Mason Dixon poll said (and it said absolutely NOTHING else) was that the organization was 95% certain that the actual election result could be anything from Bush by 5% to Kerry by 3%. Kerry won by 3%, therefore the survey was 100% accurate per its methodology. To call this a "blemish" is, well, revealing.

Pay special attention to Rick's careful use of language here. "What the Mason Dixon poll said (and it said absolutely NOTHING else)..." Here we're getting back to that pollster as prophet problem again. A poll might accurately nail the exact result of an election, but that's just good luck. It's not designed to do that, and no pollster makes the claim to be able to do so. We need to do a better job reading what a poll is actually saying, not just what we want it to tell us.

Here's another interesting bit:

#4 Research 2000: The big miss came in Florida, where Research 2000's final poll called for a one-point Kerry win.

This poll was taken from 10/18 to 10/21, a full two weeks prior to the election. This wasn't a "big miss" at all. If I ran the Research 2000 organization, I would be furious with this characterization.

See how misleading some of these poll rankings can be? For all we know, if the election was held on 10/22, Kerry would have been within that margin of error. No matter how strongly we wanted to know who would get the most votes on 11/2, that poll wasn't designed to tell us that.

Speaking as someone who was initially swayed by RCP's analysis, I think Rick has provided a service even more valuable. Let Hugh clip and save the RCP piece for years to come. I'm hanging onto Rick's posts on the topic instead.

3 Comments:

Blogger ricardo said...

Thanks Doug for the kind link. May I encourage you to send a trackback ping? I don't think my post is making much of a dent. This stuff is not "sexy" like the Specter debate, but its the kind of thing that makes liberal academics go crazy and gives them more reason to hate us.

For example, there was a fascinating CBS article today trashing bloggers. Bloggin as Typing, not JournalismYeah, I know its CBS but take this point about the exit polling fiasco: "These polls occur in the realm of statistics and probability. They require PhD-style expertise to understand. The people who analyze them for news organizations, like the legendary Warren Mitofsky and Martin Plissner at CBS News -- have trade associations like doctors do to certify their work.

When you the humble reporter are writing a story based on the polls you need one of these gurus standing over your shoulder interpreting what they mean or you almost certainly will screw it up."

Well, actually, you don't need a PhD like understanding of stats, but you could use some advanced coursework in survey research methods (which I do have). My lower division statistics classes taught me what a Confidence Level and Confidence Interval was and what it meant for any sample data.

I didn't touch the Drudge story because: 1) I was busy like mad GOTV in 3 precincts; and 2) I knew it was crap. Again, why do you think the major networks didn't run with the early exit polling data? It was based on a partial sample that had obvious flaws. Anyways, I posted on this here (see middle of the long post).

I will see Hugh tomorrow at a live feed broadcast from the University of San Diego. Perhaps he'll let me have an earful. Look I'm no expert on stats so maybe I'm missing something. If that's the case, the RCP guys are REALLY lame at explaining themselves.

8:01 PM  
Blogger Doug said...

I'd love to send a trackback Rick. Just as soon as I get around to installing Haloscan, which at my current rate should occur sometime between this weekend and my four year old's first year of college (well... maybe the first year of his post graduate work anyway).

In any case, thanks for the excellent posts.

8:26 PM  
Blogger ricardo said...

You don't need Haloscan. In fact, I don't know how to send a trackback ping via haloscan. I use the Wizbang standalone pinger. http://www.aylwardfamily.com/content/tbping.asp

All you do is click on my TB link and it will display the following:

Trackback URL for this entry:
http://haloscan.com/tb/rjbrady/109997837615889688

Copy that tb url into wizbang pinger and then copy in your permalink, insert your blog name, title of the post, short exerpt and send the ping!

11:28 PM  

Post a Comment

<< Home