- This topic contains 0 voices and has 19 replies.
November 8, 2012 at 3:21 am #605510
Looking at the polling outfits, we can now see who was accurate. Here’s the top tier (there were some ties, hence the repeated numbers)…
1. Daily Kos/SEIU/PPP*
5. Purple Strategies
Now, the guys at the bottom…
15. FOX News
15. Washington Times/JZ Analytics
15. Newsmax/JZ Analytics
15. American Research Group
15. Gravis Marketing
23. Democracy Corps
Why do you think there’s such a concentration of inaccuracy from the pollsters that are considered to be right-leaning? Remember, these rankings are based on actual facts and numbers the day after the election. They are not spin. What do you think?November 8, 2012 at 3:40 am #776838
I don’t know. I don’t know their methodology. Do you? Tell me if you do.November 8, 2012 at 3:56 am #776839
I saw a post somewhere indicating that Gallup dramatically oversampled white folks: 78% in most of their surveys while on Election Day whites were about 72% of the vote. That alone is probably enough to throw Gallup off. The same explanation might hold for at least some of the other firms (though probably not Rasmussen).November 8, 2012 at 4:03 am #776840
“…Do you? Tell me if you do.”
No I don’t, hence the question. I just think it’s curious that its those pollsters, seemingly as a group, at the bottom. I wonder what the stats on this were in the last couple of elections? I might google that up when I have some time.November 8, 2012 at 4:36 am #776841
Something I heard from Matt Barreto of the WA poll (I think) was that NO national polls asked any questions in Spanish, thereby overlooking a large and quite decisive, in NV and CO, part of the electorate. So all the polls simply weren’t tracking the correct extent to which Latinos would swing Obama (80 percent average) This is a really big deal.November 8, 2012 at 4:43 am #776842
Rasmussen at the bottom?!?
I don’t get it!
Somebody around here—the name escapes me—quoted or referenced Ras on an almost continual basis!
MikeNovember 8, 2012 at 3:27 pm #776843
what does it matter what their methodology was?
whatever it was.. they got it wrong.
when so many of your sources get it wrong
it’s time to question their shared assumptions
the error isn’t in the equationsNovember 8, 2012 at 3:40 pm #776844
could not understanding this graphic have been their epic fail?
the USA.. amp adjusted for populationNovember 8, 2012 at 6:55 pm #776845
There are several problems with opinion/voter polling in general, and these transcend the polling “accuracy” rate for any one election.
The foremost of these problems is the question of whether such polling is scientific. Pollsters claim that it is, naturally, but how scientific can any system be when what it purports to measure is the hazy probability that a human being (or even a bunch of them) will actually do what they say they are going to do?
In the months leading up to 2008, there was much ado over those White Americans who, when polled, supposedly said they couldn’t bring themselves to vote for a Black man, no matter what. And yet, many of these putative racists must have voted for Obama after all.November 8, 2012 at 7:29 pm #776846
The next problem is how you define polling “accuracy.”
Is accuracy determined by the ability to predict the results for one election? Or is it determined by how many elections the pollster called correctly over time?
Is it determined by whether the pollster called it right six months before the election? Six weeks? Or six days?
To answer dobro’s query. (“Why do you think there’s such a concentration of inaccuracy from the pollsters that are considered to be right-leaning?”)
–If we accept dobro’s premise that there are left-leaning and right-leaning pollsters, then the question itself points to the answer. If we allow that there can be a pollster who is “right-leaning” or “left-leaning” or this-leaning or that-leaning, it supports my earlier point that voter polls are inherently unscientific and that they are subject not just to the vagaries of voter behavior but to bias on the part of the pollsters themselves, who may phrase polling questions or report the responses according to those biases.
But let me put it plainly . . . the reason right-leaning pollsters were less accurate this time was because they wanted their guy to win, and the numbers they reported reflected that want. But in fact their guy lost, so in retrospect, they look kind of foolish.
You could test my assumptions here (if you had the gumption) by going back and reviewing the pollsters’ predictions according to their perceived degree of bias. So then, if what I’m saying is true, the pollsters who are most strongly right-leaning would have been the most wrong in terms of their predicted support for Romney.
But wait! Don’t assume that means that the rightmost-leaning pollsters are the only ones who “got it really wrong” because if you also went back and identified the left-leaning pollsters, I suspect you’d that they had also gotten it very wrong in terms of their predicted level of support for Obama. And accordingly, those who were most left-leaning were the ones who got it most wrong.
To test this thesis further, you could go back to the some time before when the Republicans won, where the same pattern should be apparent. Take GW Bush’s second win in 2004, for example. I suspicion that if you checked you’d find that, as a whole, the most left-leaning pollsters were the furthest off on that one, because they were the ones who most wanted their guy Kerry to win, in spite of the fact that he didn’t.
How’s that? Clear as mud?November 8, 2012 at 7:31 pm #776847
well, now…what’s to keep someone from lying in a poll..? I’m sure it’s done. Probably often…so how scientific can that possibly be?November 8, 2012 at 7:36 pm #776848
A: Polls can’t be scientific, but it’s not because people lie.
Most people who poll one way and vote another aren’t lying to the pollster. They’re just changing their minds.November 8, 2012 at 7:47 pm #776849
Don’t repeat the pundits who were wrong and grasping at excuses made of straw, ask a pollster who got it right.
November 8, 2012 at 7:57 pm #776850
- The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t
KenParticipantNovember 9, 2012 at 4:07 pm #776851
from the promotion for Nate Silver’s book linked by Ken
” Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the “prediction paradox”: The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.”
I would add that when you tailor your questions to get the responses you want from a poll.. you please your customer but get inaccurate results.November 9, 2012 at 4:35 pm #776852
Even the tone of the pollster’s voice or the expression on his face can have an effect on the way a person answers his questions. It’s called the “Observer-Expectancy” effect, and there’s a whole sub-discipline devoted to this phenomenon in the world of academic research.
Academic researchers take great pains to minimize the expectancy effect in their studies, but pollsters are under no such constraints. Sure, you might hear a poll being described by some media outlet as “scientific” or “accurate to within +/- # of points,” but that’s really just puffery. Most polling methods in use today wouldn’t make it past a high-school psych teacher.November 9, 2012 at 4:59 pm #776853
I think they all drank the same cool-aid I did. Which is to say they thought the voter turnout would more resemble the 2010 midterms than the 2008 Presidential. More R’s, more white, more male, older. As a result they re-balanced their polls to reflect that assumption.
The long and short of it is they thought the economy would trump everything else (99%, war on women, immigration, etc).
Boy, were they wrong.November 9, 2012 at 5:11 pm #776854
or maybe.. for those on the bottom end of the pay scale.. the economy did trump everything else.
those at the bottom of the pay scale don’t see the results of the trickle down theory in theoretical terms..
the failure of trickle down has determined their realityNovember 9, 2012 at 10:08 pm #776855
Could also be as we grow the number of people who are dependant on the government the more likely they are to vote for the party that will continue or expand those programs.
Why on earth would I vote against free stuff (healthcare, extended unemployment, food stamps, or whatever).November 9, 2012 at 10:11 pm #776856
- You must be logged in to reply to this topic.