Polling Accuracy

Home Forums Open Discussion Polling Accuracy

  • This topic is empty.
Viewing 20 posts - 1 through 20 (of 20 total)
  • Author
    Posts
  • #605510

    dobro
    Participant

    Looking at the polling outfits, we can now see who was accurate. Here’s the top tier (there were some ties, hence the repeated numbers)…

    1. PPP

    1. Daily Kos/SEIU/PPP*

    3. YouGov*

    4. Ipsos/Reuters*

    5. Purple Strategies

    6. NBC/WSJ

    6. CBS/NYT

    6. YouGov/Economist

    9. UPI/CVOTER

    10. IBD/TIPP

    Now, the guys at the bottom…

    15. Politico/GWU/Battleground

    15. FOX News

    15. Washington Times/JZ Analytics

    15. Newsmax/JZ Analytics

    15. American Research Group

    15. Gravis Marketing

    23. Democracy Corps

    24. Rasmussen

    24. Gallup

    Why do you think there’s such a concentration of inaccuracy from the pollsters that are considered to be right-leaning? Remember, these rankings are based on actual facts and numbers the day after the election. They are not spin. What do you think?

    #776838

    HMC Rich
    Participant

    I don’t know. I don’t know their methodology. Do you? Tell me if you do.

    #776839

    socamr
    Participant

    I saw a post somewhere indicating that Gallup dramatically oversampled white folks: 78% in most of their surveys while on Election Day whites were about 72% of the vote. That alone is probably enough to throw Gallup off. The same explanation might hold for at least some of the other firms (though probably not Rasmussen).

    #776840

    dobro
    Participant

    “…Do you? Tell me if you do.”

    No I don’t, hence the question. I just think it’s curious that its those pollsters, seemingly as a group, at the bottom. I wonder what the stats on this were in the last couple of elections? I might google that up when I have some time.

    #776841

    kgdlg
    Participant

    Something I heard from Matt Barreto of the WA poll (I think) was that NO national polls asked any questions in Spanish, thereby overlooking a large and quite decisive, in NV and CO, part of the electorate. So all the polls simply weren’t tracking the correct extent to which Latinos would swing Obama (80 percent average) This is a really big deal.

    #776842

    miws
    Participant

    Rasmussen at the bottom?!?

    I don’t get it!

    Somebody around here—the name escapes me—quoted or referenced Ras on an almost continual basis!

    Mike

    #776843

    JoB
    Participant

    HMCRich

    what does it matter what their methodology was?

    whatever it was.. they got it wrong.

    when so many of your sources get it wrong

    it’s time to question their shared assumptions

    the error isn’t in the equations

    #776844

    JoB
    Participant

    HMCRich

    could not understanding this graphic have been their epic fail?

    https://www.facebook.com/photo.php?fbid=4843456407048&set=a.1106161177003.18185.1315995464&type=1&theater

    the USA.. amp adjusted for population

    #776845

    DBP
    Member

    There are several problems with opinion/voter polling in general, and these transcend the polling “accuracy” rate for any one election.

    The foremost of these problems is the question of whether such polling is scientific. Pollsters claim that it is, naturally, but how scientific can any system be when what it purports to measure is the hazy probability that a human being (or even a bunch of them) will actually do what they say they are going to do?

    In the months leading up to 2008, there was much ado over those White Americans who, when polled, supposedly said they couldn’t bring themselves to vote for a Black man, no matter what. And yet, many of these putative racists must have voted for Obama after all.

    #776846

    DBP
    Member

    The next problem is how you define polling “accuracy.”

    Is accuracy determined by the ability to predict the results for one election? Or is it determined by how many elections the pollster called correctly over time?

    Is it determined by whether the pollster called it right six months before the election? Six weeks? Or six days?

    ****************************************************************************************

    To answer dobro’s query. (“Why do you think there’s such a concentration of inaccuracy from the pollsters that are considered to be right-leaning?”)

    –If we accept dobro’s premise that there are left-leaning and right-leaning pollsters, then the question itself points to the answer. If we allow that there can be a pollster who is “right-leaning” or “left-leaning” or this-leaning or that-leaning, it supports my earlier point that voter polls are inherently unscientific and that they are subject not just to the vagaries of voter behavior but to bias on the part of the pollsters themselves, who may phrase polling questions or report the responses according to those biases.

    But let me put it plainly . . . the reason right-leaning pollsters were less accurate this time was because they wanted their guy to win, and the numbers they reported reflected that want. But in fact their guy lost, so in retrospect, they look kind of foolish.

    You could test my assumptions here (if you had the gumption) by going back and reviewing the pollsters’ predictions according to their perceived degree of bias. So then, if what I’m saying is true, the pollsters who are most strongly right-leaning would have been the most wrong in terms of their predicted support for Romney.

    But wait! Don’t assume that means that the rightmost-leaning pollsters are the only ones who “got it really wrong” because if you also went back and identified the left-leaning pollsters, I suspect you’d that they had also gotten it very wrong in terms of their predicted level of support for Obama. And accordingly, those who were most left-leaning were the ones who got it most wrong.

    To test this thesis further, you could go back to the some time before when the Republicans won, where the same pattern should be apparent. Take GW Bush’s second win in 2004, for example. I suspicion that if you checked you’d find that, as a whole, the most left-leaning pollsters were the furthest off on that one, because they were the ones who most wanted their guy Kerry to win, in spite of the fact that he didn’t.

    How’s that? Clear as mud?

    #776847

    JanS
    Participant

    well, now…what’s to keep someone from lying in a poll..? I’m sure it’s done. Probably often…so how scientific can that possibly be?

    #776848

    DBP
    Member

    A: Polls can’t be scientific, but it’s not because people lie.

    Most people who poll one way and vote another aren’t lying to the pollster. They’re just changing their minds.

    #776849

    Ken
    Participant

    Don’t repeat the pundits who were wrong and grasping at excuses made of straw, ask a pollster who got it right.

      The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t

    http://www.amazon.com/The-Signal-Noise-Predictions-Fail-but/dp/159420411X/ref=tmm_hrd_title_0?ie=UTF8&qid=1352403842&sr=8-1

    #776850

    Ken
    Participant
    #776851

    JoB
    Participant

    from the promotion for Nate Silver’s book linked by Ken

    ” Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the “prediction paradox”: The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.”

    I would add that when you tailor your questions to get the responses you want from a poll.. you please your customer but get inaccurate results.

    #776852

    DBP
    Member

    Even the tone of the pollster’s voice or the expression on his face can have an effect on the way a person answers his questions. It’s called the “Observer-Expectancy” effect, and there’s a whole sub-discipline devoted to this phenomenon in the world of academic research.

    http://en.wikipedia.org/wiki/Observer-expectancy_effect

    Academic researchers take great pains to minimize the expectancy effect in their studies, but pollsters are under no such constraints. Sure, you might hear a poll being described by some media outlet as “scientific” or “accurate to within +/- # of points,” but that’s really just puffery. Most polling methods in use today wouldn’t make it past a high-school psych teacher.

    #776853

    Smitty
    Participant

    I think they all drank the same cool-aid I did. Which is to say they thought the voter turnout would more resemble the 2010 midterms than the 2008 Presidential. More R’s, more white, more male, older. As a result they re-balanced their polls to reflect that assumption.

    The long and short of it is they thought the economy would trump everything else (99%, war on women, immigration, etc).

    Boy, were they wrong.

    #776854

    JoB
    Participant

    Smitty..

    or maybe.. for those on the bottom end of the pay scale.. the economy did trump everything else.

    those at the bottom of the pay scale don’t see the results of the trickle down theory in theoretical terms..

    the failure of trickle down has determined their reality

    #776855

    Smitty
    Participant

    JoB,

    Could be.

    Could also be as we grow the number of people who are dependant on the government the more likely they are to vote for the party that will continue or expand those programs.

    Why on earth would I vote against free stuff (healthcare, extended unemployment, food stamps, or whatever).

    #776856

    JanS
    Participant

    David, dear…I have lied in a poll on purpose…it had nothing to do with changing my mind…a confession :D

Viewing 20 posts - 1 through 20 (of 20 total)
  • You must be logged in to reply to this topic.