Spring forward.
To delete this message, click the X at top right.

Is Political Polling Meaningful Anymore?

Trying to make sense of a crazy world, with limited success mostly
User avatar
jez
Posts: 470
Joined: Mon Feb 22, 2021 10:20 am
Location: Midwestish
Occupation: Thread Killer
Verified: ✅ Medicated

Is Political Polling Meaningful Anymore?

#26

Post by jez »

I just got home from voting. Beautiful sunrise, by the way. There was an Emerson exit poll table set up outside. I escaped the polling person because she was still setting up the table.
“What is better ? to be born good or to overcome your evil nature through great effort ?”

~Paarthurnax
Slarti the White
Posts: 442
Joined: Mon Feb 22, 2021 1:23 pm
Location: Michigan
Verified: Badges... we don't need no stinkin' badges

Is Political Polling Meaningful Anymore?

#27

Post by Slarti the White »

A few observations after election day.

You can't find the election forecasts on 538 or RCP anymore. Once the election started, they disappeared -- move along folks, nothing to see here...

On 538's election livestream (just before 1:30am):
Nate Silver wrote:The New York Times needle projects that Fetterman is eventually going to win by around 4 points. So while it’s not a catastrophically bad polling error if your survey had Oz winning by 1 or 2 points, as a lot of GOP-leaning firms did, you are winding up with a 5- or 6-point miss, which is basically in “Hillary in Wisconsin” territory.
And what, pray tell, was your prediction Nate? How did 538 do tonight? Why don't you have a metric to measure the predictive accuracy of pollsters and aggregators? Surely developing such a thing is not beyond you? If so, the only reason that you haven't is because you have no desire to hold yourself accountable for anything you said nor any desire to know how the effectiveness of your methodology has changed over the years.

One thing is for sure: no red wave happened. In the Senate at present, the Democrats have picked up Pennsylvania, look to have a good shot at holding Arizona (though late votes are likely to narrow their lead), are up big early in Nevada (no idea what that means or how long it will take until this race can be called), and have a small lead in Georgia where the question is can Warnock get enough votes to get to 50% and avoid a runoff. What are the odds that the Republicans take two of these races and flip the Senate? If I were a gambling man -- which all good Bayesians are, according to Nate's book The Signal and the Noise -- I would put my money on the Democrats and, given the final odds offered by 538 (somewhere around 55/45 in favor of the Republicans taking the Senate, if I recall correctly), I would bet big.

So while we cannot be sure about control of either the House or the Senate, I'm ready to make my one and only call of the night:

Pollsters, aggregators, and pundits have lost in a landslide.
User avatar
keith
Posts: 3706
Joined: Mon Feb 22, 2021 10:23 pm
Location: The Swamp in Victorian Oz
Occupation: Retired Computer Systems Analyst Project Manager Super Coder
Verified: ✅lunatic

Is Political Polling Meaningful Anymore?

#28

Post by keith »

:yeahthat:

Exhibit A: Arizona Governor
Has everybody heard about the bird?
Slarti the White
Posts: 442
Joined: Mon Feb 22, 2021 1:23 pm
Location: Michigan
Verified: Badges... we don't need no stinkin' badges

Is Political Polling Meaningful Anymore?

#29

Post by Slarti the White »

As I said above, I have an idea for a new polling methodology called “Pull Polling”. I made a thread to discuss it here:
Slarti the White wrote: Fri Nov 11, 2022 8:07 pm
Pull Polling


Traditional polling instruments, “horse-race” polls in particular, have significant limitations in determining preferences and consensuses amongst the groups being polled. The lack of any information regarding the certainty or importance of a choice makes it impossible to compare or correlate different polls, the limited number of choices available and choices to be selected can hide significant sentiment groups, and the bias inherent in the selection of polling questions means that there will always be some degree of “push” in traditional polls. In addition, lacking this information, the data is very sparse and provides no insight into why the poll subjects answer the way they do. These shortcomings are accepted as there is currently no alternative available.

Pull polling is an attempt to provide an option which doesn’t have these shortcomings. Instead of the polls “pushing” subjects into restricted categories and treating all answers in the same “box” as having the same weight, it allows the person being polled to “pull” the poll to record what is important to them…
Moar at the linked thread.

This is a work in progress (or at least starting to progress, I hope), but I believe that it may become a useful tool in making politicians more responsive to the public they are supposed to serve. Too, also, I don’t think it is as susceptible to the problems we’ve discussed in this thread. In any case, I am looking for any kind of feedback as I start the project of developing this new methodology. Comments, suggestions, thoughts, criticism, and non-sequiturs are all welcome.

:towel:
User avatar
Liz
Posts: 96
Joined: Tue Feb 23, 2021 5:52 pm

Is Political Polling Meaningful Anymore?

#30

Post by Liz »

Push Pull Polling. It seems the purpose of the Push to get the Pull is to publish results that shows a candidate ahead whether or not they were... to get $'s from Business's (in hopes to curry favor) by backing the frontrunner/likely winner?

As for individual me, It could be I'm wired differently...
If I see polls showing my choice as a frontrunner I'm unlikely to donate.. but more likely to when my side is behind.
User avatar
MN-Skeptic
Posts: 3000
Joined: Mon Feb 22, 2021 1:03 pm
Location: Twin Cities

Is Political Polling Meaningful Anymore?

#31

Post by MN-Skeptic »

Twice now I've responded to a Rasmussen poll. When I see the Call ID, I figure I need to answer and bring their polls back, as much as I can, to reality. But, honest to goodness, the questions are just... :shock: .

I wish I had recorded the call so I could remember them better. Of course, I always hate the "Do you think the country is headed in the right direction?" Of course I answered Yes, because they would re-frame that as a reflection on Biden. But, based on what the courts have been doing and the idiotic governors are doing, there's a lot going wrong in the country.

"If President Biden is re-elected, he would be 86 years old at the end of his second term. Is 86 years old too old to be President?" Hell no, it's not too old. For Biden. Now ask me about Trump.

"Is Biden making his own decisions or are advisors acting behind the scenes to make those decisions?" You have got to be kidding me. :roll:
User avatar
AndyinPA
Posts: 9857
Joined: Mon Feb 22, 2021 10:42 am
Location: Pittsburgh
Verified:

Is Political Polling Meaningful Anymore?

#32

Post by AndyinPA »

I do YouGov surveys, and they ask the same type of questions where there's really no answer that takes different sides in questions. I also have answered questions where my answer changes because Biden is "in charge." I agree that we are not headed in the right direction, but I don't blame Biden.
"Choose your leaders with wisdom and forethought. To be led by a coward is to be controlled by all that the coward fears… To be led by a liar is to ask to be told lies." -Octavia E. Butler
User avatar
Tiredretiredlawyer
Posts: 7541
Joined: Tue Feb 23, 2021 10:07 pm
Location: Rescue Pets Land
Occupation: 21st Century Suffragist
Verified: ✅🐴🐎🦄🌻5000 posts and counting

Is Political Polling Meaningful Anymore?

#33

Post by Tiredretiredlawyer »

https://www.cambridge.org/core/journals ... F54F383CF3
Do Survey Questions Spread Conspiracy Beliefs?
Published online by Cambridge University Press: 15 February 2023


Abstract
Conspiracy theories and misinformation have become increasingly prominent in politics, and these beliefs have pernicious effects on political behavior. A prominent line of research suggests that these beliefs are promoted by repeated exposure. Yet, as scholars have rushed to understand these beliefs, they have exposed countless respondents to conspiratorial claims, raising the question of whether researchers are contributing to their spread. We investigate this possibility using a pre-registered within-subjects experiment embedded in a panel survey. The results suggest that exposure to a standard conspiracy question causes a significant increase in the likelihood of endorsing that conspiracy a week later. However, this exposure effect does not occur with a question format that offers an alternative, non-conspiratorial explanation for the target event. Thus, we recommend that researchers reduce the likelihood of spreading conspiracy beliefs by adopting a question format that asks respondents to choose between alternative explanations for an event.

[Formulas, graphs, numbers, >,<,= are in this part of the abstract. Not copied.]

CONCLUSION

In recent years, researchers have raced to understand the pernicious effects of conspiracy beliefs. In the process, countless respondents have been exposed to a variety of questions about conspiracy theories, rumors, and falsehoods. Consistent with the illusory truth effect, we find that mere exposure to conspiracy questions increases conspiracy belief and that this effect lasted at least 1 week. However, this effect only obtained when respondents were exposed to the agree–disagree format, not the explicit choice format. Consistent with the panel conditioning literature, the evidence suggests that respondents learn from the content that is offered to them in the survey. Respondents exposed to the agree–disagree format could only learn one thing – the conspiratorial claim offered to them. And some of them did. Respondents who were instead exposed to the explicit choice format could have learned either the conspiratorial claim or the conventional explanation for the event. These respondents became less likely to say they were unsure and more likely to adopt the conventional explanation but were not more likely to adopt the conspiracy.

Of course, it is reasonable to wonder whether the observed effect sizes are substantively meaningful. We think so. Our estimates suggest that a single exposure to the agree–disagree format increases conspiracy belief by about 3.2 percentage points 1 week after exposure. While this may not seem large, consider the potential consequences for a standard survey (N = 1,000) that contains five conspiracy questions. If our effect size generalizes, an increase of 3.2 percentage points implies that this study would create about 160 new conspiracy beliefs. If the conspiratorial claim involves a topic like vaccination that may have important downstream effects on respondent behavior, these are not trivial effects.

These findings suggest that researchers should consider the potential ethical implications of inadvertently spreading conspiracy beliefs. In many cases, the potential risks might be minimal, but this may not always be the case, such as in the case of vaccines. Fortunately, our research suggests that researchers can avoid this risk by adopting the explicit choice question format. Of course, more research is needed on the validity of alternative measures, but the choice format appears to have multiple advantages (Clifford, Kim, and Sullivan Reference Clifford, Kim and Sullivan2020). Alternatively, a researcher might debrief respondents about the nature of the conspiratorial claims. However, conventional debriefing is not always completely effective (e.g., Greenspan and Loftus Reference Greenspan and Loftus2022), and it may be time-consuming to debrief on multiple conspiracies. Nonetheless, researchers ought to take the ethical considerations of conspiracy research seriously.
"Mickey Mouse and I grew up together." - Ruthie Tompson, Disney animation checker and scene planner and one of the first women to become a member of the International Photographers Union in 1952.
User avatar
bob
Posts: 5386
Joined: Mon Mar 01, 2021 12:07 am

Is Political Polling Meaningful Anymore?

#34

Post by bob »

Slarti the White wrote: Wed Nov 09, 2022 1:49 am And what, pray tell, was your prediction Nate? How did 538 do tonight? Why don't you have a metric to measure the predictive accuracy of pollsters and aggregators?
538: The Polls Were Historically Accurate In 2022.

538: The Polls Got It Right In 2022. Here Are The Pollsters To Trust.

538: How Our Pollster Ratings Work.
Image ImageImage
Slarti the White
Posts: 442
Joined: Mon Feb 22, 2021 1:23 pm
Location: Michigan
Verified: Badges... we don't need no stinkin' badges

Is Political Polling Meaningful Anymore?

#35

Post by Slarti the White »

bob wrote: Fri Mar 10, 2023 5:32 pm
Slarti the White wrote: Wed Nov 09, 2022 1:49 am And what, pray tell, was your prediction Nate? How did 538 do tonight? Why don't you have a metric to measure the predictive accuracy of pollsters and aggregators?
538: The Polls Were Historically Accurate In 2022.

538: The Polls Got It Right In 2022. Here Are The Pollsters To Trust.

538: How Our Pollster Ratings Work.
Thanks Bob! While it took Nate a while, he did eventually report on the accuracy of the polls. On the other hand, I'm not so impressed by what he said... (which I will detail in its own post).
User avatar
bob
Posts: 5386
Joined: Mon Mar 01, 2021 12:07 am

Is Political Polling Meaningful Anymore?

#36

Post by bob »

Slarti the White wrote: Sat Mar 11, 2023 3:04 pmThanks Bob! While it took Nate a while, he did eventually report on the accuracy of the polls.
"For the record," the first two articles are by the other Nate (Rakich,* not Silver). The third was written by committee, but acknowledges it is based on Silver's model.

* 538's senior election analyst.
Image ImageImage
User avatar
Tiredretiredlawyer
Posts: 7541
Joined: Tue Feb 23, 2021 10:07 pm
Location: Rescue Pets Land
Occupation: 21st Century Suffragist
Verified: ✅🐴🐎🦄🌻5000 posts and counting

Is Political Polling Meaningful Anymore?

#37

Post by Tiredretiredlawyer »

So, do I need to read the articles or just the last one. Are the headlines correct metrically? I'm so confuzzled!!! :confuzzled: :think: :cry:
"Mickey Mouse and I grew up together." - Ruthie Tompson, Disney animation checker and scene planner and one of the first women to become a member of the International Photographers Union in 1952.
Slarti the White
Posts: 442
Joined: Mon Feb 22, 2021 1:23 pm
Location: Michigan
Verified: Badges... we don't need no stinkin' badges

Is Political Polling Meaningful Anymore?

#38

Post by Slarti the White »

bob wrote: Sat Mar 11, 2023 3:58 pm
Slarti the White wrote: Sat Mar 11, 2023 3:04 pmThanks Bob! While it took Nate a while, he did eventually report on the accuracy of the polls.
"For the record," the first two articles are by the other Nate (Rakich,* not Silver). The third was written by committee, but acknowledges it is based on Silver's model.

* 538's senior election analyst.
I noticed that, but I think that, as the founder and head of 538, it is reasonable to attribute the website's official position on such matters to Nate (Silver).
Tiredretiredlawyer wrote: Sun Mar 12, 2023 11:59 am So, do I need to read the articles or just the last one. Are the headlines correct metrically? I'm so confuzzled!!! :confuzzled: :think: :cry:
Just hold on for a bit and I'll analyze the stats so you don't have to. If you are going to read any of the articles, however, I would suggest the first one -- but be cautioned, as Thomas Jefferson said, "there a lies, damned lies, and statistics".

Here there be statistics.
User avatar
Tiredretiredlawyer
Posts: 7541
Joined: Tue Feb 23, 2021 10:07 pm
Location: Rescue Pets Land
Occupation: 21st Century Suffragist
Verified: ✅🐴🐎🦄🌻5000 posts and counting

Is Political Polling Meaningful Anymore?

#39

Post by Tiredretiredlawyer »

:bighug:
"Mickey Mouse and I grew up together." - Ruthie Tompson, Disney animation checker and scene planner and one of the first women to become a member of the International Photographers Union in 1952.
Slarti the White
Posts: 442
Joined: Mon Feb 22, 2021 1:23 pm
Location: Michigan
Verified: Badges... we don't need no stinkin' badges

Is Political Polling Meaningful Anymore?

#40

Post by Slarti the White »

According to 538's Nathaniel Rakich, The Polls Were Historically Accurate in 2022. Let's unpack that claim in the light of the question asked by this thread.
538 wrote:Let’s give a big round of applause to the pollsters. Measuring public opinion is, in many ways, harder than ever — and yet, the polling industry just had one of its most successful election cycles in U.S. history. Despite a loud chorus of naysayers claiming that the polls were either underestimating Democratic support or biased yet again against Republicans, the polls were more accurate in 2022 than in any cycle since at least 1998, with almost no bias toward either party.

Okay, what does this mean? It is a reference to the top line table of "Weighted-average error of polls in the final 21 days* before presidential primary and presidential, Senate, House and gubernatorial general elections since 1998". This shows:

Year Senate House Gov.Combined
2021-22 4.84.05.14.8
All years 5.46.15.46.0
Which, of course, justifies the title, but is it meaningful? I would suggest that the answer to this is no. One way of looking at this is to ask what the value of a low weighted-average error in polling is. How does it help us? How does it make the polling more useful? What does it tell us about how well polling guided decision making in this cycle and how it can better guide decisions in later cycles? If you can find answers to any of these questions in the table excerpt above (or even the full table at 538)... well, it doesn't matter because you can't. It is a value-free metric which statistically justifies the claim of the article (and, thereby, the usefulness of polling in general and aggregators like 538 in particular). Not that there's anything wrong with this, but I have a much higher bar for whether or not something is intrinsically worthwhile.

Which, of course, begs the question of what uses would we like polling to have. Personally, I think that meaningful polling would allow one to identify where to put scarce resources in elections, identify trends, and, in the end, accurately predict the results. The metric given gives us no information at all regarding the first two points and scant information regarding the third. Instead, it is a single statistic bereft of any context that might give it meaning. Truly, 538 could do much, much better at analyzing the data in their position if they wished.


Of course, some pollsters were more accurate than others. And today, we’ve updated the FiveThirtyEight pollster ratings to account for each pollster’s performance in the 2022 cycle. Our ratings are letter grades that we assign to each pollster based on historical accuracy and transparency. (You can read exactly how we calculate pollster ratings here.) They’re one of many tools you should use when deciding how much stock to place in a poll.

Again, this is all valid on its face, but a letter grade (and a partisan lean) tell us little about how we can use any pollster's data in a meaningful way. The article goes on to say:

Before we reveal the best- and worst-rated pollsters, let’s start with our regular review of polling accuracy overall. We analyzed virtually all polls conducted in the final 21 days1 before every presidential, U.S. Senate, U.S. House and gubernatorial general election, and every presidential primary, since 1998,2 using three lenses — error, “calls” and statistical bias — to conclude that 2022 was a banner year for polling.

Three lenses... wow. The problem here is that they are throwing everything into one pot and mixing it up -- in other words, they are treating all races equally and averaging everything between them. Again, this will get you statistics that you can write articles about statistical analyses about, but it doesn't tell you anything about how helpful the polls were.

In our opinion, the best way to measure a poll’s accuracy is to look at its absolute error — i.e., the difference between a poll’s margin and the actual margin of the election (between the top two finishers in the election, not the poll). For example, if a poll gave the Democratic candidate a lead of 2 percentage points, but the Republican won the election by 1 point, that poll had a 3-point error.

The "best way" to measure a poll's accuracy? I will give them that it is a way to measure a poll's accuracy, but I can probably come up with a half dozen better ways to quantify both accuracy and usefulness. First and foremost, not all elections are created equally -- it's more important to make an accurate prediction for a close race than a blowout and the significance of errors in polling (viewed from the "lens" of the value of polling) depends on not just the difference between the poll's margin and the election's margin, but things like the qualities of the margins (big, small, which candidate is winning, etc.). Not to mention (and this article certainly doesn't mention it) the dynamics of the polling -- how did the polls change throughout the election season. Obviously we only have one actual reference point -- the election results -- but looking at how the polling changed and what that did or should have suggested regarding uses of the polls is important to understanding (and therefore being able to guide interventions into) campaigns and elections as a whole.

Interestingly, the weighted-average errors of Senate and gubernatorial races were only slightly lower than usual last year.

Maybe it's just me, but I don't really find that interesting.

Polling of the 2021-22 cycle mostly owes its success to a low error in House races. This past cycle was the first time since the 1999-2000 cycle that House polls were more accurate than Senate and gubernatorial polls.

But this isn’t as impressive as it sounds. The “House polls” group includes district-level polls of individual House races and national generic-congressional-ballot polls. And something we noticed early on in 2022 was that pollsters were conducting more generic-ballot polls and fewer district-level polls. Overall, since 1998, 21 percent of the House polls in our pollster-ratings database have been generic-ballot polls — but in 2021-22, 46 percent were. That’s higher than in any other election cycle.

So polling this cycle was "historically accurate" because pollsters were doing easy polling instead of hard polling. So we should be impressed why? Frankly, I look at this whole article (and I've already bent the 4 paragraph rule a bit, so I'm going to stop here with my response to it) as an example of the prime use of data analysis being to allow data wonks to write articles about polling that will get clicks. Now, there's nothing wrong with that -- certainly there is a clear moral difference between this, which is mostly harmless (see what I did there? :towel: ) and what FOX News does -- but I can't help but think about how much more could be done with the data, tools, and talent at 538 (and elsewhere). The article goes on to get into the details about their metrics and what they say -- you can read it on your own if you want to -- and while there's nothing wrong with what they say, it strikes me as a solid argument that there isn't a whole lot of meat on the bones of polling these days.
Okay, now that I've had my rant, I need to talk about how I think polling could be useful and how I would evaluate the results. First off, if we're using polling to guide the deployment of money, time, and resources we need to acknowledge that all races are not created equally nor are all errors of the same significance. So what does this mean? Basically, who cares if MTG wins her district by 15% or 20%? There are a great number of races in which the polling -- and its error -- are simply not important. This includes any so-called "safe" seat. Yes, participants (and other parties) in these elections are going to commission polls and they should be published, but you didn't need a poll to figure out that Nancy Pelosi was going to be re-elected.

On the other hand, if polling was so "historically accurate", in this cycle, led by the polling for the House, why didn't anybody realize that Lauren Bobert's seat was in play? If polling had told us that this race was too close to call, the Democratic party could have poured resources into Colorado and picked up this seat. That would have been a huge feather in the cap of the polling industry and ample justification for their existence. Analyzing the data to determine if opportunities to pick up or defend seats could have been determined (or if apparent opportunities were not real) seems like a no-brainer to me. To do this, you would need to look at polling over time in all the races that were either forecast to be close or ended up close or both. Maybe you could even identify the effects of interventions and determine how to effect the dynamics of a race most efficiently and effectively. You know, actually use the scientific method to improve political campaigning?

Even if you just want to look at the final polling and the election results, like 538 did, there are still much better ways to assemble, display, and analyze the data. For instance, consider a kite graph where the x-axis is the final polling margin and the y-axis is the election margin. Each race is a point (the points can be colored and/or sized to indicate other variables such as the partisan lean of the pollsters or the money spent by each side on the race) and how the points cluster on the graph would tell you far more than 1,000 of the types of tables in this article. In this sort of picture, the distance from the line y=x would be the error, anything in the first and third quadrants would be a correct prediction, and anything near the origin would be a significant and accurate polling result (a race that was predicted to be close and turned out to be close).

Anyway, I hope you get the idea. With a modicum of ingenuity and the data that a site like 538 has available, much more sophisticated visuals, metrics, and analyses can be done -- which would, at the very least, help us understand how useful and effective polling really is. But I really doubt that people in the polling industry would like what they saw.
User avatar
Luke
Posts: 5588
Joined: Mon Feb 22, 2021 1:21 pm
Location: @orly_licious With Pete Buttigieg and the other "open and defiant homosexuals" --Bryan Fischer AFA

Is Political Polling Meaningful Anymore?

#41

Post by Luke »

Wow Slarti, three cheers. Thank you. I posted a tweet about this inviting folks to take a look or join the discussion. I also sent it to a nationally known political handicapper (not mentioning names but you'd know them) who is taking a look and alerting his team.

I agree with the comments above, what messed things up in 2022 was the "flood the zone" terrible polling from Rasmussen and other F pollsters. Their mischief resulted in real damage to the national parties and candidates based on their flaws. Tim Ryan should have gotten support in PA. Ron Johnson should have been challenged harder because it was much closer than the averages were showing, etc. On the other hand, there were a lot of contrarian takes that the red wave wasn't going to happen (focused on that in our "Democrats Are Going To Win Again" topic).

Slarti is on target with the Boebert race, that would have been a great demonstration.

In my opinion, as a barometer and general look at momentum (or not), polling is still helpful. While efforts are being made to reach more people in different ways, that's taking time. Hope many pollsters and aggregators see this topic. It's a great discussion, thank you Slarti and everyone :bighug:


Lt Root Beer of the Mighty 699th. Fogbow 💙s titular Mama June in Fogbow's Favourite Show™ Mama June: From Not To Hot! Fogbow's Theme Song™ Edith Massey's "I Got The Evidence!" https://www.youtube.com/watch?v=C5jDHZd0JAg
Mr brolin
Posts: 403
Joined: Mon Feb 22, 2021 5:59 pm
Occupation: Chief Blame Officer
Verified: ✅ as vaguely humanoid

Is Political Polling Meaningful Anymore?

#42

Post by Mr brolin »

As a wholly a-mathematical tech geek, for me the concern with polling and aggregators such as 538 is they have become less "Here is useful information, make of it what you will and use it to help make informed decisions" and more " Here is our derived opinion on how stuff probably or in our opinion should look, you should do THIS!"

People be lazy, people be happy to defer to an "expert" (particularly if its confirms their bias), people be happy to shift the blame "Well the polls said......!"

We are getting down the whole Asimovian "psycho-history" rabbit hole now with some of this self mastubatory polling nonsense. Lets just rename Nate Silver Hari Seldon....... :biggrin:
User avatar
MN-Skeptic
Posts: 3000
Joined: Mon Feb 22, 2021 1:03 pm
Location: Twin Cities

Is Political Polling Meaningful Anymore?

#43

Post by MN-Skeptic »

I'm on Rasmussen's call list, and I got a polling call from them last night. One of the first questions had to do with whether the country is going in the right direction or the wrong direction. WTF? People reading the poll are going to read into my reply whatever spin works for them. I have no idea on how to answer the question. I think I said we're heading in the right direction based on Biden being President. But I could just as easily have answered that America's going in the wrong direction based on Trump and the Supreme Court. Without follow-up questions, it's just a stupid, uninformative question. :roll:
User avatar
AndyinPA
Posts: 9857
Joined: Mon Feb 22, 2021 10:42 am
Location: Pittsburgh
Verified:

Is Political Polling Meaningful Anymore?

#44

Post by AndyinPA »

I do You/Gov polls, and get frustrated over the same question all the time.
"Choose your leaders with wisdom and forethought. To be led by a coward is to be controlled by all that the coward fears… To be led by a liar is to ask to be told lies." -Octavia E. Butler
User avatar
Estiveo
Posts: 2302
Joined: Mon Feb 22, 2021 9:50 am
Location: Inland valley, Central Coast, CA
Verified:

Is Political Polling Meaningful Anymore?

#45

Post by Estiveo »

AndyinPA wrote: Tue Jun 27, 2023 10:55 am I do You/Gov polls, and get frustrated over the same question all the time.
Me three.
Image Image Image Image
User avatar
raison de arizona
Posts: 17656
Joined: Mon Feb 22, 2021 10:21 am
Location: Nothing, Arizona
Occupation: bit twiddler
Verified: ✔️ he/him/his

Is Political Polling Meaningful Anymore?

#46

Post by raison de arizona »

Nate Silver is a real dunderhead. Seems like he always has something to say, and it's usually stupid.
Nate Silver @NateSilver538 wrote: Very bad to do a Spanish Inquisition with pollsters based on their political orientation. I love my ex-colleagues (this is coming from a new guy they hired) but if this is their practice, hope ABC will stop use of 538 brand so it isn't associated with me.
https://washingtonexaminer.com/news/was ... bannon-fox
Noble Prize in Sarcasm @rewegreatyet wrote: This is the polling company that openly states the 2020 election was stolen, and also thinks Kari Lake won in Arizona…right?
“Remember, democracy never lasts long. It soon wastes, exhausts, and murders itself. There never was a democracy yet that did not commit suicide.” —John Adams
User avatar
pipistrelle
Posts: 6693
Joined: Mon Feb 22, 2021 11:27 am

Is Political Polling Meaningful Anymore?

#47

Post by pipistrelle »

I dunno what he's trying to say but I've never understood how he's a genius, like I've never understood how Musk is a genius.
User avatar
Suranis
Posts: 5830
Joined: Mon Feb 22, 2021 5:25 pm

Is Political Polling Meaningful Anymore?

#48

Post by Suranis »

He's a Maths Genius. No question. That does not mean that he is able to evaluate stuff outside of Maths.

I mean I can kind of understand where he is coming from. He sees data as data and thinks that every input and information must be included or else the model is flawed. So the idea of freezing out one particular source of info would be horrifying to him. So he would probably fight to the death to include Rasmussen etc.

The problem is that the only thing that can really judge Rasmussen is how it compares to actual election Polls, as that's the only time Polls are tested against real life. And because Rasmussen always corrects itself a week or so before the polls on PAPER mean that Rasmussen are Republican leaning but otherwise fairly accurate.

But that kind of context is outside the expertise of being a Math Quant. The just look at raw maths and data and try and make models to interpret that raw data.

That's why simply inventing fantasy poll firms screw him in the last election. He knew down to his very core that he HAD to include that data or wlse his model would lack data and would produce flawed results, but the fact is they were actively putting bad data into the system. They hit upon a flaw in the way a Math Quant thinks, and somehow he got the blame for it. Possibly due to residual anger over 2016.
Hic sunt dracones
Post Reply

Return to “Current Politics”