Poll Pain: Don’t let polling get you down

By Matt Madden and the Marketing Lab

Good, unbiased polling actually works rather well, despite sensationalized headlines to the contrary. There are some “errors” where we can rightly critique pollsters. Pollster techniques impact measurement error, coverage error and nonresponse error, all of which can bias things and hurt polling accuracy. On the other hand we have sampling error, which shouldn’t really be considered “error” at all. It’s just the result of numbers bouncing around every time you collect data from a different random sampling of people, and it will always be a part of polling no matter what techniques are used.

Here’s the catch though…few people understand ANY of these terms above.

That’s not to say that everyone needs to be or become statisticians, but it is important to realize that without understanding the complexities of polling and uncertainty, you’ll likely feel misled by those polls. 

 
clay-banks-BY-R0UNRE7w-unsplash.jpg
 

Polling is hard. The top polling experts in the nation thought that a big problem in the 2016 presidential polls could be that pollsters didn’t weight participants’ education levels. People with college degrees have higher response rates to surveys than people without college degrees. So polls in 2020 weighted educated participants differently. Did it work? Who knows. The polls seemed to miss predictions in 2020 by as much as they did in 2016, but maybe it could have been worse.

The point is, most people should stop looking at polls as sources of news, and instead look at them as merely a form of entertainment. Regular people don’t look at the footnotes to poll results to evaluate margin of error, and even if they did, understanding its implications is something even trained pollsters miss. Especially for a poll as expansive as a national poll, drawing any real conclusion is a tremendous challenge.

As a forecasting tool polls aren’t that great

In short: regular people should stop worrying about polls. And people in the industry should stop sharing useless poll results because all they are doing is harming reputations needlessly.

A Better Way to Use Polls

The thing is, we could use polls for so much more. As a forecasting tool polls aren’t that great – they set some expectations, and then we still have to wait for the votes to be counted and find out if expectations were met or dashed. But polls are one area where we can actually seek to understand the reasons people vote the way they do. Polling should be a source of insight and empathy. Rarely are polls used that way, and it’s a missed opportunity. I hope more news outlets and political analysts learn from their mistakes and start emphasizing poll results in this way, rather than using polls for mere mediocre (and mostly meaningless) forecasts.

What About Polling For Marketing Research?

As we mentioned, polling can be a great source of insight and empathy. If you want to understand why consumers make the choices they do, and what they think about your product or service, polling can be very useful (not definitive, but useful). Again, polling has error inherently built into it, so making decisions based on polling is a calculated risk.

Potential Polling Errors to Consider In Your Marketing Research

Non-response or participation bias: when participants disproportionately have certain traits.

Sorry Alf, non response bias got your hopes up.

Sorry Alf, non response bias got your hopes up.

Some have theorized that Trump supporters are less willing to take part in presidential polls, meaning that Biden supporters made up a disproportionate percentage of poll participants. Take a spin further back in time and we can see how impactful this issue can be. In the 1936 U.S. presidential election, The Literary Digest mailed out 10 million questionnaires, of which 2.4 million were returned. Based on these, they predicted that Republican Alf Landon would win with 370 of 531 electoral votes, whereas he only got eight. Research published in 1976 and 1988 concluded that non-response bias was the primary source of this error. The type of people willing to fill out and return a survey happened to be Republicans. 

Participation in general is a related problem today, where we treat pollsters like telemarketers. According to the Pew Research Center, as fewer people pick up the phone for unknown callers, survey-response rates have fallen from 36% in 1997 to 6% in 2018.

For your business, you must consider the type of people who are willing to take part in your poll. There are ways researchers try to randomly sample participants, but participation bias has to be factored in.

Measurement errors: When pollsters get inaccurate responses because of poor question wording or questionnaire design, poor interviewing, survey design, etc.

For marketers, if your poll is bad, your data is bad. Good marketing researchers will bake the probability of these errors into the final results, but doing some careful crafting or your wording is key.

SEE HOW MARKETING LAB CAN HELP YOU GAIN CONSUMER INSIGHTS

Tools For Those Who Want to Understand Polls Better

All models are wrong, but some are useful.
— George E. P. Box, British statistician

If you can’t help yourself, and you are a news and political junkie, feel free to keep following polls. I would advise you to heed the wisdom of British statistician George E. P. Box, who said, “all models are wrong, but some are useful.” Read and follow sites like Nate Silver’s FiveThirtyEight, an excellent resource on polling (who correctly predicted 49 of 50 states in 2008, all 50 states in 2012, and 48 of 50 in 2020…and who gave the 2016 winner a better chance than almost any other pollster out there). Learn a bit about statistics and sources of statistical error. And mostly, don’t let the polls get you down. They are just fine, when used properly.

Sources: 

https://www.wsj.com/articles/what-went-wrong-with-the-polls-this-year-11604536409

https://ropercenter.cornell.edu/polling-and-public-opinion/polling-fundamentals

https://www.newyorker.com/news/q-and-a/nate-cohn-explains-what-the-polls-got-wrong