Election cycles boost unscientific ‘research’ and polling data
Main page content
With every election cycle, we see more and more research and polling data offered up by the news outlets. The year 2016 could be a record-breaker in that regard.
And not only news outlet polls, but news channels like Fox News playing "focus groups" as program content and "audience dials" that track reactions in realtime to sound bites by candidates.
Of course, there's also the plea to "vote" for your position on a particular issue via social media by the news show, as we are reminded by the host/anchor "the results are un-scientific."
Really? No kidding.
Our collective obsession with research, whether it's rankings of "the friendliest city in America" to "the most affordable college" to "the most trustworthy candidate," may well eclipse baseball as the national past time. Or is it football now? Let me check the stats on that …
What does all of this mean? For the news outlets, it means two things: Brand building and ratings content.
It's akin to the space race … Who can rush a poll to publication before the other guy and get it quoted by the competition, because it's news, right?
And now we have "research as theater," pitting opposing groups against each other, tightly packed on stadium seating, facilitated by a non-professional (the show's host) and calling it a "focus group."
To paraphrase Senator Bensten's famous retort to Dan Quayle: "I know focus groups, focus groups have been a friend of mine. You, sir, are no focus group."
I can only imagine what real facilitators and qualified researchers think when they see Sean Hannity putting one of these on and calling it a focus group. Can't print that here, I'd imagine.
What's the poor consumer of news and opinion to do with this steadily increasing barrage of research-as-news?
As a marketing guy, let me help sort this whole topic of research out in the following four basic observations. Because trust me, it's only going to get worse.
1. Research is a valuable decision-making tool.
Every major brand relies on research to build, monitor and evaluate marketing strategy. This includes in-depth secondary research, which can range from collecting, organizing and analyzing data found on the Internet, like reports, articles, trends, etc., to buying category research off the shelf.
We can also include analysis of the brand's own sales or share data as part of this research area.
It can also include primary research, from focus groups, to mall intercepts, to online or phone surveys, to personal interviews, to taste tests, etc.
The objective is to get inside the head of your target audience (potential buyer or voter) in order to forecast needs and outcomes.
2. Research can be incredibly flawed.
I'm reminded of the expression "garbage in, garbage out." The reliability of the "learnings" from research are directly tied to the quality of the ingredients used to create the research study.
For example, how current is the list of consumers/voters used to do the survey? What is the make up of the list? How large is the list to be statistically meaningful? How was the survey administered? How were the questions phrased?
There are a million variables that can skew the results, which is why "you should never attempt this at home, we (researchers) are professionals."
3. Even when done right, research can still get it wrong.
Think of it: If research were foolproof, Dewey would have defeated Truman, we'd be driving Edsels and drinking New Coke. Jeb Bush would be the presumptive Republican nominee and Great Britain would still be in the European Union.
By all reports, these situations were researched well but the results were, as we know now, a bit surprising. Why? The simple answer is human nature, and it can affect the anticipated outcome one of two ways.
On the one hand, the client or candidate will simply refuse to heed the data. We caution clients all the time to accept the results, good or bad. And sometimes, that's just too hard to swallow. Sometimes your baby is ugly.
On the other hand, people being people are just unpredictable. There's this little thing called emotion, an equal factor to the rational side of decision making, that tends to get in the way.
When a politician says "the only poll I believe is the one on Election Day," they're actually partly right.
4. What to look for in the political polling that comes your way.
It's unfortunate but nevertheless true: Media outlets have become increasingly biased, either left or right. And as even the most neophyte among us become polling connoisseurs, we naturally tend to give more credence to the poll conducted by our preferred news source.
So I won't bother to recommend one source or another. Remember that thing about emotion?
Do, however, look at three things that should always accompany a report on polling data: 1. Sample size (the larger the better) 2. Margin of error and 3. Sample composition.
Margin of error is usually overlooked by most people — we all want a win or lose score. But actually, in most cases when the plus and minus is close, it's really a dead heat.
Sample composition (the balance in the number of respondents for one political or cause persuasion versus the opposing side) can create a definite skew to the results.
This information is usually harder to come by, but news outlets like to quote it when questioning the credibility of a competitor's poll.