Can we trust public opinion polls that are so frequently presented in the media? Political polls seem to be the most frequently broadcast, as well as polls about current controversial topics, such as healthcare reform and immigration. It’s great that these polls keep the public aware of market research on a very basic level, and even create a vague understanding of “sampling error.”

But poorly crafted surveys abound, and when the credibility of polls is questioned as often as results are presented, it’s detrimental to the perception of market research. The industry should hold pollsters to a higher standard in survey design and the media in reporting results. The challenge is to show that research is credible when done correctly and can accurately describe public opinion.

Polling on healthcare reform legislation prior to its passage was a great example of how important media coverage, question wording, respondent demographics, opinions and topic familiarity are to results. What’s assumed to be common knowledge may not be, but many pollsters assume respondents have a high level of knowledge about poll topics. Many elements of the healthcare bill were discussed at a very cursory level (and sometimes incorrectly) in the media, and as a result, there was a lot of misunderstanding about what the bill would actually do.

Death Panels
All of these factors came into play when people were surveyed about the pending bill; they reacted based on the latest misinformation being advanced by various media outlets. The claim by often-cited Sarah Palin that “death panels” would decide who was worthy of care showed how misinformation becomes truth for people who have already formed an opinion, especially after overblown media coverage.

An August 2009 Pew Research Center poll found that 86 percent of respondents had heard of the “death panel” controversy and of them, 30 percent believed it was true. Fifty percent said it was false. However, nearly half of Republican respondents, (47 percent), believed the healthcare legislation would actually create “death panels.”And another August 2009 NBC News poll asked if the healthcare bill, “Will allow the government to make decisions about when to stop providing medical care to the elderly,” and 45 percent said it was likely to happen.

“Death panel” was an unfortunate and untrue term used to describe a proposed fund for voluntary end-of-life counseling, which was ultimately removed from the bill.

Arizona’s Immigration Law
The issue of illegal immigration is at the forefront of the media after Arizona passed a controversial state law allowing police to ask suspected illegal immigrants for citizenship documents. You have undoubtedly heard that Arizona and national polls show a majority supporting the new stand on illegal immigration. Like healthcare reform, existing attitudes towards this subject will influence how people respond to survey questions about this, but this won’t be mentioned with poll results. People who think illegal immigration needs immediate stringent legislation are more likely to support the governor of Arizona, while Hispanics and others who believe it encourages civil rights violations generally oppose the law.

And like healthcare reform, the Arizona immigration law has controversial provisions that are misunderstood by the public, and reported by the media, often lacking detail. For example, much media attention has been given to the notion that Arizona police could stop anyone they suspect of being an illegal alien and demand citizenship documents. But the law states that an officer engaged in a “lawful stop, detention or arrest” must, when “practicable,” ask about a person’s legal status when “reasonable suspicion exists” that the person is in the U.S. illegally, as reported by azcentral.com.

Obviously, the scope of the law is unclear, and many fear that it will ultimately be interpreted and executed subjectively by police, which makes the questions used in polls even more important in this case.

Yet results of polls conducted about the new law are being reported with the unspoken (and probably mistaken) assumption that respondents are well versed on the language of the bill. If a poll asks if they support the new law without giving any of the actual language or definition of the law, the response can only be based on the respondents’ pre-existing opinions and media influences. Example:

  • A Pew Research Center poll conducted May 6-9, 2010 about the law asked for opinions about “requiring people to produce documents verifying legal status.” Results show 73 percent approve and 23 percent disapprove, with 4 percent who don’t know.
  • But not surprisingly, support differs by political party affiliation: 86 percent of Republicans, 73 percent of Independents and 65 percent of Democrats approve.

If you compare this results to another poll that does include more specific language about the law, the results differ, with fewer in favor. Example: A Rasmussen Reports poll conducted April 27, 2010 asked, “Do you favor or oppose legislation that authorizes local police to stop and verify the immigration status of anyone they suspect of being an illegal immigrant?” Results show 55 percent in favor of this, 36 percent who oppose it and 9 percent who are not sure.

Neither poll question includes verbiage about police requesting documents only during a lawful stop, which would likely result in yet another level of support.

Call to Action
But in reporting poll results, media outlets will simply give the proportion of people “supporting or opposing” the law. The media usually doesn’t go into detail about question wording, assumptions about topic familiarity or respondents’ demographics. This oversimplification of poll results only serves to undermine market research because of inevitable conflicting results, which leads the media to question poll legitimacy.

As researchers, we should seek to make the public more aware of the limitations of poorly crafted opinion polls. It’s up to us to guide the discussion about the utility and interpretation of research results and to show how credible our findings can be. And we should encourage media outlets, through partnerships or campaigns, to give qualifying information when they report poll results. This would help to distinguish between seemingly differing results on the same topic. It might also help to drive a movement towards better questionnaire design by the industry. The result would be more credibility for the market research industry, and perhaps “response bias” and “confidence level” can join “sampling error” in the American lexicon.