Polling is seductive. (Polling, not pollsters.)
You ask questions, get answers and turn those answers into numbers. Some people put decimal points next to those numbers to make you think they’re dealing with physics.
Everyone just assumes that the answers are meaningful.
{mosads}
Often, they aren’t.
During the run up to last year’s Alabama special Senate election, JMC Analytics put this question to voters: “Given the allegations that have come out about Roy Moore’s alleged sexual misconduct against four underage women, are you more or less likely to support him as a result of these allegations?”
Peering out from under the somewhat mangled syntax is a question format many pollsters use. “Would you be more or less likely to vote for Candidate X if Y were true about him/her?”
Twenty-nine percent of Alabamans said the allegations of pedophilia increased their chances of voting for Moore.
Does anybody really believe that 29 percent of Alabamans were saying to themselves something akin to, “Well there was only a 40 percent chance I was going to vote for him, but now that I know he might be a pedophile, the chances of my voting for him jumped up to 70 percent.”
Certainly not. But that’s what the question asked and in essence that’s how it’s typically interpreted.
Since most of those claiming pedophilia was a positive were white, evangelical Christians, I assume what they really meant was not “I’m voting for him because of the allegations,” but rather “I’m going to use this question to express my strong support for Moore by saying I’m more likely to vote for him.”
In short, responses to the question are not telling us what we think they are, based on the plain meaning of the question.
Fox News used a similar format to ask North Dakotans a different question: “If [Sen.] Heidi Heitkamp [(D-N.D.)] votes against Brett Kavanaugh’s nomination to the Supreme Court, would that make you more likely or less likely to vote for her, or would it not make a difference to your vote for Senate?”
Eighteen percent of likely voters say they’d be more likely to vote for her and 25 percent said they be less likely to cast their ballots for the incumbent.
What does it mean, though, that 44 percent of Republicans said “less likely” when 85 percent were already supporting Heitkamp’s opponent and some 80 percent of them were “certain” to do so?
Were they 100 percent behind Heitkamp’s opponent before the Kavanaugh vote, but predicting they would be 150 percent likely to vote for him if she opposed Kavanaugh?
Meanwhile, 40 percent of Democrats were more likely to support Heitkamp if she opposed Kavanaugh, even though 92 percent already support her and over 90 percent were certain to do so.
Again, it’s pretty certain that:
a. respondents were, as we sometimes advise our clients, answering the question they want to answer, not the question we asked, and
b. we should not interpret such questions, the way pollsters often do, as determining how powerful an argument might be as a positive or negative.
Two Yale political scientists put this question format to a rigorous test, pitting responses to several questions of this type against the actual attitude change induced by the information in a randomized controlled experiment.
They found respondents not only exaggerate the degree of change generated by the information, but also sometimes get the direction wrong — saying some information will make them less likely to vote for a candidate when experiments reveal it makes people more likely to do so, and vice versa.
So, if a pollster hands you a questionnaire asking a bunch of questions about whether respondents would be more or less likely to vote for someone who did X, ask them politely to start over or risk being seduced and misled by questions that appear to mean something but don’t.
Future columns will describe other question formats that don’t mean what they purport to mean.
Mellman is president of The Mellman Group and has helped elect 30 U.S. senators, 12 governors and dozens of House members. Mellman served as pollster to Senate Democratic leaders for over 20 years and as president of the American Association of Political Consultants.