Skip to content

Pollsters question influence of polls on elections

Given the controversy swirling around the accuracy and methodological adequacy of polls, the professionalism of pollsters and the polling literacy of journalists who slavishly report them, questions about the influence of polls on election results are arguably key to the health of Canada’s democracy.

OTTAWA — What if there had been no public opinion polls published during last spring’s federal election campaign?

Would the NDP’s orange wave have swept across the country if no one realized there was a wave to catch? Would support for the Liberals have collapsed so utterly? Would the Conservatives have captured their long-sought majority?

Such questions are not simply idle speculation.

Given the controversy swirling around the accuracy and methodological adequacy of polls, the professionalism of pollsters and the polling literacy of journalists who slavishly report them, questions about the influence of polls on election results are arguably key to the health of Canada’s democracy.

The issue arose at a conference last week, where eight pollsters from the country’s most prominent public opinion research firms discussed the lessons they’ve gleaned from last May’s election.

“I do think this was an interesting election in that it’s hard to deny that polls themselves had a major impact on the outcome of the election,” said Derek Leebosh of Environics.

He said a CROP poll published shortly after the televised leaders’ debate and showing the NDP had leapt into a commanding lead in Quebec was like “a depth charge” exploding in the campaign.

“Imagine if there had been no polling in the election campaign at all, nobody would have known that this phenomenon in Quebec was happening and the orange crush would never, may not, have spread into the rest of the country and even within Quebec people might not have thought the NDP was a realistic option.”

Ekos Research’s Frank Graves countered there’s “strong evidence” the outcome wouldn’t have changed much had there been no polls.

Indeed, he argued there was no post-debate NDP surge, that his surveys showed NDP support “proceeding on a pretty placid, straight (upward) line” throughout the campaign.

Regardless, Graves maintained voters are “not that dumb;” even without polls they would’ve noticed the explosion of NDP lawn signs and anecdotal evidence of New Democrat popularity.

In any event, he said post-election polling found the vast majority of Canadians maintained they weren’t influenced by the polls and, among those who were influenced, there was no clear pattern favouring one party over another.

Environics’ Kevin Neuman was doubtful.

“People may say that (polls) don’t influence, but it would influence the media and how the media cover the story and frame the story,” he said, adding that the CROP poll “may have completely changed the media coverage.”

While they disagreed about the impact of polls, there was consensus among pollsters at the conference that media coverage of them is often sorely wanting.

Journalists, they agreed, are riveted on the bald horse race numbers, disregard issue-based surveys, misinterpret margins of error and also make no distinction between proven and unproven polling methods.

The discussion was in many ways a polite echo of the rocket launched a couple weeks ago by Darrell Bricker and John Wright, top honchos at Canada’s largest polling company, Ipsos Reid.

The duo penned an “open letter” to journalists covering the Ontario provincial election campaign, warning them that “some marginal pollsters” are counting on media ignorance and competitiveness to peddle “inferior” polls.

They accused some unidentified pollsters of becoming “hucksters selling methodological snake oil” and some media outlets of publishing questionable polls simply because they support their editorial position.

“All of this MUST stop,” the duo wrote. “We are distorting our democracy, confusing voters and destroying what should be a source of truth in election campaigns — the unbiased, truly scientific public opinion poll.”

Graves pointed out that pollsters used to have stable relationships with specific media outlets that paid well for quality surveys. Today, he said, it’s become a “sort of auction to the bottom,” with pollsters giving their research to the media for free as a publicity tool. As a result, the quality of polls and the level of “methodological fluency” among journalists reporting on them have plunged.

It’s not that pollsters are knowingly peddling “crappy” polls, he insisted. It’s just that they can no longer afford to do surveys with the “level of depth and rigour that we’d really like to” because “polling budgets today are a fraction of what they used to be.” And journalists don’t seem to notice the difference.

“The understanding of basic issues — like what is a margin of error, or how do you create a good sample — is dramatically lower than it was back when they had really smart (media) guys ... who knew as much as the pollster.”