Janna Zittrer
No Comments

Digging a Bigger Poll

Decrease Font Size Increase Font Size Text Size Print This Page

A lot of people think polls bury the issues and reduce elections to horse races. So what did Canadian media do for the 2006 general election? They stepped up the number and frequency of polls

“You have to look at polls not as a neutral, transparent measuring instrument,” says Bob Hanke, peering through his red-framed glasses. “They’re shaped by journalists who think that knowing who’s ahead and who’s going to win – the whole emphasis on prediction – is more important than actually describing what you or I think or feel.”

It’s less than a week before January 23, Election Day, and Hanke is sitting in a Second Cup near his home office in Toronto. The assistant professor at York University’s Communication Studies program is also co-founder of CAMERA, the Committee on Alternative Media Experimentation, Research and Analysis, a collective dedicated to democratizing the media by debunking the techniques it commonly uses. Conventional public polling is a prime target because it pre-empts the normal process of debate, deliberation and reflection. At this point in the campaign, he says, “We should still be looking at competing ideological viewpoints and seeing which make sense according to our interests in health, education and the environment.”

credit: http://ca.geocities.com/theresbob/

credit: http://ca.geocities.com/theresbob/

The debate over the media’s use of polls during campaigns is ongoing and heated. One camp, in which Hanke has pitched his tent, derides the media’s use of polls, claiming it distorts reality and influences outcomes. Proponents, however, consider polling a valuable tool that informs the public. The argument goes back and forth, and even with no end in sight, Canadian media outlets commissioned and covered polls more ambitiously than ever this time around.

As Alan Bass, chair of journalism at Thompson Rivers University in Kamloops, B.C., says, “It’s ironic, but at a time when the reliability of public opinion polls is being questioned as never before, polls have dominated coverage of this campaign to an unprecedented extent.”

Public polling emerged in Canada in the 1940s, but media did not consider the results worthy of publication until the ’60s, and it was only in the ’80s that polls began to form the basis of election coverage. It did not take long for criticism to follow as the many flaws became known. The chosen sample, the way questions are worded, the range of possible answers, the sequence in which questions are asked and the length of the survey are just some of the many factors critics use to cast doubt on public polling.

In addition to questioning their accuracy, critics also charge that polls influence electoral outcome by creating a “bandwagon effect” (in which people decide to vote for a party because they believe that party is going to win), an “underdog effect” (in which people cast a sympathetic vote because they believe that a party might lose), or a strategic voting effect (in which people decide to vote for one party in order to keep another party out of office). The fact that some studies disprove this notion while others support it only adds to the never-ending debate.

The 2006 general election was marked by new innovations and polling samples of unprecedented numbers. CanWest Global Communications Corp., for one, teamed up with Ipsos-Reid and utilized online polling based on the same parameters as telephone polling. “It was our most ambitious polling program ever,” says Darrell Bricker, president of Ipsos-Reid public affairs in North America. From a panel of 100,000 Canadians who answered a 60-question survey on their demographic backgrounds, Ipsos-Reid selected 12,000 people they believed formed a representative population of voting Canadians based on factors like language background, age and regional distribution. Ipsos then sent them electronic surveys over the course of the campaign.

Online polling allowed Global National to broadcast results within minutes, says Jason Keel, the program’s broadcast producer, and to measure how voter opinions changed not only on a daily basis, but even during the leaders’ debates. Keel says he was particularly impressed by the commitment of the poll’s respondents to not only watch the whole debate, but to answer questions both before and immediately after it. “Those Canadians obviously found that polling is something they want to participate in,” he says. “The Internet is changing polling because it makes it more interactive and easy.”

The Globe and Mail and CTV, together with Strategic Counsel, also conducted “the most ambitious polling in its history,” says Globe managing editor Colin MacKenzie. By using a “rolling tracking” method that involved polling 500 people a day and providing every three-day’s worth of sampling for a combined total of 1,500, they were able to raise sample sizes while polling continuously throughout the campaign. “Rolling gives you a sense of continuity,” says MacKenzie. “Had we done it last time we would have detected an uptake,” referring to the grossly off-beam prediction of a Conservative victory in 2004. The Globe, like most media outlets, polled through the final weekend this election, “just to not get caught by last-minute changes.”

Even the CBC, which vowed not to cover election polls unless they showed a dramatic shift in public opinion, bowed somewhat by sampling the week’s major poll results every Sunday night during the campaign.

Despite the refinements, though, not all problems have been eliminated. The increased sample sizes may decrease margins of error, but rolling tracking generates problems of its own. A company may be able to produce larger sample size by grouping three day’s worth of polling, but the results from days one and two can change by day three because of events in the campaign. And even if more polls are conducted more quickly, they may still be the kinds of polls critics condemn.

Hanke is deeply opposed to “horse-race coverage,” in which the media focus on leadership and strategy to predict election outcomes. He says he’s not opposed to polling entirely, but wants the media to focus more on policy-preference polls and less on leader- or party-preference polls. “It would show a whole other side of public opinion,” he says, using Kyoto and Conservative Party of Canada leader Stephen Harper as an example. “If you asked people, ‘Do you support Kyoto?’ probably you’d find 95 per cent of people say ‘Yes.’ But look, they’re about to elect a prime minister who wants to scrap Kyoto.” On that particular issue, he says, approval or disapproval of a leader is in no way connected to the policy that person prefers.

But “the horse race is the key component of the election,” counters MacKenzie. Elections are all about who’s going to win, he says, which is why he has faith in survey research.

And Keel believes the media are justified in covering polls. “Polling actually is reflected in the results when you look at the polls closest to the election,” he says. “It’s important to know how Canadians feel because without that, you’re really just guessing.”

One journalist says the media’s insatiable appetite for polling data is not entirely editorially driven – it’s also a bottom-line imperative. “Polls are cheap and easy headlines,” says Antonia Zerbisias, media critic for The Toronto Star. “With them come dramatic headlines, colourful graphics and reaction.” The media also use polls to differentiate themselves from competition, says Zerbisias, with each news organization striving for its own unique, poll-based story.

Whether or not polls are informative or coercive, Canadians will simply have to get used to the media’s infatuation with the kind of up-to-the-minute horse-race coverage mega-polling inevitably produces. This time, at least, they came close to predicting the actual results.

Leave a Reply

Your email address will not be published. Required fields are marked *

16 − 1 =