Julie Allin – Ryerson Review of Journalism :: The Ryerson School of Journalism http://rrj.ca Canada's Watchdog on the watchdogs Sat, 30 Apr 2016 14:26:17 +0000 en-US hourly 1 Poll Position http://rrj.ca/poll-position/ http://rrj.ca/poll-position/#respond Mon, 01 Mar 1999 19:18:10 +0000 http://rrj.journalism.ryerson.ca/?p=2510 Poll Position November 26, four days before the 1998 Quebec election: A smug Jacques Parizeau, arms folded across his ample girth, looks out from the front page of the Ottawa Citizen under the headline “Parizeau: Take the ‘Booty’ and Run.” Under the photo caption is the first half of the story titled “Separatists Poised for Landslide.” It’s [...]]]> Poll Position

November 26, four days before the 1998 Quebec election: A smug Jacques Parizeau, arms folded across his ample girth, looks out from the front page of the Ottawa Citizen under the headline “Parizeau: Take the ‘Booty’ and Run.” Under the photo caption is the first half of the story titled “Separatists Poised for Landslide.” It’s the kind of article you’d expect to find during an election campaign-lots of numbers, who’s up, who’s down, plus a prediction.

But there is something unexpected here, too, right under the empty chair to Parizeau’s left: a small, understated sidebar box, split in two. One half is labelled “Today’s Poll” and and the other half declares “Yesterday’s Polls.” The latter features polls from such papers as the Montreal Gazette, Le Soleil and theNational Post. Newspapers rarely group poll results. But Harvey Schachter, the former editor of The Kingston Whig-Standard, would like to see them make the poll box a habit. The public, he argues convincingly, would be better served.

Schachter has been a journalist for longer than I’ve been alive. He has taken a specialized course in polling at Williams College in Massachusetts, and has run workshops for the editorial staffers at The Toronto Star, Ottawa Citizen and The Globe and Mail on how to understand polls. He’s all for the box approach. “Poll coverage can be extreme during elections, and you can cover polls without them being on page one every day.” A chart or a box lets readers compare. Readers can examine the latest polls to spot trends or to discover if any of the numbers are out of line. The results are put in context for the reader. He believes it’s the only way rogue polls can be spotted.

Regular boxes would be useful for reporters and editors as well. “Unfortunately, most reporters don’t know what a newsworthy poll is,” says Schachter. “You have to look for changes, but you won’t know until the next poll is done if the results were from a rogue poll or if they were accurate. There’s pressure to use a poll because the media get two dramatic stories: the rocket and the decline.”

Pollsters all vie for the dead-on accurate poll, the one in which final election numbers match the prediction. In a competitive industry, the coveted bragging rights go to the firm whose poll numbers mirror reality. In polling, though, a number really represents a range of numbers. As Schachter says: “The best polls wouldn’t necessarily be the closest to the actual result because there’s the margin of error. It’s a delusionary concept that a poll is a single number.”

The polls the Citizen included in its “Yesterday’s Polls” box had a variety of predicted outcomes. SOM polled for the Montreal Gazette and Le Soleil, COMPAS for the National Post and Angus Reid for (among others) CBC Newsworld and Radio-Canada. In the end, COMPAS’s results were a duplicate of the election results, with the Parti Qu?becois taking 45 percent of the vote, the Liberals 44 percent and the Action D?mocratique 11 percent. However, given the sample size of approximately 1,000 people, a common number for pollsters, then the margin of error was, plus or minus, about three percent. So, 44 percent could be a number between 47 and 41. And 45 could be between 48 and 42.

By contrast, Angus Reid had the Liberals at 41 percent of the decided vote and the PQ at 46 percent. Though the numbers weren’t dead on, Angus Reid wasn’t exactly kilometres off target. Once the margin of error was put into play, the Liberal support ranged from a high of 44 percent to a low of 38 percent, while the PQ’s numbers ranged from 49 percent to 43 percent. The actual outcome of the election fell within those ranges, and so the Angus Reid poll was accurate too. It just wasn’t remarkable. Bringing up the rear was SOM, which had the PQ at 43 percent and the Liberals at an unusually low 30 percent. Even with the margin of error, the numbers were off.

The folks at COMPAS credit their “leaner” question with giving them such “precise” results. Having a good leaner question is like having the killer app: everyone wants it. Leaner questions help pollsters distribute the undecided voters. This gives them a clearer indication of where the support for the various parties lies. In a posting to the Canadian Association of Journalists e-mail list, Conrad Winn, COMPAS president, explained part of his method. “Asked of voters claiming to be undecided, COMPAS’ more coercive ‘leaner’ question reduces dramatically the number of respondents claiming to be undecided. As you may know, in Quebec, avowedly partisan voters are disproportionately P?quiste, while ostensible ‘leaners’ are Liberals.”

The question is “a way of reducing the number of undecideds in a poll by giving them fewer options. I’m sorry. I can’t go into any more detail than that.” Winn said. It’s his belief that a “bias of fear in our culture,” is one of the things that causes voters to give the dreaded “undecided” response. People may be afraid they don’t know enough about a candidate or a party platform, so they hide their perceived ignorance by telling the interviewer they have no opinion. If a survey is crafted to significantly reduce the opportunity for playing the undecided card, more respondents have to make a choice. So what was Winn’s knockout leaner question? “I can?t tell you exactly what I did, for proprietary reasons,” he politely but firmly replied.

In 1991, the Royal Commission on Electoral Reform and Party Financing highlighted a need for information comparison: “When two or more pollsters are seeking essentially the same information, yet produce different results, doubts naturally arise about their methodology.” The commission recommended that papers put in more methodological information to better serve the readers. From what I’ve seen, the dailies have done this. There’s more information on sample sizes, dates of surveys and margins of error. A chart box would round out their efforts to provide more detailed information. “Polls should be on the editorial page in a box,” says Schachter. “You can cover polls without going to five leaders and have one say it’s great and have four say it’s wrong.”

]]>
http://rrj.ca/poll-position/feed/ 0