This Post Worth Reading 19 Times Out of 20
With B.C. heading to the polls in one month (vote yes!), and Federal politics in a state of uncertainty, potentially leading to an election sometime this year, I thought this would be a good time to add another dropdown box, this time with Canadian Polling firms in it (if I missed any important ones, drop me a line).
Like with many topics, it seems, I have mixed feeling about polls. On the one hand, I'm a numbers person with a bit of a stats background, so I enjoy digging into poll results to see what can be made of them. On the other hand, I find that polls feed into the 'horserace' mentality in which all people care about is whose numbers are good and whose are bad, rather than looking at actual issues. Furthermore, because the general population often seems to be a little lacking in its ability to interpret numbers, statistics tends to be an area where unscrupulous people will try to manipulate the numbers to suit their point. And in no area of statistics is this more true than in regards to polls.
The most common way to manipulate a poll is through the structuring of the questions. The pathetic B.C. referendum on native issues was a classic example of how to write a question to get the answer you want to hear. If this fails, the second line of defense is to interpret the results in a way which isn't consistent with the answers received. My recent post on my STV blog, is a good example where the 'lead' headline from a poll gave the opposite result from the poll results themselves.
Another problem with polls is that people try to interpret small movements which could well just be random fluctuations. This is especially true when the numbers are broken down into regions or different demographic groups which typically have much higher levels of uncertainty than the national totals.
There is legislation which outlines disclosure requirements for polls but as this article by Claire Durand, Professor at Université de Montréal shows, compliance with these rules is inconsistent at best. Of course I'm not sure how much good these rules do since I'm not sure people really pay attention to the poll fine print anyway.
Even if the pollsters are doing their job honourably and competently (as many are) and the media is interpreting the results clearly and objectively (this happens every now and then) there are still issues with polls. One of the biggest is the continuing rise in refusal rates. Veteran pollster Angus Reid refers to high refusal rates as polling's 'dirty secret' in this worthwhile article on polling in the Tyee. Besides increasing costs and putting downward pressure on sample sizes, this potentially biases the results if the refusal rates are different for different parties. Reid figures the solution is for the media to spend more on polling (shocking!) to make sure they get things right and notes that in the era before polls became prevalent, people were still devoting a lot of effort into 'horserace' politics and attempts to predict the outcome.
So far, there's no word on the possible rise of a new generation of pop-culture-referencing-smart-asses who are happy to talk to pollsters and just make up false answers as they go along (count me in with this group) and how that could skew the results.
It's been a rough year for the pollsters. They completely failed to see the Liberal minority government coming and down in the U.S. the massive discrepancy between the exit poll1 results and the vote count still has not been resolved half a year later.2
Looking to the future, it seems like times will only get tougher for the pollsters. While on the one hand our growing demand for up-to-date information will drive more and more polling, our growing reluctance to talk (honestly) to pollsters will make it harder for them to get accurate results. People are already skeptical about relying too heavily on poll results or trusting media interpretations of poll results and this isn't likely to help. Darren's negative reaction to polling (which proved justified when the results came in in the Federal election) may become increasingly common. Polls make for good amusement in the punditocracy but their usefulness in the real world is pretty limited. If nothing else, I recommend taking all poll results with a grain (or two) of salt.
----------------
If you're interested in polling and recent Canadian polls, here are some links:
HillWatch's polling station which has a good historical log of political polls.
Maple Leaf Web has a really good FAQ (list of Frequently Asked Questions) on the topic.
CRIC has a great archive of polls organized by polling company
Michigan State University's Canadian Studies Centre has a dated (but still useful for those companies which are still around) list of polling companies with descriptions of each one.
----------------------
Polling companies tend to match up with media outlets:
Typical pattern is:
Environics = CBC
Ipsos-Reid = Globe and Mail / CTV
SES Research = Toronto Star / Sun Media
COMPAS = National Post / CanWest
although there are certainly exceptions and this can change as contracts/relationships end.
----------------------
1 Exit polls are polls conducted as people are leaving the voting station. Because they record what people actually voted (of they are honest) as opposed to just who they are going to vote for, exit polls are generally much more accurate than pre-election polls.
2 The firm responsible for the exit polling released a report on what went wrong which said that, "Our investigation of the differences between the exit poll estimates and the actual vote count point to one primary reason: in a number of precincts a higher than average Within Precinct Error most likely due to Kerry voters participating in the exit polls at a higher rate than Bush voters.".
But they didn't really do any analysis to back that up and as this report shows, their explanation doesn't really hold water: "Edison/Mitofsky did not come close to justifying this position, however, even though they have access to the raw, unadjusted, precinct-specific data set. The data that Edison/Mitofsky did offer in their report show how implausible this theory is."
Like with many topics, it seems, I have mixed feeling about polls. On the one hand, I'm a numbers person with a bit of a stats background, so I enjoy digging into poll results to see what can be made of them. On the other hand, I find that polls feed into the 'horserace' mentality in which all people care about is whose numbers are good and whose are bad, rather than looking at actual issues. Furthermore, because the general population often seems to be a little lacking in its ability to interpret numbers, statistics tends to be an area where unscrupulous people will try to manipulate the numbers to suit their point. And in no area of statistics is this more true than in regards to polls.
The most common way to manipulate a poll is through the structuring of the questions. The pathetic B.C. referendum on native issues was a classic example of how to write a question to get the answer you want to hear. If this fails, the second line of defense is to interpret the results in a way which isn't consistent with the answers received. My recent post on my STV blog, is a good example where the 'lead' headline from a poll gave the opposite result from the poll results themselves.
Another problem with polls is that people try to interpret small movements which could well just be random fluctuations. This is especially true when the numbers are broken down into regions or different demographic groups which typically have much higher levels of uncertainty than the national totals.
There is legislation which outlines disclosure requirements for polls but as this article by Claire Durand, Professor at Université de Montréal shows, compliance with these rules is inconsistent at best. Of course I'm not sure how much good these rules do since I'm not sure people really pay attention to the poll fine print anyway.
Even if the pollsters are doing their job honourably and competently (as many are) and the media is interpreting the results clearly and objectively (this happens every now and then) there are still issues with polls. One of the biggest is the continuing rise in refusal rates. Veteran pollster Angus Reid refers to high refusal rates as polling's 'dirty secret' in this worthwhile article on polling in the Tyee. Besides increasing costs and putting downward pressure on sample sizes, this potentially biases the results if the refusal rates are different for different parties. Reid figures the solution is for the media to spend more on polling (shocking!) to make sure they get things right and notes that in the era before polls became prevalent, people were still devoting a lot of effort into 'horserace' politics and attempts to predict the outcome.
So far, there's no word on the possible rise of a new generation of pop-culture-referencing-smart-asses who are happy to talk to pollsters and just make up false answers as they go along (count me in with this group) and how that could skew the results.
It's been a rough year for the pollsters. They completely failed to see the Liberal minority government coming and down in the U.S. the massive discrepancy between the exit poll1 results and the vote count still has not been resolved half a year later.2
Looking to the future, it seems like times will only get tougher for the pollsters. While on the one hand our growing demand for up-to-date information will drive more and more polling, our growing reluctance to talk (honestly) to pollsters will make it harder for them to get accurate results. People are already skeptical about relying too heavily on poll results or trusting media interpretations of poll results and this isn't likely to help. Darren's negative reaction to polling (which proved justified when the results came in in the Federal election) may become increasingly common. Polls make for good amusement in the punditocracy but their usefulness in the real world is pretty limited. If nothing else, I recommend taking all poll results with a grain (or two) of salt.
----------------
If you're interested in polling and recent Canadian polls, here are some links:
HillWatch's polling station which has a good historical log of political polls.
Maple Leaf Web has a really good FAQ (list of Frequently Asked Questions) on the topic.
CRIC has a great archive of polls organized by polling company
Michigan State University's Canadian Studies Centre has a dated (but still useful for those companies which are still around) list of polling companies with descriptions of each one.
----------------------
Polling companies tend to match up with media outlets:
Typical pattern is:
Environics = CBC
Ipsos-Reid = Globe and Mail / CTV
SES Research = Toronto Star / Sun Media
COMPAS = National Post / CanWest
although there are certainly exceptions and this can change as contracts/relationships end.
----------------------
1 Exit polls are polls conducted as people are leaving the voting station. Because they record what people actually voted (of they are honest) as opposed to just who they are going to vote for, exit polls are generally much more accurate than pre-election polls.
2 The firm responsible for the exit polling released a report on what went wrong which said that, "Our investigation of the differences between the exit poll estimates and the actual vote count point to one primary reason: in a number of precincts a higher than average Within Precinct Error most likely due to Kerry voters participating in the exit polls at a higher rate than Bush voters.".
But they didn't really do any analysis to back that up and as this report shows, their explanation doesn't really hold water: "Edison/Mitofsky did not come close to justifying this position, however, even though they have access to the raw, unadjusted, precinct-specific data set. The data that Edison/Mitofsky did offer in their report show how implausible this theory is."
Labels: one of the better ones, polls
4 Comments:
For more polling news, you could also check out the news page at Centre for Public Opinion and Democracy, part of Angus Reid Consultants. (Full disclosure: my wife, Alexandra Samuel, heads up their Dialogue Networks public engagement practice. But I've been a CPOD regular for quite a while before...)
By Rob Cottingham, at 11:30 AM
Thanks for the link Rob, I tried putting in Canada as the keyword and it seems like their archive is pretty thorough.
By Declan, at 1:41 PM
While I agree with most of the post, I think blaming the pollsters for the difference between exit polls and results in the 2004 US election is a poor example.
Most of the evidence I've seen suggests that the difference in that case was probably due to fraud, in which case the polls may have gotten it righter than the actual election.
By Anonymous, at 3:44 PM
rufus - you make a good point (that so far there has been no plausible explanation for why the exit polls were wrong so it seems harsh to blame them) and I probably should have worded that paragraph differently. I think if we had seen that exit poll / result discrepancy in another part of the world (say, the Ukraine?), the legitimacy of the election would have been called into question.
Still, given the long history of democracy in the U.S. and the intense level of oversight over the electoral process there, the idea that the U.S. election could be rigged to that extent seems implausible.
Ergo, it must have been the pollls which were wrong, and thus the reputation of polling is tarnished and it is part of a bad year for polls.
By Declan, at 10:40 PM
Post a Comment
<< Home