How to lie with (other people's) statistics
One of the reasons Fox News and its bedmates are able to pass themselves off as news organizations so effectively is the Emperor's New Clothes effect. If somebody calls out an authority figure or declares that the official version of something ought to be in doubt, the journalist's job is to pass out the pitchforks and torches -- not to ask whether the numbers might say exactly what they appear to say. It's sort of like that first cry of "-gate" over some publisher's favorite scandal-in-the-making: Your watchdog reputation is at some risk if you point out that it's bogus, even when it is.
Hence, even though in this case it's an out-and-out lie, "another phony poll" doesn't just resonate around the echo chamber; it puts a bug in the grownup world's ear too. It looks like what journalism is supposed to be doing. Polls are suspect anyway (being based on data, which is inferior to gut feelings and news sense), and catching a fake one is a triumph of the common man over the spinmeisters -- right, The American Thinker?
This week brought news of Obamacare securing 7.1 million enrollees and the expected jubilation from the White House. ... The veracity of these numbers is questionable, as many have pointed out. But predictably, the media is doing its best to paint a rosy picture of the success of Obamacare, sharing all of the “good news” of the enrollment numbers.
ABC News and the Washington Post piled on with a poll of their own, demonstrating how public opinion has shifted in a favorable direction coincident with the President’s announcement of the enrollment numbers. “Public support for the Affordable Care Act narrowly notched a new high in the latest ABC News/Washington Post poll” they gleefully report. But a closer look at the poll suggests otherwise.
That point marks the jump over at The Fox Nation, and for a large part of the audience, the job is already done. Too bad. The real fun is just beginning:
Langer Research Associates, a professional research and polling firm, conducted the poll. They pledge to, “Ask only questions that in our professional judgment will produce meaningful, unbiased results.” So what did they ask that produced a new high in support for Obamacare?
Their survey question asked, “Overall, do you support or oppose the federal law making changes to the health care system?” What specific federal law? No mention of the Affordable Care Act or Obamacare. And what changes specifically? Is the question about the current federal law that makes those changes (Obamacare) or is it about a desire for using the federal law to make further changes? Words mean things. How these questions are phrased leads to different interpretations and answers.
That's true. Question design is a potential source of error in any poll, and it's always a good question to raise. (The infamous Roper Holocaust poll is a good introduction). And after a little ranting, our expert gets around to his point:
So what does this poll really tell us? That half the country wants changes to the status quo in the health care system. Not that they support Obamacare in its current state or any of the 19 changes made in the law since passage.
Suppose the poll asked, “In general, do you support, oppose or neither support nor oppose the health care reforms that were passed by Congress in March of 2010?” This is a bit more specific and references Obamacare, not by name, but at least by the legislation passed by Congress in 2010. This poll, taken a week before the ABCNews/WashPost poll, by GfK Public Affairs & Corporate Communications, showed only 26 percent support for Obamacare. Just a small difference in wording and support is cut in half from one poll to another.
Just a guess -- when he says "by name," do you figure he means "Obamacare" or "Patient Protection and Affordable Care Act"? Anyway, his point is clear: The libruls are using dishonest methodological tricks to sway the public! And that's where he's lying. "The federal law making changes to the health care system" is the wording Langer has been using for the past three years on this question; if you want to call the 49% result a nonsignificant increase from the same poll's findings in January and December (both 46%), or a significant increase from November (40%), go ahead. As public opinion polling goes, that's one of the safer comparisons you can make.
Comparing the Langer poll to the GfK poll for the AP, on the other hand, is pretty stupid -- you'll notice the "neither support nor oppose" option in the GfK question, which is soaking up 30% of the responses. That could explain part of why there's less support for the ACA in the GfK poll. It could also explain why there's less opposition (43% vs. 48%) and less "strong" opposition (31% vs. 36%). Neither of those results would be statistically significant at 95% confidence in genuinely comparable polls, but both support the idea that question wording makes a real -- but by no means conclusively partisan -- difference.
Polls are going to differ in details. CBS, for example, calls it "the health care law that was enacted in 2010" but doesn't give a breakdown on the proportion of landlines and cell phones in its sample; here in the broad sunny uplands of 2014, the latter is a much bigger methodological concern than the former. Fox has been asking about "the new national health care law that was passed by Congress and signed into law by President Obama in 2010" for at least two years; I wouldn't be surprised if there's some influence on the result just from saying "Obama," but you're still safe in concluding that approval on that question is pretty much where it was two years ago after a slight decline in the second half of 2013.
And given the general level of scorn in which Fox's journalistic efforts ought to be held, it's worth pointing out that Fox's polling is reliable because, in general, it's consistent about doing stuff right. When you let the political staff write the questions -- who would win a game of chess, Obama or Putin? -- you get what you pay for, but if you want a nice, steady reading on a monovalent approve-or-disapprove question, bet on the folks who tell you exactly what they're asking and how they got their sample.
Being a dipstick, though, The American Thinker doesn't qualify for the professional respect that Fox has earned:
And the results can be spun accordingly. The Washington Post headline about their own poll crows, “Democrats’ support for Obamacare surges.” Others claim, “Nearly half of Americans support Obamacare.” And some see 49 percent as a majority, “New poll: More Americans now support 'Obamacare' than oppose it.” Yet an honest analysis based on the wording of the question suggests anything but such support.
In order: It'd be nice if headline writers banned "surge" from their vocabularies, but an 11-point change from January is pretty impressive. (The change in support from November among self-identified conservatives -- 17% to 36% -- is more striking, but subgroup confidence intervals are a different problem altogether.) It's hard to see why "nearly half" is an exaggeration of 49% (or of the 47% in the NPR poll that was in the field March 19-23); if it was any more than 49%, it'd be -- um -- "at least half." No, saying one proportion is larger than another is not the same as saying that one of them is larger than 50%. Those are some ways you could provide an "honest analysis based on the wording of the question"; the author's inability to cope with them suggests both weaselhood and cluelessness.
We ought to be suspicious about polls and the people who write about them. (As demonstrated in this example, any random stranger can lie his head off about survey results as long as he has a pliant media outlet or two -- say, The American Thinker and The Fox Nation -- that will reproduce his natterings without question.) More to the point, we shouldn't be surprised when public opinion changes or when sample values differ from population values. That's why we do survey research, kids. But when some clown decides to rouse the masses by making things up about the evidence, it's perfectly all right for journalism -- and journalists -- to push back a little bit. In this case, the poll's right, and the critic is a liar. Journalists who are afraid to say that should probably find another line of work.
Hence, even though in this case it's an out-and-out lie, "another phony poll" doesn't just resonate around the echo chamber; it puts a bug in the grownup world's ear too. It looks like what journalism is supposed to be doing. Polls are suspect anyway (being based on data, which is inferior to gut feelings and news sense), and catching a fake one is a triumph of the common man over the spinmeisters -- right, The American Thinker?
This week brought news of Obamacare securing 7.1 million enrollees and the expected jubilation from the White House. ... The veracity of these numbers is questionable, as many have pointed out. But predictably, the media is doing its best to paint a rosy picture of the success of Obamacare, sharing all of the “good news” of the enrollment numbers.
ABC News and the Washington Post piled on with a poll of their own, demonstrating how public opinion has shifted in a favorable direction coincident with the President’s announcement of the enrollment numbers. “Public support for the Affordable Care Act narrowly notched a new high in the latest ABC News/Washington Post poll” they gleefully report. But a closer look at the poll suggests otherwise.
That point marks the jump over at The Fox Nation, and for a large part of the audience, the job is already done. Too bad. The real fun is just beginning:
Langer Research Associates, a professional research and polling firm, conducted the poll. They pledge to, “Ask only questions that in our professional judgment will produce meaningful, unbiased results.” So what did they ask that produced a new high in support for Obamacare?
Their survey question asked, “Overall, do you support or oppose the federal law making changes to the health care system?” What specific federal law? No mention of the Affordable Care Act or Obamacare. And what changes specifically? Is the question about the current federal law that makes those changes (Obamacare) or is it about a desire for using the federal law to make further changes? Words mean things. How these questions are phrased leads to different interpretations and answers.
That's true. Question design is a potential source of error in any poll, and it's always a good question to raise. (The infamous Roper Holocaust poll is a good introduction). And after a little ranting, our expert gets around to his point:
So what does this poll really tell us? That half the country wants changes to the status quo in the health care system. Not that they support Obamacare in its current state or any of the 19 changes made in the law since passage.
Suppose the poll asked, “In general, do you support, oppose or neither support nor oppose the health care reforms that were passed by Congress in March of 2010?” This is a bit more specific and references Obamacare, not by name, but at least by the legislation passed by Congress in 2010. This poll, taken a week before the ABCNews/WashPost poll, by GfK Public Affairs & Corporate Communications, showed only 26 percent support for Obamacare. Just a small difference in wording and support is cut in half from one poll to another.
Just a guess -- when he says "by name," do you figure he means "Obamacare" or "Patient Protection and Affordable Care Act"? Anyway, his point is clear: The libruls are using dishonest methodological tricks to sway the public! And that's where he's lying. "The federal law making changes to the health care system" is the wording Langer has been using for the past three years on this question; if you want to call the 49% result a nonsignificant increase from the same poll's findings in January and December (both 46%), or a significant increase from November (40%), go ahead. As public opinion polling goes, that's one of the safer comparisons you can make.
Comparing the Langer poll to the GfK poll for the AP, on the other hand, is pretty stupid -- you'll notice the "neither support nor oppose" option in the GfK question, which is soaking up 30% of the responses. That could explain part of why there's less support for the ACA in the GfK poll. It could also explain why there's less opposition (43% vs. 48%) and less "strong" opposition (31% vs. 36%). Neither of those results would be statistically significant at 95% confidence in genuinely comparable polls, but both support the idea that question wording makes a real -- but by no means conclusively partisan -- difference.
Polls are going to differ in details. CBS, for example, calls it "the health care law that was enacted in 2010" but doesn't give a breakdown on the proportion of landlines and cell phones in its sample; here in the broad sunny uplands of 2014, the latter is a much bigger methodological concern than the former. Fox has been asking about "the new national health care law that was passed by Congress and signed into law by President Obama in 2010" for at least two years; I wouldn't be surprised if there's some influence on the result just from saying "Obama," but you're still safe in concluding that approval on that question is pretty much where it was two years ago after a slight decline in the second half of 2013.
And given the general level of scorn in which Fox's journalistic efforts ought to be held, it's worth pointing out that Fox's polling is reliable because, in general, it's consistent about doing stuff right. When you let the political staff write the questions -- who would win a game of chess, Obama or Putin? -- you get what you pay for, but if you want a nice, steady reading on a monovalent approve-or-disapprove question, bet on the folks who tell you exactly what they're asking and how they got their sample.
Being a dipstick, though, The American Thinker doesn't qualify for the professional respect that Fox has earned:
And the results can be spun accordingly. The Washington Post headline about their own poll crows, “Democrats’ support for Obamacare surges.” Others claim, “Nearly half of Americans support Obamacare.” And some see 49 percent as a majority, “New poll: More Americans now support 'Obamacare' than oppose it.” Yet an honest analysis based on the wording of the question suggests anything but such support.
In order: It'd be nice if headline writers banned "surge" from their vocabularies, but an 11-point change from January is pretty impressive. (The change in support from November among self-identified conservatives -- 17% to 36% -- is more striking, but subgroup confidence intervals are a different problem altogether.) It's hard to see why "nearly half" is an exaggeration of 49% (or of the 47% in the NPR poll that was in the field March 19-23); if it was any more than 49%, it'd be -- um -- "at least half." No, saying one proportion is larger than another is not the same as saying that one of them is larger than 50%. Those are some ways you could provide an "honest analysis based on the wording of the question"; the author's inability to cope with them suggests both weaselhood and cluelessness.
We ought to be suspicious about polls and the people who write about them. (As demonstrated in this example, any random stranger can lie his head off about survey results as long as he has a pliant media outlet or two -- say, The American Thinker and The Fox Nation -- that will reproduce his natterings without question.) More to the point, we shouldn't be surprised when public opinion changes or when sample values differ from population values. That's why we do survey research, kids. But when some clown decides to rouse the masses by making things up about the evidence, it's perfectly all right for journalism -- and journalists -- to push back a little bit. In this case, the poll's right, and the critic is a liar. Journalists who are afraid to say that should probably find another line of work.
0 Comments:
Post a Comment
<< Home