The Chief came across an opinion piece this week titled, “Never trust a statistical ranking,” by a writer named Richard Gerst. The article was one of hundreds of similar items that appear every week in this writer’s email inbox. Most such items merit a cursory look and a quick trip to the trash.
It was the subtitle that caught this writer’s attention. “Statistical science and methods,” it stated, “are so misused that the results are nothing more than a statistical fairy tale.”
Gerst goes on to take aim at the annual Fraser Institute rankings of schools. The Alberta government, he states, is considering prohibiting publication Provincial Achievement Test (PAT) results upon which the rankings are based.
The writer opines, “That’s a bad idea… transparency demands the tolerance of, in my opinion, statistical nonsense, even as it parades itself as objective analysis of school performance.”
We all know that most educators have a serious hate on for the Fraser Institute rankings, and for good reason. That’s because the results of one standardized test — in our case, B.C.’s Foundation Skills Assessment (FSA) — are but one of many ways to measure student learning and achievement. But the Fraser Institute uses the results to issue a ranking of schools, purportedly to help parents and educators gauge the performance of their children’s schools. What it often winds up doing, though — for reasons directly related to the obvious differences between public and private schools — is persuading parents that private ones are somehow “better.”
This writer is in full agreement with Mr. Gerst: While it would be foolhardy to try to suppress the publication of the Fraser rankings or other listing of FSA results, the rankings are, as Gerst opines, “nothing more than statistical fairy tales.”
The same thinking can be applied to other, similar attempts answer complex questions through the use of statistical rankings. Take, for example, the fourth annual Money Sense magazine “Best Places to Live” ranking of the 190 Canadian communities with 10,000 or more residents. On its surface, the methodology used to compile the rankings appears pretty sound. Communities’ livability is determined through a series of statistics in categories ranging from affordable housing, discretionary income, new cars, crime rates, doctors per capita, weather, etc. And the list of categories appears to be pretty comprehensive.
However, it’s what’s not on the list that raises questions. There’s no category for “access to outdoor recreation,” or “access to a major urban centre and its amenities.” Squamish, of course, has both of those in spades — and yet, this year, it ranked 182nd out of the 190 communities in the Money Sense rankings. Yes, we made the bottom 10, if you can believe that.
This writer can’t. That doesn’t completely invalidate the rankings, of course. It just means that, when you’re looking at anything that purports to rank things through the use of statistics, you’ve go to take it not just with a grain, but perhaps a huge chunk, of salt.
— David Burke