I got an email from a reader who wanted to make a point he thought I might be interested in — a common occurrence, and always welcome. He included a reference to a scientific paper to support his point — even more welcome, but quite unusual. Most people are just not that diligent.
Unfortunately for his credibility, I actually checked that reference — imagine! — and I quickly realized that the paper did not actually support his point. And this wasn’t ambiguous. It wasn’t really a matter of interpretation or opinion. It was way off: he had read much more into the paper than the researchers had ever intended anyone to get from it. The authors had clearly defined the limits of what could be interpreted from their evidence.
Unfortunately, even professionals can be that careless with their citations. Full-blown irrelevancy in citations is surprisingly common, and milder forms of bogosity are absolutely epidemic.The phenomenon of the bogus citation is almost a tradition in both real science and (especially) in science reporting. Full-blown irrelevancy in citations is surprisingly common, and milder forms of bogosity are absolutely epidemic. Real live scientists — not the better ones, of course, but scientists nevertheless — often “pad” their books and articles with barely relevant citations. They probably pulled the same crap with their term papers as undergrads!
This kind of stuff gives science a bad name. But it’s not that science is bad — it’s just bad science. This is why nothing can ever be “known” until several lines of research have all converged on the same point.
So, a word to the wise: citations are not inherently trustworthy.
The right way to refer to scientific evidence is to link to a relevant, good-quality study — no glaring flaws, like “statistical significance” being passed off as “importance” — published in a respectable peer-reviewed scientific journal. There are quite a few wrong ways.
The clean miss: cite something that is topically relevant but which just doesn’t actually support your point. It looks like a perfectly good footnote, but where’s the beef?
The sneaky reach: make just a bit too much of good and relevant evidence … without even realizing it yourself, probably.
The big reach: make waaaaay too much of otherwise good evidence.
The curve ball: reference to perfectly good science that has little or nothing to do with the point.
The bluff (A.K.A. “the name drop”): citation selected at random from a famous scientific journal like The New England Journal of Medicine … because no one actually checks references, do they?
The ego trip: cite your own work … which in turn cites only your own work … and so on …
The slum cite: referencing research that perfectly supports your point, but is published by hacks and quacks in a crap journal no one’s ever heard of or ever will again.
The uncheckable: citing a chapter in a book no one can or would ever want to actually read, because it has a title like Gaussian-Verdian Analysis of Phototropobloggospheric Keynsian Infidelivitalismness … and it’s been out of print for decades and it’s in Russian.
The backfire: science that actually undermines your point, instead of supporting it. (It sounds hard to believe, but this is actually common.)
The fake: case studies and other tarted up anecdotal evidence are a nice way to cite lite.
The really low road: just make stuff up!
Geek-artist Randall Munroe made a highly relevant comment about that comic:
I just read a pop-science book by a respected author. One chapter, and much of the thesis, was based around wildly inaccurate data which traced back to … Wikipedia. To encourage people to be on their toes, I'm not going to say what book or author.
Here’s a rich example of a “clean miss” double-whammy, from a generally good article about controversy over the safety and effectiveness of powerful narcotic drugs. The authors are generally attacked the credibility of the American Pain Foundation’s position, and in this passage they accuse the APF of supporting a point with an irrelevant reference — a “clean miss.” But the accusation is based on a citation that isn’t actually there — a clean miss of their own! (Not that I could find in a good 20 minutes of poring over the document. Maybe it’s there, but after 20 minutes of looking I was beginning to question the sanity of the time investment.)
Another guide, written for journalists and supported by Alpharma Pharmaceuticals, likewise is reassuring. It notes in at least five places that the risk of opioid addiction is low, and it references a 1996 article in Scientific American, saying fewer than 1 percent of children treated with opioids become addicted.
But the cited article does not include this statistic or deal with addiction in children.
Actually, after a careful search, I can find no such Scientific American article cited in the APF document at all. So the APF’s point does not appear to be properly supported, but then again neither is the accusation.
I was making some corrections to my insomnia tutorial when I curiously clicked on an advertisement for SleepTracks.com, where a personable blogger named “Yan” is hawking an insomnia cure: “brain entrainment” by listening to “isochronic tones,” allegedly superior to the more common “binaural beats” method.
Yan goes to considerable lengths to portray his product as scientifically valid, advanced and modern, and he actually had me going for a while. He tells readers that he’s done “a lot of research.” To my amazement, he even cited some scientific papers. I had been so lulled by his pleasant writing tone that I almost didn’t check ‘em.
“Here are a few sleep-related scientific papers you can reference,” Yan writes, and then he supplies these three references:
Notice anything odd there? They lack dates. Hmm. I wonder why? Could it be because they’re from the stone age?
Those papers are from 1967, 1981, and 1982. In case you’ve lost track of time, 1982 was 31 years ago — not exactly “recent” reseach, particularly when you consider how far neuroscience has come in the last twenty years. Now, old research isn’t necessarily useless, but Yan was bragging about how his insomnia treatment method is based on modern science. And the only three references he can come up with pre-date the internet by a decade? One of them pre-dates me.
Clearly, these are references intended to make him look good. Yan didn’t actually think anyone would look them up. Yan was wrong. I looked them up. And their age isn’t the worst of it.
The 1981 study had negative results. (The Backfire!) The biofeedback methods studied — which aren’t even the same thing Yan is selling, just conceptually related — didn’t actually work: “No feedback group showed improved sleep significantly.” Gosh, Yan, thanks for doing that research! I sure am glad to know that your product is based on thirty year-old research that showed that a loosely related treatment method completely flopped!
The 1982 study? This one actually had positive results, but again studying something only sorta related. And the sample size? Sixteen patients — a microscopically small study, good for proving nothing.
The 1967 study? Not even a test of a therapy: just basic research. Fine if you’re interested in what researchers thought about brain waves and insomnia before ABBA. Fine if you want to include numerous other references. But as one of three references intended to support the efficacy of your product? Is this a joke?
So Yan gets the Bogus Citation Prize of the Year: not only are his citations barely relevant and ancient, but are obviously deliberately cited without dates to keep them from looking as silly as they are.
Here in the Salamander’s domain, there are no bogus citations. (Certainly none I know about!)
Citations here are harvested and hand-crafted masterfully from only from the finest pure organic heirloom artisan fair-trade sources. If I had a TV ad for this, there’d be oak barrels and a kindly old Italian gentleman wearing a leather apron and holding up sparkling citations in the dappled sunshine before uploading them to ye olde FTP server.
But how are you to believe this?
To actually trust citations without checking them yourself, you have to really trust the author. Citations here are harvested and hand-crafted masterfully from only from the finest pure organic heirloom artisan fair-trade sources.But you can only trust an author by actually checking their references quite a few times first. Even after trust is established, you should probably still check the occasional reference — for your own edification, if nothing else. And of course you should always check references when the truth actually matters to you. That goes without saying, right?
This is why I have gone to considerable technological lengths on SaveYourself.ca not just to cite my sources for every key point, but also to provide user-friendly links to the original material … which makes them easy to check! This is a basic principle of responsible publishing of health care information online.
If you’re not going to leverage technology to facilitate reference-checking, why even bother?