Libraries and Learning

Peering Into Peer Review

May 24, 2017

A recent hoax has gotten a lot of attention, including here at Inside Higher Ed. To demonstrate that the field of gender studies is prone to accepting utter nonsense as scholarship, two wags got an article published in a peer reviewed journal and then pointed and laughed.

Well, it’s a bit more complicated than that. First their article was rejected by the journal to which they submitted, and then (because the publisher knows how desperate academics are to publish and because they know how to make money) the manuscript was efficiently “cascaded” – sent to a less discerning journal that would publish it for a fee. (Both journals are owned by Taylor & Francis; this practice of cascading rejected articles to branded open access mega-journals that publish large volumes of papers is common among giant academic publishers as they seek ways to make open access publishing part of their business model.) The moment it was published, the hoax was revealed. It has since been removed from the Taylor and Francis site, though the authors have kept a copy online.

It’s another Sokal moment proving that gender studies and much of social science is foolish, postmodernism is balderdash, and academic publishing too full of jargon, theory, and lax standards. Man-hating feminists and climate change got thrown into the parody, too, to demonstrate the dominance of “morally fashionable nonsense,” which I guess is a way of saying “political correctness” without all the political baggage.

Pointing and laughing has a long history in academic publishing. In a blog post Ketan Joshi (cited in the IHE article) names several “gotcha” moments in the recent history of peer review. These included a nonsense paper accepted by a scam conference outfit – of which there are many, to go by the contents of my inbox; the rather more embarrassing inclusion in databases compiled by Springer and IEEE of abstracts of 120 computer-generated gibberish papers reportedly given at conferences that are probably of the kind landing in my inbox; as well John Bohannon’s deliberately nonsense article written in non-standard English accepted by numerous scam operations (and rejected by genuine open access journals) and a delightful article titled “Get Me Off Your Fucking Mailing List” the text of which consisted of the title repeated for several pages published for a fee.

What can we conclude from these kinds of hoaxes? There are plenty of people out there who are happy to take your money. The only folks who were chastened by these hoaxes are the two legitimate publishers – Springer and IEEE – who got sloppy as they included fake conference abstracts in their subscription databases. The rest don’t have reputations at stake. They’re scammers. Of course they will accept nonsense papers, so long as you pay them. They might have a bridge to sell you as well, or a fortune awaiting you in a Nigerian bank. The lesson isn’t that peer review is broken or that open access is a fraud. It’s simply that you need to think before you send your research off to a publisher you’ve never heard of – or before you assume all lines in every CV represent genuine peer-reviewed research.

Joshi also mentions a peer review failure on a completely different scale: Andrew Wakefield’s Lancet article claiming, on the basis of a sample size of twelve with a dubious methodology, that vaccines cause autism – a finding that was not only sloppy but deliberately fraudulent. That fraud continues: thanks to that article and the author’s grandstanding the Somali community in my state is suffering from a measles outbreak. Wakefield’s discredited theory has led to the deaths of children. So far as I know, nobody has died because of an article published by a scam faux-journal website.

The recent hoax took inspiration from Alan Sokal, who in 1996 had a parody article taken seriously by the editors of Social Text. Sokal wasn’t tilting at gender studies; he was annoyed by postmodernism and culture studies for challenging something he believed was obvious. “There is a real world; its properties are not merely social constructions; acts and evidence do matter,” he wrote when explaining his motivation. “What sane person would contend otherwise?” It was an attempt, like the recent hoax, to delegitimize somebody else’s discipline through ridicule. The peer review process itself however, wasn’t on trial. At the time, Social Text was not peer-reviewed but rather operated in the “little magazine” tradition, publishing political essays and social critique without a referee system.

Anyone who has participated in peer review knows it isn’t a foolproof garantee of quality. In 1982 two pre-tenure scholars concocted an experiment, resubmitting a dozen articles in manuscript form to the same psychology journals that had previously published them, finding in most cases they were rejected, citing methodological flaws, though the only difference was that fictitious authors and institutions were substituted for the prestigious authors and affiliations of the originals. It’s not a perfect system, and there are various improvements that have been proposed from time to time. But there’s another factor that I haven’t seen mentioned much in discussions of this latest hoax.

Apart from the problem of scammy pseudo-journals or the supposed lack of rigor in other people’s disciplines, or the potential for the peer review process to be biased, another issue seems to be in play, here: the demand for scholars and scientists to be insanely productive, with productivity typically measured in publications. Though there are variations in how those demands are spelled out across disciplines, all of them seem affected, and it creates the conditions for scams, the dubious recycling of rejected articles for profit, and intellectual exhaustion.

It’s hard to produce good research under these conditions. It’s hard to find enough reviewers who can read papers thoroughly when so many manuscripts are being submitted. It’s hard to keep up with what’s new or wade through all the chaff when it’s time to review the literature. It’s hard to resist the urge to salami-slice research into as many publishable units as possible when you could instead be spending that time on designing experiments or archival research or simply on thinking. What do we get for all this illusory productivity? More truth? Greater knowledge? I doubt it.

I’m not sure how to put a stop to this unhelfpul acceleration of publishing demands, but the idea that publications are the measure of the worth of academics is a hoax that has serious consequences.

 

License

Icon for the Creative Commons Attribution 4.0 International License

Babel Fish Bouillabaisse II Copyright © 2019 by Barbara Fister is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.