2 January 2014

How journals like Nature, Cell and Science are damaging science

The following great article appeared in the  The Guardian, Monday 9 December 2013, and contains a number of accurate and important points. Many people in Academia would agree with it, but few would dare speak about it. I reproduce it below verbatim. Please notice that the emphasis is mine. Note, in particular, how the author speaks about the concept of "impact factor" (a gimmick!), and how it urges scientists (including administrators) to seek the truth, rather than fall victims of a fake-bonus practice. Indeed, one's achievements should NOT be measured as a function of impact factor and other silly criteria (like the h-index), but as a function of one's research and scholarship. But this would pose extra work for the heads of university divisions, for the committees deciding on one's promotion, and for the funding agencies. But, hey, learning something about the work of the person you judge ain't that bad, right?


The incentives offered by top journals distort science, just as big bonuses distort banking

Litter in the street
The journal Science has recently retracted a high-profile paper reporting links between littering and violence. Photograph: Alamy/Janine Wiedel

I am a scientist. Mine is a professional world that achieves great things for humanity. But it is disfigured by inappropriate incentives. The prevailing structures of personal reputation and career advancement mean the biggest rewards often follow the flashiest work, not the best. Those of us who follow these incentives are being entirely rational – I have followed them myself – but we do not always best serve our profession's interests, let alone those of humanity and society.

We all know what distorting incentives have done to finance and banking. The incentives my colleagues face are not huge bonuses, but the professional rewards that accompany publication in prestigious journals – chiefly Nature, Cell and Science.

These luxury journals are supposed to be the epitome of quality, publishing only the best research. Because funding and appointment panels often use place of publication as a proxy for quality of science, appearing in these titles often leads to grants and professorships. But the big journals' reputations are only partly warranted. While they publish many outstanding papers, they do not publish only outstanding papers. Neither are they the only publishers of outstanding research.

These journals aggressively curate their brands, in ways more conducive to selling subscriptions than to stimulating the most important research. Like fashion designers who create limited-edition handbags or suits, they know scarcity stokes demand, so they artificially restrict the number of papers they accept. The exclusive brands are then marketed with a gimmick called "impact factor" – a score for each journal, measuring the number of times its papers are cited by subsequent research. Better papers, the theory goes, are cited more often, so better journals boast higher scores. Yet it is a deeply flawed measure, pursuing which has become an end in itself – and is as damaging to science as the bonus culture is to banking.

It is common, and encouraged by many journals, for research to be judged by the impact factor of the journal that publishes it. But as a journal's score is an average, it says little about the quality of any individual piece of research. What is more, citation is sometimes, but not always, linked to quality. A paper can become highly cited because it is good science – or because it is eye-catching, provocative or wrong. Luxury-journal editors know this, so they accept papers that will make waves because they explore sexy subjects or make challenging claims. This influences the science that scientists do. It builds bubbles in fashionable fields where researchers can make the bold claims these journals want, while discouraging other important work, such as replication studies.

In extreme cases, the lure of the luxury journal can encourage the cutting of corners, and contribute to the escalating number of papers that are retracted as flawed or fraudulent. Science alone has recently retracted high-profile papers reporting cloned human embryos, links between littering and violence, and the genetic profiles of centenarians. Perhaps worse, it has not retracted claims that a microbe is able to use arsenic in its DNA instead of phosphorus, despite overwhelming scientific criticism.

There is a better way, through the new breed of open-access journals that are free for anybody to read, and have no expensive subscriptions to promote. Born on the web, they can accept all papers that meet quality standards, with no artificial caps. Many are edited by working scientists, who can assess the worth of papers without regard for citations. As I know from my editorship of eLife, an open access journal funded by the Wellcome Trust, the Howard Hughes Medical Institute and the Max Planck Society, they are publishing world-class science every week.

Funders and universities, too, have a role to play. They must tell the committees that decide on grants and positions not to judge papers by where they are published. It is the quality of the science, not the journal's brand, that matters. Most importantly of all, we scientists need to take action. Like many successful researchers, I have published in the big brands, including the papers that won me the Nobel prize for medicine, which I will be honoured to collect tomorrow.. But no longer. I have now committed my lab to avoiding luxury journals, and I encourage others to do likewise.

Just as Wall Street needs to break the hold of the bonus culture, which drives risk-taking that is rational for individuals but damaging to the financial system, so science must break the tyranny of the luxury journals. The result will be better research that better serves science and society.

No comments:

Post a Comment




T H E B O T T O M L I N E

What measure theory is about

It's about counting, but when things get too large.
Put otherwise, it's about addition of positive numbers, but when these numbers are far too many.

The principle of dynamic programming

max_{x,y} [f(x) + g(x,y)] = max_x [f(x) + max_y g(x,y)]

The bottom line

Nuestras horas son minutos cuando esperamos saber y siglos cuando sabemos lo que se puede aprender.
(Our hours are minutes when we wait to learn and centuries when we know what is to be learnt.) --António Machado

Αγεωμέτρητος μηδείς εισίτω.
(Those who do not know geometry may not enter.) --Plato

Sapere Aude! Habe Muth, dich deines eigenen Verstandes zu bedienen!
(Dare to know! Have courage to use your own reason!) --Kant