It’s Un-Nature-al

Ask anybody with a vague interest in scientific research, and they’ve probably heard of journals like Nature and Science. These journals claim to publish some of the most cutting edge research in the natural sciences, from genes involved in human cancer to evidence that the axis of the Moon’s rotation has shifted. And those are just from the last week in Nature.

These journals are home to some of the most highly cited papers—this means papers whose findings are mentioned in other papers, often highlighting research that has had a large impact on a field of research. The quest for scientific papers in these so-called “High Impact” journals has come to dominate many scientists’ research.

For some, this leads to a conundrum. Applications for faculty positions, post-doctoral jobs, and tenure reviews often privilege articles published in such high impact journals over all others. It’s not uncommon to hear that if you don’t have a Nature or Science article as an early career researcher, you chances of landing a long-term academic position are slim-to-none. This might be a bit too much like scare mongering, but in a job market saturated with PhD graduates it can ring far too true.

And so, is it really a surprise that many scientists will push to get their research into these prestigious journals? Yet this push for “high impact” has started to come under fire.

Could it be that this obsession with prestige, and the name of the journal, is actually feeding into a culture of quantity over quality? Where the drive to publish in a high-profile journal is leading to the publication of results that haven’t been sufficiently vetted? And perhaps where the preconceptions of what constitutes “cutting edge research” are systematically biased in favour of certain fields of research?

It’s not uncommon now to hear about another high profile paper that has been retracted or amended after further investigation (following publication, of course) has poked holes in the story. The most well known case recently involved the retraction of two papers in Nature which claimed to have induced stem-cell creation via application of mechanical stress. But this is a particularly unique case—it’s still rare for papers to be retracted allegedly for outright fabrication of results.

No, perhaps more worrying is the trend towards publishing flashy, new science in high profile journals, and away from publishing studies which attempt to replicate new findings. It might seem natural that papers that are investigating new results should receive higher “status” per se. But it’s increasingly apparent that many results, particularly in biology, are unable to be replicated. And yet, with few journals to publish papers based on replicating results, there’s little incentive for scientists to waste time and resources on double-checking another lab’s results.

reproducibility
David Perkins / Nature

Maybe if the academic establishment placed less importance on ostensibly “high-powered” research, a notion that in and of itself seems increasingly outdated, we might start to see a shift towards high-quality research that is less dependent on the current scientific trends. I’m of the opinion that the reliance on the name of the journal as a measure of quality is past its time. Why not focus on the number of citations of the paper itself, rather than those of the journal its in, if some sort of quantification of a researcher’s impact is needed? Moves have already been made in this direction, such as with ResearchGate’s Score metric and Google Scholar’s impact metrics.

 

Scientific journals originated decades, if not centuries, before the Internet upended the means of distributing research findings. Being published in a smaller, less renowned journal no longer prevents your research from reaching its desired audience. A quick search on Google or Web of Science, and papers from journals ranging from Nature to the Canadian Journal of Zoology can reach anyone anywhere in the world. And now the number of times a paper is cited can be tracked automatically, making it easier than ever to directly measure a paper’s impact.

Lets stop relying on outdated means of judging scholarly output. Enough with the blind acceptance of “high-impact journals” as the be all and end all of a researcher’s career. Why not let our research speak for itself?

 

Featured Image from Amy | Flickr

Advertisements

3 thoughts on “It’s Un-Nature-al

  1. Nice post. I can only agree. Judge papers by their own merit. Period.

    However, it pains me to still hear the myth that if you don’t have a Nature or Science article as an early career researcher, you chances of landing a long-term academic position are slim-to-none. Although publishing in such journals will not hurt your career prospects, your chances are much better then slim-to-none, if you happen to have published good solid papers, have a coherent narrative about your career, have a rational and exciting plan for the future, and can communicate your work and prospects to your colleagues.

    If you don’t believe me then you don’t have to look very far. I was hired as a Senior Group Leader without Nature and Science papers. And I wasn’t even exactly early career. And I know several other examples.

    So lets stop propagating the defeatist myth and lets look forward towards building a new reward culture. You, the next generation, can make it happen!

    More on this at http://kamounlab.tumblr.com/post/121748816600/what-are-world-class-science-outputs

    Like

    1. Thanks for the feedback! That’s a really good point– I think that sometimes despite lots of researchers like yourself pointing out the obvious flaws in the Nature/Science mythology (namely that lots of researchers have succeeded outside of their umbrella), it’s still easy for students and younger scientists to feel as if that is the exception, rather than actually being quite common.

      I’m not sure where this starts– I know that even early in undergrad I was aware of the prestige often associated with those journals. Hopefully the push for open access publishing will also help reframe priorities when publishing papers. Definitely food for thought!

      Like

  2. There are many other examples. I can immediately think of 4 Group Leaders at The Sainsbury Lab who were hired with having published CNS papers. I think these would rate as prestigious appointments. So to think it is pre-requisite to publish in CNS journals to have a great job is incorrect.

    As you pointed out, the goal should be to publish good, solid papers that stand the test of time. I honestly believe that this will make the difference over the course of a career.

    Thanks again for raising the topic. I encourage you and everyone else to have a more positive outlook. Things are changing fast in scientific publishing. We can all contribute to develop a new reward culture. Get your Institute to sign DORA. Convince your supervisor to post your paper on biorxiv. Inform your colleagues to the rapidly weakening relationship between the impact factor and papers’ citations. etc.

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s