中国A片

Reliance on journal rankings is undermining academic integrity

Careers can depend on publishing in higher-quartile journals, but the statistics are too easily gamed, says Jak?a Cvitani?

May 21, 2021
Piles of wood blocks, symbolising ranking
Source: iStock

What is your hunch about the likely quality of Journal X? It publishes 24 regular issues a year, with more than 400 articles in each one.?In addition, it publishes hundreds of “special issues” annually. All this adds up to thousands of papers per year.?You have to pay when submitting, but the acceptance rate is at least two in three, and the typical wait between submission and acceptance is only a couple of weeks.

Quality can’t be high on Journal X’s list of priorities, right?

Yet Journal X is ranked in the second quartile (Q2) in one subdiscipline, according to the widely used rankings produced by Clarivate Analytics’ Journal Citation Reports (JCR), and the first quartile (Q1) in another.

This true example matters because in many less-developed scientific systems, important things depend on publishing in Q1 or Q2 journals. These include whether a student can defend a PhD thesis, whether a junior lecturer?is promoted or gets a salary increase, and who is allocated the most research funds.

中国A片

ADVERTISEMENT

For people set on academic advancement, then, Journal X is the perfect outlet. All you have to do is submit a passably decent piece of work and pay a fee. Not that the authors necessarily know this. Many young researchers prefer to think they got their paper in because they produced great research. But even if they are not consciously taking shortcuts, Journal X’s clientele are failing to acquire a sense of research quality and ethics – and of what it really means to revise work to make it better. And this is a big concern given that some of them will one day be in positions of power.

Not that the current generation of academic leaders are much better. Such shortcuts to success are only possible because administrators in certain schools and countries do not have the means or the will to judge the quality of the faculty they employ by a thorough review of their research, but instead rely on simplistic measures such as journal rankings. And even if you are an established senior researcher who laments that situation, you might be tempted to advise your student or a junior colleague to submit to Journal X instead of to a journal with a lower acceptance rate and a longer wait for publication.

中国A片

ADVERTISEMENT

Why are journal rankings such poor proxies for quality? The JCR’s impact factors are computed as the average number of citations received by each paper in the journal over a given period of time. This is easily manipulated. Publishers and editors ask authors of accepted papers to cite additional papers published in the same journal. They also artificially boost the number of papers published per issue to increase the opportunities for such citation farming, while authors cut one paper into several or collaborate in larger groups to increase their output of mutually citing publications.

A colleague at an eastern European university who serves on the editorial board of a legitimately high-raking journal has more than once been invited by a publisher to an all-expenses-paid conference, where he was invited to publish freely in the publisher’s journals on the understanding that his journal would become a part of the citation cartel and return the favour. The journal of another colleague was hijacked by a publisher in Asia; a journal with the same name (but whose website was invisible in Europe and the US) started inviting authors to pay to submit to it in the evident hope of trading off the impact factor of my colleague’s journal.

Journals have been suspended from the JCR for excessive self-citation, but, as far as I know, “citation cartels”, whereby networks of journals all agree to cite each other’s papers, have not been penalised.

Another established ranking, Scientific Journal Rankings (SJR) does a slightly better job of recognising genuine quality by weighing citations depending on the quality of the journals in which they appear. Journal X’s SJR factor puts it in the second or third quartile, depending on the subfield. But since the citing journals’ quality is, itself, measured by the same method, the potential for distortion by undeservedly high rankings does not go away completely.

中国A片

ADVERTISEMENT

New ways to compute impact factors are called for that take account of the opinion of leading researchers in the field. Such figures routinely get asked to evaluate the quality of individuals’ work; it should be natural for them to do the same for journals.

Judgements of individual’s quality should also bear more weight in the criteria for advancement and funding imposed by academic and national bureaucracies. These should be revisited annually, with input sought from practising researchers. I am confident that this would herald a move away from counting how many papers the individual in question has published in highly ranked journals.

Let us stop providing incentives for publishing “quick and dirty”. It corrupts the integrity of academic administrators and researchers alike. Instead, let us think of incentives to motivate better long-term research impact, better teaching and better service to society.

Jak?a Cvitani? is Richard N. Merkin professor of mathematical finance at the California Institute of Technology (Caltech).

中国A片

ADVERTISEMENT

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (2)

Cancel or devalue the journal impact factors, and it would make such a positive impact on the academic community.
Sadly, not much new here. Humans are adept at optimising any system to get the maximum reward so I cannot see any prospect of a change at present. As was said when I was a student many years ago, "you find out what they want and give it to them". The messages coming from management and the system at large are often the wrong ones and so encourage these behaviours, which are not that new really but now more corrosive because of the speed of online publication.

Sponsored

ADVERTISEMENT