On Jeffrey Beall's Scholarly Open Access blog I recently got involved in a short discussion on steadily rising citation metrics. He discussed a citation metric that he considers suspicious, and one of the reasons he mentioned was that it always seems to be going up. It is well possible that the metric in question is manipulated - or not, I would not know and have no opinion either way - but it is important to realize that some degree of inflation is unsurprising and not necessarily, on its own, an indication that something is off.
The most highly regarded of them all, the Impact Factors (subsequently IF) from Thompson Reuters' Journal Citation Report, show the same trend. Yes, there are some losers and a lot of stagnation especially at the lower end of the spectrum, probably because the editor of a very small journal is only willing to invest so much time into it, and if you only compare across two years or so you will get a lot of noise. But if you take a bunch of decent, mid to high level journals from my field and compare their current IFs with what they had a few years back you will see slight increases very nearly across the board.
Here a few semi-randomly chosen journals from my field and their change over five years - meaning I looked up the IF 2007 and 2012 for a few well-regarded journals whose names immediately popped into my head:
Am. J. Bot. +0.074; Aust. J. Bot. +0.217; Aust. Syst. Bot. +0.488; Bot. J. Linn. Soc. +1.514; Flora +0.559; Folia Geobot. +0.432; Syst. Bot. -0.345; Taxon +0.258; Trends Plant Sci. +2.813
The only one bucking the trend here is Systematic Botany, which does seem to have moved from publishing a lot of phylogenies to a lot of alpha taxonomy lately, and the latter unfortunately and unjustly does not bring in a lot of citations. Conversely, the Botanical Journal went up a lot since they stopped accepting purely alpha taxonomic papers and they published the last Angiosperm Phylogeny Group update, a landmark paper that was guaranteed to get a ton of citations. Trends also looks like it got a big increase but it is not that much relatively speaking; as a review journal it was always by far the highest on this short list.
So how is that possible? Why do the citation metrics appear to consistently increase over the years? Is there some manipulation going on?
To some degree that is of course possible. Faced with a situation in which they are increasingly evaluated based on how often they are cited, authors may game the system by citing themselves and by forming citation cartels. Peer reviewers and editors may try to game the system by requiring an author to cite them even where it would not make a lot of sense. And faced with a situation in which the prestige of their journal is increasingly based on its IF, editors may try to game the system by requiring authors to cite other recent papers from the same journal. I am sure some of that happens because people aren't stupid, and I recently had a peer reviewer quite openly trash one of my manuscripts because I did not cite him enough. Still, I have not seen a lot of that kind of behaviour.
There is a much simpler explanation. Even if nobody does anything of substance, the average IF will go up "automatically" as long as the number of journals having an IF increases. The thing is this: The IF is not calculated based on the citations that a journal gets from anywhere on this planet but only on those it gets from other publications that are also indexed by Thompson Reuters' Web of Science. And of course originally comparatively few journals were indexed, but now that everybody considers having an IF to be important more journals every year attempt to get themselves indexed. The Plant Sciences section of the Thompson Reuters Journal Citation
Reports, for example, comprised 152 journals in 2007 but it had
197 journals in 2012. And that means that citations that previously would have been invisible to the IF suddenly become visible to it.
To self-plagiarize my comment from the aforementioned discussion: Assume every paper in the Journal of Scholarship (JoS) always, every year, gets cited five times in the first two years after publication, and one of those
citations is always in papers published in the journal Scholarly Letters
(SL). In 2007, SL was not yet indexed but all the other
journals citing JoS articles were, so JoS got an IF (2007) of 4.
Then in, say, 2010 the editors of SL decided to get themselves indexed to increase the prestige of their journal, and thus now, in 2013, the citations of JoS articles in SL
are counted by the people who calculate impact factors. Eh voila, JoS
now has an IF (2012) of 5 although nothing whatsoever about its quality or, crucially, about its impact
has changed; it gets a higher score merely thanks to a decision made by the editors of SL.
It is things like these that should caution us against taking citation metrics too seriously.
No comments:
Post a Comment