For those of you who care about such things, Nature Chemistry now has an Impact Factor (IF). It is 17.9 (or, if you really want us to quote it to three decimal places, it’s 17.927). We realise that the IF is far from a perfect metric; really, we do. And we also appreciate that there are a range of opinions out there amongst our audience on the matter — some of you love the IF, some of you hate it. For those of you who aren’t about to click away in a rage or aren’t sick of IFs, I’m going to drill a little more deeply into our number and what it might mean, if anything.
The 2010 IFs (the ones just released in 2011) are calculated for any given journal by looking at the number of citations it received in 2010 to the articles it published in 2008 and 2009. This total number of citations is then divided by the total number of ‘citable items’ published in 2008 and 2009, and the number you get is the IF. So, it really boils down to the average number of times papers published in 2008 and 2009 get cited in 2010 – if that has any meaning! If a journal has a 2010 IF of 7.5, it means that — on average — each paper from 2008 and 2009 was cited 7.5 times in 2010.
Averaging brings with it some problems. Each paper might have been cited 7.5 times on average, but in reality, some papers will have been cited lots more than that and others might not have been cited at all. One (or a few) very highly cited papers can have a huge effect on the IF of a journal (see the wonderful example of Acta Cryst. A from last year!).
Another bone of contention for some is the use of ‘citable items’. Not everything that a journal publishes counts as a ‘citable item’ — typically only research articles and reviews do. For example, in Nature Chemistry, ‘front-half’ material such as Editorials, Book Reviews, Commentaries and News & Views articles are not counted as citable items and so do not add to the denominator in the IF calculation. Any citations that they do receive, however, are counted in the numerator. Note that this is not a special exception made for Nature Chemistry or Nature journals in general, this type of citable-item categorization is made for all journals.
Also, the sources of the exact numbers used to calculate the IFs are not easy to find. Sure, you can get a total of citations from Web of Science for an individual item or for a given year’s worth of content in a journal, but that does not usually include all of the citations used in the IF calculation. See this post at The Scholarly Kitchen for more details on how impact factors are calculated. Bearing this in mind, however, let’s take a closer look at Nature Chemistry content from 2009.
Nature Chemistry published 82 citable items in 2009; that was 17 review-type articles (Reviews & Perspectives) and 65 primary research papers. Of course we also published many other pieces of content that are not counted as citable items (as described above). Based on our rough calculations, it appears that just under 10% of our total citations in 2010 to these 2009 pieces of content were to these non-counted items, just over 27% of the citations were to the review-type articles and just over 63% were to the research articles.
And if you’re interested, the top five 2009 papers based on 2010 citations are (subscription required to access all of the articles apart from the third on the list, which was in our first ever issue that is currently available for free):
REVIEW: Nanostructured functional materials prepared by atom transfer radical polymerization (77 citations in 2010)
REVIEW: Towards the computational design of solid catalysts (52)
There don’t appear to be any obvious trends in terms of articles in one particular subject area (physical, organic, inorganic, bio, analytical) consistently receiving more citations than any other. And there seems to be no correlation between how much an article is cited and how many page views it has received. We do note that what could be considered to be more traditional ‘organic’ papers typically get more page views than other sub-disciplines, but this is not reflected by an increased citation rate.
A few quick numbers for you about the citations our 2009 content received in 2010: of the 82 citable items, the most cited paper received 77 citations and the least cited paper received 1 citation. The other 80 articles fell somewhere in the middle. The top 40 of these had a citation count of 11 or more and the lower 40 each had somewhere from 2 to 10 citations — hence the median number of cites was 10.5. The mean, on the other hand, is 14.5 if you’re interested. And now compare those numbers to the actual IF, which is 17.9.
So, what does this tell us? I’m not really sure. Other than perhaps IF calculations are a strange and mysterious thing.
Stuart Cantrill (Chief Editor, Nature Chemistry)