December 1, 2016, by khyx2lyn
The true percentage of scientific articles that never get cited
Every so often I see a message passing on social media in which someone claims that ‘90% of scientific articles will never be cited’. While I do not agree with the notion that a publication’s value is solely determined by the number of citations it receives, such statements can be damaging to science, because they suggest to the public that most research is useless, with the underlying implication that many researchers therefore waste public money.
Although the statement has been traced back to an overeager editor of a non-scientific journal, it is actually fairly easy to examine the validity of this claim with a website called Scimago Journal and Country Rank (SJR). This excellent website includes citation information categorized by country and subject area based on data taken from Elsevier’s Scopus. Because SJR categorizes the information by country or subject area, a country has to be selected to obtain the numbers necessary for examining the claim. For the first analysis, I have selected the United States, but using other countries yields similar outcomes.
When reading the statement, it is unclear how ‘never’ and ‘articles’ are defined. Never is a very long time period. Although SJR gives citation information for the period between 1996 and 2015, I have taken articles that were published in 2005 as the reference point for this analysis. It seems unlikely that many articles that have not been cited at all in 10 years will suddenly be cited.
Besides defining ‘never’, it is similarly important to define ‘articles’. Some scientific documents, such as editorials, errata and lists of reviewers, are not intended to be cited and should therefore not be included in the analysis. SJR helpfully distinguishes citable and non-citable documents. The website informs us that, for the year 2005, 443188 citable and 42704 non-citable documents were published by researchers from the United States (8.79%).
Furthermore, the website informs us that, of the documents that had been published in 2005, 379366 had been cited but 106526 remained uncited after 10 years. However, the number of uncited documents includes the non-citable documents. Without those non-citable documents, only 63822 of the documents that had been published in 2005 remained uncited after 10 years. By dividing the number of uncited documents (63822) with the number of citable documents (443188), one finds that only 14.4% had not been cited.
The analysis shows that statement that ‘90% of articles will never be cited’ is simply not true. According to information taken from Scimago Journal and Country Rank, only 14.4% of citable documents (written in 2005 by researchers from the United States and published in journals indexed by Scopus) have not been cited after 10 years.
This analysis can be conducted for every other country. Most countries have similar low rates as the United States (UK: 9.0%; Germany: 19.8%, France: 18.0%, Canada: 11.6%, Italy: 14.1%; India: 17.3%, Spain: 14.0%). Some countries have higher proportions (China: 31.9%, Japan: 23.2%), but none of them has extremely high rates as the one mentioned in the claim. While these 10 countries represent about 69% of all the documents published in 2005, only 17.7% of the citable articles from these countries had not been cited after 10 years.
It is important to note that these analyses only include articles published in journals that are indexed by Scopus. Publications in journals that are not indexed by Scopus are probably less likely to be cited.
While it is fair to assess the impact of research, unfounded statements implying that many academics do not conduct valuable research can damage science. Politicians may be swayed by public opinion to decrease research funding and they may feel comfortable ignoring the opinions of experts when making policy decisions. It is therefore important that false myths, such as the proportion of articles that are never cited, are dispelled.
-
Post a comment