Abstract
This opinion paper takes aim at an error made recently by Clarivate Analytics in which it sent out an email that congratulated academics for becoming exclusive members of academia’s most cited elite, the Highly Cited Researchers (HCRs). However, that email was sent out to an undisclosed number of non-HCRs, who were offered an apology shortly after, through a bulk mail, which tried to down-play the importance of the error, all the while praising the true HCRs. When Clarivate Analytics senior management was contacted, the company declined to offer an indication of the number of academics who had been contacted and erroneously awarded the HCR status. We believe that this regrettable blunder, together with the opacity offered by the company, fortify the corporate attitude about the value of the journal impact factor (JIF), and what it represents, namely a marketing tool that is falsely used to equate citations with quality, worth, or influence. The continued commercialization of metrics such as the JIF is at the heart of their use to assess the “quality” of a researcher, their work, or a journal, and contributes to a great extent to driving scientific activities towards a futile endeavor.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Clarivate Analytics completed its transition from Thomson Reuters’ Intellectual Property & Science business, including the journal impact factor (JIF), purchased by Onex and Baring Private Equity Asia in July of 2016 (Marketwatch 2016). The transition was fairly smooth, and the JIF is still computed according to the concept formulated by Eugene Garfield and Irving Sher, 60 years ago (Garfield 2006). Nowadays, the JIF is often erroneously used as a proxy to assess the quality of research or a researcher, leading to a movement (DORA; http://www.ascb.org/dora/) that has been pushing for it to be scrapped from academics, not only because this metric is heavily gamed, but also because it is a non-academic barometer that can be easily abused (Callaway 2016). Over decades of use, the JIF has become too entrenched in the cultural fabric of academia at a global scale to ignore it completely, although we believe that it could soon become obsolete. To give a somewhat false notion of moving away from this metric, variations such as the Source Normalized Impact per Paper (SNIP) have simply been devised. However, since the SNIP is essentially a JIF-based metric, it inherits almost all misuses of the JIF (Larivière et al. 2016). Provided that academics, including authors and editors, continue to use and associate the JIF as a prime indicator of quality, which it is not (Favaloro 2008), and provided that senior authorities in academic bodies continue to reward their academics based on their JIF scores (per author, per article, or per journal), it will be difficult to visualize a cure for this endemic obsession, sometimes referred to as IF mania or Impactitis (Casadevall and Fang 2015).
The argument can be made that a highly cited paper is one that carries a high paper-based JIF, such as the SNIP, and has proved useful for a wide swathe of researchers, as measured by the number of times it has been cited for a period of time. In that sense, a high number of citations for a paper can be equated with an indication of its usefulness, but does not necessarily reflect its intrinsic quality, or that of the journal in which it was published. For example, it is now well established that high-ranked journals on the JIF scale have a tendency to also have a high rate of retractions (Woolston 2014). In addition to the argument as to whether the JIF reflects quality, importance or usefulness, its abuse may be the greatest problem, because it has become institutionalized. For example, in China, Iran, Indonesia, Brazil, Russia (see country- and region-wide examples in several papers in Teixeira da Silva 2013a; e.g., p. 38, 46, 57, 61, 66) or Mexico, the JIF of journals in which scientists publish their results are blatantly used in remuneration schemes that financially reward academics based on these JIF scores. A sample of such policies may be found in the scheme implemented in 1984 in Mexico, known as Sistema Nacional de Investigadores (SNI), with the aim of supporting a current staff of ca. 25,000 researchers. The economic support for these individuals is on a scale between 1 and 5, and is based essentially on their portfolio of publications. As set out in the documents of the SNI, metrics used during the periodic evaluations of researchers are the JIF and the number of citations received, omitting self-citations and citations in theses (CONACyT 2016). Although this economic scheme certainly prevented a brain drain and boosted Mexican scientific production over the past 30 years, it has also been the crucible for deviant behaviors, the most visible being the steady growth of guest authorship (Gómez Nashiki et al. 2014). So, whatever arguments exist in favor of the possible relation between a JIF and quality, the potential usefulness of the JIF becomes erased by its global abuse in academics, and the metric seems to survive only because of its use in large institutional schemes.
When the JIF was still owned by Thomson Reuters, a rewards system was put into place to offer some form of recognition to the most highly cited researchers, referred to as Highly Cited Researchers (HCRs).Footnote 1 Each year, a few months after the new JIF has been published, the lists of HCRs are released, leading to great jubilation among scientists, laboratories and institutes whose staff appear among this exclusively cited elite. In 2016, Clarivate Analytics was to have celebrated its first HCR announcement, following in the tracks of its predecessor. Unfortunately, something went grossly wrong during this process. Clarivate Analytics sent a congratulatory email and a link to the digital badge and certificate that accompanies the HCR announcement to an undisclosed number of scientists (Appendix 1). The problem is that several (perhaps the majority) of these were in fact not HCRs. What followed was a fairly chaotic scene on Twitter (some samples in Fig. 1). Some scientists made mockery of the situation, while others sounded totally baffled, and delighted, at having been selected as a HCR. Within hours, Clarivate Analytics contacted those who had been erroneously rewarded to indicate that a mistake had occurred (Oransky 2016) (Appendix 2), while taking the opportunity to praise HCRs.
On November 21, 2016, the first author contacted Heidi Siegel, the Director of External Relations at Clarivate Analytics to provide comment and explanation, and to indicate the precise number of researchers that had been erroneously awarded the HCR badge for a few hours. The response was received within 24 h: “Regarding information about a precise number of scientists who were sent the wrong highly cited researcher email last week, there were a number of people who received the letter in error. However, the number we should focus on are the 3265 Highly Cited Researchers (HCRs) for 2016 who are to be celebrated. Highly Cited Researchers derive from papers that are defined as those in the top 1% by citations for their field and publication year in the Web of Science. As leaders in the field of bibliometrics we appreciate the effort required to reach this achievement and celebrate those who have done so this year.”
This statement is of some concern because it cements the notion that Clarivate Analytics is trying to underscore the seriousness of the technical error, while continuing to promote the now old-fashioned (since the launch of Elsevier’s CiteScore) and academically useless JIF as a marketing and gaming tool in academia.
In essence, there has been no change in the JIF-gaming mentality in the transition from Thomson Reuters to Onex and Baring Private Equity Asia. The lack of transparency and accountability that had been pointed out in detail in 2013 (Teixeira da Silva 2013b) thus continues. The integrity of the published JIF will continue to be questioned by many researchers, as long as the database used to calculate it is not made available, or, even worse, manipulated in a way that suits the interests of Clarivate Analytics customers (Rossner et al. 2007). On the other hand, it is unclear why Clarivate Analytics considers that the number of researchers who were erroneously contacted and congratulated for being HCRs in 2016 should be kept secret. With such opaque practices, how can the community trust the JIF-related figures calculated by this company?
There is a common feature between HCRs and JIF-related metrics: both have highly skewed distributions, and the 3265 HCRs identified by Clarivate Analytics for 2016 obviously represent considerably less than 1% of the global scholarly community. Although we agree with Clarivate Analytics that the “Highly Cited Researchers 2016 represents some of world’s most influential scientific minds”Footnote 2 and therefore that most of those HCRs set a good example to follow, we remain convinced that the delivery of badges finally sounds as a futile exercise, with no true academic value, and based on unverifiable data. Moreover, Clarivate Analytics seems to have missed the target, by confusing the prestige (deserved or not) of researchers and the actual weight of their research, increasingly carried out by large collaborative teams including essentially non-HCRs. By focusing on the wrong target, or not entirely on the full complement of targets, Clarivate Analytics is perpetuating the prevalence of vanity in science, and the HCRs badges are reminiscent of the optical illusion created by the American illustrator Charles Allan Gilbert (Kearl 2015),Footnote 3 illustrating the sentence found in the Latin Vulgate: Vanitas vanitatum, omnia vanitas.
References
Callaway, E. (2016). Beat it, impact factor! Publishing elite turns against controversial metric. Nature, 535, 210–211. doi:10.1038/nature.2016.20224.
Casadevall, A., & Fang, F. C. (2015). Impacted science: Impact is not importance. mBio, 6(5), e01593-15. doi:10.1128/mBio.01593-15.
CONACyT. (2016). Área II: Biología y Química Criterios Internos de Evaluación. http://www.conacyt.gob.mx/index.php/el-conacyt/sistema-nacional-de-investigadores/otros/marco-legal-sni/criterios-sni/828-criteriosespecificosareaii/file. Accessed: 6 Jan 2017.
Favaloro, E. J. (2008). Measuring the quality of journals and journal articles: The impact factor tells but a portion of the story. Seminars in Thrombosis and Hemostasis, 34(1), 7–25. doi:10.1055/s-2008-1066030.
Garfield, E. (2006). The history and meaning of the journal impact factor. Journal of the American Medical Association, 295(1), 90–93. doi:10.1001/jama.295.1.90.
Gómez Nashiki, A., Jiménez García, S. A., & Moreles Vázquez, J. (2014). Publicar en revistas científicas, recomendaciones de investigadores de ciencias sociales y humanidades. Rev Mex Invest Educativa, 19(60), 155–185. (in Spanish with English abstract).
Kearl, M. C. (2015). The proliferation of skulls in popular culture: A case study of how the traditional symbol of mortality was rendered meaningless. Mortality, 20(1), 1–18. doi:10.1080/13576275.2014.961004.
Larivière, V., Kiermer, V., MacCallum, C. J., McNutt, M., Patterson, M., Pulverer, B., et al. (2016). A simple proposal for the publication of journal citation distributions. BioRxiv. doi:10.1101/062109.
Marketwatch. (2016). Thomson Reuters announces definitive agreement to sell its Intellectual Property & Science business to Onex and Baring Asia for $3.55 billion. http://www.marketwatch.com/story/thomson-reuters-announces-definitive-agreement-to-sell-its-intellectual-property-science-business-to-onex-and-baring-asia-for-355-billion-2016-07-11?siteid=nbsh. Accessed 6 Jan 2017.
Oransky, I. (2016). Sorry, researchers: That Thomson Reuters “highly cited” designation you received is probably wrong. http://retractionwatch.com/2016/11/21/sorry-researchers-that-thomson-reuters-highly-cited-designation-you-received-is-probably-wrong/. Accessed 6 Jan 2017.
Rossner, M., Van Epps, H., & Hill, E. (2007). Show me the data. Journal of Cell Biology, 179(6), 1091–1092. doi:10.1083/jcb.200711140.
Teixeira da Silva, J. A. (2013a). Issues in publishing and science. Asian and Australasian Journal of Plant Science and Biotechnology, 7(Special issue 1), 126 pp.
Teixeira da Silva, J. A. (2013b). The Thomson Reuters Impact Factor: critical questions that scientists should be asking. Asian and Australasian Journal of Plant Science and Biotechnology, 7(Special issue 1), 81–83.
Woolston, C. (2014). High retraction rates raise eyebrows. Nature, 513(7518), 283. doi:10.1038/513283f.
Acknowledgements
The authors thank the 12 scientists for their explicit permission to publish their Tweets shown in Fig. 1.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
The authors declare no conflicts of interest. The second author received a congratulatory email from Thomson Reuters IP and Science, indicating that he had been named a 2016 Highly Cited Researcher. The second author was subsequently sent an email indicating that the original mail had been sent in error. The second author is currently a member of the Sistema Nacional de Investigadores (SNI) in Mexico.
Appendices
Appendix 1
An anonymized verbatim sample of the erroneous congratulatory email sent to a non-HCR. According to a December 24, 2016 report published by Bloomberg,Footnote 4 the signatory, Mr. Vin Caraher, served as “President of Intellectual Property & Science at Thomson Reuters Corporation since 2015 until October 12, 2016.”
“De: Thomson Reuters IP and Science [email redacted]
Envoyé: vendredi 18 Novembre 2016 08:06
À: [email redacted]
Objet: Congratulations Highly Cited Researcher!
Dear [recipient name redacted], I would like to extend congratulations on being named a 2016 Highly Cited Researcher and to announce the availability of the official 2016 list. You were selected as a Highly Cited Researcher because your work has been identified as being among the most valuable and significant in the field. Very few researchers earn this distinction—writing the greatest number of reports, officially designated by Essential Science Indicators as Highly Cited Papers. In addition, these reports rank among the top 1% most cited works for their subject field and year of publication, earning them the mark of exceptional impact. Now that you have achieved this designation, you will always retain your Highly Cited Researcher status. Share your recognition!
-
Add this badge to your website, LinkedIn profile and email signature.
-
Request a physical copy of a personalized letter and certificate for display. Requests can be made through the end of December.
-
Join the conversation on social media using the hashtag #HighlyCited.
I applaud your contributions to the advancement of scientific discovery and innovation and wish you continued success.
Best regards,
Vin Caraher”
Appendix 2
An anonymized verbatim sample of the apology to the non-HCR who was erroneously sent a congratulatory email in Appendix 1. Kindly contrast the signatory of this email with that in the email that is displayed in Appendix 1. Also notice that by addressing the email recipient as “Dear Researcher” indicates that a bulk email was sent to all non-HCRs who had been congratulated by the email in Appendix 1.
“De: Thomson Reuters IP and Science [email redacted]
Envoyé: vendredi 18 Novembre 2016 11:48
À: [email redacted]
Objet: 2016 Highly Cited Researcher status update
Dear Researcher, We recently sent you an email about being named a Highly Cited Researcher. This was sent in error. Please accept our sincere apologies. We’ve identified the error in our system that caused this and were able to resolve it quickly, ensuring it won’t be repeated. Highly Cited Researchers derive from papers that are defined as those in the top 1% by citations for their field and publication year in the Web of Science. As leaders in the field of bibliometrics we appreciate the effort required to reach this achievement and celebrate those who have done so this year. Sincerely, Clarivate Analytics”
Rights and permissions
About this article
Cite this article
Teixeira da Silva, J.A., Bernès, S. Clarivate Analytics: Continued Omnia vanitas Impact Factor Culture. Sci Eng Ethics 24, 291–297 (2018). https://doi.org/10.1007/s11948-017-9873-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11948-017-9873-7