On ‘Best Practice Rules’ of Publishing and Their Erosion—A Cause for Concern

We have recently been alerted to the fact that an article published in Minerva had a double in the journal Scientometrics where it had been published by the same author just a few weeks before. A check of text overlap produced a disturbing result: 29% text identity, two tables and four figures exhibited the same numbers even though the format differed slightly. It is a case of ‘self-plagiarism’. Many of us if not all have done this, myself included, with one important exception: text which has already appeared in another context must be cited so that editors and reviewers are aware of the multiple use of text. No one can expect them to find such cases on their own. Unless someone sees these duplications by accident, they go unnoticed and inflate the communication process. But scholars, editors and reviewers need to know because journal space is scarce and so is the time researchers spend looking for results relevant to their own work, not to mention the time reviewers are asked to spend on providing constructive criticism to authors. Unnecessary and hidden multiplication of text and results abuses both.

To protect these scarce resources, covert duplication has become a matter of concern for journals and research councils. While so far little is known about duplicate publication in social sciences and the humanities, other disciplines have already reacted. von Elm et al. (2004) proposed a decision tree for identification of duplicate publications, based on a study in anesthesia and analgesia (the current case could fit for pattern 1 A of this decision tree): “Duplicate publication is the publication of an article that overlaps substantially with an article published elsewhere. This practice may be acceptable in particular situations. However, authors must acknowledge the main article overtly by using a cross-reference. Covert duplicate publication has been widely disapproved. This practice is wasteful of the time and resources of editors, peer reviewers, and readers, and it is misleading because undue weight is given to observations that are being reported repeatedly…. Finally, covert duplicate publication is dishonest; it undermines the integrity of science” (von Elm et al. 2004, p. 974). Moreover, “… to produce an article that overlaps substantially with an already published article without adequate cross-referencing is misconduct” (op. cit., p. 979). Most experienced guidelines for cases of duplicate publication and multiple submission can be found in Sect. 8.2.4 (“Allegations of Misconduct”) of the Publication Services and Products Board Operations Manual of IEEE. There is a specific chapter (8.2.4.F) with “Guidelines for editorial reuse of previously published material, and adjudicating inappropriate reuse of previous work or the failure to inform editors of previous publications or multiple submissions” (IEEE Publication Services and Products Board Operations Manual 2009, p. 74ff.). The board admits that “it is common in… publishing for material to be presented at various stages of its evolution. … this can take the form of publishing early ideas in a workshop, more developed work in a conference and fully developed contributions as journal… papers. The IEEE recognizes the importance of this evolutionary publication process as a significant means of scientific communication and fully supports this publishing paradigm”. However, “the IEEE requires that this evolutionary process be fully referenced” (op. cit. p. 74). The manual has also recommendations for cases of multiple submission: “… authors should only submit original work that has neither appeared elsewhere for publication, nor which is under review for another refereed publication. Multiple submission is defined as a given manuscript being concurrently under active consideration by two or more publications” (op. cit. p. 75). Chapter 8.2.9 (“Referencing Guidelines”) of the same document gives details on author obligations: “A manuscript submitted for publication to IEEE should be original work submitted to a single IEEE publication. The manuscript should not have been published previously and should not be concurrently under consideration for publication elsewhere. … When an author reuses text, charts, photographs, or other graphics from his/her own previously published material, the author shall: 1. Clearly indicate all reused material and provide a full reference to the original publication of the material and 2. If the previously published or submitted material is used as a basis for a new submission, clearly indicate how the new submission differs from the previously published work(s)” (op. cit. p. 79). Finally, the German Science Foundation (DFG) has published ‘recommendations on good scientific practice’ which in similar form are probably shared by research councils everywhere. They state: “Publications intended to report new scientific findings shall… • give correct and complete references to previous work by the authors and by others (citations), • repeat previously published findings only inasmuch as it is necessary for understanding the context, and in a clearly identified form” (Deutsche Forschungsgemeinschaft 1998, p. 62). Clearly both rules were violated in this particular case as were the rules stated above.

I discussed the matter with colleagues and also—anonymously—with the chair of a national committee on scientific integrity. Reactions varied widely. One colleague plaid down the case and pointed out that ‘everyone does it’. The chairperson of the said committee was much less nonchalant and recommended withdrawal of the article. I side with the latter. The rules are clear, simple and unambiguous. They should be obvious to everyone working in academia. However, two questions emerge: (1) Why is the rule being violated? (2) Why do reactions to their violations vary so much? The answer to the first question is fairly straightforward. Pressure on young researchers to publish in international peer reviewed articles borders on the absurd. This pressure is being exerted by administrations in science ministries, universities and research councils whose members neither read any of the articles (they just look to others who do not read them either but count them instead) nor give any thought to the repercussions of their policies on the communication process in science, the researchers and the journals.

The answer to the second question is harder to come by. Everyone at some stage in the academic career is aware of a general and fundamental change in the conditions of academic work, of the atmosphere, of relationships to colleagues etc. People adapt in different ways to such changes and when survival is becoming more difficult, cleverness to succeed is valued more highly. Cleverness is contrasted with naiveté. Playing by the rules is considered naïve. Bluffing, even if it verges on lying, is looked upon as a legitimate strategy. In reference to the article in question: is it correct to list a reputed research organization as institutional affiliation even if one was just a visiting fellow several years past and never received a cent from their budget? No! It is playing clever, trying to impress the uninformed department chair or the evaluating committee.

If this becomes widespread practice in academia, if the clever bluffers are admired for their successes and those who continue to play by the rules are pitied for their naiveté, academia will soon be in serious trouble. Science rests on trust, part of which is supported by motivations that favor devotion to knowledge production over the collection of bonuses. The uniqueness of this institution rests in the combination of trust and competition. As the editor of this journal I am old and conservative enough to defend this model together with the policy that Minerva publishes only original articles. I urge all our readers and contributors to do the same and to resist the pressures described—if for no other reason that you can never be sure if there is not someone somewhere who beats you at the game of bluffing.