Introduction

Evidence based health care entails quality evidence, but rare consideration is given to how “the evidence” is produced. Scientific circles has recently witnessed grave examples of scientific misconduct, yet appear to have had little impact in dentistry, signifying either a lack of awareness or possibly a certain tolerance. However a recent survey (Martinson et al. 2005) and a meta-analysis (Fanelli 2009) suggest that the few cases that do float up, represent only the tip of a large iceberg. In fact, it is suggested that if, on average, 2 % of scientists admit data falsification at least once, and up to 34 % admit other questionable research practices, the actual frequencies of misconduct could be much higher (Fanelli 2009). In addition, it appears that misconduct is more widespread in clinical, pharmacological and medical research than in other fields, although the reasons for this are unclear (Fanelli 2009; Luther 2010).

This paper therefore highlights:

  • Reasons for scientific misconduct

  • Indian scenario

  • Steps that can be taken to reduce scientific misconduct.

What is scientific misconduct?

Scientific misconduct is not new. Despite serious efforts there is no universally accepted definition of scientific (research) misconduct (Scott-Lichter and the Editorial Policy Committee 2009).

The U.S. National Science Foundation (NSF)Footnote 1 defines three types of research misconduct, which are Fabrication, Falsification and Plagiarism These “big three” of research misconduct are generally more addressed. Fabricating data involves creating a new record of data or results. Informed consent Forms and Patient diaries are the most commonly fabricated documents. Falsifying data means altering the existing records. It is the deliberate omission or distortion of undesired data or results (Gupta 2013). Plagiarism is defined by the United States Office of Research Integrity (ORI) as “both the theft or misappropriation of intellectual property and the substantial unattributed textual copying of another’s work” (United States Department of Health and Human Services 2008). Recent interest in plagiarism as an academic matter has grown immensely. Whether the fault is missed citation in a non-integral part of the paper, or selfplagiarism, or wholesale duplication of passages from others work, the negative consequence are not merely distortion of evidence base towards plagiarised ideas, but also the erosion of public’s confidence in the products of scientific research. Plagiarism as opposed to fabrication and falsification has the characteristic of having direct “victims” in individuals whose work was unattributed, and who should be involved in the review of new matter in their field (Rathod 2010). Increasingly, however, a multitude of “questionable research practices,” including selective reporting, suppression of negative findings, hiding conflicts of interest, redundant publication, violations of ethical standards in human or animal studies, especially the issues of informed consent with vulnerable populations, listing authors on papers who have done little or nothing (gift authorship), “ghost authorship” (non-inclusion of individuals as authors who played a valuable part in the work and were qualified for authorship), “duplication” (publication of the same paper with little or no change at all in its content in different journals), “salami” publishing, where authors slice up their research, carving multiple papers from a single study with the sole aim of having multiple publications and much more, probably does more damage to science than the “big three” (Smith and Koehlmoos 2013; Dhingra and Mishra 2014).

How common is scientific misconduct?

There can be no definitive answer on how commonly research misconduct occurs as its best definition is still debated. Minor forms of misconduct are common, but precise information on the prevalence of serious misconduct is very low (Smith 2006). A survey (Martinson et al. 2005) of 3247 US, National Institutes of Health (NIH) funded scientists, reported that in response to pressure from funding source, 15.5 % changed the design, methods or results of a study; 10 % withheld details of methods or results in papers or proposals (Luther 2010). A direct survey of NIH-funded scientists, calculated that the scientists observed, as an absolute minimum, 2,325 incidents per year (Titus et al. 2008). A recent systematic review of studies shows frighteningly high levels of misconduct in high-income countries: nearly 2 % of scientists had themselves fabricated or falsified data, and one-third admitted to questionable research practices, including selective reporting (e.g., “dropping data points based on a gut feeling”) and altering an experiment or its results “in response to pressures from a funding source”. When asked about other researchers, those surveyed said that they believed as many as 14 % of their colleagues had falsified or fabricated data and nearly three-quarters were guilty of questionable research practices (Smith and Koehlmoos 2013).

Among low- and middle-income countries, the volume of research has increased most dramatically in China. A 2006 article in Science described the country as a “scientific Wild West, where an unprecedented number of researchers stand accused of cheating—from fudging resumes to fabricating data—to gain fame or plum positions”. The National Science Foundation of China investigated 542 allegations of misconduct and found positive evidence in 60 cases. The main problems were plagiarism (34 %), data falsification (40 %), and data fabrication or theft (34 %). Unfortunately, it is difficult to prove or disprove misconduct (Smith and Koehlmoos 2013).

Surprisingly, no data is available for dentistry. A survey undertaken by the American Association for Dental Research (Bebeau and Davis 1996) reported falsification of data had by 30 % of the 76 (out of 98) program chairs/Association officers responding, and 54 % having observed plagiarism at least once (Bebeau and Davis 1996).

Reasons for scientific misconduct

Attitude of tolerance appears to be one of the major reasons for scientific misconduct (Luther 2008). Authors, funders, and institutions are compounded by the pressure to publish in an increasingly competitive research and educational environments (Luther 2010). Scientific misconduct may creep in when researcher’s primary goal of research is not scientific discovery but only publication. An unrealistic demand for perfect results and thorough understanding in research papers have delayed scientific communication and also forced some creations of artifacts or even fakeries. Also, bad selection processes rewards faster and better those scientists most good at making high profile publication but least good in mining truly ground-breaking discovery. Worse than this, the outcome of this selection often exalts cheaters than true pioneers (Liu 2006). The incentives for people who cheat are enormous. Duplication, salami slicing and gift authorship reap in great rewards. It is therefore natural that to attain higher academic positions people sometimes take the shorter route and commit serious crime. For example, nowadays, Hirsch index is used to measure the productivity and impact of a scientist. Many universities or organizations consider this index for allotting research funds or to appoint the person as a chair in the university. The total number of publications influences the number of citations, hence the incentive to cheat (Kekre 2012).

Furthermore, data and images are easily manipulated by electronic methods (Luther 2010). Sloppy behaviour, laziness sometimes spill over to fraud (Claxton 2005). Misconduct becomes easier for scientists because the system operates on trust. Also scientists happen to be victims of their own rhetoric: they have fooled themselves that science is a wholly objective enterprise, untarnished by the usual human subjectivity and imperfections (Smith 2006). There are elements of an ‘infallibility’ or ‘I know I’m right really’ complex (Luther 2008). Innocent ignorance like backdating the subject’s signature on a consent form or disposing source documents after accurate transcription or even creating source documents from case record forms could also lead to misconduct. Amount of oversight of the study, existence of explicit versus implicit rules, penalties and rewards attached to such rules, extent of training imparted, regulations involved and insufficient mentoring are other related factors (Gupta 2013).

Why research misconduct matters

Research misconduct stands similar to child abuse today. We didn’t recognise it earlier, now we see a lot. It challenges public trust in medical research and health workers. It can lead to wrong or ineffective or harmful molecules being brought in the market thus delivering ineffective or harmful treatment to the patients (Gupta 2013). Misconduct may dilute the already existent research corrupting the scientific record and leading to false conclusions. Repeating fraudulent aspects of research and investigating such records both results in massive cost to the sponsors in terms of resources. Fraudulent clinical research adversely impacts the core of good clinical practice, i.e., rights, safety and well-being of research participants. But what matters most is that the majority of the countries do not have good systems of either treatment or prevention of it.

Indian scenario

In India, such unethical practices are thought to be rampant and all pervasive. Many especially in India as well as in Asia, whose first language is not English, indulge in unintended inappropriate paraphrasing. The concept of ownership of ideas and words varies from culture to culture. What is plagiarism in a western context may be a sign of deep respect in another (Kekre 2012). Fabrication and falsification of data in India is “rare and more sophisticated” comments Balaram editor of Current science.Footnote 2 In India most cases of misconduct are due to plagiarism— the retracted papers are often published first in low profile journals which are not indexed, hardly ever cited or read where authors try to “fly under the radar”. The lack of institutional transparency and scientific institutes’ unwillingness to investigate scientific misconduct, make their findings public, and take action adds fuel to the fire (see footnote 2). Some cases particularly those involving influential scientist are not raised fearing reprisals. India has no specific law pertaining to scientific misconduct. Universities or Sponsors or Institutions are rested with the responsibility of investigating and taking action against such instances and then they need to report the same to Drug Controller General of India, a central body, which is responsible for approval of clinical trials in India (Gupta 2013). Proper legal definition and a statutory mechanism to deal with such cases is also lacking. Also India’s problems go beyond defining scientific misconduct. What to do after a complaint comes up is even more difficult to know. “There are many fields of science where India may not have sufficient scientists of caliber who would arrive at a sound judgment,” remarks E. D. Jemmis, a chemist at the Central University in Hyderabad. Others are cynical about whether even reputed scientists can be impartial. “Many have their own cliques whom they will help and protect,” remarks one scientist.Footnote 3

Moreover an influential scientist committing a really serious offence doesn’t get even a rap on the knuckles while a junior researcher committing a minor breach is likely to be punished. “Regrettably, there has always been a tendency to award harsh punishments to those who may least deserve it; students and post-doctoral fellows can easily be removed from their positions when there is a whiff of trouble,” commented P. Balaram in an editorial in Current Science. “Senior scientists, on the other hand, are protected by institutional armour, powerful colleagues and the general reluctance to wage a prolonged battle to establish facts” (see footnote 3).

Supervisor or sectional head in national laboratories of CSIR, DRDO, ICMR, ICAR, etc are awarded gift authorship in a research paper quite routinely. Honours and Fake degrees are available on internet and by mail. Directors/ Vice Chancellors are increasingly appointed on the basis of political understanding or pliability. Bribes set appointments and transfers of faculty in state controlled institutions. Bribes and kick-backs are sneaking in national funding agencies and departments. Questionable claims of achievements are being made by government departments through lavish advertisements.Footnote 4

Business with Knowledge was highlighted by a popular national magazine when it listed a number of Vice Chancellors as Chancellors of Vice. Many Heads of Institutions are not even prepared to acknowledge the existence of such a problem, leave alone take any action (see footnote 4). A paper by Prof T A Abinandanan finds that misconduct rates in India have risen from 10 per 100,000 papers in 1991–2000 to 44 per 100,000 papers in 2001–2010. Also, 70 papers out of 103,434 papers published from India in 2001–2010 period have been retracted. Of the 70 papers retracted, 45 papers were attributed to some form of plagiarism—23 for text plagiarism, 18 for self plagiarism, and 3 for data plagiarism. An analysis posted on the Nature journal’s blog (Richard Van Noorden and Bob O’Hara) found that India had the highest fraud rate in the world—18 papers of 100,000.Footnote 5

Plagiarism and other academically unethical practices are quite common even in prestigious and elite institutes of nation such as IITs and Central Universities. By the end of the year 2010, three Indian Institutes of Technology (IIT Kanpur, IIT Delhi, IIT Kharagpur), have also become controversial due to alleged scientific misconduct and unethical practices.Footnote 6 Some Indian Universities are reported to have plagiarized dissertations awarded PhDs, thanks to a cozy nexus between the Guide, University and examiners. It is even worse in medical colleges with most so called theses/dissertations submitted for MD/MS/DM/MCh degrees are but a repeat of published material collected from several sources. With examiners generally uninterested in going beyond the title page, the conduct, recording and reporting of research in most medical colleges (including some renowned ones) is abysmal (Satyanarayana 2010). Prof Srivastava recommended that an approved and final copy of the PhD thesis of a student should list the names of the examiners of the thesis so that it is a public knowledge (see footnote 4).

In a pan-India audit survey by Lady Hardinge Medical College and Maulana Azad Medical College, 91 % respondents had some knowledge of publication ethics; but only 29 % believed it was adequate. This lack of knowledge may well be the foundation for future publication misconduct. Gift authorship was reported by 65; 56 % reported data alteration, 53 % observed Plagiarism; while 33.5 % observed ghost authorship. A majority of respondents reported witnessing publication misconduct, thereby revealing the common occurrence of this problem among Indian biomedical researchers (Dhingra and Mishra 2014).

In absence of a statutory body to investigate academic misconduct, the Society for Scientific Values (SSV) was set up in 1986. The society has no legal or administrative powers, but acclaims high moral credibility. It has taken up cases from time to time, where values intrinsic to science have been compromised. The infamous case of Prof Rajput, the VC of Kumaon University, is in point. The SSV and numerous others made all efforts to move the President of India, the state and central governments but nothing would move till three American scientists wrote to the President for an immediate action (see footnote 4). The Himalayan geology scandal at the Punjab University, Chandigarh, in the journal Nature has been widely tinted. Also International chaos due to the embroilment of Indians—V.J. Gupta, Ram B. Singh, R.K. Chandra et al. and the absence of substantial corrective measures is not exactly encouraging (Satyanarayana 2010). A plagiarism charge against a senior scientist of Centre for DNA and Fingerprinting Diagnostics rocked the premier research institution The scientist suspended after prima facie evidence was found was strangely reinstated even as the investigation was on.Footnote 7 Such practice of revocation of suspension without review is not permitted in Government of India service rules. More damaging is that the said scientist moved back to the same post without even giving his reply to the charge memo. Such incidents raise doubts whether more such actions are being hidden. The case of misrepresentation of data in two JBC publications by Dr. Kundu and his students, received a very wide coverage in the media. The second paper published in JBC was withdrawn by the journal.Footnote 8 The case of Raghunath Mashelkar—one of India’s most decorated scientists—is especially informative of the public and personal ramifications of plagiarism. Publicity around the matter preceded Dr Mashelkar’s request to withdraw his report, and his resignation from the committee (Rathod 2010). A paper of a senior academic at All India Institute of Medical Sciences (AIIMS), New Delhi was withdrawn after the editors found several overlaps including figures (plagiarism) from another review published in 2001 by a UK-based professor. Seven professors of AIIMS including a former Director were charged of publishing the same article in two different journals. A professor in the S. V University, Tirupati published almost 70 plagiarized papers in prestigious journals. In another case a sustained national furor forced resignation of a Vice-Chancellor with proven charges of plagiarism. A former Director-General of the CSIR and President of the Indian National Science Academy numbered in two allegations of plagiarism. In a shocking disclosure, 10 of 18 students who copied their way to admission to a US business school in 2010 were Indians. The website of SSV New Delhi lists many more cases (Satyanarayana 2010). On average, SSV investigates around 200 new complaints of plagiarism and corruption against scientists every month.Footnote 9

The dental scenario is also not very assuring. India constitutes approximately one third of the dental school worldwide, comprising about 310 dental colleges. Annually around 25,000 dentists are graduating in India. Since the last decade there has been a sudden uncontrolled mushrooming of colleges. And as it goes without saying, any growth seemingly uncontrolled, called malignancy in science, should be eyed with suspicion. This uncontrolled mushrooming has resulted in number of health care professionals of questionable quality and doubtful integrity. Most of the dental universities and schools and are not having an ethical committee. Ethical issue like ethical committee clearance, informed consent are being taken for granted; being cited in the manuscript or research paper, without the actual clearance or consent being taken (Singh and Purohit 2011).

Similarly health care journals have seen a rapid escalation in the country. Many Open Access journals send e-mails to the researchers with the offer of publishing their research papers within 2 weeks and with some publication charges. Although mentioned peer reviewed, many research articles are published without a review process (see footnote 9, Singh and Purohit 2011). These journals have a sole aim of taking advantage of dental or medical council’s rule of publications for promotion. An amendment by the Medical Council of India, in 2009, introduced a compulsory minimum number of publications as a criterion for early academic promotion (D’Souza 2010). There are several teachers in medical colleges who have fulfilled all criteria for promotion except publications (D’Souza 2010). Young researchers fall in such trap as quick and easy publication to earn job, promotion or research funding could be tempting.

Also the journals with International Standard Serial Number (ISSN) are legal and cannot simply be shut down on the basis of few suspicions. However universities can act by discontinuing or improving substandard university-owned journals if the regulatory bodies such as the University Grants Commission (UGC) in India issue such an advice (see footnote 9). Prof. Chopra says ministries and funding agencies should keep their own blacklists and insist on scientists publishing only in above-board journals as a condition of funding—which again raises the issue of defining ‘above board’ and ‘substandard’ (see footnote 9).

There are several such instances in Indian Science in India—there is no channel to investigate the scientific misconduct. There is no agency to conduct the investigation impartially. Most of the time the institute constitutes the committee (from its own department!) which gives clean report and the whistle-blower suffers very badly. The ICMR has brought out guidelines on authorship for both intra and extramural research, though not strictly enforced. Therefore SSV resolutely wages a lonely battle for cleaning up the Indian science of its known ills since 1986 (Satyanarayana 2010). SSV has also from time to time, organised meetings with a specific purpose. An office of research integrity to detect, investigate and punish proven scientific misconduct has been called by Indian scientists. The office would be part of a national policy on academic ethics to cater for the country’s rapidly expanding scientific community and output. Institutes should appoint ‘ethics officers’ and undergraduate studies to include mandatory ethics modules (see footnote 2). Jesani recently pointed that even after 30 years of having ethics committees (ECs), the empirical and factual knowledge about how ethics committees should function is unknown in the country. The information on how “ECs function”, the problems and dilemmas faced and experiential sharing is not available in the public domain (Bhan et al. 2010).

Nevertheless some steps are taken in right direction which deserves special mention. A course in Self Awareness by Prof Menon D was introduced in Indian Institute of Technology Madras, to make students understand and explore self awareness with foundation in “traditional Indian wisdom and modern approaches, and thereby learn to find inspiration, take responsibility for one’s inner life, live with integrity and contribute creatively towards the well-being of all.” Also a course and text book for a foundation course in “Human values and professional ethics” has been recently introduced at numerous technical universities in India (see footnote 9). Many universities (eg Maharashtra University of Health Sciences) have made research training compulsory for post graduate students.

Where is the path ahead?

No regulatory body can hope to seize all research misconduct. The primary restriction must be at institutional level. Patient care is threatened. New systems and above all, education, awareness, and enforcement are required to prevent misconduct (Luther 2010). Luther (2010) suggested some steps that can be taken to help prevent misconduct for example.

“Digital forensics” Covert image-tampering can be detected by new techniques (Farid 2009). A cloning tool to cut and paste sections of an image to obliterate unwanted details is commonly used in tampering. Mathematical algorithms can detect pixel blocks that have similar spatial offsets which may be invisible to the naked eye. Analyses can distinguish variations in the distribution of pixel blocks for example, from a random to a more structured layout (Luther 2010).

More steps to tackle misconduct

  • If journals use contributor statements, defining what constitutes “authorship”: the roles, responsibilities, and level of contribution that has to be achieved to meet the requirements of being an “author”, the problem of token authors can be influenced (Luther 2010).

  • Co-authors have a major role. They may detect fraudulent data, not possible through peer review. They can also help ensure that honest study methods are presented (Luther 2010).

  • To reduce problems associated with authors publishing the same or similar data in different papers, or “salami slicing”—cutting bigger works into smaller ones, editors could request to see all authors’ recent/related papers published and/or under consideration (Luther 2010).

  • Journals should use the software to help detect plagiarism (Luther 2010).

  • Image manipulation limits to which authors confirm that they have adhered could be introduced by the journals (Luther 2010).

  • Heightened awareness of other ‘game-playing’ (e.g., to make research easier to perform manipulation of clinically relevant differences in sample size calculations) should be encouraged (Luther 2010).

  • Institutions should ensure that they have processes which are transparent, unbiased, and allow for open investigation (Luther 2010).

  • Members should be guided by their professional societies; also their members should alert journal editors to fraud if they see it (Luther 2010).

Titus et al (2008) recommended six strategies to champion integrity

  1. 1.

    Implement zero tolerance To create a zero-tolerance culture an institution should specify and implement the requirements that all suspected misconduct must be reported, and all allegations are thoroughly and fairly investigated (Titus et al. 2008).

  2. 2.

    Protect whistle blowers The creation and dissemination of measures to protect whistleblowers should be done carefully (Titus et al. 2008).

  3. 3.

    Coach the mentors Mentors can play a major role in establishing and maintaining research rules and minimizing opportunities to commit research misconduct. An institutional venture in building better mentors is an important vehicle to uphold research integrity (Titus et al. 2008).

  4. 4.

    Simplify how to report A reporting system that clearly identifies the individuals to whom allegations should be brought, and clear policies, procedures and guidelines related to misconduct and responsible conduct should be established (Titus et al. 2008).

  5. 5.

    Exploit alternative mechanisms Continuing mechanisms to review and evaluate the research and training environment of their institution is required by universities. One such means would be auditing research records. Reduction in deficient record keeping, improper protection of human or animal subjects or the utilization of questionable research behavior can be done by mechanisms of review (Titus et al. 2008).

  6. 6.

    Model ethical behavior Behavior of powerful role models is imitated. Institutions successfully stop cheating, when they have leaders who communicate what is acceptable behavior, encourage staff and faculty members to follow the policies, focus on ways to develop and promote ethical behavior, develop fair and appropriate procedures for handling misconduct cases, and provide clear deterrents that are communicated (Titus et al. 2008).

Smith AJ suggested robust peer review and training to be warriors of research. Robust peer review by reviewers with considerable expertise and knowledge in the area of research that they are reviewing remains one of the most effective means of assessing research for publication (Smith 2008). Researchers should receive adequate training in research integrity. Formal training programs in research integrity for graduate students and young researchers should be adopted. Even some experienced researchers may benefit from such training (Smith 2008). Faggion Jr (2011) suggested publishing the original data (“raw data”) used by authors in preparing a manuscript an effective means of deterring or reducing scientific misconduct. The raw data would allow interested readers and other research groups to reproduce or verify the analyses used in an article (Faggion 2011). Also The trial protocol should be registered in a public clinical trial registry to monitor deviations between what is reported in the trial methodology and the final published paper (Faggion 2011).

One method could be the creation of a Retraction Index, indicating the number of retractions a journal has for every 1,000 papers published thus playing the role of watchdog (see footnote 9). Marcus A and Oransky I suggested creating a Transparency Index, giving a score on how well a journal controls its manuscript review process, including how it conducts peer review, whether supporting data are also reviewed, use of plagiarism detecting software, and a number of other measures (see footnote 9). Science Exchange and journal PLOS ONE suggested the Reproducibility Initiative, providing researchers to submit their studies for replication by other labs for a fee. “Successfully reproduced study will win a certificate of reproducibility” (see footnote 9).

FangFootnote 10 recommends better funding as means to prevent misconduct. Science today is inadequately supported, resulting in a heightened competition for limited dollars. Adequate resources would not only reduce incentives for misconduct but also improve the lives of all scientists allowing them to spend more of their time searching for answers to research questions instead of funds (See footnote 10).

RadhakrishnanFootnote 11 proposed four approaches to reduce misconduct

  1. 1.

    funding for all ages Dividing funds into three groups according to career stage such that researcher will be competing for funds against other with similar experience levels (see footnote 11).

  2. 2.

    Third party data verification. Invoking an independent agency for data verification during the preliminary stages of a project could aid in generating stronger manuscripts, grant applications, and clinical trials while minimizing the occurrence of research misconduct (see footnote 11).

  3. 3.

    Strong postdoctoral forums. Invoking stronger institutional post doc associations can increase overall awareness and provide additional support within the institution (see footnote 11).

  4. 4.

    Objective manuscript review revealing the names of the reviewers or blinding the authors name can increase objectivity in scientific publishing encouraging constructive criticism thereby increasing the quality of work with reduction in research misconduct (see footnote 11).

Scientific misconduct affects authors, reviewers and editors but the worst sufferer is patient. The consequence of misconduct is same whether done intentionally or through ignorance. The seriousness of misconduct is unaffected if it is done through ignorance (Singh and Purohit 2011). Ultimately, however, journals alone cannot trample out scientific misconduct; these issues are everyone’s responsibility. Everyone should know the professional etiquette of good and bad practice in evidence based clinical research, so we may all see bad practice for what it is (Luther 2010). Individuals and institutions and not the government agencies are the guardians of research (Titus et al. 2008). The credibility of research in the public’s eyes demands research to be conducted to the highest standards of integrity (Smith 2008).