This is with reference to the endemic problem of research misconduct in even notable universities and is in response to Shubhada Nagarkar’s ‘Research paper mills’: A factory outlet for dubious research [1]. Despite such articles routinely appearing in prestigious journals, the problem seems unfixable due to the structure of the modern university with its fixation on research metrics, rather than, say, social mobility, student life and exposure, mitigating social disharmony and so on, which were part of the original mandate of the idea of the university. Instead, the everyday life of university today seems built around the annual cycle of rankings, and the rise of a primarily bureaucratic leadership fixated on numbers alone. This seems to almost necessitate the entrenchment of overwhelmingly one-sided research metrics. Thus, one constantly hears stories of purchased co-authorship, of invitations to “lend one’s name” for a nominal fee — or even for free — as with the pressure to publish, shared authorship from a different country greatly increases a paper’s value. The system rewards citations and international collaborations, so nothing stops a bad actor from lending her name to a foreign paper she had no role in, in return for being allowed to name a foreign co-author in her own work. As there is no reliable way of proving who did what in a multi-authored paper — the gratuitous appearance of foreign affiliations is used to list the paper as an important international collaboration. It is thus in the interest of all parties to “play the game”. To add insult to injury, the labour of the single-author paper is particularly looked down upon. So, now, with the presence of numerous authors across several countries, it is almost impossible anymore to apportion apt blame or responsibility.
It is astonishing that software programmes can generate “papers” based on keywords [2] and that these actually get published — one has to wonder at the quality of review. But if reviews are often dependent on software systems to screen for plagiarism, then it is a battle of software, of bullet and shield, and the human is simply marginal or too busy. None of this feels surprising when in many university contexts, honest researchers are often stunned to read the names of supposedly highly published/cited scholars in their university honours list. Many openly boast of gaming the system, knowing not only that there will be no proof, but that more importantly, the university supports such activity. If the researcher is caught, as rarely happens, the university will wash its hands off her. She may be a scapegoat, but this is a risk both parties are willing to take. And the price for all this is very high — it is health science papers and research that are the most abused [3].
The blind faith in these metrics needs to be curbed by more empowered human presence and committees, and the interrogation of “star researchers”. At least some of these are known within their own department as encouraging possible misconduct. Yet, few researchers have the time or inclination to spend several man-hours trying to expose a colleague. Further, today, one needs great IT skills to expose a colleague (at her or another university) because the algorithms that drive platforms of the billion-dollar publishing industry are kept proprietary and opaque. There is also no consensus on what the penalty ought to be. If it is a high-performing researcher, the university is happy to forgive her with a slap on the wrist. Faith is difficult to cling to when everything from appointment to promotion is only algorithmically determined. Indian universities, under financial pressure as well as the pressure of rankings, are more likely to depend on these opaque metrics (and look the other way till an individual may be publicly exposed), than wealthier countries which do still appoint integrity officers. These universities understand that ill-understood yet overused metrics such as the much-touted h-index have only a poor predictive power and cannot serve serious research with longer gestational periods [4]. It is salutary to remember that the ultimate cost of hurried research is borne by students who face distracted professors who are unable to develop the India-relevant research that is the aim of the National Educational Policy [5].
Author: Nikhil Govind (nikhilgovind@hotmail.com), Professor, Manipal Centre for Humanities, Manipal Academy of Higher Education, Manipal 576 104, Karnataka, INDIA.
To cite: Govind N. Is research misconduct becoming unstoppable? Indian J Med Ethics. Published online first on February 15, 2025. DOI: 10.20529/IJME.2025.012
Manuscript Editor: Sandhya Srinivasan
Copyright and license
©Indian Journal of Medical Ethics 2025: Open Access and Distributed under the Creative Commons license (CC BY-NC-ND 4.0), which permits only noncommercial and non-modified sharing in any medium, provided the original author(s) and source are credited.