Indian Journal of Medical Ethics

COMMENTARY


Weak campaign against predatory journals: structural issues

Ruth Macklin

Published online first on August 7, 2025. DOI:10.20529/IJME.2025.061

Abstract

The International Committee of Medical Journal Editors (ICMJE) is an influential group of general medical journal editors, some from the five leading publishing houses, representing more than 50% of the market, and representatives of select related organisations. Working together to improve the quality of medical science and its reporting, they have rightly acknowledged the threat posed by predatory journals. However, we argue that ICMJE has overlooked deeper structural issues, particularly the publishing industry’s own economic interests, thus hindering efforts to address this problem. It overlooked numerous initiatives aimed at systemic change, eg, the 2012 Declaration on Research Assessment, Transparency and Openness Promotion, and the Coalition for Advancing Research Assessment. Notably, models such as Diamond Open Access, which have the potential to address the issue of predatory journals, while also posing challenges to traditional publishers’ interests, are not mentioned. By not addressing these concerns, there is a risk that the ICMJE’s crusade against predatory journals could look like cartelisation.

Keywords: scientific integrity, processing charges, peer-review, open access, cartelisation, for-profit


The International Committee of Medical Journal Editors (ICMJE), a small group of general medical journal editors and representatives of selected related organisations working together to improve the quality of medical science and its reporting significantly advanced the field of clinical trials in 2005 by mandating prospective registration of clinical trials [1]. However, it demonstrated a less ambitious stance in 2017 when, under pressure, it required only data-sharing statements instead of enforcing actual data sharing, which ICMJE had itself, in 2016, proposed as an “ethical obligation” [2]. In its latest statement published in a dozen journals [3], the ICMJE rightly acknowledged the danger of predatory journals, as resembling wolves in sheep’s clothing. However, it failed to consider that legitimate entities can sometimes be harmful and may facilitate the activities of such predators. By overlooking this diagnosis, we are afraid that the ICMJE misses critical elements of the solution.

What’s in a name? (Romeo and Juliet, Act 2, Scene 2, the “balcony scene”)

Laine et al used the term “predatory” to target “entities (that) misrepresent themselves as scholarly journals for financial gain despite not meeting scholarly publishing standards.”[3] This definition aligns with the 2019 consensual definition: “predatory journals prioritize self-interest, often financial, over scholarship.”[4] They provide false information about their identity (eg, fake impact factors, misrepresented editorial boards), deviate from best editorial and publication practices, and lack transparency in operations (eg, editorial decisions, fees, peer review processes), along with aggressive solicitations of authors.

Certainly, the ICMJE should be commended for taking this issue seriously, those who are eager to gain something out of someone else’s weakness are a concern. But it is the dark side of our societies that frequently prioritise the accumulation of wealth over other interests. Let’s first examine the so-called “legitimate” journals from various aspects of this definition.

Regarding prioritisation of self-interest over scholarship, the academic publishing industry has a large financial turnover. Its worldwide sales amount to more than USD 19 billion, which positions it between the music industry and the film industry [5]. More than 50% of the market is controlled by five large publishing houses: Elsevier, Wiley-Blackwell, Taylor & Francis, Springer Nature and Sage. Elsevier is the largest (approximately 16% of the total market and more than 3000 academic journals), profit margin approaches 40%, ― higher than Microsoft, Google and Coca Cola ― while profit tends to be in the 10–15% for traditional newspapers 5]. Comparison of financial statistics using investment websites such as investing.com is indicative of profitability. Certainly, newspapers are costly: wages for journalists, editors and graphic artists, plus expenses for fact-checking, printing and distribution, paid for through sales and advertising. In academic journals: a) the production of content, that is genuine, is paid by research funds; b) the editorial board is almost always unpaid, with, at most, a symbolic payment for the editor-in-chief; c) the control of product quality is done through “peer review”, which is unpaid voluntary work. With new technologies, dissemination is no longer a significant expense. Too few journals offer a year’s online subscription to reviewers as the BMJ does.

It cannot be a surprise then, that new actors exploit the loopholes and flaws of the system, with bad or good intentions. Simply put, the commercialisation of academic publishing has existed for a long time. Sometimes, the government or sponsors fund all stages of research production; but must then pay again to be published Open Access through article processing charges (APC) and/or to have access to the results. For example, in 2020, the French National Institute of Health (Inserm) paid €2,741,287 to provide researchers at the institution with access to major medical journals, while also paying an additional two-thirds of that amount (€1,833,205) for APCs [6]. Three of the five largest publishers were among the top recipients of APCs (Springer Nature, Elsevier, and Wiley-Blackwell) [7]. The two others (Frontiers and MDPI) are severely criticised for their editorial policy of relying heavily on “special issues” with guest editors and reduced turnaround times, resulting in substantial growth in the number of articles published annually [8].

The concern about predatory journals highlights the question: Why does the scientific publishing industry, despite its significant value, face challenges in maintaining basic quality standards and allow such issues to arise? Companies in high-risk sectors like aviation have strict quality controls. However, as with plane crashes, a single editorial mistake can be catastrophic, such as the Lancet publishing Wakefield’s fraudulent paper linking the MMR (measles, mumps, and rubella) vaccine to autism, causing public health immense damage [9]. There is simply no governing body that ensures quality and legitimacy in publishing. Organisations such as the World Association of Medical Editors (WAME), the Committee on Publication Ethics (COPE), the ICMJE, the Council of Science Editors (CSE) provide guidance but don’t even fund, develop and implement programmes with quality assurance or tools, eg, as done by Cabanac and Labbé [10] to improve the system. In our opinion, medical journals must fund and support such efforts. Even when they are faced with clear post-publication evidence of gross errors, legitimate journals often fail to correct their records [11]. Implementing such initiatives could help avoid the lengthy and difficult process of retracting papers with integrity concerns, as often encountered by Bik [12].

Lack of transparency affects more than just predatory journals. When Bucci et al flagged potentially duplicated data in the Sputnik V Covid-19 vaccine phase 1/2 publication, and requested individual patient data [13], their request was denied. The Lancet refused to comment on its data policy for clinical trials and stated it would “continue to follow the situation closely” [14]. When The Lancet made this declaration, were they referring to their subsequent publication of the Sputnik-V phase III trial, a study that would again be marred by inconsistencies and an excessive homogeneity in the results [15, 16]? While aviation has procedures for dealing with severe failures, medical journals handle these issues through editors, often keeping crucial information opaque. To what extent do editors, such as the editor of Science, advocate for addressing questions of integrity publicly? [17]

Of course, predatory journals are unethical and misrepresent their credentials. However, it can be argued that more reputable entities use flawed metrics like impact factors to attract scholars. Despite numerous calls for its abandonment, such as the 2012 Declaration on Research Assessment (DORA), at the annual meeting of the American Society for Cell Biology [18], the impact factor is all too often considered as a proxy for quality and a marketing tool for journal editors and publishers.

Are authors always “prey”?

The ICMJE asked “what can we do to protect their (predatory journals’) prey?” [3]. Claiming that “in some situations, authors under pressure to publish may knowingly choose to publish in suspect journals to build a long list of publications to support academic promotion” is, at best, the ostrich policy [3]. It is commonly believed that predatory publishing predominantly affects researchers in low- and middle-income countries, who are pressured by an inequitable publication system with prohibitive fees. However, a thorough review of nearly 2,000 biomedical articles from suspected predatory journals revealed that over half of the corresponding authors were from high- and upper-middle-income countries [19]. In these articles, the US National Institutes of Health (NIH) was identified as the most frequent funder, and even prestigious institutions, such as Harvard University and the University of Texas, had articles published in these journals [19]. This is not without consequences. Some predatory journals have entered biomedical databases through public funding, despite not being officially indexed in Medline. An example includes OMICS articles accessible via PubMed through PubMed Central (PMC), facilitated by NIH’s open access policy which mandates researchers to submit final peer-reviewed manuscripts to PMC upon acceptance [20]. How could a group of authors, each with one doctoral degree at least, and led by a senior researcher, be fooled by pseudo-journals with awkward sounding titles, large editorial boards comprised of relatively unknown individuals, along with high APCs? Predatory journals generally have distinct traits that set them apart from legitimate journals [21].

Can’t the ICMJE call a spade a spade? The victim is often the culprit, and their institutions or sponsors are accomplices. The seemingly conscious inertia of the latter is a very powerful force and reveals an active resistance to change. Institutions bear responsibility, as they often pay APCs; instead, they should refuse to fund questionable journals. Additionally, several institutions, like French hospitals, are financially incentivised to publish, leading them to pay fees even to dubious journals due to the potential return on investment [22]. Alternative frameworks for assessing research, like the Hong Kong Principles, aim to reward research quality over quantity, but their adoption by institutions is slow [23]. Medical scientists are still assessed based mostly on productivity metrics, which can lead to publishing in questionable journals. Even dubious journals manage to get an impact factor, reinforcing their fit with the current criteria of research evaluation. For instance, with more than 9,500 papers published in 2020 and 17,000 in 2022, the International Journal of Environmental Research and Public Health had an impact factor of 4.6 before being removed from Web of Science on February 13, 2023, by Clarivate, for publishing content outside its scope. Determining the line of demarcation can be challenging, particularly when committee members responsible for evaluation also have extensive curricula vitae, including, sometimes, publications in the very same problematic journals [24]. Additionally, there are significant instances where researchers assume control of apparently legitimate journals through their positions on editorial boards to publish their own research or that of their close associates. An example of this is the control of Professor Didier Raoult and his team over the International Journal of Microbial Agents [24]. This case is not unique and underscores a broader concern with nepotistic behaviour in biomedical journals [25].

What can be done?

While systemic changes are undoubtedly necessary, the ICJME’s position paper did not recommend extensive concrete actions to address the issue. The proposals are primarily limited to alerting authors about the existence of predatory journals. Additionally, there is some self-victimisation when the ICMJE suggests that legitimate journals and publishers may face “unfounded accusations of improper behavior” [3], because predatory journals that profit off the Open Access model “make some academics and their institutions wary of legitimate open access, author-pays journals”. But could the firefighter himself be the arsonist? The ICMJE should have reflected on whether “reputable” publishers could have been responsible for developing strategies that nurtured the present state of affairs, eg, i) by requiring article processing charges and subscription rates that are hardly affordable; ii) cascading rejected pieces to their sister journals whose number is increasing; iii) publishing sponsored issues (for instance, when Nature published an outlook series on psychedelic medicine sponsored by Atai Life Sciences, a biotechnology company) [26]. Even aggressive solicitations of authors by email are no longer limited to predatory entities. Springer Nature has been known to approach authors of papers in its journals with AI-generated “Media Kits” to summarise and promote their research, claiming its AI tool will “maximize the impact” of their research, saying the $49 package will return “high-quality” outputs for marketing and communication purposes [27].

Alternatively, ICMJE could have suggested measures for reputable scientific journals to uphold their responsibility as guardians of the integrity of the scientific publication process. Ironically, journals sometimes appear prone to focus more on correcting typographical errors or missing punctuation in pre-proof papers, while paying less attention to issues of integrity and potential misconduct. Is the form ultimately more important than the substance? Why is COPE toothless when dealing with the inertia of journals about publishing corrections regarding undisclosed conflict of interests? Why does the journal Science remain a solitary beacon, handling specifically and conveniently concerns about science integrity at science_data@aaas.org? Why do top-ranked academic journals pose serious barriers to post-publication peer-review [28]? Indeed, so few “reputable” journals nurture debates, the cornerstone of science, as well as the BMJ, with its convenient Rapid Response section online and space for it in the print issue. Is it acceptable that Annals of Internal Medicine requires, for accepting a Rapid Response, that “the reader is a subscriber/member, the reader is accessing the article via an institutional subscription, or the person purchased pay per view access”? [29]. Is it acceptable that the Lancet refuses to publish a 171-word Letter to the Editor from one of us (AB) questioning why the term “overdiagnosis” was absent in the 55-page review (with 494 references) from the Lancet Breast Cancer Commission, while the word “screening” appeared 55 times? [30]. And this, when among biennially-screened women aged 50 to 74 years, about 1 in 7 cases of screen-detected cancer is over-diagnosed and the same is true of 1 out of 3 among women aged 70 to 74 years [31, 32].

ICMJE must also suggest further steps to improve quality control of medical journals. It could implement a certification system for journals adhering to the ICMJE guidelines, rather than maintaining a list of “journals stating that they follow the ICMJE Recommendations” on its website without verifying the completeness or accuracy of this list. ICMJE hosts on its website many possibly predatory journals [33, 34], potentially aiding their deceptive claims. Instead, ICMJE put the burden of verification on authors and suggested that they use a checklist of features from WAME to identify trusted journals and publishers or to rely on the “think, check and submit” initiative that allows for identifying trusted publishers. Furthermore, when the ICMJE claimed that the creation of a list of “predatory” journals is “infeasible” [3], it ignored Jeffrey Beall’s initiatives and new ones [35]. Further, we believe that an assessment of journals could easily be achieved. A grading like Nutri-Score1 which evaluates food products, could be used with an option for appeal, where a journal is given a chance to defend itself before it is listed. The ranking should not be in terms like “legitimate” or “predatory”, but more like “fraudulent,” “bad quality”, “poor quality”, ‘fair quality”; “good quality”. Sting operations could be used to provide simple and reliable evidence [36].

Comprehensive initiatives already exist to monitor journal policies such as the Transparency and Openness Promotion (TOP) factor [37], a metric that evaluates the steps a journal takes to implement open science practices (https://topfactor.org). It should help researchers to find an ethical and transparent journal. Too few medical journals follow the example of the Journal of Clinical Epidemiology which put forward the TOP factor. Despite the global enthusiasm for open science, this metric may highlight the lack of willingness of many journals to implement open science solutions in their policies for a more transparent scientific process. Building an observatory of such practices is possible [38], and this would be even easier to implement in case journals adopt common standards and metadata. At the article level, there is some consensus on core open science practices that can serve as key indicators to monitor journal performances in implementing policy [39]. ICMJE could have called for standardisation and showcased examples such as PLOS Open Science Indicators [40]. Careful implementation of those indicators, however, need to take into account Goodhart’s and Campbell’s laws: a) “As soon as an indicator becomes an objective, it ceases to be a good objective” [41]; b) “It is subject to manipulation, faking the numbers or working only to improve the measure”. ICMJE should support meta-research efforts that include robust evaluation to enhance the quality of published literature.

Beyond quantitative indicators, the ICJME appropriately highlighted, in alignment with DORA, that “academic promotion committees should consider not only the quantity but also the quality of publications and the journals in which they appear”. However, this has been blowing in the wind in the absence of concrete action towards change. Ten years after DORA, in 2022, a bottom-up initiative — the Coalition for Advancing Research Assessment (CoARA) [42] confirmed a craving for change: 700 signatories, including research organisations, funders, assessment authorities, and professional societies, highlighted their commitment to improving research practices, with quality and reproducibility being core preoccupations. But the change is slow, and the first results are expected in 2027. CoARA is not even mentioned in the ICMJE statement. Furthermore, the ICMJE’s current practice of listing journals that self-declare adherence to the ICMJE Recommendations, is exploited by predatory journals. On February 1, 2020, there were 4,892 journals published on the ICMJE website, of which nearly 40% were on Jeffrey Bealls’ list of alleged predatory journals. The number of journals published on the ICMJE website increased to 8,808 by January 8, 2025, an 80% increase from 2020. Granting an ICMJE label when there is no funding for quality control and certification systems is naïve, at best [43].

What’s more, the ICMJE made no mention of the most effective measure against predators: hitting them in the wallet. The ICMJE statement did not mention the diamond access model, such as Open Research by the European Commission, as a funding body via the publisher Taylor & Francis, via F1000, or non-profit initiatives like Peer Community In [44], which provides free peer review, recommendation, and publication of open access scientific articles. We are afraid that the traditional journals and publishers could oppose such new initiatives as they did with preprints for too long on the model of The Ingelfinger Rule [45, 46], designed to prevent authors from engaging in duplicate publication. However, by prohibiting preprints, it also served to maintain the journals’ exclusivity of publications and revenue stream. We are afraid that a for-profit industry is more concerned with shaping policies to its own economic advantage rather than with improving quality and taking any drastic measure to combat predatory journals that would go against a legitimate publisher’s business model.

Without taking these issues into account, we fear that ICMJE’s crusade against predatory journals could look like cartelisation. Several ICMJE members are indeed editors from the leading publishing houses. The most important thing we, as users of the system, can do is to be aware of the realities and treat publishing houses, journals and academic articles with healthy skepticism. [47]. Having said this, we are all still trapped in the system that we criticise.

1 Note: Nutri-Score demonstrates the overall nutritional value of food products, assigning a rating letter from A (best) to E (worst), with associated colours from green to red, from an algorithm accounting for energy value, ingredients that should be limited in the diet (ie, saturated fatty acids, sugar and salt) and beneficial ingredients (ie, vegetables).


Authors: Alain Braillon (corresponding author — braillon.alain@gmail.com, https://orcid.org/0000-0001-5735-9530), Independent Researcher; Ex-chief, Alcohol Treatment Unit, University Hospital of Amiens, FRANCE; Constant Vinatier (constant.vinatier@etudiant.univ-rennes.fr, https://orcid.org/0000-0002-6899-1838) PhD Student, Univ Rennes, CHU Rennes, Inserm, EHESP, IRSET (Institut de Recherche en Santé, Environnement et Travail), UMR_S 1085, Rennes, FRANCE; Florian Naudet (floriannaudet@gmail.com, https://orcid.org/0000-0003-3760-3801), Professor of Therapeutics, Department of Psychiatry, University Hospital of Rennes, Rennes, FRANCE.

Conflict of Interest: FN received funding from the French National Research Agency, the French ministry of health and the French ministry of research. He is a work package leader in the OSIRIS project (Open Science to Increase Reproducibility in Science). FN is also work package leader for the doctoral network MSCA-DN SHARE-CTD (HORIZON-MSCA-2022-DN-01 101120360), funded by the EU.

To cite: Braillon A, Vinatier C, Naudet F. Weak campaign against predatory journals: structural issues. Indian J Med Ethics. Published online first on August 7, 2025. DOI: 10.20529/IJME.2025.061

Submission received: February 15, 2025

Submission accepted: April 7, 2025

Manuscript Editor: Sanjay A Pai

Copyright and license

©Indian Journal of Medical Ethics 2025: Open Access and Distributed under the Creative Commons license (CC BY-NC-ND 4.0), which permits only noncommercial and non-modified sharing in any medium, provided the original author(s) and source are credited.


References

  1. De Angelis C, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, et al. Clinical Trial Registration: A Statement from the International Committee of Medical Journal Editors. N Engl J Med. 2004;351(12):1250–1. https://doi.org/10.1056/NEJMe048225
  2. Naudet F, Siebert M, Pellen C, Gaba J, Axfors C, Cristea I, et al. Medical journal requirements for clinical trial data sharing: Ripe for improvement. PLoS Med. 2021;18(10):e1003844. https://doi.org/10.1371/journal.pmed.1003844
  3. Laine C, Babski D, Bachelet VC, Bärnighausen TW, Baethge C, Bibbins-Domingo K, et al. Predatory journals: what can we do to protect their prey? Tunis Med. 2025 Jan 5[cited Feb 27];103(1):1-3. Available from: https://latunisiemedicale.com/index.php/tunismed/article/view/5666
  4. Grudniewicz A, Moher D, Cobey KD, Bryson GL, Cukier S, Allen K, et al. Predatory journals: no definition, no defence. Nature. 2019;576(7786):210-212. https://doi.org/10.1038/d41586-019-03759-y
  5. Hagve M. The money behind academic publishing. Tidsskr Nor Laegeforen. 2020;140(11). https://doi.org/10.4045/tidsskr.20.0118
  6. Anonymous. Financement de l’open access. [Funding open access} Inserm pro 8 February 2022 [Cited 2025 Feb 27]. [French] Available from: https://pro.inserm.fr/science-ouverte/financement-de-lopen-access
  7. Open Knowledge Foundation Germany. OpenAPC. [Cited 2025 Feb 27]. Available from: https://treemaps.openapc.net/apcdata/openapc/publisher
  8. Hanson MA, Barreiro PG, Crosetto P, Brockington D. The strain on scientific publishing. Quant Sci Stud. 2024;5 (4):823–843. https://doi.org/10.1162/qss_a_00327
  9. Godlee F, Smith J, Marcovitch H. Wakefield’s article linking MMR vaccine and autism was fraudulent. BMJ. 2011 Jan 5;342:c7452. https://doi.org/10.1136/bmj.c7452
  10. Cabanac G, Labbé C. Prevalence of nonsensical algorithmically generated papers in the scientific literature. J Assoc Inf Sci Technol. 2021;72(12):1461–1476
  11. Besançon L, Bik E, Heathers J, Meyerowitz-Katz G. Correction of scientific literature: Too little, too late! PLoS Biol. 2022 Mar 3;20(3):e3001572. https://doi.org/10.1371/journal.pbio.3001572
  12. Retraction Watch. Cancer paper retracted 11 years after reported plagiarism. 2024 July 31[Cited 2025 Feb 27]. Available from https://retractionwatch.com/2024/07/31/cancer-paper-retracted-11-years-after-reported-plagiarism
  13. Bucci E, Andreev K, Björkman A, Calogero RA, Carafoli E, Carninci P, et al. Safety and efficacy of the Russian COVID-19 vaccine: more information needed. Lancet. 2020 ;396(10256):e53. https://doi.org/10.1016/S0140-6736(20)31960-7
  14. Abbott A. Researchers highlight ‘questionable’ data in Russian coronavirus vaccine trial results. Nature. 2020 Sep;585(7826):493. https://doi.org/10.1038/d41586-020-02619-4
  15. Bucci EM, Berkhof J, Gillibert A, Gopalakrishna G, Calogero RA, Bouter et al. Data discrepancies and substandard reporting of interim data of Sputnik V phase 3 trial. Lancet. 2021;397(10288):1881-1883. https://doi.org/10.1016/S0140-6736(21)00899-0
  16. Sheldrick KA, Meyerowitz-Katz G, Tucker-Kellogg G. Plausibility of Claimed Covid-19 Vaccine Efficacies by Age: A Simulation Study. Am J Ther. 2022;29(5):e495-e499. https://doi.org/10.1097/MJT.0000000000001528
  17. Thorp HH, Phelan M. Breaking the silence. Science. 2025;387(6735):701. https://doi.org/10.1126/science.adw5838
  18. The Declaration on Research Assessment (DORA). What is DORA? [Cited 2025 Feb 27]. Available from: https://sfdora.org
  19. Moher D, Shamseer L, Cobey KD, Lalu MM, Galipeau J, Avey MT, et al. Stop this waste of people, animals and money. Nature. 2017;549(7670):23-25. https://doi.org/10.1038/549023a
  20. 20. Manca A, Cugusi L, Cortegiani A, Ingoglia G, Moher D, Deriu F. Predatory journals enter biomedical databases through public funding. BMJ. 2020;371:m4265. https://doi.org/10.1136/bmj.m4265
  21. Shamseer L, Moher D, Maduekwe O, Turner L, Barbour V, Burch R, et al. Potential predatory and legitimate biomedical journals: can you tell the difference? A cross-sectional comparison. BMC Med. 2017;15(1):28. https://doi.org/10.1186/s12916-017-0785-9
  22. Ministry of Health, France. B02: Dotation socle de financement des activités de recherche, d’enseignement et d’innovation [Base funding allocation for research, teaching and innovation activities]. [Cited 2025 Feb 27]. Available from: https://sante.gouv.fr/IMG/pdf/fiche_mig_b02_dotation_socle_merri.pdf
  23. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, Coriat AM, et al. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLoS Biol. 2020;18(7):e3000737. https://doi.org/10.1371/journal.pbio.3000737
  24. Locher C, Moher D, Cristea IA, Naudet F. Publication by association: how the COVID-19 pandemic has shown relationships between authors and editorial board members in the field of infectious diseases. BMJ Evid Based Med. 2022;27(3):133-136. https://doi.org/10.1136/bmjebm-2021-111670
  25. Scanff A, Naudet F, Cristea IA, Moher D, Bishop DVM, Locher C. A survey of biomedical journals to detect editorial bias and nepotistic behavior [published correction appears in PLoS Biol. 2022;20(1):e3001525. https://doi.org/10.1371/journal.pbio.3001525]. PLoS Biol. 2021;19(11):e3001133. https://doi.org/10.1371/journal.pbio.3001133
  26. Naudet F, Fried EI, Cosgrove L, Turner E, Braillon A, Cristea IA. Psychedelic drugs: more emphasis on safety issues. Nature. 2022;611(7936):449. https://doi.org/10.1038/d41586-022-03680-x
  27. Harrison Dupré M. The Publisher of the Journal “Nature” Is Emailing Authors of Scientific Papers, Offering to Sell Them AI Summaries of Their Own Work. Futurism. 2025 Jan 9[Cited 2025 Feb 27]. Available from: https://futurism.com/springer-nature-ai-media-kit
  28. Hardwicke TE, Thibault RT, Kosie JE, Tzavella L, Bendixen T, Handcock SA, et al. Post-publication critique at top-ranked journals across scientific disciplines: a cross-sectional assessment of policies and practice. R Soc Open Sci. 2022;9(8):220139. https://doi.org/10.1098/rsos.220139
  29. Annals of Internal Medicine. Information for authors. 27 June 2024[Cited 2025 Feb 27]. Available from: https://www.acpjournals.org/pb-assets/pdf/AnnalsAuthorInfo-1719507466067.pdf
  30. Coles CE, Earl H, Anderson BO, et al. The Lancet Breast Cancer Commission. Lancet 2024;403(10439):1895-1950. https://doi.org/10.1016/S0140-6736(24)00747-5
  31. Ryser MD, Lange J, Inoue LYT, O’Meara ES, Gard C, Miglioretti DL, et al. Estimation of Breast Cancer Overdiagnosis in a U.S. Breast Screening Cohort. Ann Intern Med. 2022;175(4):471-478. https://doi.org/10.7326/M21-3577
  32. Richman IB, Long JB, Soulos PR, Wang SY, Gross CP. Estimating Breast Cancer Overdiagnosis After Screening Mammography Among Older Women in the United States. Ann Intern Med. 2023;176(9):1172-1180. https://doi.org/10.7326/M23-0133
  33. Dal-Ré R, Marušić A. Potential predatory journals are colonizing the ICMJE recommendations list of followers. Neth J Med. 2019;77(2):92-96.
  34. Siebert M, Gaba JF, Caquelin L, Gouraud H, Dupuy A, Moher D et al. Data-sharing recommendations in biomedical journals and randomised controlled trials: an audit of journals following the ICMJE recommendations. BMJ Open. 2020;10(5):e038887. https://doi.org/10.1136/bmjopen-2020-038887
  35. Predatory Journals. [Cited 2025 Feb 27]. Available from: https://predatoryjournals.org/
  36. Bohannon J. Who’s afraid of peer review? Science. 2013;342:60-65. https://doi.org/10.1126/science.2013.342.6154.342_60
  37. Grant S, Corker KS, Mellor DT, Stewart SLK, Cashin AG, Lagisz M, et al. TOP 2025: An Update to the Transparency and Openness Promotion Guidelines. https://doi.org/10.31222/osf.io/nmfs6_v2
  38. Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. Assessment of transparency indicators across the biomedical literature: How open is open? PLoS Biol. 2021;19(3):e3001107. https://doi.org/10.1371/journal.pbio.3001107
  39. Cobey KD, Haustein S, Brehaut J, Dirnagl U, Franzen DL, Hemkens LG,et al. Community consensus on core open science practices to monitor in biomedicine. PLoS Biol. 2023;21(1):e3001949. https://doi.org/10.1371/journal.pbio.3001949
  40. Open Science Indicators. Six years of Open Science Indicators data. PLOS. 2024 Mar 28[Cited 2025 Feb 27]. Available from: https://theplosblog.plos.org/2024/03/six-years-of-open-science-indicators-data/
  41. Clear J. Atomic Habits. NY: Random House Business; 2018. p 203.
  42. The Coalition for Advancing Research Assessment. [Cited 2025 Feb 27]. Available from: https://coara.eu
  43. Siebert M, Bourgeois FT, Naudet F. ICMJE Should Create a Certification System to Identify Predatory Journals. JAMA. 2025 Jul 1;334(1):87-88. https://doi.org/10.1001/jama.2025.3661
  44. Peer Community In. [Cited 2025 Feb 27]. Available from: https://peercommunityin.org/
  45. Relman AS. The Ingelfinger Rule. N Engl J Med. 1981;305(14):824-826. https://doi.org/10.1056/NEJM198110013051408
  46. Anonymous. Definition of “sole contribution”. N Engl J Med. 1969;281(12):676-677. https://doi.org/10.1056/NEJM196909182811208
  47. Oransky I, Marcus A, Abritis A. How bibliometrics and school rankings reward unreliable science. BMJ. 2023 Aug 17; 382:1887. https://doi.org/10.1136/bmj.p1887