PREDATORY JOURNAL PAPERS ARE PERMEATING AND POLLUTING THE SCHOLARLY SYSTEM VIA CITATIONS. HOW SERIOUS IS THE PROBLEM? WHAT CAN BE DONE?
In their blog (https://scholarlykitchen.sspnet.org/2019/09/25/fighting-citation-pollution/), Lisa Janicke Hinchliffe and Michael Clarke highlight the issue of citation pollution, stressing that fraudulent journals are penetrating the scholarly ecosystem. They say that “separating the methodologically sound from the fraudulent would be a Herculean challenge” and tracking fraudulent publications may be almost impossible, because of exponential increasing numbers of these publishers and journals (see Huffman 2017).
We agree with this position, but also suggest that the problem needs finer-grained articulation, because it will help underline the extent of the problem. Citation pollution may also means that:
- Citations to predatory content cannot be removed from the academic literature. This has a long term negative impact on the reliability of the academic knowledge base.
- Citation databases cannot sanitize or remove the cited predatory papers from references of papers indexed.
- The presence of predatory journal literature in the reference lists of respectable journals increase visibility of the predatory content and give credibility to low quality papers and journals.
- By citing predatory journals, authors invalidate their own research.
- Citation to predatory journals inflate citation metrics and Google Scholar H-index of predatory journal authors.
We agree with Robert’s (2016) view that distortion of the published record may lead to future misdirection in research, and ask: How much bogus journals and fake articles are currently incorporated into review articles, systematic literature reviews or legitimate meta-analyses?
Several citation studies investigate the citations of predatory journals. Key claims from these studies offer much needed support for the extended position we offer above:
- Nwagwu & Ojemeni (2015) conducted a Google Scholar and Web of Science citation study of 32 biomedical open access journals published by two Nigerian predatory publishers, “Academic Journals” and “International Research Journals”. The 32 journals received a total of 12 596 citations in Google Scholar, an average of 2.25 per paper and 394 citations per journal. The top ten countries citing the predatory journals were India (18.13%), China (16.21%), Pakistan (13.09%), Iran (10.63%), Malaysia (9.96%), South Africa (7.83%), USA (7.72%), Saudi Arabia (5.37%), Nigeria (6.52%) and Turkey (4.82%). Two of the 20 Nigerian predatory journals had 1195 citations in 54 Web of Science journals.
- Frandsen (2017) analysed the characteristics and profile of the scholars that cite predatory journals by analysing citations to 124 predatory journals in Scopus. The data was collected in March 2017 from a random selection of half of the journals on Beall’s standalone list. Citing authors were analysed according to geographic location, publication and citations. The top country of origin of the citers were India (33.4%), Malaysia (15.1%), USA (4.8%), Iran (3.3%), Pakistan (2.6%), Egypt (2.6%), China (2.5%), Poland (2.2%), Brasil (2.2%), and Nigeria (1.9%). Results indicate that there were 4250 authors and 1295 citations to 124 journals. The most citing authors are inexperienced authors primarily from South Asia, Southeast Asia, Middle East, Africa and to a lesser extent from the rest of the world. The citations to these journals were generally low.
- Alrawadieh (2018) described a research study on predatory journals in the discipline of tourism and hospitality. Beall’s list of predatory journals were used to select 13 tourism and hospitality predatory journals. The results indicated that 612 articles were published in the 13 predatory journals, written by 1267 authors. A citation analysis was performed using Google Scholar to assess the performance of the predatory journal articles. Results indicated that the journals received a total of 362 citations between 2010 and 2017. 59% of the articles have never been cited.
- Bagues, Sylos-Labini, & Zinovyeva (2019) conducted a study among 46 000 researchers in Italy and found that 5% of them published in journals on Beall’s blacklist. The researchers then collected information on the number of citations received by these journals from Google Scholar. The results show that most predatory articles were not cited; 23% of articles were not cited. If self citations are included, the share of never-cited articles rises to one-third. The 10 most cited articles in the sample received at least 20 citations, including one article with 399 citations.
- Ross-White, Godfrey, Sears, & Wilson (2019) explored the degree to which articles published by a major predatory publisher, i.e. OMICS, are cited in systematic literature reviews in the health and biomedical sciences. A list of 459 journals with health and biomedical science titles were selected. Google Scholar was used to examine if the predatory journal articles were cited in systematic literature reviews. Each journal title was searched individually in Google Scholar and the citations were downloaded into EndNote Citation software. Results indicated that 62 of the journal titles had published a total of 120 articles that were cited by at least 1 systematic review, with a total of 157 systematic reviews citing articles from one of these predatory journals.
- Schira & Hurst (2019) examined the citation of predatory journals by university students in 19 courses. The study analysed 2359 citations in 245 student bibliographies. Results showed that of the 1485 journals cited, 5 cited journals on Beall’s list. 0.34% of all journal articles and 2.4% of open access articles were from predatory publishers.
- Oermann et al. (2019) investigated the citations from seven predatory nursing journals in Scopus in October 2018. The results indicated 814 citations of predatory journal articles in Scopus. A median of two articles per journal cited one of the seven predatory journals. The results showed that 141 Scopus articles cited predatory journals, of which 85 were reviewed articles. Most of the authors of the articles that cited predatory journals were from USA (30.9%), Australia (12.4%) and Sweden (8.3%). The majority of the authors of predatory journal articles were from USA (32.8%), Sweden (12.1%), Australia (8.1%) and Canada (6.6%).
Fidelior’ strength is that it provides an integrated platform of reputable journal lists and questionable journal lists that are available in the public domain. It facilitates searches on specific journal titles, as well as automatic scanning of bibliographies and reference lists for possible journal titles matches in these lists. Fidelior reports refer to orange and green lists instead of black and white lists. Orange flags journal names that may appear in predatory lists and warns authors and researchers that there could be a problem with the specific journal. However, not all titles on the orange lists are actually predatory. For example, DOAJ may remove a title when the journal is not open access anymore. Although the title may be a reputable journal, it will appears on their “orange” list of removed journals. The Fidelior user receives an integrated report with all (or most of the) data that is available in the public domain on a specific journal, allowing authors and researchers to make more informed decisions about the journal’s standing.