One of the challenges that is often brought up with regards to publishing in open access journals is the perceived low quality of some of these journals and the belief that they are not peer reviewed. Open access (OA) is an access and business model, and does not by nature indicate whether a publication is peer-reviewed or not. The Directory of Open Access Journals (DOAJ) is a comprehensive resource which indexes over 9,500 peer-reviewed, scholarly OA journals, as reported by publishers and reviewed by DOAJ.
A recent sting operation by journalist John Bohannon for Science magazine tested just what kind of peer review was conducted by 304 open access journals he submitted a spoof article to. Bohannon used a phony name and affiliation and a wrote a purposely flawed article with errors that should have been easily caught by an editor or reviewer who was doing their job. Of the 255 decisions the author received by press time, 157 were acceptances and 98 rejections. This experiment clearly raises red flags about some OA publishers and the level of peer review they are conducting. Since the targeted journals use a “gold” OA model, whereby a publication charge is paid by the author, there is also a legitimate concern about authors paying for a service they are not actually receiving.
Over the last five years, as OA publishing has become more accepted and pervasive, the number of OA publishers and publications has ballooned. Many OA journals have built a good reputation through their editorial boards and quality of output, and are indexed in PubMed and Web of Science. Meanwhile others raise doubts amongst scrupulous scholars who get bombarded with emails from journals urging authors to submit their articles or to serve on editorial boards.
As with any subscription journal, it is important for scholars to investigate unfamiliar OA journals and to use their best judgement when deciding where to submit their articles. The UCSF Library recommends following the guidelines outlined here, including a review of the caliber of the journal and whether it has been included in trusted sources of information.
The Science sting has put some important concerns to the test, but its results probably aren’t surprising to anyone who is familiar with the quality publications in their field. Several OA publishers that are popular with UCSF researchers, namely BioMed Central, Frontiers, Hindawi, Libertas Academica, and PLOS, were among the 98 that rejected the paper. Bohannon’s article has been widely criticized for singling out gold OA journals, and not using a control group of subscription journals. Without comparable results from subscription journals or from OA journals that do not charge a fee, it is difficult to draw conclusions about the practices of gold OA journals as compared to other publication models.
One of the positive results Bohannon’s experiment is likely to have is to tighten the standards for vetting OA publications. Both the DOAJ and OASPA have made statements to this effect in response to the sting, and in fact the DOAJ had already drafted revised inclusion criteria. And scholars will undoubtedly continue to use scrutiny when selecting what journals to publish in.