Bibliometrics Open access Scopus
OACA – the open access citation advantage
July 22, 2016
0
, ,

If you post your paper online somewhere outside a paywall, you join the majority of scholars and your paper will gain a citation advantage over papers available only through a journal. This has been demonstrated in recent research. A particular focus of research on open access publishing has been the hypothesis that open access papers will be more highly cited. In this post I will examine some studies that have attempted to confirm or quantify this citation advantage.

Interest in knowing whether there is an open access citation advantage (OACA) is so high that two annotated bibliographies are maintained: one by Ben Wagner, a librarian at University of Buffalo (2014 version here), and a second by a European project (up to date as of early 2015). Judging by these two bibliographies, upward of 70 papers have been published on OACA.

Wagner’s overview of the literature leads him to conclude that the OACA is well established; that is, “open access articles definitely have a citation advantage over generally equivalent toll access articles.” Nevertheless, one problem with many studies is that they are small: restricted to a specialty or set of journals. Most studies in this area examine thousands or maybe tens of thousands of papers. A definitive answer requires examining a broad swath of papers from across all areas of research, but this is difficult because determining whether a paper is available open access is not easy. The Internet must be searched for every paper.

The annotated bibliographies suggest that there are two possible contenders for biggest study of OACA. Evans & Reimer (2009) processed a sample of the Web of Science database containing 12 million records, which sounds large but the OACA analysis was only based on journals within that sample that had switched to open access between 1998 and 2005. The number of OA journals analyzed was no doubt small; however, that figure was not reported. Hajjem et al. (2005) assessed 1 million papers, crawling the Internet to see whether each could be found open access. They found 123,000 open access papers and based their comparison of citation rates on this set.

Superseding this is a study of OACA performed in 2014 by Science-Metrix for the European Commission. This study is definitive because its data is big. Science-Metrix crawled the Internet looking for one and a half million Scopus-indexed papers in the open access world. They found that more than 700,000 of the papers were available open access—that is, available outside paywalls somewhere. They estimated that as of April 2014, more than 50% of the scientific papers published between 2007 and 2012 could be downloaded for free on the Internet. This percentage increased for papers published between 2013 and 2014.

Science-Metrix then retrieved 245,571 OA papers published between 2009 and 2011 and conducted an OACA analysis on this set. Overall, the OA papers had a 26% citation advantage compared to the full set of 512,443 papers, and papers that were not OA had a 24% disadvantage (i.e., they had a lower citation rate than the full set). The advantage varied by type of OA: green (author posting) had a 53% advantage, while gold (journal OA) had a 39% disadvantage. The advantage also varied by field, being highest in some humanities fields and lowest in medical fields. According to Wagner this study provides “the strongest evidence to date of an OACA.”

The size of the study reduces but doesn’t eliminate the possibility of two traditional objections to the OACA. The first is that authors are motivated to post their better papers and so selection bias accounts for the open access citation advantage. The other is that author posting makes a paper available earlier and so citations arrive sooner to the published version because people are aware of it before it is published. This makes the green OA paper seem more cited when citations are counted during the first few years after publication. The Science-Metrix study was not designed to investigate these factors.

Keeping these caveats in mind, these results suggest that scholars should post their papers somewhere if they would like to maximize their impact. Most journals allow posting of the submitted or refereed but not edited/laid out version in an institutional repository or similar situation. It is less clear that one should submit to an open access journal or pay to have the paper be open access in a subscription journal as that would be gold open access, which seemed to have a disadvantage in this study.

 

Archambault, E., Amyot, D., Deschamps, P., Nicol, A., Provencher, F., Rebout, L., & Roberge, G. (2014). Proportion of open access papers published in peer-reviewed journals at the European and world levels—1996–2013. Montréal, Canada: Prepared for the European Commission. Retrieved from http://science-metrix.com/files/science-metrix/publications/d_1.8_sm_ec_dg-rtd_proportion_oa_1996-2013_v11p.pdf

Evans, J. A., & Reimer, J. (2009). Open access and global participation in science. Science, 323(5917), 1025. doi:10.1126/science.1154562

Hajjem, C., Gingras, Y., Brody, T., Carr, L., & Harnad, S. (2005). Open access to research increases citation impact (Technical Report). Retrieved from http://eprints.soton.ac.uk/261687/

 

0

About the author

Diana Hicks

Professor Diana Hicks specializes in science and technology policy as well as in the innovative use of large databases of patents and papers to address questions of broad interest at the intersection of science and technology. Her recent work focuses on the challenges of bibliometric analysis in the social sciences and humanities and on developing broad understanding of national performance-based research funding systems and their consequences. Professor Hicks’s work has appeared in such journals as Policy Sciences, Social Studies of Science, Nature, Research Policy, Science and Public Policy, Research Evaluation, Research Technology Management, R&D Management, Scientometrics, Revue Economique Industrielle, Science Technology and Human Values, Industrial and Corporate Change, Japan Journal for Science, and Technology and Society. She was also lead author of the Leiden Manifesto (2015, see http://www.leidenmanifesto.org/), which presented 10 principles for guiding research evaluation and has been translated into 11 languages. Hicks is a Professor in the Georgia Tech School of Public Policy, and previously chaired the School between 2003 and 2013. Prior to this, Professor Hicks was the Senior Policy Analyst at CHI Research, Inc. She was also on the faculty of SPRU, University of Sussex (UK), for almost 10 years, taught at the Haas School of Business at the University of California, Berkeley, and worked at the National Institute of Science and Technology Policy (NISTEP) in Tokyo. Visit https://works.bepress.com/diana_hicks/ to view Professor Hicks’s publications.

Related items

/ You may check this items as well

Data mining: Open access policies and outcomes

During our data mining project for the European Co...

Read more

A short history of research impact

During the 2017 Science & Technology Indicator...

Read more

Interdisciplinarity: the mutual adjustment of concept and indicator

If you post your paper online somewhere outside a ...

Read more

There are 0 comments