We concluded our stellar year of educational webinars by hosting a discussion about research metrics and open access. Joining us were Rod Cookson from IWA Publishing, Domenic Rosati from scite, and Catriona McCallum from Hindawi Publishing. Our Head of Publisher Relations, Romy Beard, moderated the webinar and got everyone thinking more about how metrics can help us understand the impact of open access and more about the research process in general. As you can imagine, there are a lot of nuances in the stories that metrics can tell!
As ever, let’s get started with the groundwork so we are all on the same page. Research metrics are usually tracked and collected at journal and article level. The most common metrics relate to usage, citations, and the length of time it takes to publish an article in a journal. However, there are a huge range of metrics that publishers can collect, interpret, and share. In terms of assessing the value of open access, the main challenge is to decide which metrics to track and compare.
For IWA metrics enable them to assess the journal appeal with researchers, understand the global impact of the research, and decide journal performance, which is more operational. Rod acknowledged that it is important to consider the group you are communicating with, so that the metrics you share are relevant. When you consider the needs of libraries, for example, they crossover between different areas. So, to help them manage publishing agreements they will want to know journal performance statistics such as the number of downloads. A library will also want to know the global impact of articles so they can share that with their institution. In comparison publishers are invested in knowing about all three aspects: appeal, impact, and performance.
Catriona talked about the Plan S Journal Comparison Service (JCS), which is designed to provide transparent data on publishing services and fees. Collating this data means Hindawi can show journal reports on most of their journals. As a result, researchers are more informed up front about acceptance rates, length of time from acceptance to decision, and the cost of publishing open access. Taking steps to join up information also helps to collect metrics with insight. One initiative that Hindawi have worked on is linking articles to an author’s ORCID. As a result, it makes an article more discoverable, and it works well because all Hindawi’s articles are open access.
Domenic Rosati from scite shared a wealth of statistics to help us investigate this crucial question! Scite looks at how authors mention or reference texts. To get the full picture they track open and closed access. Closed open access articles, according to scite’s metrics, are cited more, but that gap is closing considerably. Beyond this, Domenic shared a bunch of illuminating facts about open access and citations.
Even though closed access papers are cited more overall, open access articles have more citations than closed open access. And if you write an open access paper, it is more likely that it will be cited more often within someone else’s paper, than a closed access paper will be. This means that the open access material is being engaged with more, than the equivalent closed access material. Open access articles are also more likely to cite other open access papers.
The panel speculated on the reasons for this. One reason could be due to more editorial freedom. If an article is open access, a researcher can engage with the full article more easily rather than just a title and an abstract. Open access papers could be cited more because they have access to journals that they want to have access to. Open access started to receive more supporting citations a few years before open access started receiving more citations in general. Open access also receives more critical citations than closed access, which again means that the material is being engaged with more deeply. This is all looking good for open access in the research cycle!
Metrics can have an important role in showing how open access research has an impact beyond the confines of well-funded countries and institutions. When IWA converted 10 journals to open access through Subscribe to Open (S2O) the article downloads more than doubled. They noticed that before there were 54% citations from the developing world. After S2O was rolled out citations from the developing world increased by around 10%. Open access is particularly important for IWA because their mission is to share cutting edge research that solves real world problems related to water. For IWA these metrics make a clear case that open access does a great job of fulfilling this commitment.
You’ve probably heard of the impact factor, and you may have even been part of conversations about how useful it is! All the panelists agreed it is a controversial metric. So, what is up with the impact factor? And why does it get so many people talking? It calculates the average citations an article receives, but tells us little about how the research is being used to generate further knowledge. For example: a paper could be cited many times by others claiming that it shows incomplete data, but the impact factor would not show this.
A further problem with the impact factor is the importance placed on this one metric when the research process is much more than just publishing. The reputation of researchers is often tied to the impact factor, as well as the financial gain of publishers and institutions. An article with a high impact factor can be used to drive funding to an institution and a journal with a high impact factor can attract more articles. This in turn creates a divide that is unfair, especially if it is based on one value. So, what are the alternatives?
Catriona mentioned Open Science, an initiative which encourages researchers to make all elements of the research cycle openly available for others. One key focus of Open Science is to campaign for responsible metrics. As Open Science focuses on the whole research process, the aim of this would be for the research community to use a range of metrics to measure what successful research is. Rather than focusing on just publishing, the question should be: what is the service that publishers are trying to provide? The hope is that by looking at the research cycle in full, publishers and researchers will move away from the impact factor.
All the panelists agreed that another significant step is to promote the reporting of metrics within context. This means that researchers, publishers, and institutions can make a more holistic assessment of an article and the impact it has within its field. Avoiding creating situations where metrics are gamed is also important, as is measuring equity within research and publishing. The discussion highlighted that we could use a wider range of metrics to make open access publishing more equitable and tell a richer story about the research cycle!
We are busy planning more webinars for next year where will continue to delve into all the hot topics surrounding open access. Feel free to reach out to Romy Beard with topics for discussion at future webinars or to sign up to be a speaker at one of our webinars in 2023. Thank you for participating and we very much look forward to seeing you in the new year!