Hindawi / Blog / Blog Post

Open Science

Why price transparency in research publishing is a positive step

Institutions | Data sharing
Light shining through structure

What is the Journal Comparison Service? And why will more transparency benefit all stakeholders in research publishing?


In 2019, Hindawi took part in the price transparency framework pilot run by Information Power on behalf of cOAlition S. Three years later and the coalition’s new Journal Comparison Service (JCS) is up and running. Hindawi is proud to be one of the publishers that has contributed data to this service. Taking part has helped us focus on the rigour of our own reporting system and has enabled us to give researchers greater choice when choosing a journal by giving more visibility to our services in our new and publicly available journal reports.

Only a few publishers took part in the pilot and the framework remains untested. It’s not yet clear how useful the JCS will be to the institutions who might want to access the service and use the data, or how the JCS will increase transparency about costs as well as pricing across the publishing industry more generally. In part, this is because it’s seen by some to provide an overly simplistic view of publishing. Compartmentalising publishing services into seven or eight different categories  (see page 20 of the JCS guidance for publishers) inevitably constrains the many different and often overlapping services that publishers provide. In addition, limiting the price breakdown of these services into the percentage that each contributes to a journal’s APC also means that the real costs aren’t visible. There are also pragmatic reasons that make it very difficult for some publishers to collect data consistently, especially for those with large portfolios that operate on multiple platforms or have journal-specific workflows. Finally, fully open-access publishers who don’t have an APC business model can’t take part, even if they want to be more transparent.

However, we believe the upsides are large. Hindawi has more than 200 journals in our portfolio and the following outlines a few of the ways we, and we hope those who contribute to and access our journals, are benefiting. Our focus is on the ‘Information Power’ framework for the JCS and on the ‘Journal Quality’ information specifically (columns P-Z in the template spreadsheet). This information relates to data on the journal workflow, especially peer review (such as timings and the no of reviewers involved). We know that there is a long way to go to make all publishing services transparent, but we are learning from our participation in the JCS and will continue to explore ways to improve transparency.

Getting our publishing data in order

Hindawi’s journals currently operate on one submission and peer review platform called Phenom which we built ourselves and is continually being developed (the source code is made openly available). In principle, one platform should make it easier for us to provide data on our workflows. It’s also standard practice for publishers to collect data on the turnaround times and rejection/acceptance rates of articles. However actually standardizing, coding, and retrieving the data required turned out to be harder than we anticipated. cOAlition S provided a definition of each rate and checkpoint they wanted publishers to include but some variables were still open to interpretation. For example, take the rate of rejection before peer review – does this include articles that are submitted and then withdrawn? Does the clock start ticking on peer review when reviewers are invited, or when they accept? In our reporting we’ve excluded data on withdrawn articles and counted time in peer review from when the first reviewer is invited but other publishers may have implemented this differently. Even with consistent definitions you need the software and reporting systems in place to accurately extract data for thousands of articles.

We had already started being more rigorous about how we defined, standardized, and analyzed our data as part of our own reporting system, and this, combined with the reporting requirements of the coalition, has created spinoffs for how we measure, evaluate and compare the integrity, performance and efficiency of our journals. We are also now collecting different types of data at different points in the workflow that we might not have focused on otherwise such as the impact of collecting ROR IDs earlier in our workflows. This is work in progress but having external standards as well as internal incentives helps us focus on the needs of our core stakeholders.

Building new public-facing Journal Reports

Although authors aren’t the primary audience of the JCS, getting our data house in better order has made it possible to report data publicly as we have in our new Journal Reports (see an example). If the publishing industry and researchers are to move away from relying on journal rank or the impact factor as a proxy of quality – with all the detrimental consequences this has had on the system – then we need to raise awareness and showcase other publishing services and features that can provide more reliable choices for authors, their institutions, and funders. This includes data on the integrity, quality and speed of our processes, but also how we are making articles more accessible and discoverable. In future releases of the Journal Reports, we aim to provide more information about our services, especially in relation to open science and research integrity.

Driving innovation in scholarly publishing

Innovation for us is about changing the way publishing is done for the better. Making our services more visible to institutions via the JCS and on our websites is a first step towards raising awareness of the range and value of different services we provide and enabling others to independently evaluate them. Innovation is also about being open to change, taking risks and experimenting on what works and what doesn’t. As we develop and create new services for open science we need to do so in an evidence-informed way so that our services meet the needs of the wider research community and science as a whole.

How we and others then evaluate the services we provide for our authors and others is part of the rationale behind the JCS. One of their aims is to change the perspective on what it is about journals and publishers that is important for scholarly communication. There are also other projects being developed around journal transparency that support such change (eg. the Journal Observatory Project) and many discussions about how journal services might be disaggregated (eg. the notion of an ‘extended journal’). Taking part in the JCS is a small contribution in many ways but a signal that we want to work with funders, libraries and others to align our services in a world where Open Access and Open Science is rightly driving the agenda. The challenge for us is to embrace this opportunity and innovate in a way that is scalable, cost-effective, and inclusive – where we can reduce the burden on authors so that anyone, anywhere in the world has the same opportunity to contribute to open science.

Creating trust

An underlying theme in all of this is about building trust and being accountable. Publishers haven’t had to report such granular data before because their services have traditionally been taken on trust, and reputation and journal rank has been used as a proxy for quality. And yet there are increasing and valid concerns about the trustworthiness of both science and science publishing from researchers, institutions, funders, governments, and the public. There is a lot of work we can do to help cement trust. Being more rigorous about the data we collect, making this transparent, and using it to improve publishing services and outputs are just the first steps.


This blog post is distributed under the Creative Commons Attribution License (CC-BY). Illustration adapted from Adobe Stock by Samuel Jennings.

We have begun to integrate the 200+ Hindawi journals into Wiley’s journal portfolio. You can find out more about how this benefits our journal communities on our FAQ.