On Tuesday, January 24, we held our first hands-on and interactive workshop for publishers focused on workflows. The online workshop was designed for people working for publishing companies, and we were pleased to greet and get to know participants from various roles, including management, sales operations, publishing technologies, open access, and funder relations.
From our day-to-day conversations with publishers, we know how complex the end-to-end publication workflow is. From submission and through to publication, the complexities continue because processes like payment for Open Access (OA) or other charges rely on other steps happening earlier in the workflow, such as during at the submission stage.
That's why we arranged this workshop to look at the overall publishing workflow and discuss where individual steps best fit in.
At ChronosHub, we like to be on top of things and believe in capturing information as early as possible in the workflow to avoid problems later. This certainly goes for institutional ID matching, agreement eligibility checking, funder compliance, and a clear presentation of publishing choices, including any possible associated charges ─even if the final publishing decision is not made until later.
Given that some elements change during the peer review process, all these need to be confirmed again after acceptance. This is why we recommend having the publishing choice and related license signature happen after acceptance. Suppose those elements are registered too early without considering possible changes during peer review. In that case, they will need to be repeated and redone manually, which would require too many change requests.
While we recommend providing clear information about publishing choice, license, and fees during submission and peer review, we encourage leaving the final decision and license signature until after acceptance.
To help understand the workflows better, the workshop was a mix of exercises and presentations. During a mapping exercise, we asked our participants to think about the sequence in which different steps in the publishing process happen, during which stage they occur, and whether they are manual or automated. We allowed each publisher to talk through their current workflow and highlight issues and problem areas.
We already identified 14 generic steps that we see in publishers' workflows. Most of these steps are essential when it comes to OA management This includes the application of read & publish or transformative agreements and steps that even apply to closed access articles.
This clearly highlighted how complex the workflows truly are, and the mapping exercise helped us to get a holistic view.
As we learned more about the workflows, we discussed the considerations between manual and automatic processes. Automated steps do not require human intervention and are driven by technology like a stand-alone tool or a platform. They can include automatically generated emails, directions on the next steps for an author to take on a platform, or approvals based on business rules.
Manual steps involve an element of human intervention, for example, reading an email response from an author that contains information that needs to be checked and potentially processed through other systems.
We discovered that some steps involve a mixture of automated and manual processes. In an ideal world, we believe that all processes can be automated, provided that the underlying data is clean and correctly maintained. An automated, rule-based process is always more accurate than any manual work.
A problem with lack of automation can occur when business needs, such as sales agreements, do not match business process. This would be if institutions or customers are promised something that you can only deliver manually as a business.
We know that a shift to automation is hard to put into practice given the number of legacy systems publishers work with, and we understand that most of them have some element of manual intervention. The key here is to ensure that manual intervention steps can be easily identified and acted upon, and can be measured i through robust, automated reporting.
While we managed to unpack a lot in this 90-minute workshop, we couldn't cover everything. Because we received very positive feedback from participants, we are planning a new workshop to get into more of the details we couldn’t cover this time around.
Please get in touch if you'd like to be added to the list to learn more about this workshop or our upcoming webinars. If there's anything you'd like to see raised in a workshop, please let us know, and we'll take your suggestions into account.