Three great stories that demonstrate why you should make analytics a cornerstone of your publishing business
At a recent technology workshop we asked the attendees to outline what they thought that Artificial Intelligence and decision-support could do for them. The first response was that these technologies could be used to "make long meetings shorter". Far from being flippant this desire to ‘make meetings shorter’ shows the continuing relevance of the old management adage that “if you can’t measure it you can’t manage it”. It’s a self-evident truth that understanding what to measure and how to measure it is critical to the success of any business or organization. Without measurement how do we know if our strategy has worked? Without data or evidence how can we even formulate a strategy in the first place? How do we measure the success of our tactics? Relevant data and data analysis can inspire a business to change, to identify new strategic opportunities and challenges and better understand and serve customers.
In recognition of the nomination of Vizors, our market-leading analytics platform, for a 2018 SIIA Codie award, this post offers three great stories that show why you should make analytics a cornerstone of your publishing business.
1. A publisher who spotted where to make efficiencies in the workflow
Smart businesses gather and analyse data to drive efficiency. Using data to understand which publication strategies are working and which are not is key to driving efficiencies in the editorial workflow with specific regard to the application of editorial resourcing. This is the story of a publisher of a major biomedical science journal who was concerned that papers in its Brief Communications section did not serve the journal and its readership as effectively as full-length research articles.
The publisher’s instinct was to discontinue the section and cease accepting publication of these shorter papers. Doing so would free up editorial and reviewer time to focus on research that was more highly cited. The publisher used Vizors data to understand that Brief Communications papers were not cited as frequently as other papers. This in turn indicated that editorial resources might be better deployed to publishing more full-length papers.
Vizors data helped the publisher come to a decision to discontinue the Brief Communications section enabling the focus of editorial resources to full-length papers. Furthermore, average citation per article increased because the journal published fewer papers and the citations to those papers were higher on average.
2. A publisher who spotted and validated a new revenue opportunity
Sometimes measuring what you don’t do is just as important as measuring what you do: new opportunities can be spotted in unlikely places. A key attribute of Vizors is the insight it provides into the articles editors decided not to publish. Who did publish them? And how were they subsequently cited?
This is the story of a major medical publisher who was rejecting a large number of articles from their highly selective, established, flagship journal. There was a sense that many of these papers went on to be published, and highly cited, in competing journals.
The publisher wanted to assess the performance of these rejected articles to see if there was a business case for a new journal. Vizors provided a breakdown of how many rejected articles had been published elsewhere, which journal they were published in and the number of citations they achieved.
The publisher found that the highest-cited papers were published in a small number of competing journals. By then undertaking a comparison of journal aims and scope, and analysis of average citations, the publisher could make a business case for a new journal. The journal was launched with a scope that allowed it to gather a significant proportion of papers that might otherwise have been rejected from the flagship title and then published elsewhere.
3. A publisher who made better-informed decisions
Data-driven decision-making is core to the success of the world’s leading businesses many of whom - Netflix, Amazon - have invested heavily in complex, algorithmically-driven intelligent systems. Such complexity lies beyond the reach of many. But there are numerous ways in which data-driven decision making can be enabled in publishing businesses of any scale.
This is the story of a major scientific publisher who had a policy of securing at least two peer reviews for every article they published. In cases where reviewer opinions were not in agreement, a third reviewer was required by the journal’s publication policy.
The publisher noticed that some articles they published were difficult to recruit peer reviewers for; and they wondered if these were just poorer or less interesting articles. In some cases, nine or more prospective reviewers declined to review the paper before two reviews were secured. This represented a considerable investment of editorial time. The publisher wanted to understand if there was a difference in citations between their average papers and those papers that required additional effort to secure reviews in order to make an informed decision regarding potential changes in the journal’s review policy.
As a result of the analysis, it was clear that the journal's review policies needed no changes – these ‘challenging’ articles performed better, in many cases than those which went through a typical review process. The evaluation of the contribution to the research over time reinforced the need to persevere with future papers that require additional editorial attention.
By way of conclusion, here’s a final story. We recently had a help request from one of our small but successful society publishers. Their JIF had dropped 50% in one year! Being good Americans, their response to this unusual and sudden fall was - of course - to ask "WTF?!"
We ran our Hot Article Tracker with a date of publication and date of citation set to report on the previous year's JIF as it would show what the highest citation articles were that were dropping out of the JIF this current year. In doing this, we quickly helped them to identify that three articles had accounted for a little more than one-third of the journal's JIF from the previous year. So, when those dropped out in the current year, there was instantly a 35% reduction right there in the JIF.