Attending the Frankfurt Book Fair recently – where Johannes Gutenberg used to go to sell his books – it occurred to me that one of the most challenging conceptual leaps our industry has to make when it comes to digital is a changed perception of time.
More particularly, I was thinking about how the time gap between publishing your works to readers and seeing how they receive them – do they browse, do they buy, do they download, do they share? – has become vanishingly small.
Analytics, now, are practically real-time, and many publishers are competing for mindspace with companies like Amazon who will test variant page layouts to see which one sells best and make instant changes based on the data.
And yet many in our industry still view proper analytics as an afterthought – ‘something we’ll look at if here’s any spare budget’ – and prefer to focus solely on discovery. Surely, this is madness.
Stats get more ‘touchy-feely’
One of the major reasons for going to trade fairs like Frankfurt is the chance to look your buyers in the eye and see what they really think of what you’re offering. All of those subtle and subliminal cues we pick up at a fair – body language, zeitgeist, gossip, mood music, whatever you want to call it – though difficult to quantify when the Financial Director queries the expense, nevertheless keep us coming back year after year. We need them, to make a fully informed judgment of which way the market is going.
Historically, the statistical information we get from sales reports and so on has been much drier by contrast, and harder from which to draw inferences about market behaviour. Digital has changed this radically.
While it is difficult to imagine face-to-face events like Frankfurt ever not being valuable, we now have extraordinarily rich data resources to guide us in the vital activity of seeing exactly what is going on with our customers.
Tools such as Google Analytics, supplemented by social search tools, offer an embarrassment of riches – and they offer it almost instantaneously. All of the previously mysterious customer behaviours, both before and after sale, that take place during the interaction between publisher content and customer need are now rendered transparent. The major difficulty now lies in what you choose to pay attention to, how you interpret the information you gather, and what you decide to do as a result.
Much richer information is available, and in a short enough time frame to be immediately actionable. You can develop incrementally, test and pilot much more rapidly, and so limit the risk of innovation and new product development.
The worst strategy, it seems to me, would be to ignore these riches that are on offer simply because making use of them lies outside a traditional workflow. You can be sure your competitors will not do the same.
A website is a website
Part of the reason for the failure to capitalize on better analytics, it seems to me, is the inward-looking attitude that tends to prevail in publishing.
A publisher platform is, fundamentally, a website. This might seem like stating the blindingly obvious, however there is a tendency in our industry to see it as a particular and specialised entity in a category of its own.
True, publisher platforms hold a great deal of content – but then, so does YouTube or Facebook. True, that content is (generally) of a particularly valuable and exalted kind. But to the robot that crawl and index the web, and control mass discoverability, these honorifics are close to meaningless. All they see is structure and relevance.
True, the users of these websites – librarians, researchers, students – are a particular and perhaps atypical set of human beings. But when they are not researching books and journals, they are using the web to do their shopping, seeing exactly how bad the weather is going to be next week, and buying tickets to holiday destinations, just the same as the rest of us.
My point is that publisher sites don’t exist in a separate, roped-off part of the web devoted solely to scholarly enquiry – or even within a particular purlieu known as the information industry. And if we act as if they do, there is a danger that publisher platforms become divorced from the mainstream of development and innovation online. With commercial consequences for everyone involved.
One particular symptom of this disconnect that I see is failure to make the right kind of conceptual link between discovery and analytics.
Stats, metrics and impact
One truly unique feature of our industry is that metrics are of vital interest to several parts of the scholarly value chain – not just to marketing. Academic careers, as well as journal sales, have historically depended on the citation data embodied in ‘impact factor’; and funders are now increasingly seeking wider measures of impact. Librarians have increasingly drawn on usage data to make purchasing decisions, and are beginning to be more data-driven in their strategic decision-making about the future of their institutional libraries.
Metrics (and altmetrics) are therefore a hot topic of debate. However, it is possible that this wholly necessary focus on a set of crucial issues particular to our industry diverts attention from the more commercial uses of analytics by publishers, where there is a far greater degree of commonalty with other areas of business, and opportunities to draw on a larger pool of best practice.
I quite often see aspects of our commercial environment cited as evidence of publishing’s uniqueness – e.g. our buyers are not the same people as our end users – which in reality are rather generic features of almost any business-to-business (B2B) environment. The people who procure tanks on behalf of the military aren’t the people who drive them. The people who book corporate travel aren’t the ones you see standing around in baggage reclaim. And so on.
The marketing issues that arise from such a situation are not unique to publishing either. So why shouldn’t we look at how other sectors use digital to solve similar problems to our own?
And when we do so, what we find is that B2B digital marketing has undergone a revolution in recent years.
The revolution in content marketing
Online has become an increasingly dominant factor in B2B marketing in the last decade, with upwards of 90% of sales commencing with a web search. SEO has been an important part of the mix as a result. Frequent changes in the Google algorithm have led to the cat-and-mouse game played with SEO agencies, which most of us will be familiar with; however there are signs that SEO is finally losing this contest, as Google incorporates more semantic technology and foregrounds relevance (rather than authority) as the most important weighting factor in how it ranks web pages.
‘Quality content’ is now the mantra of a whole new practice known as content marketing – using content to market products and services online. The skillsets of marketing departments is changing to reflect this new emphasis on content generation and curation. Marketing is becoming more data-driven – and more automated, with tools and platforms such as Hubspot and
Latest news and blog articles
A quick rundown from the 9th Peer Review Congress – Part 3 – Preprints and their effectiveness in peer review
A quick rundown from the 9th Peer Review Congress – Part 2 – Trust and Transparency in Peer Review Models