This website may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the website work as you expect it to and give you a more personalized web experience. We respect your right to privacy, so you can choose not to allow some types of cookies. Click on the different category headings to find out more. You can accept or refuse our use of cookies, by moving the selector switch in each category to change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer you.
In which disciplines are we most successful? What opportunities do we have?
Background
- The publisher of a leading, multi-disciplinary science publication wanted to evaluate the representation and impact of articles within and across the many fields in its journal. The publisher always looks at citation averages within specific fields because citation behavior differs by discipline.
- The journal is highly selective and often the first choice of authors. The publisher had an idea of where rejected papers were published, but wanted to confirm assumptions about the top journals across disciplines.
- The publisher wanted to know which disciplines within the journal were the strongest in terms of number of papers and total citations.
Solution
Analysis with Impact Vizor
- For each field, the publisher evaluated the proportion of published papers (blue) and rejected papers (red) using the Rejected Article Tracker. Among the fields with the most published papers, some have an even higher percentage of rejected articles (blue box), representing a robust pipeline of research and strong reputation in that field.
- Journal editors seek the most significant research, but because citation averages vary across fields, each field must be evaluated. The green bars indicate the proportion of published papers in the discipline which were cited above the journal’s average citation level. In several fields (blue box), >50% of papers are above the journal’s citation average.
- In only two disciplines (red box) were no published articles cited above the journal’s average (no green bar). In these specialized fields, under 15% of the rejected papers outperformed the journal citation average (red bar) and the field’s citation average (blue bar).
Conclusions
- Across disciplines, the journal’s editors selected highly cited papers in their field, and many disciplines outperformed the average citations for the journal and the field.
- There is evidence to support the publisher’s decision to launch a new subject-specific journal. The volume of papers on the topic was at capacity for the multi-disciplinary journal, and there has been a strong submission pipeline. Though highly cited papers were rejected by the multi-disciplinary journal, these may be in scope for the new, subject-specific journal.
- Further analysis confirmed the publisher’s expectations that other top-tier journals published most of the rejected, top cited papers.
This information was presented during the panel session “The Many Pragmatic Uses of Impact Vizor” at the 2016 Fall HighWire Publisher’s Meeting.