Interview & Opinion

Alternative science metrics

The so-called “donut” by

Who publishes most; which article is quoted where and how often? – For many years, everything in science has revolved around the so-called impact factors that determine who is successful and who is not. In the field of impact measurement, altmetrics can be understood as non-traditional metrics proposed as an alternative to more traditional citation impact metrics through evaluating the attention a publication receives in online media – including platforms such as Facebook and Twitter, but also other sources such as policy documents.

Altmetrics assess a variety of citation and usage data of scientific contributions. The aim is to produce a multi-faceted image of the public perception and relevance of research results. Altmetrics are intended as a supplement to, not a replacement for, other impact measurements.

Where do altmetrics data come from?

The Austrian Science Fund FWF receives data from the UK company, currently one of the most important altmetrics providers. The company tracks and collates 19 online data sources for mentions or uses of specified publications. Depending on type, they may include just one source (e.g. Twitter or Wikipedia) or many individual sources, such as the approximately 9,000 blogs analysed by the company. The overall result of this search related to one specific publication is presented in the shape of a “donut” (see image on the left), whose colours illustrate the heterogeneity of the sources of attention for a given publication.

The data sources used by are comprehensive and diverse, but not uncontroversial: one finds no empirical or conceptual justification for the selection of sources, and despite the wealth of data, many politically relevant sources at national/regional level are not included. The main focus lies on English-language sources, although several German-language and Austrian daily or weekly newspapers are included, such as the online versions of Der Spiegel, Der Standard or Die Presse.

What is – and what isn’t – the relevance of altmetrics?

The debate about the information value of altmetrics is ongoing. But they do obviously provide a broader picture of the impact of research output that goes beyond the confines of the science system and is not equivalent to purely scientific success. This type of impact is assessed as being the “societal impact”, although it is not yet entirely clear what exactly is measured in this field. One understands that mentions of publications in patents, textbooks and medical-clinical guidelines or policy documents are seen as a good way of capturing the societal impact of research. Mentions or reviews on Twitter or Facebook, on the other hand, are not necessarily an indicator of the relevance of research output but mainly serve to demonstrate the interest and curiosity of a specific readership.

Against this backdrop, the greatly differentiated visualisation of the results by does seem useful, but the aggregation of data to a total value seems questionable. The relevance of the magnitude of this value is hard to interpret: what does a value of 1,348 as shown in the picture above really mean? In addition to an explanation of the complicated computational basis, one would need some information about whether 1,348 constitutes an average or perhaps an outstanding value. This would presuppose a standardisation of such data. The difficulty of substantive interpretation is exacerbated by deficiencies in the empirical material. The total score seems dominated by Twitter data, whose usefulness is dubious. The ease of manipulation of certain indicators (such as download or read rates) is another problem. The Altmetric company is aware of these weaknesses and makes them an issue for discussion. A final judgement seems premature, since altmetrics are still very much at the beginning of their development.

So, what do altmetrics tell us? Fundamentally, they demonstrate the attention paid to research output by various groups in society. Moreover, certain indicators have the potential of demonstrating “societal impact”, while other – quite prominent – sources of altmetrics such as Twitter or Facebook seem inappropriate. And what is it that altmetrics obviously cannot do? – Measuring the quality of research.

Why is the FWF testing altmetrics?

The legitimacy of a research funding organisation is based, inter alia, on the scientific quality and the scientific impact of the funded research, which can be captured by analysing citations in scientific journals. – Such analyses are already conducted by the FWF. While societal relevance is hard to capture, demonstrating this type of relevance instead of merely postulating it is becoming more and more important. Hence, the FWF is testing altmetrics in an attempt to find out whether a direct impact of basic research beyond the confines of the science system can be proven. It is also interesting to see whether and in what way alternative indicators represent an additional useful input to assessing the relevance of basic research. While altmetrics do have weaknesses, they also show potential. The inherent diversity and possibilities thus make testing them worthwhile.

What the FWF is not going to use altmetrics for!

The testing of relates to funded research in its totality. The data material will not be used to assess individual scholars or have an impact on future funding decisions.

Personal details

Ralph Reimann is a member of the FWF’s scientific team in the ”Strategy – Policy, Evaluation, Analysis” Department with a focus on data analyses. More information at:

More information

FWF Altmetrics results

This contribution is based on the following publications:

Bornmann, L. (2014). Validity of altmetrics data for measuring societal impact: A study using data from Altmetric and F1000Prime. Journal of Informetrics, 8: 935. [doi: 10.1016/j.joi.2014.09.007]
Bornmann, L. & Haunschild, R. (2017). Does evaluative scientometrics lose its main focus on scientific quality by the new orientation towards societal impact? Scientometrics, 110: 937. [doi: 10.1007/s11192-016-2200-2]
Bornmann, L., Haunschild, R. & Marx, W. (2016). Policy documents as sources for measuring societal impact: how often is climate change research mentioned in policy-related documents? Scientometrics, 109: 1477. [doi: 10.1007/s11192-016-2115-y]
Erdt, M., Nagarajan, A., et al. (2016). Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media. Scientometrics, 109: 1117. [doi: 10.1007/s11192-016-2077-0]
Patthi, B., Prasad, M., et al. (2017). Altmetrics – A Collated Adjunct Beyond Citations for Scholarly Impact: A Systematic Review. Journal of Clinical and Diagnostic Research, [serial online]2017 Jun 6 ZE16 – ZE20. [doi: 10.7860/JCDR/2017/26153.10078]
Piwowar, H.A. (2013). Altmetrics: Value all research products. Nature, v. 493, n. 159, January 10. [doi: 10.1038/493159a]
Priem, J., Piwowar, H.A. & Hemminger, B.M. (2012). Altmetrics in the Wild: Using social media to explore scholarly impact. [arXiv: 1203.4745v1]
Robinson-Garcia N., Costas, R., et al. (2017). The unbearable emptiness of tweeting—About journal articles. PLoS ONE 12(8): e0183551. [doi: 10.1371/journal.pone.0183551]
Robinson-Garcia N., Torres-Salinas, D., et al. (2014). New data, new possibilities: exploring the insides of El Prof Inf, 23: 359. [doi: 10.3145/epi.2014.jul.03]
Wardle, D. (2016). Why Altmetric scores should never be used to measure the merit of scientific publications (or ‘how to tweet your way to honour and glory’). Ideas in Ecology and Evolution, 9: 1. [doi: 10.4033/iee.2016.9.1.e]
Wilsdon, J., Allen, L., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. [doi: 10.13140/RG.2.1.4929.1363]

Comments (0)

Currently no comments for this article.

Write a comment

Your email address will not be published. Required fields are marked *