Skip to Main Content
It looks like you're using Internet Explorer 11 or older. This website works best with modern browsers such as the latest versions of Chrome, Firefox, Safari, and Edge. If you continue with this browser, you may see unexpected results.

Measuring Research Impact: Altmetrics

Use bibliometrics to measure the impact of your publications

Contact Us

For help or advice with any Measuring Research Impact / Bibliometrics query please email

Guide to Altmetrics


This is an emerging movement to use a wider range of metrics in addition to traditional citations. This can include discussion (e.g. Twitter), saving (e.g. Mendeley, Delicious) and sometimes viewing or recommending an item.

This can give a better picture of impact, but until this is better understood probably needs to be used with caution.

We have a lot more information on our guide to altmetrics.

What are altmetrics?

altmetrics = alternative metrics

Altmetrics are alternative metrics used to measure the impact of research.

The term altmetrics was first proposed in a tweet by Jason Priem in 2010, and further detailed in a manifesto.

The term is not clearly defined and can be used to mean:

Impact measured based on online activity, mined or gathered from online tools and social media for example:

  • tweets, mentions, shares or links,
  • downloads, clicks or views,
  • saves, bookmarks, favourites, likes or upvotes,
  • reviews, comments, ratings, or recommendations, 
  • adaptations or derivative works, and
  • readers, subscribers, watchers, or followers.


Metrics for alternative research outputs, for example citations to datasets.


Other alternative ways of measuring research impact.

Altmetrics can be used as an alternative, or in addition, to traditional metrics such as citation counts and impact factors. 

Image: Altmetric bookmarklet result for the article: Piwowar, H. (2013). Altmetrics: value all research products. Nature, 493(7431), 159-159. Screenshot taken Nov 4 2014.

Speed Altmetrics can accumulate more quickly than traditional metrics such as citations.
Range Altmetrics can be gathered for many types of research output, not just scholarly articles.
Granularity Altmetrics can provide metrics at the article, rather than journal level, such as journal impact factors.
Detail Altmetrics can give a fuller picture of research impact using many indicators, not just citations.
Non-academic Altmetrics can measure impact outside the academic word, where people may use but not cite research.
Sharing If researchers get credit for a wider range of research outputs, such as data, it could motivate further sharing.
Standards There are a lack of standards for altmetrics.
Unregulated Altmetrics could be manipulated or gamed.
Reliability Altmetrics may indicate popularity with the general public rather quality research.
Time There is no single widely used rating or score and altmetrics can be time consuming to gather.
Difficulty Altmetrics can be difficult to collect, for example bloggers or tweeters may not use unique identifiers for articles.
Overload There are many different metrics and providers to choose from and it can be hard to determine which are relevant.
Acceptance  Many funders and institutions use traditional metrics to measure research impact.
Context Use of online tools may differ by discipline, geographic region, and over time, making altmetrics difficult to interpret. 

An article by Linda Aiken in 2014 is a good example of these features.

Items with a DOI have an Altmetric 'badge' if statistics are available. This works better for items published after July 2011 - it is likely to underestimate activity of older items.

There are also download statistics for any item with available files. This only includes the version in ePrints Soton, and not other digital versions e.g. publishers websites and other repositories. These can be particularly useful where items are mostly accessed via ePrints Soton (e.g. theses, working papers and datasets).

An article by Linda Aiken in 2014 is a good example of these features.Altmetrics in Pure

Items with a DOI have Altmetric and PlumX 'badges' if statistics are available on the Pure Portal. You can also see these on the Pure Portal profiles for people and research units for example. These metrics are likely to underestimate activity of older items.

It is also possible to report on the (numeric) PlumX metrics in Pure.

Altmetrics is a movement to provide information about the attention a research output receives on social media, in tabloids, on TV and radio, Wikipedia, public policy and in other influential public spaces. See our Altmetrics page for more details.

Examples include (integrated into ePrints Soton) and PlumX (integrated with Scopus and DelphiS).

Like citation metrics, altmetrics should only be used as an imprecise indicator of how much attention an article is receiving relative to articles of the similar age; while excellent scores may be indicative of excellent engagement, the opposite is not necessarily true.PlumX Metrics

Also like citation metrics they are not able to tell the difference between negative and positive cites - i.e. negative media attention can also lead to a high score (e.g. the paper on 'Attractiveness of women with rectovaginal endometriosis' - in both Altmetric and PlumX).