Hello, can you hear me?

We live in a world of competing voices, each struggling for attention amongst the hubub. Now, more than ever, it’s easy to express oneself, yet this very reality, can also make it more difficult to be heard. Academics cannot escape this new paradigm either. In the past, they would contribute journal articles, write books and present papers to conferences; yet now they are increasingly expected to write blogs, tweet and have a more visible online presence; but are they being heard?

This week in our DITA class we were exploring the world of altmetrics. These are Alternative Metrics by which the impact of academic journal articles can be measured. Traditionally in academia, the measure of the ‘success’ of an article was by the number of citations which it received; this still remains a valid and important measure. Nevertheless, there has been a move in recent years towards identifying and measuring the broader societal impact of academic work. These twin complementary approaches can hopefully provide a clearer picture of the impact of an article has. I believe this to be a very positive step, because for academic research to be truly meaningful it needs to be disseminated and read as broadly as possible, rather than remaining largely irrelevant and only read by, and of interest, to fellow academics.

Advances in technology, particularly the development of social media and the APIs which permit us to engage with the data generated therein, makes the generation of these altmetrics possible. So how can we assess the societal impact of an article? In order to be traceable by altmetrics, a document needs to have a Digital Object Identifier (DOI), which is a unique string of numbers. These DOIs contain important machine-readable metadata. Rather than counting its citations in other articles, altmetrics counts the number of mentions an article has on social media, page views, mentions in blogs and mentions in news reports. A score is then produced for an article based upon the level of attention it has received and the quality of that attention. A low score would indicate that an article has made little impact, whereas a higher score would indicate a larger impact.

With a score of 8298, the highest rated article on altmetric.com is a scientific abstract examining an aspect of the ecological damage caused by the Fukushima nuclear disaster in 2011. The main determinant behind its very high score was the fact that a link to it was tweeted 16229 times by 10,015 tweeters. This fact in itself, resulted in the article then being referenced in two news reports, by the International Business Times and Chemistry Views. This shows us the entwined and cannibalistic nature of social media – success begets success; the two articles (which were commenting on the success of the article) then combined to push up the score of the article further!

Each altmetric score is represented graphically as a multicoloured ring, with each colour representing a separate source where the article was mentioned (e.g red for a news source, dark blue for Facebook, light blue for Twitter); therefore, the more multicoloured the ring, the more broadly across sources has the article been mentioned, and conversely, if a ring has just one colour then it means it has only been mentioned in one source.

altmetric-badges.a.ssl.fastly.netThis is the ring for the article with the highest altmetric score I mentioned previously.

Currently altmetrics is best set up to measure the societal impact of scientific articles, so I was curious as an historian to see how History articles fared under this system. I made two separate but related searches into an area I am interested in and have taught, the struggle for Civil Rights and racial equality in the United States in the 1950s and 60s. For both searches I kept the parameters exactly the same in order that a fair comparison could be made between the results. I first searched for articles with the key words “civil rights” from Journal Subject ‘History and Archaeology’, mentioned at any time, on any app. I then repeated the search with the key words “black power”. These two searches were then saved in My Workspaces.

Workspace title Email reports Export
All mentioned articles from journal subject HISTORY AND ARCHAEOLOGY with keywords “black power”, with at least one twitter,gplus,news,linkedin,blogs,pinterest,video,facebook,reddit,f1000,rh,peerreview,weibo,policy mention (delete?) To Excel
All mentioned articles from journal subject HISTORY AND ARCHAEOLOGY with keywords “civil rights”, with at least one twitter,gplus,news,linkedin,blogs,pinterest,video,facebook,reddit,f1000,rh,peerreview,weibo,policy mention (delete?) To Excel

I was curious to see where the current focus of scholarship lies in this field. Traditionally the overwhelming majority of research has focussed on the non-violent Civil Rights Movement, yet in 2006, with the publishing of The Black Power Movement there was a slight reorientation to an examination of the significance of Black Power upon the broader struggle for racial equality.

Both searches came up with very limited results, with ‘Civil Right’ still proving a more popular topic than ‘Black Power’. There were 18 results for Civil Rights, although 3 had to be dismissed for lack of relevance (focussing on Gay Civil Rights and the Environmental Civil Rights movements) and 5 results for black power. Results could be viewed in the interface either as standard or tiled (which both utilised the altmetrics ring graphic), or as condensed, which was my preferred view, showing the results clearly in tabular form. Additionally, results can be exported to Excel as CSV files, where filters can be applied to allow the user to get into the data better.

Only two of the articles from the Civil Rights search came into double figures (11 and 13), whilst the highest score for any Black Power article was only 2. This would appear to suggest that both these areas are neglected in scholarship at present and that that research which is being published has very little resonance in social media.

Altmetrics is a welcome tool, I do however, have some caveats. Firstly, the results it throws up are largely quantitative and tell us how widely the article has been mentioned. In itself, it doesn’t tell us whether the reception was positive or not. Theoretically, an article which has been very negatively received could nevertheless be given a very high altmetrics score, solely down to thousands of people on Twitter saying ‘check out this article, it really sucks!’. Likewise, altmetrics give us no indication as to the quality of an article.

Furthermore, I do have to question the accuracy of the search results. Today I carried out the exact same searches just 5 days after my initial searches. The Black Power results were identical but a further 7 Civil Rights articles were found. Clearly these articles have not been published in the last 5 days, so why did they not appear in the original search?

Clearly these are the very early days of altmetrics, and with time and further development, it will hopefully prove as useful to the social sciences as it currently does for the scientific community.

This entry was posted in Altmetrics and tagged , , , . Bookmark the permalink.

4 Responses to Hello, can you hear me?

  1. Alison Pope says:

    Is the time period facet for activity rather than publications? Can you see new mentions for the 7 additional Civil Rights articles since you originally ran the search?


  2. Pingback: An alternative to traditional metrics | Digital Information Technology and Architecture in the 21st Century

  3. Pingback: Screwing around | Steve Mishkin: For what it's worth

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s