Tuesday, 25 March 2014

The Alphabet Soup of Measuring Scientific Output

When I returned to Australian science after twelve years in the USA, one of the first things I encountered was the use of metrics to try to assess a researcher's output. I was used to writing grants in the US system where I would simply list my number of publications. Over drinks with Mark Schembri and Steve Djordjevic at a pub in Adelaide in 2007, I was introduced to the concept of a H-index, which I had never heard of before.

After indoctrination into the mysteries of the H-index, I learn't that this metric tries to measure the impact of a scientist's publications by factoring how often they are cited by other scientific publications. So a H-index of 25 (about typical for a tenured professor) means that you have published 25 papers that have each received at least 25 citations. Depending on how you define citations, your H-index may vary considerably. For instance, according to Thomson ISI Web of Science, my H-index is 83, but according to GoogleScholar, my H-index is 94.

H-indices have a number of limitations, e.g., they are a cumulative measure, so a young scientist will virtually always have much lower H-indices than older scientists; citation rates vary alot between different scientific fields; review articles tend to be much more highly cited than original research papers, so write lots of reviews if you want a better H-index.

I've subsequently encountered a whole alphabet soup of ever more obscure metrics, which i don't really use. There is an M-index, which is your H-index divided by the number of years since you published your first paper. There's an i10-index, which is the number of publications to have at least 10 citations, and many others.

This latest grant round, Karl Hassan in my group introduced me to an entirely new metric for scientific impact- AltMetric. AltMetric tracks mentions of your scientific publications on news media, twitter, blogs, etc, so gives some indication of broader impact. Its pretty cool, though I'm not sure how much direct use I will make of it, but lets do a quick analysis.

IMHO, our two most important papers published in the last year were:

Hassan, K. A., Jackson, S. M., Penesyan, A., Patching, S. G., Tetu, S. G., Eijkelkamp, B. A., Brown, M. H., Henderson, P. J., and Paulsen, I. T. (2013) Transcriptomic and biochemical analyses identify a family of chlorhexidine efflux proteins. Proceedings of the National Academy of Sciences, USA 110: 20254-20259.

Tetu, S. G., Breakwell, K., Elbourne, L. D. H., Holmes, A. J., Gillings, M. R., and Paulsen, I. T. (2013) Life in the Dark: Metagenomic evidence that a microbial slime community is  driven by inorganic nitrogen metabolism. ISME Journal 7: 1227-1236.

What does AltMetric think?
Hassan et al.- 7 news stories, 2 blog posts (curiously does not include my blog), 15 tweets, and recommended on the Faculty of 1000; which puts it in the top 2% of all papers in terms of impact.

Tetu et al.- 1 blog post (also does not include my blog), 6 tweets, 2 facebook posts, 1 mention on reddit; which puts it in the top 6% of all papers in terms of impact (curiously, I know this paper was featured in many news articles, yet none of those are included).

After quickly checking all our other 2013 publications, altmetric and I agree, those are our two highest impact papers for the year. Based on this small sample then, altmetric seems to do any effective job. It also provides a very nice way of summarizing who is blogging or tweeting about your work.

No comments:

Post a Comment