Tuesday, August 31, 2010

Bibliometrics

I just spent most of the day at a Scopus user event put on by Elsevier, the company that owns the abstracting and indexing database. The major focus of the event was some tools that Scopus makes available for measuring the impact of authors and jounals through citations and impact factors. Bibliometrics is a very delicate science, in the sense that it's very easy to take the numbers that you get from these tools and assign more significance to them than they actually hold. In a culture of accountability it's easy to compare numbers, and much harder to compare qualitative measures, but there's always a need to be cautious that you fully understand just what those numbers represent.

We've seen some of this in the initial user group of our own citations data collection. People are very reluctant to have a number associated with their work, for fear that the number will become a way of judging their performance. However, one of the major outcomes of the event, for me anyways, was a feeling that we're on the right track with our database. Not only are we doing similar work to what several other academic and government organizations are doing, we're working on a much smaller scale than many other places, which means we can curate our collection of citations in a way that's not possible for them. I will never be comfortable saying that we've captured every single citation of our work, but do I think we have good enough data to draw some conclusions about the kind of work that we're doing that's getting the widest dissemination, and presumably having the most impact (an assumption that underlies all bibliometric tools), which is what we set out to do.

No comments:

Post a Comment