¶ 1 Leave a comment on paragraph 1 0 Research can give economies a competitive edge (DTI 1999). This is partly why the UK Research Councils (UKRC) fund around £3 billion of research projects annually. There is however much competition among researchers to access funding; after all only a finite amount of money is available. By defining impact as a number of criteria that are desirable, funding bodies have attempted to understand what the ‘best’ research is, so that funding can be targeted. Impact metrics allow an element of this assessment to be based (easily comparable) numerical measures.
¶ 2 Leave a comment on paragraph 2 2 NESTA’s 2010 report on creating value across boundaries offers some critique to quantitative measures: “Because it can seem foolish to set targets for creativity, curiosity, collegiality or serendipity, policy makers focus on what can be measured” and goes on to say that “Recognition of new insights and intellectual capacity requires a portfolio of benchmarking and descriptive approaches instead of existing metrics.” (Blackwell et al. 2010). The appetite for focusing only on what can be measured is fed by an abundance of apparently compelling data.
¶ 3 Leave a comment on paragraph 3 1 These data come in the form of indexed citations, and other figures derived from the index. Although certainly valuable, it is well documented that citation index data (and values derived from it such as H-index and Journal Impact Factor) are difficult to interpret meaningfully (Bornmann & Daniel 2007; Bar-Ilan 2008; Adam 2002). Some studies have shown that metrics such as the H-index are relatively simply manipulated (Arnold & Fowler 2010) and that having articles published in a journal with a favourable impact factor doesn’t necessarily mean they’re the most “valuable” contributions (Lowy 1997). The relatively new field of altmetrics is the yin to traditional metrics’ yang: proponents argue that we need to go beyond traditional notions of metrics and consider exactly what and how impact is being made (Priem et al. 2010). Altmetrics are interested in going far beyond citation analysis, considering the relative value of data analysis based on blogs, social media content, and download counts, to name a few examples.
¶ 4 Leave a comment on paragraph 4 2 Although altmetrics do, by implication, have social elements, they do not directly have any mechanism or intention to evaluate the impact of research in terms of CoP. Altmetrics is an extremely new, and therefore evolving, field. So far altmetrics are primarily concerned with measuring the signatures of social activity, whether they are found in blogs, social-bookmarking, download/view statistics, etc.
¶ 5 Leave a comment on paragraph 5 2 In this paper the argument is based on an abstraction of CoP theory in order that it might relate the traditions, meanings and practices of scholarly publishing. The resulting line of questioning supports the core modus operandi of traditional metrics (tradmetrics) and altmetrics as tools for measuring impact. However the research questions that conclude this paper aim to find out if CoP theory may be a useful element for evaluating impact more holistically, but also whether extra impact can be recorded and maybe even cultivated, by properly appreciating CoP theory when evaluating impact.
¶ 6 Leave a comment on paragraph 6 0 In terms of structure, the paper begins with a discussion around the meaning of impact in the context that the paper intends it. Following on is a short introduction to how CoP theory has been interpreted and abstracted for the purposes of this paper. Traditional metrics and altmetrics are then discussed to explore their respective strengths/weaknesses; this includes some reflection on which aspects of social learning (as described by CoP theory) that they do/do not address. Finally the concluding remarks provide some additional context to the research questions proposed.