As more information is disseminated electronically, researchers will come to interact with that information on the open web in a variety of ways and through many different platforms and media types. Altmetrics measures how many times a journal article is downloaded, shared, commented on, and cited in social media outlets and can provide a meaningful indicator of the impact an article has among different user populations.
Jason Priem and Bradley Hemminger, founders of the study of altmetrics, have compiled a list of sources from which data can be collected and relayed back to scholars as meaningful impact data. These sources fall into seven categories:
Culling usage data from various social sites has several advantages. First, sites which have open APIs (application programming interface) can be accessed immediately for up-to-date usage statistics – an advantage over traditional citations. Second, a growing number of commercial platforms such as ImpactStory, Altmetric.com, and Plum Analytics allow scholars to track usage of their works across several research blogs, journals, and user populations. This allows a granularity of data and presents a broader perspective of overall impact. It is important to note that altmetrics is not meant to replace traditional citation, but is best used in conjunction with citations for an overall picture of scholarly impact.
Limitations of Traditional Measures
Professor Pádraig Cunningham of UCD School of Computer Science and Informatics outlines h-index calculation and why it is not a usable metric for early career researchers:
Although article-level metrics (ALMs) are not completely new and ALMs are not strictly article-level, SPARC recently provided a definition of the concept:
"Article-Level Metrics (ALMs) are a new approach to quantifying the reach and impact of published research. Historically, impact has been measured at the journal level. A journal’s average number of citations to recent articles (i.e., its impact factor) has for years served as a proxy for that publication’s importance. Articles published in highly-cited journals were viewed as impactful by association. As electronic dissemination of scholarly content surpassed print, it became easier to disaggregate an individual article’s impact from the publication in which it appeared. It also became possible to track different markers of an article’s reach, beyond just citations. ALMs seek to incorporate new data sources (sometimes referred to as “altmetrics”) along with traditional measures to present a richer picture of how an individual article is being discussed, shared, and used."
— Greg Tannenbaum, "Article-Level Metrics: A SPARC Primer" (2013).
Public Library of Science (PLOS) offers ALM Reports allowing you to collect metrics for PLOS articles and quickly visualize results.
How to Improve Altmetric Scores
To improve your altmetric scores you need to create an online presence and share information about your work and your research outputs online. There are many ways to do this such as:
Tools to Measure the Your Scholarly Work:
Additional Metric Tools and Embedding Instructions:
Tenure & Promotion Guidelines
Some faculty are still unfamiliar with altmetrics, so do your homework before deciding whether or not to include altmetrics in your dossier. Ask around in your department with others who have recently gone up for P&T, and also your department chair, mentor, or anyone else familiar with the P&T process in your department and institution.
If you do choose to use altmetrics in your dossier, keep in mind that it's best to be selective with the metrics you plan to include. It's much more effective to include metrics that showcase the types of impact you're looking to document, rather than taking a "kitchen sink" approach (which might overwhelm your reviewers with numbers).
Promotion & tenure preparation guidelines rarely include instructions on how to use impact metrics. Or, when they do, the guidelines usually only address citation metrics or, worse, recommend using journal impact factors.
These instructions often also lack guidance on how to make the metrics meaningful. For example, what does it mean if a tenure candidate says he received 5 citations for a paper published in 2013? Whether that's a good or bad number is often dependent upon the average citations that others in his field receive, and also the year the paper was published (as older papers tend to have more citations, by virtue of just being around longer).
There's an obvious need for clear instructions on how to use impact metrics in tenure & promotion dossiers. And there's also a need for guidelines to help dossier reviewers make sense of the numbers.
A small but growing number of universities include altmetrics in their tenure & promotion preparation guidelines. These include the University of Colorado Denver Medical School (PDF; page 84) and IUPUI (see: "The Guidelines for Preparing and Reviewing Promotion and Tenure Dossiers").
If you're interested in updating your university promotion & tenure guidelines to better document the use and interpretation of impact metrics, contact your faculty senate (or similar organization) to learn more about how that might work on your campus. You might also get in touch with your Vice Provost for Faculty & Academic Affairs (or similar campus office that oversees the writing of such guidelines). Some examples include:
Resumes & CVs
Some examples include (hover over the "info" icon for images):
"Altmetric" vs. Altmetrics
Confusingly there is a company named Altmetric which provides and collects altmetrics for journals and articles. Many large publishers have contracts with them so the trademark Altmetric donut () can be found in many places. There are also Almetric badges which you can find on any deposited journal article with a DOI.