Skip to Main Content

Academic Publishing

An introduction to the adaptations in the academic publishing model, including open access publishing and archiving, authors rights, and tools and strategies for measuring the impact of research.

As more information is disseminated electronically, researchers will come to interact with that information on the open web in a variety of ways and through many different platforms and media types. Altmetrics measures how many times a journal article is downloaded, shared, commented on, and cited in social media outlets and can provide a meaningful indicator of the impact an article has among different user populations.  

Jason Priem and Bradley Hemminger, founders of the study of altmetrics, have compiled a list of sources from which data can be collected and relayed back to scholars as meaningful impact data.  These sources fall into seven categories:

  1. Social bookmarking sites (Delicious, CitULike, Connotea)
  2. Reference managers (Zotero, Mendeley)
  3. Recommendation sites (Dig, Reddit, Friendfeed, Faculty of 1000)
  4. Publisher hosted comment spaces (PloS, British Medical Journal)
  5. Microblogging (Twitter)
  6. Blogs (WordPress, Blogger, Researchblogger)
  7. Social networks (Facebook, Nature Networks, Orkut)

Culling usage data from various social sites has several advantages.  First, sites which have open APIs (application programming interface) can be accessed immediately for up-to-date usage statistics – an advantage over traditional citations.  Second, a growing number of commercial platforms such as ImpactStoryAltmetric.com, and Plum Analytics allow scholars to track usage of their works across several research blogs, journals, and user populations. This allows a granularity of data and presents a broader perspective of overall impact. It is important to note that altmetrics is not meant to replace traditional citation, but is best used in conjunction with citations for an overall picture of scholarly impact.

Limitations of Traditional Measures

Professor Pádraig Cunningham of UCD School of Computer Science and Informatics outlines h-index calculation and why it is not a usable metric for early career researchers:

Although article-level metrics (ALMs) are not completely new and ALMs are not strictly article-level, SPARC recently provided a definition of the concept:

"Article-Level Metrics (ALMs) are a new approach to quantifying the reach and impact of published research. Historically, impact has been measured at the journal level. A journal’s average number of citations to recent articles (i.e., its impact factor) has for years served as a proxy for that publication’s importance. Articles published in highly-cited journals were viewed as impactful by association. As electronic dissemination of scholarly content surpassed print, it became easier to disaggregate an individual article’s impact from the publication in which it appeared. It also became possible to track different markers of an article’s reach, beyond just citations. ALMs seek to incorporate new data sources (sometimes referred to as “altmetrics”) along with traditional measures to present a richer picture of how an individual article is being discussed, shared, and used."

— Greg Tannenbaum,  "Article-Level Metrics: A SPARC Primer" (2013).

Public Library of Science (PLOS) offers ALM Reports allowing you to collect metrics for PLOS articles and quickly visualize results.


How to Improve Altmetric Scores

To improve your altmetric scores you need to create an online presence and share information about your work and your research outputs online. There are many ways to do this such as:

  • Blog about your articles or work and ask others to write blog posts about your work.
  • Tweet actively on Twitter and tweet links to your articles and other work.
  • Use social networks for researchers. Create a profile and add your publication list to academic social networking sites, such as Academia.edu, ResearchGate and Mendeley.
  • Register for researcher IDs such as an ORCID id, ResearcherID and keep your list of publications up-to-date.
  • Make all your research outputs available online, including data, code, videos and presentations using content hosting tools such as figshare, Dryad, YouTube, Vimeo, SlideShare, SourceForge or GitHub.
  • Deposit your work in an institutional or subject repository.

See also Researcher Profiles, Identifiers and Social Networks: Maximise your Impact.

Tools to Measure the Your Scholarly Work:

Additional Metric Tools and Embedding Instructions:

Grants

Funding agencies like the NSF are increasingly asking researchers to document the "broader impacts" of their work. Altmetrics are a good way to do that, as they can help you find and explain how your research is being used by other researchers and the public. Some examples include:

Tenure & Promotion Guidelines

Some faculty are still unfamiliar with altmetrics, so do your homework before deciding whether or not to include altmetrics in your dossier. Ask around in your department with others who have recently gone up for P&T, and also your department chair, mentor, or anyone else familiar with the P&T process in your department and institution. 

If you do choose to use altmetrics in your dossier, keep in mind that it's best to be selective with the metrics you plan to include. It's much more effective to include metrics that showcase the types of impact you're looking to document, rather than taking a "kitchen sink" approach (which might overwhelm your reviewers with numbers).

Promotion & tenure preparation guidelines rarely include instructions on how to use impact metrics. Or, when they do, the guidelines usually only address citation metrics or, worse, recommend using journal impact factors.

These instructions often also lack guidance on how to make the metrics meaningful. For example, what does it mean if a tenure candidate says he received 5 citations for a paper published in 2013? Whether that's a good or bad number is often dependent upon the average citations that others in his field receive, and also the year the paper was published (as older papers tend to have more citations, by virtue of just being around longer). 

There's an obvious need for clear  instructions on how to use impact metrics in tenure & promotion dossiers. And there's also a need for guidelines to help dossier reviewers make sense of the numbers.

A small but growing number of universities include altmetrics in their tenure & promotion preparation guidelines. These include the University of Colorado Denver Medical School (PDF; page 84) and IUPUI (see: "The Guidelines for Preparing and Reviewing Promotion and Tenure Dossiers").

If you're interested in updating your university promotion & tenure guidelines to better document the use and interpretation of impact metrics, contact your faculty senate (or similar organization) to learn more about how that might work on your campus. You might also get in touch with your Vice Provost for Faculty & Academic Affairs (or similar campus office that oversees the writing of such guidelines). Some examples include:

Resumes & CVs

Some examples include (hover over the "info" icon for images):

Attribution

Content from this guide has been adapted from the following Impactstory LibGuides to altmetrics and is published here under a CC-BY license:

Creative Commons icon

The Florida State University Libraries
FIND & BORROW | RESEARCH & PUBLISH | VISIT & STUDY | COLLECTIONS | ABOUT | HELP & SUPPORT

© Florida State University Libraries | 116 Honors Way | Tallahassee, FL 32306 | (850) 644-2706