Article metrics are metrics based on usage of a scholarly work or components of a work such as figures, or a non-article work such as software or slides, and its subsequent application or use. A peer-reviewed journal article is one example of a scholarly work.
An example of a traditional article metric is a citation to a work noted in the scholarly literature which allows for in-context understanding as to the nature, purpose and motivation of the citing author/s. See the Citations tab for more information.
With the advent of sophisticated digital applications, publishers and vendors developed other types of article metrics based on usage of the work in its digital format such as the number of times a work is read, viewed or downloaded. These are also referred to as altmetrics or alternative metrics.
Public Library of Science (PLoS) publishers, the first to offer quantitative usage counts in 2009, provides perhaps the most highly-developed publisher platform that provides number of reads, views or downloads.
Other examples of altmetrics or alternative metrics represent an immediate set of metrics that can be captured to determine how a work is shared among others, disseminated further, or commented upon using various social media-based platforms. Works can be in other formats besides the traditional journal article such as figures, slides, datasets, software, and unpublished works. These metrics are generated by a variety of audiences including non-academic audiences, and are considered to be representative of the level of "public or social engagement" activity based on a work.
Non-citation metrics can be useful as they provide supplemental metrics that can be used by authors to quantify the influence or impact of their works, and go beyond the traditional peer-reviewed journal article to include other scholarly works such as datasets, software, slides, figures, unpublished works such as a policy brief, etc. Some metrics such as online views or comments or recommendations represent early-stage engagement indicators of how and by whom a work is being shared, used, commented on, and disseminated further. Who is reading the new work? Who is tweeting about the new work? Where are they tweeting from? Is the work being commented on in a blog posting? By whom? A scientist or a policy-maker or a layperson? Are users bookmarking the work in Mendeley? Is the work the topic of an article in the press? Is a user viewing the slides in Slideshare? Is a user viewing the figure in Figshare?
The idea behind non-citation metrics is to gauge nascent influence or attention a work is garnering on various online platforms. It is evidence of the outreach of a work and serves as a complementary means to traditional citations as well as allowing authors to highlight multiple examples of scholarly output.
Use the updated Metrics Module in Scopus
Article metrics in Scopus allow users to evaluate both citation impact and levels of community engagement (digital footprints) around an article. Scopus has two types of article metrics available:
Traditional Citation Metrics:
Number of times a work has been cited.
The FWCI is the ratio of the article's citations to the average number of citations received by all similar articles over a three-year window. Each discipline makes an equal contribution to the metric, which eliminates differences in researcher citation behavior. A FWCI greater than 1.00 means the article is more cited than expected according to the average.
Citation benchmarking shows how citations received by this article compare with the average for similar articles. 99th percentile is high, and indicates an article in the top 1% globally. It takes into account the year of publication, document type and disciplines associated with its source.
Altmetrics from Plum Analytics:
Usage indicate active use of the work such as clicks, downloads, views, holdings, etc.
Captures indicate users who want to return to the work such as Mendeley, bookmarks, etc.
Mentions indicate users that are actively engaging with the work such as news articles or blog posts.
Social Media measures “buzz” and attention such as tweets, Facebook likes, etc. that reference the work.
Citations are based on traditional citation counts from Scopus but also include citations to some clinical guidelines and patents.
The full metric module is available from the document details sidebar on the Scopus record page. The Metrics sidebar highlights the minimal number of meaningful metrics a researcher needs to evaluate both citation impact and levels of community engagement. Click on “View All Metrics” to be directed to the full metric module.
If you have works in PubMed Central you can obtain use statistics on how frequently your works are being accessed.
See NIHMS Users: Do You Know How Often Your Paper is Being Accessed Via PMC? Here’s How to Find Out for guidance on how to obtain use statistics.