[go: up one dir, main page]

Skip to main content Accessibility help
×

Menu links

  1. Browse
    1. Subjects
      1. Subjects (A-D)
        1. Anthropology
        2. Archaeology
        3. Area Studies
        4. Art
        5. Chemistry
        6. Classical Studies
        7. Computer Science
        8. Drama, Theatre, Performance Studies
      2. Subjects (E-K)
        1. Earth and Environmental Science
        2. Economics
        3. Education
        4. Engineering
        5. English Language Teaching – Resources for Teachers
        6. Film, Media, Mass Communication
        7. General Science
        8. Geography
        9. History
      3. Subjects (L-O)
        1. Language and Linguistics
        2. Law
        3. Life Sciences
        4. Literature
        5. Management
        6. Materials Science
        7. Mathematics
        8. Medicine
        9. Music
        10. Nutrition
      4. Subjects (P-Z)
        1. Philosophy
        2. Physics and Astronomy
        3. Politics and International Relations
        4. Psychiatry
        5. Psychology
        6. Religion
        7. Social Science Research Methods
        8. Sociology
        9. Statistics and Probability
    2. Open access
      1. All open access publishing
        1. Open access
        2. Open access journals
        3. Research open journals
        4. Journals containing open access
        5. Open access articles
        6. Open access books
        7. Open access Elements
    3. Journals
      1. Explore
        1. All journal subjects
        2. Search journals
      2. Open access
        1. Open access journals
        2. Research open journals
        3. Journals containing open access
        4. Open access articles
      3. Collections
        1. Cambridge Forum
        2. Cambridge Law Reports Collection
        3. Cambridge Prisms
        4. Research Directions
    4. Books
      1. Explore
        1. Books
        2. Open access books
        3. New books
        4. Flip it Open
      2. Collections
        1. Cambridge Companions
        2. Cambridge Editions
        3. Cambridge Histories
        4. Cambridge Library Collection
        5. Cambridge Shakespeare
        6. Cambridge Handbooks
      3. Collections (cont.)
        1. Dispute Settlement Reports Online
        2. Flip it Open
        3. Hemingway Letters
        4. Shakespeare Survey
        5. Stahl Online
        6. The Correspondence of Isaac Newton
    5. Elements
      1. Explore
        1. About Elements
        2. Elements series
        3. Open access Elements
        4. New Elements
      2. Subjects (A-E)
        1. Anthropology
        2. Archaeology
        3. Classical Studies
        4. Computer Science
        5. Drama, Theatre, Performance Studies
        6. Earth and Environmental Sciences
        7. Economics
        8. Education
        9. Engineering
      3. Subjects (F-O)
        1. Film, Media, Mass Communication
        2. History
        3. Language and Linguistics
        4. Law
        5. Life Sciences
        6. Literature
        7. Management
        8. Mathematics
        9. Medicine
        10. Music
      4. Subjects (P-Z)
        1. Philosophy
        2. Physics and Astronomy
        3. Politics and International Relations
        4. Psychology
        5. Religion
        6. Sociology
        7. Statistics and Probability
    6. Textbooks
      1. Explore
        1. Cambridge Higher Education
        2. Title list
        3. New titles
    7. Collections
      1. Book collections
        1. Cambridge Companions
        2. Cambridge Editions
        3. Cambridge Histories
        4. Cambridge Library Collection
        5. Cambridge Shakespeare
        6. Cambridge Handbooks
      2. Book collections (cont.)
        1. Dispute Settlement Reports Online
        2. Flip it Open
        3. Hemingway Letters
        4. Shakespeare Survey
        5. Stahl Online
        6. The Correspondence of Isaac Newton
      3. Journal collections
        1. Cambridge Forum
        2. Cambridge Law Reports Collection
        3. Cambridge Materials
        4. Cambridge Prisms
      4. Series
        1. All series
    8. Partners
      1. Partners
        1. Agenda Publishing
        2. Anthem Press
        3. ARC Humanities Press
        4. Boydell & Brewer
        5. Bristol University Press
        6. Edinburgh University Press
        7. Emirates Center for Strategic Studies and Research
        8. Facet Publishing
        9. Foundation Books
        10. Gerlach Press
      2. Partners (cont.)
        1. Intersentia
        2. ISEAS-Yusof Ishak Institute
        3. Jagiellonian University Press
        4. Manchester University Press
        5. Oxbow Books
        6. Royal Economic Society
        7. The University of Adelaide Press
        8. Unisa Press
        9. Wits University Press
  2. Services
    1. About
      1. About Cambridge Core
        1. About
        2. Accessibility statement
        3. CrossMark policy
        4. Ethical Standards
      2. Environment and sustainability
        1. Environment and sustainability
        2. Reducing print
        3. Journals moving to online only
      3. Guides
        1. User guides
        2. User Guides and Videos
        3. Support Videos
        4. Training
      4. Help
        1. Cambridge Core help
        2. Contact us
        3. Technical support
    2. Agents
      1. Services for agents
        1. Services for agents
        2. Journals for agents
        3. Books for agents
        4. Price list
    3. Authors
      1. Journals
        1. Journal authors
        2. Journal publishing statistics
        3. Corresponding author
        4. Seeking permission to use copyrighted material
        5. Publishing supplementary material
        6. Writing an effective abstract
        7. Journal production - FAQs
      2. Journals (cont.)
        1. Author affiliations
        2. Co-reviewing policy
        3. Anonymising your manuscript
        4. Publishing open access
        5. Convert your article to Gold Open Access
        6. Publishing Open Access - webinars
      3. Journals (cont.)
        1. Preparing and submitting your paper
        2. Publication journey
        3. Publishing agreement FAQs for journal authors
        4. Author Information Form FAQs
        5. Promoting your published paper
        6. Measuring impact
        7. Journals artwork guide
        8. Using ORCID
      4. Books
        1. Books
        2. Publishing an open access book
        3. Marketing your book
        4. Author guides for Cambridge Elements
    4. Corporates
      1. Corporates
        1. Commercial reprints
        2. Advertising
        3. Sponsorship
        4. Book special sales
        5. Contact us
    5. Editors
      1. Information
        1. Journal development
        2. Peer review for editors
        3. Open access for editors
        4. Policies and guidelines
      2. Resources
        1. The editor's role
        2. Open research for editors
        3. Engagement and promotion for editors
    6. Librarians
      1. Information
        1. Open Access for Librarians
        2. Transformative agreements
        3. Transformative Agreements - FAQs
        4. Evidence based acquisition
        5. Cambridge libraries of the world podcast
        6. Purchasing models
        7. Journals Publishing Updates
      2. Products
        1. Cambridge frontlist
        2. Cambridge journals digital archive
        3. Hot topics
        4. Other digital products
        5. Perpetual access products
        6. Price list
        7. Developing country programme
        8. New content
      3. Tools
        1. Eligibility checker
        2. Transformative agreements
        3. KBART
        4. MARC records
        5. Using MARCEdit for MARC records
        6. Inbound OpenURL specifications
        7. COUNTER usage reporting
      4. Resources
        1. Catalogues and resources
        2. Making the most of your EBA
        3. Posters
        4. Leaflets and brochures
        5. Additional resources
        6. Find my sales contact
        7. Training
        8. Read and publish resources
    7. Peer review
      1. Peer review
        1. How to peer review journal articles
        2. How to peer review book proposals
        3. How to peer review Registered Reports
        4. Peer review FAQs
        5. Ethics in peer review
        6. Online peer review systems
        7. A guide to Publons
    8. Publishing ethics
      1. Journals
        1. Publishing ethics guidelines for journals
        2. Core editorial policies for journals
        3. Authorship and contributorship for journals
        4. Affiliations for journals
        5. Research ethics for journals
        6. Competing interests and funding for journals
      2. Journals (cont.)
        1. Data and supporting evidence for journals
        2. Misconduct for journals
        3. Corrections, retractions and removals for journals
        4. Versions and adaptations for journals
        5. Libel, defamation and freedom of expression
        6. Business ethics journals
      3. Books
        1. Publishing ethics guidelines for books
        2. Core editorial policies for books
        3. Authorship and contributorship for books
        4. Affiliations for books
        5. Research ethics for books
        6. Competing interests and funding for books
      4. Books (cont.)
        1. Data and supporting evidence for books
        2. Misconduct for books
        3. Corrections, retractions and removals for books
        4. Versions and adaptations for books
        5. Libel, defamation and freedom of expression
        6. Business ethics books
    9. Publishing partners
      1. Publishing partners
        1. Publishing partnerships
        2. Book publishing partners
        3. eBook publishing partnerships
        4. Journal publishing partnerships
      2. Publishing partners (cont.)
        1. Journals publishing
        2. Customer support
        3. Membership Services
        4. Our Team
  3. Open research
    1. Open access policies
      1. Open access policies
        1. Open research
        2. Open access policies
        3. Cambridge University Press and Plan S
        4. Text and data mining
        5. Preprint policy
        6. Social sharing
      2. Journals
        1. Open access journals
        2. Gold Open Access journals
        3. Transformative journals
        4. Green Open Access policy for journals
        5. Transparent pricing policy for journals
      3. Books and Elements
        1. Open access books
        2. Gold open access books
        3. Green Open Access policy for books
        4. Open access Elements
    2. Open access publishing
      1. About open access
        1. Open research
        2. Open Access Week
        3. What is open access?
        4. Open access glossary
        5. Open access myths
        6. Hybrid Open Access FAQs
        7. Eligibility checker
      2. Open access resources
        1. Open access resources
        2. Benefits of open access
        3. Creative commons licences
        4. Funder policies and mandates
        5. Article type definitions
        6. Convert your article to Gold Open Access
        7. Open access video resources
    3. Open research initiatives
      1. Research transparency
        1. Transparency and openness
        2. Open Practice Badges
        3. OA organisations, initiatives & directories
        4. Registered Reports
        5. Annotation for Transparent Inquiry (ATI)
      2. Journal flips
        1. Open access journal flips
        2. OA Journal Flip FAQs
      3. Flip it Open
        1. Flip it Open
        2. Flip it Open FAQs
    4. Open access funding
      1. Open access funding
        1. Funding open access publication
        2. Cambridge Open Equity Initiative
        3. Completing a RightsLink (open access) transaction
    5. Cambridge Open Engage
      1. Cambridge Open Engage
        1. Cambridge Open Engage
        2. Partner With Us
        3. Branded Hubs
        4. Event Workspaces
        5. Partner Resources
        6. APSA Preprints
        7. APSA Preprints FAQs

Authors

Measuring impact

This page explains the most common metrics used to measure journal-level impact in scholarly publishing.

Journal-level metrics

As a signatory of the San Francisco Declaration on Research Assessment  (DORA) we believe it is important that researchers and research outputs are evaluated fairly and on their own merits, rather than on the basis of the journal in which the research is published. To help provide authors and readers with a richer, more nuanced understanding of journal performance, we promote a range of metrics on our website and in individual journals’ promotional materials.


None of these metrics offer a perfect measure of a journal’s impact, and we do not recommend using them to assess individual articles or authors. We believe that more accurate and appropriate evaluation of research outputs will help to maximise the impact and benefits of research and benefit our authors and communities. All metrics discussed below apply at the journal level, not at the article or individual author level. They should not be used as a proxy for the ‘prestige’ of individual authors or their articles.


Metrics are produced by several external companies, some are publisher-owned, and should be considered in the context of how they are calculated or produced. Some may be valid for comparison between or among similar titles, but others are specific to titles’ data and should be compared only longitudinally for the single journal and not across titles.


The information below is just a starting point. Bibliometrics is a research field in its own right, with its own academic journals. To explore the rich literature that examines the relevance and limitation of the various metrics available, we recommend the Metrics Toolkit , HEFCE's Independent Review of Metrics  and the University Ranking Watch blog  as starting points.


Metrics provided by Clarivate Analytics

Clarivate publishes a range of metrics in their annual Journal Citation Reports, which are typically published in the middle of the calendar year. Clarivate takes their data from their Web of Science.


1. Journal Impact Factor (JIF)

The JIF purports to provide an ‘average citation score per article’ in a journal, by dividing numbers of citations (to all material published by a journal) by numbers of “citable” articles (citable articles for this purpose do not include book reviews or editorial material.) Two versions of the JIF are published:

a. Two-year JIF. The number of citations garnered by the journal in year Y to all articles published in years Y-1 and Y-2, divided by the numbers of citable articles published in years Y-1 and Y-2.

b. Five-year JIF. The number of citations garnered by the journal in year Y to all articles published in years Y-1 to Y-5, divided by the numbers of citable articles published in years Y-1 to Y-5.

Clarivate ranks journals in several subject categories based on the JIF in each release of the Journal Citation Reports. However, there are significant limitations to the usefulness of comparing journals based on JIFs.

  • Every journal publishes different numbers of articles in each year, within a differing range of article types, even if their aims and scopes are similar.
  • Journals publishing relatively small numbers of articles in a year may be subject to significant swings of the resultant JIF, especially the two-year JIF.

A more useful comparison that JIFs may provide is to compare the ‘performance’ of each journal longitudinally, over successive years.

The advantages and limitations of Impact Factor as a measure of research impact have been widely discussed. For more information and for an introduction to this controversial topic, see Impact Factors: Use and Abuse [opens in a new window]">Impact Factors: Use and Abuse  by Mayur Amin, and The Agony and the Ecstasy – The History and Meaning of the Journal Impact Factor [opens in a new window]">The Agony and the Ecstasy – The History and Meaning of the Journal Impact Factor  by Michael Mabe and Eugene Garfield.

2. Journal Citation Indicator (JCI)

In 2021 Clarivate released a new metric – the Journal Citation Indicator (JCI). More information on this metric can be found on Clarivate’s blog .

The JCI is a field-normalised metric, representing the average citation impact for papers published in the prior three-year period. Several factors are considered in the normalisation process, including subject/field, publication type, and year of publication, which are intended to make it more reasonable to use the JCI to compare journals across disciplines.

A JCI value of 1.0 means that, across the journal, published papers received several citations equal to the average citation count in that journal’s category. However, because citation counts are not evenly distributed (most papers receive a small number of citations, and few gain more than average), most journals will not have a JCI value above 1.0. A value of 1.5 means that the journal’s papers have received 50% more citations than the average journal in its subject category.

JCI calculations include a wider span of material and citation years than the JIF. For example, the 2020 JCI includes citations made in 2017-2020 to articles published in 2017-2019. Clarivate also gives a JCI score to a broader range of journals than those receiving a JIF: all journals in Clarivate’s Emerging Sources Citation Index (ESCI) and Arts & Humanities Citation Index (ACHI), along with its SCIE and SSCI indexes, receive a score.

Unlike the JIF (which counts all citations), only citations to reviews and research material will count toward calculation of the JCI. Citations to editorial material will not be taken into account.

3. Eigenfactor

The Eigenfactor Score considers the number of times a journal has been cited in one year, based on the number of articles published in the previous five years, similar to the 5-year JIF. Using citation data from a five-year window smooths out sudden changes to some extent. Like the JIF, Eigenfactor Scores are only published for journals in Clarivate’s SCIE and SSCI indexes.

The major differences between the Eigenfactor Score and the JIF are that citations are weighted according to the relative value of the citing journal (similar to Google’s PageRank ). Highly cited journals are assigned a higher weight than poorly-cited journals, and self-citations (to the journal by the journal) are omitted.

An alternative way of considering what the Eigenfactor Score represents is to view it as measuring the ‘popularity’ of a journal within all the citations counted in the Web of Science within a given year. All other things being equal, if one journal publishes twice as many articles as another, that journal will have an Eigenfactor Score twice as big as that of the smaller journal.

Another major difference from the JIF is that the scores are normalised, so that for all journals the sum total of the Scores in the full database adds up to 100. As a result, as Clarivate adds more journals to their database, if all other aspects remained unchanged the Eigenfactor Scores for individual journals would decrease with time.

4. Article Influence Score

The Article Influence Score  is linked to the Eigenfactor Score, and also only published for journals in Clarivate’s SCIE and SSCI indexes. From the description provided by Clarivate: the Article Influence Score determines the average influence of a journal's articles over the first five years after publication.

It is calculated by dividing a journal’s Eigenfactor Score by the number of articles in the journal, normalised as a fraction of all articles in all publications. This measure is roughly analogous to the 5-year JIF in that it is a ratio of a journal’s citation influence to the size of the journal’s article contribution over a period of five years. The mean Article Influence Score is 1.00. A score greater than 1.00 indicates that articles in the journal tend to have above-average influence. A score less than 1.00 indicates that articles in the journal tend to have below-average influence.

As with the Eigenfactor Score, the Article Influence Score uses all the articles and journals in the Web of Science database for comparative analysis.


Metrics provided by Scopus

Scopus typically releases a new annual dataset in the middle of the calendar year, including a variety of metrics. Monthly cumulative CiteScores are also published for each journal, tracking how citations are accumulated during the year.

1. CiteScore

CiteScore is very similar to the JIF, in that it is calculated as an average citation per article. The CiteScore counts the citations received over a four year period to articles, reviews, conference papers, book chapters, and data papers published in that same four year period, and divides this by the number of publications during that time. The key differences from the JIF are as follows:

  • The calculation covers a 4-year period, rather than 2 or 5.
  • Citations from all years are included, not just the current year.
  • All content types are considered citable items.
  • Material sources do not just include journal articles.

The full CiteScore calculation can be found here , and the full dataset is publically available here . Scopus includes a larger number of journals in its CiteScore dataset than Clarivate.

2. SCimago Journal Rank (SJR)

The SJR expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years. A weighted citation gives more value to citations from prestigious journals than those from a title that has a smaller citation network. More information can be found at https://www.scimagojr.com/SCImagoJournalRank.pdf .

3. Source Normalised Impact per Paper (SNIP)

This metric is calculated using a method developed by the CWTS in Leiden . The calculation is complex but can be described as: ‘the ratio of a source's average citation count per paper and the citation potential of its subject field.SNIP  measures a source’s contextual citation impact by weighting citations based on the total number of citations in a subject field. It measures the average number of weighted citations to papers published in the previous three years. This time weighting is based on the number of citations within a field; if there are fewer total citations in a research field then citations are worth more in that field. This is therefore a metric where you can compare journals across different subject areas.


Metrics provided by Google Scholar

Google Scholar uses several metrics, updated annually, to accompany its indexing of journal articles and scholarly publications. Find out more about Google Scholar Metrics here .

1. h-index

The h-index of a publication is the largest number h, such that at least h articles in that publication were cited at least h times each. For example, a publication with five articles cited by 17, 9, 6, 2 and 1 other articles have an h-index of three. This is because three of the articles published – the ones cited 17, 9 and 6 times – were cited at least three times. This is a metric that is difficult to compare across subject fields because citation levels differ so much from field to field.

2. h-core

The h-core of a publication is a set of top-cited h articles from the publication. These are the articles that the h-index is based on. For example, the publication above has the h-core with three articles – those cited by 17, 9 and 6.

3. h-median

The h-median of a publication is the median of the citation counts in its h-core. For example, the h-median of the publication above is nine. The h-median is a measure of the distribution of citations to the h-core articles.

4. h5-index, h5-core, and h5-median

Finally, the h5-index, h5-core and h5-median of a publication are, respectively, the h-index, h-core and h-median of the set of articles published by that publication in the last five complete calendar years.


Metrics provided by Altmetric

Altmetric evaluates impact through counting mentions on social media, blogs and mainstream news sites. Altmetric  essentially provides an article-level metric but it is worth listing here because publishers increasingly report on how many mentions a journal has attracted overall on Altmetric, and compare those year on year.

1. Altmetric Attention Score (AAS)

The Altmetric Attention Score  (AAS) assigns a score for individual articles based on the number of mentions that article receives from a variety of online sources. These include news articles, blogs, Twitter, Facebook, Sina Weibo, Wikipedia, policy documents (per source), Q&A, F1000, Publons, Pubpeer, YouTube, Reddit, Pinterest, LinkedIn, and Open Syllabus.

The AAS also takes into consideration the ‘quality’ of the source (for example news articles receive a higher score than a Facebook post), and who it is that mentions the paper (the weighting takes into account whether the author of a mention of a research output regularly posts about scholarly articles). It should be noted that the AAS is largely limited to outputs published from 2011 onwards and does not differentiate between attention that is positive or negative.

The Altmetric donut  displays the numerical AAS score in the centre of the donut, with colours surrounding the donut that reflect the mix of sources mentioning the article – blues for Twitter, Facebook and other social media; yellow for blogs; red for mainstream media sources, and so on.