Understanding the impact of journals and scholars is essential in academic research. Many platforms, such as Scopus and Web of Science, assign ranks to journals mainly based on citation metrics. This page provides an overview of the important metrics that help to find the impact of an article, journal or scholar. Metrics marked as author impact metrics are primarily meant to measure the impact of scholars/authors.

Metrics from Scopus

CiteScore
CiteScore is a metric developed by Elsevier and calculated using data from the Scopus database. This rating is an annual value released each June that measures the citation impact of a title, such as a journal, book series, or conference proceedings. The primary purpose of CiteScore is to provide an indicator of a journal’s impact and quality within the academic community.
CiteScore values are calculated by taking the total citation counts of peer-reviewed documents (such as articles, reviews, conference papers, data papers, and book chapters) published over a four-year period (e.g., 2016-2019) and dividing that number by the total number of those documents published in the same four years.

By clicking the sources menu(external link) on the top left-hand side of the page, you can access the CiteScore and other related metrics. You can search for the specific publication using the 'title' option from the dropdown menu on the top right-hand side of the page.

Citescore rank and trend provide a view of the source's rank and percentile for each subject category to which it belongs. CiteScore Tracker continues to update CiteScore monthly from the June release until the following annual CiteScore calculation. To obtain the Scopus rank and trend, click on the journal title and view its page.

On the Scopus results page, click on the 'Analyse Results' tab, which comes above the results list on the top left-hand side, to access citation metrics and other related information.

SNIP
SNIP (Source Normalised Impact per Paper) is a journal-level metric that measures contextual citation impact by weighting citations based on the total number of citations in a subject field. Essentially, it normalises differences in citation behaviours across various research fields.

The h-index- An author impact metric
The h-index in Scopus is a metric that reflects both the productivity and impact of a researcher's work. The H-index signifies the quantity of papers (H) that have received at least H citations. An H-index of 10 indicates that a researcher has 10 papers, each of which is cited 10 or more times. To locate the h-index on Scopus, click on the authors tab, the second one above the search box, and search for the scholar. To specify and disambiguate the author, you can use an ORCID ID, which uniquely identifies individual researchers.

SCImago Journal Ranking and h index(external link)

Scimago Journal Ranking is based on the bibliographic data available on the Scopus database. Apart from the sources section of the Scopus database, this ranking of publications is also available on a free website. SJR (Scimago Journal Rank) aims to capture the effect of a journal's subject field, quality, and reputation on citations. It calculates the prestige of a journal by considering the value of the sources that cite it, rather than counting all citations equally.  

It also provides rankings by journal country of origin and numerous visual representations of journal impact data. SCImago journal rankings also list the h index of the journals based on the Scopus data. If you check the individual journal data, h index is given below the SJR.

Metrics from Web of Science

Impact Factor
Impact Factor is a scientometric index calculated by Clarivate’s Web of Science platform. If a journal had 100 citations in 2024 to articles published in 2022 and 2023, and 50 articles and reviews published in the journal during 2022 and 2023, the Impact Factor would be 2 (100/50 = 2). The formula for impact factor is the following.
Impact factor of journal for the year Y=   Citations in Year Y/Publications Y-1 +Publications Y-2                                               

The NMIT library does not have access to the Web of Science platform. However, one can still access the impact factor of a journal through other platforms, such as journal websites. Most leading publishers provide the impact factor of their journals in the About section of the journal website. Usually, the impact factor is listed in the journal metrics section.

H index-Author impact metric(external link)
By creating a free researcher profile on the Web of Science platform, you can search for the h-index of scholars. You can use the 'create and manage your profile' button on this page(external link) to get access to the h-index data of scholars based on the Web of Science data. On your personal research profile page, you can search for the researcher. Although you cannot access the premium features, you can view the h-index of scholars and their articles, as listed by WOS.

Web of Science Master Journals List(external link)
One of the criteria for evaluating a journal's quality is its inclusion in the Web of Science Master Journal list. At the same time, considering that the inclusion criteria for the WOS Master Journal list are too strict, the non-inclusion of a journal in this list cannot be regarded as an absolute measure of quality.

Metrics from Google Scholar(external link)

Google Scholar categorises top publications by language and field, e.g., business, engineering, medicine, and social sciences. Under the English language, you can browse rankings in different subject areas. Within each subject area, you can search for subfields. Google Scholar’s coverage includes journals, conference proceedings, and other scholarly sources indexed by Google Scholar. To access the journal metrics page, first open the Google Scholar homepage(external link). Then, click the drop-down menu in the top left corner of the page (located next to the ‘My profile’ tab) to find the metrics link. On the metrics page(external link), you can view top publications by category and subcategory and see metrics such as the h5-index and h5-median.

h5-index
A publication has an h5-index of *h* if *h* of its articles published in the last 5 years have at least *h* citations each. For example, a journal with an h5-index of 30 means 30 of its articles published in the last 5 years have at least 30 citations each.

h5-median
The median number of citations for the articles that make up the h5-index. If a journal's h5-index is 20, the h5-median is the median citation count of those 20 articles.

H-index of authors
To find an author's h-index on Google Scholar, search for the author's name. If a profile exists, it will be displayed at the top of the search results. Click on the author's profile, and their h-index will be visible on the right-hand side under "citation indices". In the general search results page, the names of authors with Google Scholar profiles are displayed as clickable links under the article title.

Along with the h-index, GS also lists the i10-index, which indicates the number of scholarly publications by an author that have been cited at least 10 times. The complementary index of the h-index provides a quick way to assess the number of a researcher's publications that have achieved a reasonable level of impact within the scholarly community. 

Eigenfactor(external link)

Eigenfactor is a metric used to assess the influence and prestige of scholarly journals, developed by Jevin West and Carl Bergstrom at the University of Washington. Like the Scimago Journal Ranking, it aims to provide a more nuanced measure of a journal's overall contribution to the scientific community by considering factors such as citation patterns and the prestige of citing journals. 

Eigenfactor considers a five-year citation window and gives more weight to citations from highly influential journals. 

Article Influence Score
The Article Influence Score is derived from the Eigenfactor score and measures the average influence of a journal's individual articles. It measures how often articles are cited in the five years following publication. The Article Influence Score is calculated by dividing the Eigenfactor Score by the normalised fraction of articles. An Article Influence Score greater than 1 indicates that, on average, the journal's articles have a greater influence than the average article.

Altmetrics

Altmetrics use alternative approaches to measure the impact of a publication. While traditional metrics primarily focus on citation counts in other scholarly publications, such as academic journals, altmetrics track the impact of scholarly publications on social media platforms, blog posts, research networking sites, news articles, and policy documents. It also tracks evidence of user activity, such as page views, page downloads and exports to referencing tools, on publishers’ and indexing websites.

Altmetric Attention Score
The leading organisation that provides altmetric score is Digital Science, which manages the search engine platform called Dimensions. The Altmetric Attention Score is a weighted score that reflects online attention from various sources. In addition to the score, Altmetric also provides a visual representation called the Altmetric Badge, which uses a donut chart to show the distribution of attention across different sources. 

As NMIT does not subscribe to Dimensions, the full breakdown of Altmetric Attention Score is not available to us.  But main elements of the AAS is available in the free version of Dimensions. On the search results page, articles with AAS display an Altmetric badge below the background information for each result. Click on this badge to get detailed information. By clicking on the title of those articles, you get access to the article details page. On the right-hand side of the page, you can see information related to the Dimensions Badge and Altmetric. You can access the details by clicking on these logos.

PlumX metrics
PlumX Metrics on Scopus tracks five key areas—usage, captures, mentions, social media, and citations—offering a broader view of research impact beyond traditional citations. As access to the PlumX platform is subscription-based, you can access it through other platforms such as Scopus and Digital Commons. NMIT library subscribes to Scopus only.

To find PlumX metrics on Scopus, first search for the document (article, conference paper, etc.) within Scopus. Then, click on the document title to view its details. On the document details page, click on the ‘impact’ tab (the second tab after the ‘document’ tab). On the right-hand side of the page, you can see the PlumX metrics link below the tabs for Scopus metrics and SciVal topics.

Subject and region-specific publication rankings

Although the following lists provide subject- and region-specific quality rankings of journals, it is always advisable to use Scopus, Web of Science, and Scimago listings to obtain journal rankings that have global acceptance.

Australian Business Deans Council journal quality list(external link)
ANDC is the apex body of Australian university business schools; therefore, this list focuses on business-related publications. You can download the complete list from their website(external link) or search for the individual journal using their search interface.

Australian Political Studies Association list
This(external link)
list primarily comprises journals from the fields of political science, policy studies, and international relations.

ERIH PLUS Journal list(external link)
This list, published by the Norwegian Directorate of Higher Education and Skills, lists journals in the humanities and social sciences.

Financial Times top 50 Journals(external link)
This list comprises the 50 journals utilised by the Financial Times in compiling the FT Research Rank, which is featured in the Global MBA, EMBA, and Online MBA course rankings.

Research.com Journal list
(external link)
You need to scroll down to the middle of the page to see the best journal rankings. You can select a broad subject and research area within each one.

Lists of Predatory Journals 

A predatory journals list is a compilation of journals and publishers identified as engaging in deceptive or unethical academic publishing practices, often prioritising profit over academic integrity.

Although many leading open-access publishing platforms, such as Frontiers and MDPI, are included in these lists, we do not need to consider all journals from these platforms as predatory or of low quality. One needs to consider other objective parameters, such as citation metrics, to evaluate their quality.

Beall's list(external link)
This is a list of potentially predatory journals, originally compiled by librarian Jeffrey Beall. This website also features a set of journal evaluation tools.

Predatory Journals(external link)
This portal features a list of predatory publishers, as well as a list of journals.

Close drawer

a page loading spinner