Evolving Benchmarks – Clarivate’s Highly Cited Researcher Methodology Updates in 2025
— by River Mi
HKU’s Performance in 2025 Highly Cited Researcher
HKU achieved a new record with 54 scholars named in Clarivate’s “Highly Cited Researchers 2025” list (HCR list) with 38% of Hong Kong Highly Cited Researchers affiliated. In 2025, Clarivate awarded 6,868 individuals worldwide, among whom 145 had primary affiliation in Hong Kong. Behind the figures, the methodology and selection process of Highly Cited Researchers was also evolving to capture research impact in the contemporary world. In this blog, we will examine the criteria of HCR, its updates this year, and its response to responsible research assessment.
How Highly Cited Researchers Are Selected
Clarivate published its methodology in selection of Highly Cited Researchers on their official website. The process consisted of two phases as follows:
Phase I, select a list of researchers whose number of Highly Cited Papers surpassed a threshold, and the citation count was also beyond the threshold in InCites in their respective field.
Phase II, based on the name list generated, scholars were scrutinized in terms of research integrity, significance and broadness of impact, as well as other considerations such as review to article ratio.
Methodology and What’s New in 2025
According to the information release and webinar by Clarivate, major changes included addressing concerns for research integrity and applying existing selection criteria more broadly and aligned.
1. Re-inclusion of Mathematics
Clarivate excluded Mathematics in the exercise in 2023 for it was a field with relatively low average rate of publication and citation, and was more vulnerable to citation manipulation (Podlubny, 2025). In 2025, Clarivate re-introduced Mathematics, with efforts to try to mitigate the subfield’s prevalent citation cartel manipulation issue through additional layers of scrutiny. The exercises included forgoing papers from previously excluded researchers with integrity concerns (Pendlebury, 2025).
2. Interdisciplinary Research
Clarivate introduced Cross-Field as an effort to identify scholars who contributed at the intersection of diverse scientific domains in 2018 (Clarivate & Institute for Scientific Information, 2025). Out of the 22 fields, Cross-Field was with the highest number of Highly Cited Researcher awards in 2025.
To identify top performing researchers in Cross-Field category, the panel normalized the HCP threshold according to the criteria in different categories involved.
If the value of normalized paper count was greater than 1, then a scholar would pass the paper count threshold. Likewise, the total citation that qualified a Cross-field Highly Cited Researcher would be:
A researcher would need to pass both paper count and total citation count thresholds (Normalized count >1) to be selected as a candidate in Cross-Field.
3. Transparency
Ideally, any indicator used, qualitative or quantitative, should be publicly available with accessible data and algorithms, so that people would be able to verify the results by replicating the process. Clarivate did not publish the dataset snapshot or information about paper counts and total citation threshold parameters for becoming a Highly Cited Researcher. Clarivate explained that this was an attempt to prevent gaming. Among different major rankings, the data and code used by Stanford University’s list of “World’s Top 2%” scientists were the only one shared publicly: https://elsevier.digitalcommonsdata.com/datasets/btchxktzyw/8.
4. Integrity and Considerations on Retractions
Integrity checks were included in Clarivate’s HCR selection process. Since 2022, Clarivate searched for evidence of misconduct in all publications with data from Retraction Watch, hyper-authorship of papers, excessive self-citation and unusual patterns of collaborative group citation activity and anomalous levels of citations from co-authors. In 2025, Clarivate introduced a new exclusion criterion to rule out papers authored by researchers who were flagged to have potential misconduct in research. This addressed the increasing concerns over the easy fabrication of texts, images, and data accelerated by Generative AI.
Beyond the List: Strengths, Limitations and Implications
Overall, Clarivate’s Highly Cited Researcher played an iconic role in terms of accrediting scholars who excel in their respective fields. Its efforts to emphasize integrity and multi-disciplinary influence were relevant attempts to keep the ranking representative.
However, the selection process of Highly Cited Researcher remained half-transparent, and did not allow replication. Clarivate explained that field specific paper threshold is kept confidential to prevent gaming, but it is negotiable whether the number of Highly Cited Papers can be easily gamed based on the author’s wish.
Rather than a permanent honour, being a Highly Cited Researcher should be viewed more as a recognition of a sustained academic performance over a given period of time (Filchenko, 2025). When the methodology largely relies on citation counts, risks were that overreliance on metrics would encourage gaming and could distort research priorities and integrity.
Research impact is multi-faceted – Apart from academic impact, the impact of research can travel beyond academia and bring about benefits for the economy, society, environment or culture. Stakeholders in research are encouraged to recognize the different aspects of research impact and to consider how evaluation tools and incentive systems can contribute to responsible research.
Reference
Clarivate. (2025, November 19). Highly cited Researchers Analysis 2025 | Clarivate. Retrieved January 12, 2026, from https://clarivate.com/highly-cited-researchers/analysis/
Clarivate & Institute for Scientific Information. (2025, October). Highly cited researchers: Evaluation & selection. Institute for Scientific Information. Retrieved from https://clarivate.com/wp-content/uploads/dlm_uploads/2025/10/HCR-Evaluation-and-Selection-November-2025.pdf
Filchenko, D. (2025, October 6). Highly Cited Researchers 2025: Integrity Drives selection. Retrieved from https://clarivate.com/academia-government/blog/highly-cited-researchers-2025-integrity-drives-selection/
Podlubny, I. (2025, November 20). Detection of the papermilling behavior. Retrieved January 12, 2026, from https://arxiv.org/html/2405.19872v4
The University of Hong Kong. (2025, November 12). HKU Achieves Record with 54 Academics Named Clarivate’s Highly Cited Researchers 2025. Retrieved January 12, 2026, from https://www.hku.hk/press/news_detail_28739.html
Declaration of Generative AI use
I acknowledge the use of Generative AI tools in writing this post. I used:
- Perplexity and Gemini to cross-check and verify online information sources.
- Gemini, Claude.ai, and ChatGPT to refine wordings in the visualization.
- Copilot to check grammar.
- MidJourney to generate the feature image.
I declare that I reviewed and edited the contents as needed, and take full responsibility for the contents of the post; And the information provided is complete and accurate.
