Skip to main menu Skip to content

Responsible Use of Bibliometrics Statement

October 20, 2023

The Toronto Metropolitan University Libraries is committed to a Scholarly, Research and Creative (SRC) environment imbued with the values of equity, diversity, inclusion, and access. The responsible use of bibliometric data is an important part of SRC efforts to support EDIA. While Toronto Metropolitan University Libraries subscribes to bibliometric tools, such as SciVal and InCites, it is  recommended that the criteria used to evaluate research performance is explicit and well understood in alignment with its SRC values. 

Bibliometrics is the quantitative analysis of journal articles, books and other publications, which can be used to understand elements of research performance and impact. However, bibliometric data should be used carefully and requires an understanding of the strengths and limitations of the indicators being measured. For example, NSERC recommends that in regards to research assessment  “The quality and impact of contributions to research should be assessed directly, where possible. Surrogate measures of quality and impact, such as the prestige of a publication venue or citation-based metrics (e.g., journal impact factor or h-index) must not be used as they introduce bias in the merit review process.”1

Responsible use of research indicators means that the evaluation of research performance relies on a holistic view that includes both qualitative evaluation and quantitative indicators. For example: 

  • Bibliometric indicators should be used to support, not replace qualitative expert assessment by peers.
  • Journal Impact Factors should not be used as a measure of the quality of individual research articles.
  • When assessing research performance, different aspects of research, the field of study (including if the field is emerging, or involving EDIA research) and also the variations across disciplines should be considered. 
  • When evaluating research performance, all relevant research outputs (such as creative works, inventions, monographs, datasets, software, community research, and as well as publications) and other types of contributions, should be considered. 

General principles that underlie the responsible use of metrics also recognizes the limitations of bibliometric indicators. We recommend referring to the principles of the Leiden Manifesto and DORA before undertaking any quantitative bibliometric assessment or analysis.  However, we acknowledge that these principles focus on STEM disciplines and don’t speak to the challenges encountered by all disciplines. 

It is important to note that the bibliometric indicators captured by existing research metrics tools will not be appropriate for disciplines where knowledge mobilization takes place primarily outside of journal and conference literature. It also may not be appropriate to evaluate research in niche, emerging and EDIA fields of study.

The following considerations are essential for the responsible use of bibliometric tools: 2

  • Bibliometric indicators should be used only for their specific purposes, and in some cases be used with caution. For instance, Journal Impact numbers only evaluate journals and not the quality of the papers in those journals. Another example is that a highly cited journal article could mean that it is based on questionable research, not credible research. 
  • Use at least two indicators in any metric-based assessment
  • Using discipline filters may not give a full picture of multidisciplinary research 
  • Citation-based indicators are time-dependent and should only be used to compare articles of a similar age
  • Citation-based indicators are discipline-specific and should not be used to compare articles of different disciplines
  • Any indicator calculated by a simple average will be susceptible to being skewed by outliers. Check for outliers in your dataset and consider whether they might skew the indicators you are planning to use 
  • Pre-set indicators may not, in every case, match your analysis needs. If there is not a suitable indicator for your need, consider exporting data for analysis in a different program 
  • Avoid selecting indicators on the basis that they may fit the conclusion you are expecting to draw
  • Avoid false precision. Research metrics tools quote many indicators to two decimal places, which can give a false sense of accuracy. Very small differences between groups over time are unlikely to be meaningful, so consider rounding up when interpreting them
  • Articles need at least two years to accumulate enough citations for analysis.
  • Bibliometric indicators can be manipulated by practices like self-citation and courtesy citation, and there is known under-citing of female and racialized researchers’ scholarship.3

For more guidance regarding the responsible use of bibliometrics, please refer to the Bibliometrics LibGuide (under development)  or contact your Subject Librarian

This statement has been created by Toronto Metropolitan University Libraries and is licensed under a Creative Commons Attribution International 4.0 License. You are free to adapt or adopt with attribution.

Further Reading and Resources 

Example statements

Statement of Responsible Metrics
Cambridge: Guidance on the Responsible Use of Metrics in Research Assessment | Research Strategy Office (cam.ac.uk)
Dublin: Responsible use of Research Metrics | UCD Research & Innovation
NOAA: Home – Bibliometrics – LibGuides at National Oceanic and Atmospheric Administration (noaa.gov)

Citational Justice resources

Making Feminist Points
The Rise of Citational Justice: How Scholars are Making References Fairer
Braving Citational Justice Within Human-Computer Interaction
Citing Black Women

Open Infrastructure

Helping you Invest in the Open Technology that Research Relies on

Other

The Use of Bibliometrics in the Social sciences and Humanities (science-metrix.com)
Guidelines on the assessment of contributions to research, training and mentoring
University of York libguide: “The University takes the position that “there are very few true ‘metrics’ of research performance – they are more accurately ‘indicators’ e.g. citations are an indicator not a measure of research esteem”.  This guide uses the term ‘indicator’ throughout.”


  1. NSERC.  (Dec 1st, 2022). Guidelines on the assessment of contributions to research, training and mentoring. https://www.nserc-crsng.gc.ca/NSERC-CRSNG/Policies-Politiques/assessment_of_contributions-evaluation_des_contributions_eng.asp
    ↩︎
  2. Adapted from Rowlands, Ian; Committee, LIS-Bibliometrics; Gadd, Elizabeth (2020): Using SciVal responsibly: a guide to interpretation and good practice. Loughborough University. Educational resource. https://hdl.handle.net/2134/11812044.v1  and Gray, A.; Price, R. (2020):Using InCites Responsibly. https://doi.org/10.25561/75946
    ↩︎
  3. Kwon, D (2022): The rise of citational justice: how scholars are making references fairer. Nature 603, 568-571. doi: https://doi.org/10.1038/d41586-022-00793-1 
    ↩︎