Skip to Main Content

Bibliometrics & Responsible Use of Research Metrics: Responsible Use of Research Metrics

Responsible Use of Research Metrics

                                                                                                                           

 

It is vitally important to note that although metrics are widely used in research evaluation, they only reflect an extremely limited snapshot of the overall impact of your research and if used carelessly or irresponsibly they can be gamed both in your favour and to your detriment. Common misconceptions around bibliometrics are that because they are quantitative measures that they are reliable, objective and trustworthy however that is not necessarily the case. Furthermore, respective of those aforementioned limitations, bibliometrics only reflect academic impact and do not demonstrate any societal impact etc. It is crucial to track and showcase the impact of your research beyond bibliometrics, and how you develop your impact story in a more responsible use of research metrics focused manner that is more contextualised and representative of your contributions.

Coalition for Advancing Research Assessment (COARA)

 

The CoARA Agreement on Research Assessment Reform (2022) was drawn up with the aim of reforming research assessment and to maximise the quality and impact of research. The agreement recognises the diverse outputs, activities and practices of research, researchers and research institutions/organisations. CoARA is a significant movement in the reform of research assessment, marking a step towards more inclusive and appropriate research assessment based primarily on qualitative assessment supported by responsible and appropriate use of bibliometric indicators. Technological University Dublin is a signatory of this agreement.

Funding Agencies and the responsible Use or Research Metrics

Many funding agencies are placing more importance on responsible research evaluation and are mandating rules around how and when you can use bibliometrics in their funding applications; some have even signed DORA The San Francisco Declaration on Research Assessment (DORA, 2012). DORA signatories include funders such as Science Foundation Ireland and the HRB. It is vital to be aware of any restrictions that a funder may have about using bibliometrics in certain areas of the application as using metrics in applications where the funder has outlined they are not allowed can render the entire application ineligible for review.  Oftentimes funding applications to include a narrative CV as part of the application, for more info on this please visit the Narrative CV Libguide.

Tips to Increase your Impact responsibly

  • Maintain accurate publication records & author profiles. Make sure your papers are correctly attributed to you (eg in databases such as Scopus etc.)
  • Make it open access - either via publishing OA or archiving it in a trusted research repository - typically OA papers get cited roughly 50% more than papers that exist behind the paywall.
  • Build international collaboration especially with highly cited authors - collaborating internationally helps with networks effects which typically amplifies promotion and increases citations.
  • Think about the journals you publish in and their visibility and readership - ensure you will reach the best audience that might be most likely to cite your work.
  • Promote your research via conferences, videos, social media, researchgate etc., 
  • Work on search engine optimisation with your titles and keywords so they rank higher for relevance in search results and so other researchers are more likely to find them and potentially cite them 
  • Share the data from your research and link your paper to the dataset - papers with accompanying data can sometimes attract more reuse and citations. “As open as possible as closed as necessary” 

How to Use Metrics responsibly

  • Maintain accurate publication records & author profiles
  • State the data source you used (& time period)
  • Always compare like with like
  • Focus on article level metrics
  • Don’t use journal metrics (e.g. Journal Impact Factor) to demonstrate the impact of an article/researcher
  • Use more than one metric if possible/relevant and provide context and qualitative information
  • Focus on senior/lead author papers and demonstrating quality over quantity
  • Adhere to the funder’s guidelines where specified

The Leiden Manifesto for research metrics

The Leiden Manifesto for research metrics outlines a list of "ten principles to guide research evaluation". The Leiden Manifesto was presented as a guide to combat the misuse of bibliometrics when evaluating and assesing research literature. 

The ten principles of the Leiden Manifesto are as follows:

  1. Quantitative evaluation should support qualitative, expert assessment.
  2. Measure performance against the research missions of the institution, group, or researcher.
  3. Protect excellence in locally relevant research. 
  4. Keep data collection and analytical processes open, transparent, and simple.
  5. Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
  8. Avoid misplaced concreteness and false precision. 
  9. Recognize the systemic effects of assessment and indicators.
  10. Scrutinize indicators regularly and update them.

This work is licensed under CC BY-NC-SA 4.0