Policy & Compliance

MediaCityUK Banner

The University of Salford is committed to excellent research with impact, conducted to the highest standards of research integrity. To achieve this and remain committed to our research participants, our research community must operate to the highest standards of research ethics and integrity. 

Annual Statement on Compliance with the Concordat to Support Research Integrity

Part of our commitment to the Concordat to Support Research Integrity requires us to publish a statement on our activities to support research governance and integrity at the University. The report for the 2019-20 academic year is now available.

Nagoya Protocol

“The Nagoya Protocol on Access to Genetic Resources and the Fair and Equitable Sharing of Benefits Arising from their Utilization to the Convention on Biological Diversity is an international agreement which aims at sharing the benefits arising from the use of genetic resources in a fair and equitable way” (https://www.cbd.int/abs/ August 2020).

Each country has rights over its genetic resources (such as animals, plants and organisms) and the traditional knowledge associated with them. The Nagoya Protocol was designed to ensure the equitable sharing of these genetic resources and their associated traditional knowledge (aTK) and the benefits that arise from their use.

Established in 2010, the Nagoya Protocol puts the Access and Benefits Sharing (ABS) principles from the Convention of Biological Diversity into a legally binding contract.

From 21st October 2014, anyone wishing to access genetic resources and/or the traditional knowledge associated with them, must comply with the EU regulation.* The regulations in the UK apply to any company, organisation or individual conducting research and development on genetic resources and or aTK where:

  • The genetic material and / or aTK was accessed on or after 12 October 2014, and
  • Was from a country that is party to the Nagoya Protocol and has access and benefit sharing (ABS) legislation

It does not apply to:

  • Human genetic resources
  • Genetic resources for which access and benefit-sharing is governed by specialised international instruments (such as the International Treaty on Plant Genetic Resources for Food and Agriculture)

The Protocol does not apply to activities that took place before the regulation came into place.

For the most up-to-date information on the Nagoya Protocol and complying with the agreement the Convention on Biological Diversity and UK Government websites should be the first point of reference.

* On 15th November 2018, the Department for Environment, Food & Rural Affairs published a statement confirming that ‘regulations in the UK which implement the Nagoya Protocol […] will continue to be operable after the UK leaves the EU’.

DORA

The University is a signatory to the San Francisco Declaration on Research Assessment (DORA). You can read this declaration below, or at https://sfdora.org/read/.

DORA

Read the DORA Declaration

There is a pressing need to improve the ways in which the output of scientific research is evaluated by funding agencies, academic institutions, and other parties.To address this issue, a group of editors and publishers of scholarly journals met during the Annual Meeting of The American Society for Cell Biology (ASCB) in San Francisco, CA, on December 16, 2012. The group developed a set of recommendations, referred to as the San Francisco Declaration on Research Assessment. We invite interested parties across all scientific disciplines to indicate their support by adding their names to this Declaration.

The outputs from scientific research are many and varied, including: research articles reporting new knowledge, data, reagents, and software; intellectual property; and highly trained young scientists. Funding agencies, institutions that employ scientists, and scientists themselves, all have a desire, and need, to assess the quality and impact of scientific outputs. It is thus imperative that scientific output is measured accurately and evaluated wisely.

The Journal Impact Factor is frequently used as the primary parameter with which to compare the scientific output of individuals and institutions. The Journal Impact Factor, as calculated by Thomson Reuters*, was originally created as a tool to help librarians identify journals to purchase, not as a measure of the scientific quality of research in an article. With that in mind, it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment. These limitations include: A) citation distributions within journals are highly skewed [1–3]; B) the properties of the Journal Impact Factor are field-specific: it is a composite of multiple, highly diverse article types, including primary research papers and reviews [1, 4]; C) Journal Impact Factors can be manipulated (or “gamed”) by editorial policy [5]; and D) data used to calculate the Journal Impact Factors are neither transparent nor openly available to the public [4, 6, 7]. Below we make a number of recommendations for improving the way in which the quality of research output is evaluated. Outputs other than research articles will grow in importance in assessing research effectiveness in the future, but the peer-reviewed research paper will remain a central research output that informs research assessment. Our recommendations therefore focus primarily on practices relating to research articles published in peer-reviewed journals but can and should be extended by recognizing additional products, such as datasets, as important research outputs. These recommendations are aimed at funding agencies, academic institutions, journals, organizations that supply metrics, and individual researchers.

A number of themes run through these recommendations:

-­‐ the need to eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations;

-­‐ the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and

-­‐ the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact).

We recognize that many funding agencies, institutions, publishers, and researchers are already encouraging improved practices in research assessment. Such steps are beginning to increase the momentum toward more sophisticated and meaningful approaches to research evaluation that can now be built upon and adopted by all of the key constituencies involved.

The signatories of the San Francisco Declaration on Research Assessment support the adoption of the following practices in research assessment.

General Recommendation

1. Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.

For funding agencies

2. Be explicit about the criteria used in evaluating the scientific productivity of grant applicants and clearly highlight, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published.

3. For the purposes of research assessment, consider the value and impact of all research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.

For institutions

4. Be explicit about the criteria used to reach hiring, tenure, and promotion decisions, clearly highlighting, especially for early-stage investigators, that the scientific content of a paper is much more important than publication metrics or the identity of the journal in which it was published.

5. For the purposes of research assessment, consider the value and impact of all
research outputs (including datasets and software) in addition to research publications, and consider a broad range of impact measures including qualitative indicators of research impact, such as influence on policy and practice.

For publishers

6. Greatly reduce emphasis on the journal impact factor as a promotional tool, ideally by ceasing to promote the impact factor or by presenting the metric in the context of a variety of journal-based metrics (e.g., 5-year impact factor, EigenFactor [8], SCImago [9], h-index, editorial and publication times, etc.) that provide a richer view of journal performance.

7. Make available a range of article-level metrics to encourage a shift toward assessment based on the scientific content of an article rather than publication metrics of the journal in which it was published.

8. Encourage responsible authorship practices and the provision of information about the specific contributions of each author.

9. Whether a journal is open-access or subscription-based, remove all reuse limitations on reference lists in research articles and make them available under the Creative Commons Public Domain Dedication [10].

10. Remove or reduce the constraints on the number of references in research articles, and, where appropriate, mandate the citation of primary literature in favor of reviews in order to give credit to the group(s) who first reported a finding.

For organizations that supply metrics

11. Be open and transparent by providing data and methods used to calculate all metrics.

12. Provide the data under a licence that allows unrestricted reuse, and provide computational access to data, where possible.

13. Be clear that inappropriate manipulation of metrics will not be tolerated; be explicit about what constitutes inappropriate manipulation and what measures will be taken to combat this.

14. Account for the variation in article types (e.g., reviews versus research articles), and in different subject areas when metrics are used, aggregated, or compared.

For researchers

15. When involved in committees making decisions about funding, hiring, tenure, or promotion, make assessments based on scientific content rather than publication metrics.

16. Wherever appropriate, cite primary literature in which observations are first reported rather than reviews in order to give credit where credit is due.

17. Use a range of article metrics and indicators on personal/supporting statements, as evidence of the impact of individual published articles and other research outputs [11].

18. Challenge research assessment practices that rely inappropriately on Journal Impact Factors and promote and teach best practice that focuses on the value and influence of specific research outputs.

References

Adler, R., Ewing, J., and Taylor, P. (2008) Citation statistics. A report from the International Mathematical Union.
Seglen, P.O. (1997) Why the impact factor of journals should not be used for evaluating research. BMJ 314, 498–502.
Editorial (2005). Not so deep impact. Nature 435, 1003–1004.
Vanclay, J.K. (2012) Impact Factor: Outdated artefact or stepping-stone to journal certification. Scientometric 92, 211–238.
The PLoS Medicine Editors (2006). The impact factor game. PLoS Med 3(6): e291 doi:10.1371/journal.pmed.0030291.
Rossner, M., Van Epps, H., Hill, E. (2007). Show me the data. J. Cell Biol. 179, 1091–1092.
Rossner M., Van Epps H., and Hill E. (2008). Irreproducible results: A response to Thomson Scientific. J. Cell Biol. 180, 254–255.
http://www.eigenfactor.org/
http://www.scimagojr.com/
http://opencitations.wordpress.com/2013/01/03/open-letter-to-publishers
http://altmetrics.org/tools/
*The Journal Impact Factor is now published by Clarivate Analytics.

Research Excellence Framework (REF)

The next Research Excellence Framework submission will be made in March 2021. Find out about our preparations for this exercise. The University’s internal REF site can be found at www.salford.ac.uk/ref (requires network login off-campus or on a mobile device) where you will find information on all our internal processes and policies.

Below you will find a link to the University’s REF Privacy Notices. This provides information on how and why we collect, use and store data as part of our REF2021 preparations and submission.