This is technically not an infringement of the policy and you should be able to publish elsewhere.
Articles published in journals perceived to have greater reach and rigour may increase their potential for being read, understood and used. It is advised that if a Research Group is developing a recommended list of journals to publish in, that they use multiple metrics to confirm the findings, alongside expert judgement. There should be no consequences for authors who choose to publish elsewhere if that journal is the most suitable venue for their output.
You should publish in the most appropriate venue for your research. Ideally, you should be publishing in journals that meet benchmarking criteria such as ‘have they published similar research recently?’ and ‘do my colleagues have access?’ or ‘is the journal well indexed in internet search engines?’
You should check the aims and scope and the instructions for authors before submitting to a journal. There are other criteria unrelated to journal prestige that you may also need to consider, such as open access policies and the cost or speed of publishing. Field-normalised journal metrics can be used to help make a decision but other factors need to be taken into account.
Maybe. Such indicators typically look at people or papers in the top 1% of their discipline which is a very narrow threshold few will meet, and which highly prolific authors are more likely to meet. No doubt, a prolific highly cited author is impactful, but there are many different types of impact and ways of achieving it. Furthermore, there are many different citation, publication and authorship cultures geographically and disciplinarily. Therefore, being on a highly cited list may indicate impact, but does not guarantee it, nor does absence/low ranking indicate a lack of impact.
No, the H-index is a flawed indicator of the quality and quantity of an individual’s research. The H-index is the maximum value (h) based on the number of papers (h) an author has published that have each been cited at least h times. Therefore, this indicator is strongly influenced by discipline, publication volume, career length and effective career length. This makes it very difficult to use responsibly and consistently, especially when assessing Early Career Researchers or individuals with protected characteristics. The H-index has been severely criticised by some funders, including UKRI.
Here is an extreme example which demonstrates the problems with the H-index: an individual could have been cited 1,000 times but will only have an H-index of 2 if they have only published twice, while an author with 1,000 published articles will have an H-index of 3 if they have 997 uncited papers and 3 cited 3 times. In this case, an H-index of 3 is not an improvement on 2. There are some variations of the H-index that try to improve it, but it is not clear if any can be used responsibly. We recommend avoiding the H-index, but if it is required, please contact the Library for support.
Without metrics we could not qualify many of our claims as individuals or an institution. Due cause and consideration need to be given, as to why you use a metric. Many metrics were developed responsibly to provide an answer to a specific question. We should aim to use research metrics as little as possible and only as much as necessary after considering appropriateness, transparency, reproducibility and equality.
Read the job description, competency framework, or promotion guidance website carefully to understand what information can be included in support.
If an assessor asks for you to provide information about the impact or quality of your work, it is advised that you provide written context alongside quantitative evidence or other qualitative supporting evidence. If on balance, you wish to supply metrics, you should provide contextual information to support the findings. If the chosen metric is inappropriate or the assessor can't apply it to everyone they are assessing it will not be taken into consideration in line with the Responsible Research Metrics Policy. If a metric is being used, the assessors are asked to abide by the Responsible Research Metrics Policy and to consider protected characteristics when analysing the metrics. Please see our Responsible Research Metrics Policy for more information
If you wish to supply metrics in support of your application, there is helpful information in the ‘Responsible Metrics Matrix' in the Tools section of the Bibliometrics pages
No you can't. High-quality outputs as judged by peer-review may inform a probation meeting, but the impact factor is only appropriate to assess the quality of a journal, not individual articles or a person. Journal metrics such as SNIP and SJR are better to assess a journal but are equally inappropriate for assessing individual articles or a person. Those metrics may be used to help inform an author’s decision to publish in a journal on balance with the aims and scope, editorial board, other works published, readership and rigour of peer-review, but they should not be used in the assessment of individual outputs.
It is highly recommended that assessors ignore any reference an individual makes to journal impact factors or journal rankings when assessing the quality of a piece of research. An individual should not be penalised for including journal-based metrics, but if they fail to qualify why the research is important/impactful/rigorous etc. journal base metrics should not be used as a substitute for this missing information – to do so would go against the responsible research metrics policy. The H-index is also commonly supplied by individuals within some disciplines, this index is biased and lacks precision, and while some biases are diminished when comparing like for like this index can still have a negative impact on individuals with protected characteristics.
If metrics will be used during an assessment a suite of metrics capable of verifying the results of each other should be used alongside expert judgement. Please ensure that the use of metrics is reassessed for each job role or promotion review cycle.