中国A片

The REF: how was it for you?

Research heads and other university staff on the burdens of submitting to the inaugural research excellence framework

February 27, 2014

Source: SuperStock

The only thing we could liken the experience of preparing impact case studies to would be writing up a?PhD thesis with 20 supervisors

In early December a short press release from the 中国A片 Funding Council for England quietly announced that all 155 中国A片 institutions that had registered an intention to submit to the 2014 research excellence framework had done so inside the 29 November deadline.

Yet this straightforward statistic belies an administrative task that – despite Hefce’s best intentions – appears to have been significantly more onerous and possibly more bloody than ever before.

When, in 2006, Gordon Brown, who was chancellor at the time, abruptly announced that the research assessment exercise – the multimillion-pound process then used to assess the quality of university research – was to be scrapped, his intention was to replace it with a?metrics-driven approach that would be vastly cheaper to administer.

中国A片

ADVERTISEMENT

However, none of the university heads of research who have spoken to Times 中国A片 about the RAE’s eventual successor, the REF, report that the revised process was any less labour-intensive than the last RAE in?2008.

One says it was at least as onerous, while the others say it was more so. One estimates the required workload to have been three or four times greater.

中国A片

ADVERTISEMENT

No one working in 中国A片 will be surprised that top of the list of reasons for this is the REF’s requirement for research impact case studies. This aims to allow assessment panels to examine the social, economic and cultural benefits of research beyond academia and was introduced partly as a?concession to government in return for abandoning the metrics-driven approach favoured by Brown (which a feasibility study carried out in 2009 had deemed unworkable).

As this requirement was entirely unprecedented in previous research assessments, institutions report having had to do a lot of work to make sure that it was understood and that case studies were written and supported as required. Most heads of research who have spoken to THE cite this as the most difficult part of preparing their submissions. One is Barbara Pittam, director of academic services at the specialist London-based Institute of Cancer Research, which despite submitting research to the REF in only two areas, both biomedical, is among the top 30 recipients of quality-related research funding (from an annual ?1.6 billion, currently distributed on the basis of the results of the 2008 RAE).

To give an example of the work required, she says that her institute submitted a total of?86 pages of information on impact and research environment to the REF, compared with just 34 pages on “environment” and “esteem” measures in the 2008 RAE (the latter measure was subsequently dropped and does not feature in the 2014 REF).

The major difficulty in writing impact case studies, says Pittam, was acquiring the necessary evidence of research impact between 1?January 2008 and 31?July 2013 because much of it was information that institutions did not own or record for other purposes (such as the effect of research on public policy).

Another problem was working out which space-saving technical terms could safely be used in the case studies given the frequent requirement to satisfy three audiences at once (case studies will be examined by a?range of assessors): “the peer review panel who need to confirm that the underpinning research is at least 2* quality, the scientists and clinicians who must agree it had patient impact, and the commercial sector who need to?agree it had commercial impact”.

“The only thing we could liken the experience to would be writing up a PhD thesis with 20 supervisors,” she says.

In her view, the workload involved in developing case studies would not be justified if the rationale for it were simply the distribution of funding between 中国A片 institutions; however, she believes that it is justified by the need “to be able to make the case to the Treasury for continued investment in the sector, particularly in the light of a very challenging spending review expected in 2015”.

Pittam wonders, though, whether this effort might be better confined to a separate exercise in order to avoid the likelihood that including it in the REF might push researchers towards more applied research, which, she says, “may not be the right answer for 中国A片 institutions and the UK in the longer term”.

中国A片

ADVERTISEMENT

Now that the dust has settled, and the 中国A片 sector awaits the results, what do staff at Hefce say in response to such concerns?

On the issue of workload, Graeme Rosenberg, REF manager at Hefce, says the REF was “streamlined” as much as possible, such as in “making the criteria consistent across the panels and aligning data requirements with data already reported to the 中国A片 Statistics Agency”. But he recognises that “in other areas…additional workload was involved”.

Not only did the REF have the new impact requirement, there were also the more detailed assessments of special circumstances for staff. While this aimed to make it easier for certain staff members – such as those who have been ill or have had periods of maternity leave – to submit fewer outputs than the standard four without being penalised, Pittam describes the handling of these cases as “very onerous”.

“It required new policies and processes, and our rough estimate is that it took an aggregated effort of 12 weeks of work,” she says.

A recent statistical release by Hefce reveals that nearly 30?per cent of academics submitted to the REF qualified for reduced outputs. About 2?per cent had “complex” circumstances that needed to be considered individually.

The head of research at another institution, who asks not to be identified, argues that the level of detail on special circumstances required by Hefce’s Equality and Diversity Panel, which had to rubber-stamp institutional decisions on non-standard exemptions, was excessive: “Submitting such sensitive [personal] information in the REF submission was worrying for individuals and a risk. For any future REF we suggest that the detail should remain within the institution but be available for audit.”

Rosenberg admits that the rules on special circumstances are more elaborate than in the RAE, but says they were developed in consultation with the sector and have been “widely welcomed as a significant improvement in supporting equality and diversity”. He?insists that “sensitive” information, which institutions were advised to anonymise, was not made available to the REF’s 36 assessment panels.

For universities, the stakes regarding submission were especially high this time around. In 2011, following instructions contained in the government’s grant letter to Hefce, the funding council announced that it?would stop allocating quality-related research funding in England on the basis of 2*?research: QR income is now determined solely on the basis of 3* and 4* research (work judged to be “world-leading” or “internationally excellent” in the 2008 RAE), and this principle is not expected to be reversed. Given this, some observers expected universities to submit only outputs they deemed to be 3* and 4*, potentially leading to the exclusion of an unprecedentedly large number of academics.

In the event, however, institutions as a?whole do not appear to have been significantly more selective about the staff they decided to enter in the 2014 REF than they were in 2008. According to Hefce, 28.1?per cent of the 185,585 academics in post in 2012?13 were submitted to the REF, compared with 30?per cent of the 174,945 staff in post in 2007?08 who were submitted to the 2008 RAE. But the responses from heads of research suggest that some institutions have indeed been more selective. Several say that one reason for the increased workload in the REF was the need to establish robust mechanisms for assessing the quality of research produced by staff.

Typically, heads of research say that their institutions did this by using a combination of internal and external peer review. One says that their institution used only internal assessment, but with “high-level monitoring/calibration”.

Michael Davies, pro vice-chancellor for research at the University of Sussex, says his institution used bibliometrics to back up its peer review “where appropriate”, as some of the REF panels will do. Because REF panels in biomedicine will be among those to refer to bibliometric measures, Pittam says the Institute of Cancer Research also examined “the number of citations for each paper [produced by a member of staff] versus the expected number for field and year”.

All stand by the robustness of their methodologies. One comments that the process “is not perfect but it is no more imperfect than any other peer-review process”.

But this is a view with which some rank and file academics beg to differ. With inclusion in the RAE/REF generally seen as an important step in a research-active academic career, one would expect universities’ decisions to be challenged by some of those excluded – but in October last year THE reported that Derek Sayer, professor of history at Lancaster University, had launched a “surreal” appeal against his inclusion in Lancaster’s submission. His department’s REF selection process, he claimed, had ridden primarily on peer review by one external professor who lacked expertise in several areas.

Man undergoing strength test

The REF was streamlined as much as possible, such as in making criteria consistent across panels. But in other areas, additional workload was involved

He noted that Lancaster “requires external examiners for PhDs to be ‘an experienced member of another university qualified…to assess the thesis within its own field’ and also requires all undergraduate work to be both second-marked internally and open to?inspection by an external examiner”.

“Why are those whose livelihood depends on their research – and its reputation for quality – not given at least equivalent consideration as the students they teach?” Sayer asked at the time. His appeal was turned down, and a?Lancaster spokeswoman said that the university was confident that it was “making well-informed judgements as part of a careful decision-making process, which includes internal and external peer review”.

Some respondents to the recent THE Best University Workplace Survey 2014 – a survey of more than 4,500 中国A片 staff from 150 institutions across the UK – also worry about the robustness of their universities’ selection methods. One despairs of “the lack of fairness and equity that the REF blackmail culture has developed”. An academic in business and law says that staff in his unit were assessed purely on the basis of the Association of Business Schools’ journal ranking. Another respondent says the criteria are based “entirely” on journal impact factors and citations: “Quality of work and esteem with one’s own research community are totally ignored, and research fields which, for a variety of reasons, have lower citations and grants are being marginalised.”

All the heads of research surveyed by THE insist that there will be no negative career consequences for those excluded from the REF. Phil Hannaford, vice-principal for research and knowledge exchange at the University of Aberdeen, says that REF submission forms “just one part of the picture” on the basis of which academics’ contributions to his institution are measured.

中国A片

ADVERTISEMENT

“If someone was not submitted (for whatever reason) we look to see how best we might help that person be submitted next time (if appropriate),” he says – although he adds that the inevitable “loss of professional pride” if someone is not submitted “needs to?be managed appropriately”. (In 2007, one?academic excluded from the 2008 RAE formed a support group for other academics left out in the cold.)

But although the “no consequences” message has been repeated up and down the country, in the lead-up to the submission deadline THE was alerted to several cases in which academics believed it to have been undermined by subsequent words and actions (see ‘Penalty warnings: institutions where a REF red card might be a sending-off offence’ box, at the end of this article). Some respondents to the THE Best University Workplace Survey also speak of non-submitted staff being “bullied” or “intimidated” into administrative or teaching-only positions.

Many of the heads of research who spoke to THE admit to being concerned that controversy around selection decisions would damage morale – but most insist that their own approach has mitigated that risk. Hannaford says that his institution has sent out a?“clear message that this is about the university trying to maximise resources and prestige so?that we can provide an excellent research environment for everyone”. Another institution says that it has been “actively working with staff to ensure all processes used were transparent” and has emphasised that “staff contribute to the research environment in many ways, including important contributions to impact”. A third says that its REF preparation process has “undoubtedly provided a focus for increased output by some staff and this has improved their morale”.

Paul Hogg, vice-principal for research and enterprise at Royal Holloway, University of London, says that his institution’s “realistic and practical” staff “understand the process and by and large have accepted the judgements made”.

But while some respondents to the THE Best University Workplace Survey echo such sentiments, it is clear that the management of?the REF process has caused a great deal of angst across the sector. Some complain about a?lack of transparency. One academic describes the management of the REF process at her Russell Group institution as “poorly handled and alienating”. An academic at a?former 1994 Group institution says: “The REF was handled unbelievably badly, partly owing to the personality of the then director of research who believed only in emails and not actually talking to anyone. It has certainly damaged my morale. Management has simply not wanted to talk about it – it’s all been top?down and ‘shut up!’”

Many academics also worry about the long-term effect of the REF’s existence on departmental culture, strategy and staffing. Gripes include obsession with REF league tables, the downgrading of teaching, the distortion of research priorities and the corrosion of job security, all contributing to the creation of a “climate of fear”.

Overall, 35 per cent of academics responding to the THE Best University Workplace Survey agree with the statement “my university’s response to the REF has had a negative impact on my work” – although only fractionally fewer disagree.

Another common grumble is universities’ habit of poaching research “stars” from other institutions ahead of the census date. One academic at a pre-1992 university describes his institution’s “new focus on getting rid of excellent teaching staff and bringing in research staff for the REF” as “short-sighted and disrupting”.

Last-minute hiring sprees are often condemned by critics as one of several ways that universities attempt to “game” the REF. Rumours also abound that certain institutions fared unduly well in 2008 league tables by submitting only their very top researchers – judging that the reputational gain of doing so would offset the loss of QR income. However, Hesa has now pledged to publish the proportion of eligible staff submitted by each institution in each unit of assessment – making it possible for league table compilers to weight their rankings accordingly – and this may have made this a less attractive tactic this time around.

All the heads of research who spoke to THE insist that their REF returns paint an accurate picture of the quality of research at their institution – although one, who asks not to be named, laments that “a lot of worthwhile research that will never be considered 3* or above” had to be left out.

“There is a risk that research of value to, let’s say, ‘less fashionable’ parts of UK industry might be reduced by a focus on 3* and 4* outputs. But governments will always kowtow to the ‘big boys’, such as the pharmaceutical industry and the universities that support them,” he adds.

A new “game-playing” gripe has emerged this time around from the decision to index the number of impact case studies required from each department to the number of academics submitted; in other words, the number of case studies required was determined by the number of staff submitted.

Some institutions allegedly turned that formula around and determined the number of academics they submitted according to the number of high-quality case studies they could muster. Indeed, at least one institution – the University of Leicester – admits that it envisaged doing so.

None of the heads of research in THE’s survey admit to such tactics in more than a small number of units. One says that such an?approach “would have been deeply unfair to colleagues”. But there is some anxiety that other institutions may have used it more widely. Pittam fears that if this is the case, “the snapshot picture of UK research, and the funding to underpin this, will be skewed towards the applied disciplines”.

The chief advantage of the REF cited by heads of research is that it allows them to benchmark their departments’ and institution’s research. One laments that Hefce lacks “the courage to say at high level there are other proxy indicators that could provide the same distribution”. But most remain sceptical about the ability of metrics to replace peer review: concerns about such an approach include excessive counts for faddish research, variations in citation rates across and within subjects and the difficulty of applying bibliometrics to the humanities and to recently published papers. The last such problem “implies that, even in the sciences, citations could only be applied to slightly older papers, so there would be a two-tier system of evaluation that might be difficult to justify”, one head says.

Reflecting the personal flak some managers take for their decisions about which staff to submit, one research head suggests that Hefce and its panels should be obliged to run the same gauntlet: “University managers have to make decisions about inclusion of individual staff and stand by their decisions with those individuals. Hefce should be required to publish each individual grading decision [for the work of each academic submitted]”.

But Sussex’s Davies suggests that as the REF is aimed at assessing research quality at institutional level, it might be better to move beyond the submission of individuals in?future exercises and instead assess a unit of assessment’s best research regardless of which of its academics authored it.

Hefce’s Rosenberg expects the cost of the 2014 REF to be “broadly similar” to the ?47?million officially estimated to have been the cost of the 2008 exercise to English institutions alone, which he points out is less than 1?per cent of the funding driven by the results.

But one head of research begs to differ, insisting that an “honest assessment” would put the cost at nearer to ?200 million.

“The amount of valuable research that this could have funded is immense [and the results are] unlikely to tell us anything significant that we don’t know already,” he claims.

Whether this is true will be revealed in December, when the results are published. But he is surely right to say that his idea for improving the REF – that it be scrapped altogether – is “not going to happen”.

Woman performing high jump

Penalty warnings: institutions where a REF red card might be a sending-off offence

Last summer Times 中国A片 reported on fears within the University of Leicester that the institution was reneging on a pledge that there would be no negative career consequences for academics whose work was not submitted to the REF.

A memo from Mark Thompson, its senior pro vice-chancellor, noted that non-submission was “clearly an important performance indicator” and announced that the positions of all eligible staff who were not submitted would be reviewed.

Those without extenuating circumstances could either apply for a teaching-only position or commit to producing certain research performance targets within a year. Failure to do so would normally result in “dismissal on the grounds of unsatisfactory performance”.

Extenuating circumstances would include a department’s submission being “constrained” by the limited number of impact case studies it intended to submit. A Leicester spokesman denied that this amounted to game-playing, noting that “all universities will seek to optimise their outcomes”.

Meanwhile, in September last year, a memo from Niall Piercy, the deputy dean for operations at Swansea University’s School of Management, announced that its academics would typically be moved into teaching-only roles if they did not have four papers deemed to be of at least 3* quality in the institution’s internal “mini-REF” exercise. The plans were dropped a few weeks later, but academics continued to complain that teaching allocations announced on the back of the mini-REF remained largely in place.

And in October, a survey by the University and College Union indicated that more than 10?per cent of academics at eight UK universities – including Leicester – believed themselves to have been told that failure to meet their institution’s REF expectations would lead to redundancy.

Across the sector, however, only 4 per cent of the nearly 7,500 respondents to the UCU survey reported having received such a message. Yet 10?per cent had been told to expect denial of promotion as a?consequence of non-submission, 4?per cent to expect transfer to inferior terms and conditions, and 12?per cent to expect to be moved to teaching-focused contracts.

中国A片

ADVERTISEMENT

Only 35 per cent of respondents agreed that their institution’s selection procedures were transparent and 6 per cent said selections had been made without any input from peer review.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Reader's comments (2)

HOW TO MINIMIZE THE REF BURDEN WHILE MAXIMIZING UK RESEARCH IMPACT Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated online RAE CVs Linked to University Eprint Archives: Improving the UK Research Assessment Exercise whilst making it cheaper and easier. Ariadne 35 http://www.ariadne.ac.uk/issue35/harnad Harnad, S. (2006) Online, Continuous, Metrics-Based Research Assessment. Technical Report, ECS, University of Southampton. http://eprints.ecs.soton.ac.uk/12130/ Harnad, S. (2008) Validating Research Performance Metrics Against Peer Rankings. Ethics in Science and Environmental Politics 8 (11) doi:10.3354/esep00088 The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance http://eprints.ecs.soton.ac.uk/15619/ Brody, T., Carr, L., Harnad, S. and Swan, A. (2007) Time to Convert to Metrics. Research Fortnight pp. 17-18. http://eprints.ecs.soton.ac.uk/14329/ Harnad, S. (2008) Validating Research Performance Metrics Against Peer Rankings. Ethics in Science and Environmental Politics 8 (11) doi:10.3354/esep00088 The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance http://eprints.ecs.soton.ac.uk/15619/ Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1) http://eprints.ecs.soton.ac.uk/17142/ Harnad, S. (2008) Self-Archiving, Metrics and Mandates. Science Editor 31(2) 57-59 http://www.councilscienceeditors.org/members/secureDocument.cfm?docID=1916 Harnad, S., Carr, L. and Gingras, Y. (2008) Maximizing Research Progress Through Open Access Mandates and Metrics. Liinc em Revista 4(2). http://eprints.ecs.soton.ac.uk/16617/ Harnad, S. (2009) Multiple metrics required to measure research performance. Nature (Correspondence) 457 (785) (12 February 2009) http://www.nature.com/nature/journal/v457/n7231/full/457785a.html Gargouri, Y., Hajjem, C., Lariviere, V., Gingras, Y., Brody, T., Carr, L. and Harnad, S. (2010) Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research. PLOS ONE 5 (10) e13636 http://eprints.ecs.soton.ac.uk/18493/ Harnad, Stevan (2013) Harnad Comments on HEFCE/REF Open Access Mandate Proposal. Open access and submissions to the REF post-2014 http://eprints.soton.ac.uk/349893/
CRAVEN ARMS: “Stevan. It would be more helpful if you could just summarise your point for those of us with increasingly busy lives? Thanks.” For those with increasingly busy lives: Institutions, funders, assessors: Mandate immediate deposit of all publications in authors' institutional repository. It will maximize their usage and impact. Assessors: Harvest both publications and metrics for all articles and authors (citation counts, download counts, and other digital metrics of usage and impact) to help in assessment.

Sponsored

ADVERTISEMENT