There is, it seems, no rest between Research Excellence Frameworks.
Barely 72 hours after the release of the REF 2021 results, the first email landed. Sent on behalf of an anonymous university working group “set up to look specifically at data capture for the next REF cycle”, it linked me to an Excel spreadsheet. This contained 27 columns, each with a detailed question about research collaborations, talks and lectures, public engagement, media appearances, contributions to the discipline, PhD training – the usual jazz – over the past 18 months. To be filled in and returned “if possible” within three weeks.
I mention this not to criticise or poke fun at my own university. Tens of thousands of academic researchers across the UK could share a similar story. And there is, of course, a managerial logic to such efforts. As a former “impact lead” for my faculty, I know the importance of strategies, plans and support structures. And as someone , I applaud efforts to improve the patchy data and limited understanding we have of so many aspects of research cultures and impacts.
Yet how much of this has anything to do with “the next REF”, and how much reflects broader dynamics: the remorseless proliferation of process designed – in theory – to support and strengthen the research enterprise, while quietly choking it to death? It is, after all, convenient to pin blame on an externally imposed process, as a diversion from diagnosing deeper causes: whether in policy and the long-term underfunding of research; in the priorities of our institutional leaders; or in ourselves. Not for the first time in the aftermath of a research assessment exercise, I find myself reflecting on the wisdom of : “Auditors are not aliens: they are a version of ourselves.”
中国A片
You now have to be over 60 to remember life before the REF or its predecessors. The management of UK university research has shaped the exercise – and been shaped by it. This makes reform difficult.
Yet to the credit of Research England, while “the next REF” may already have taken form in faculty spreadsheets, it is in reality more open to overhaul than at any point in recent years. Steady growth in public R&D spending through to 2025, closer integration between quality-related (QR) and other funding through UK Research and Innovation, and a fresh impetus to improve research cultures and ?are all aligning to support a radical rethink.
中国A片
Enter the (FRAP), with its concurrent streams of evaluation and analysis intended to inform the design of any future exercise. As part of this, I’ve been asked – with Stephen Curry and Elizabeth Gadd – to carry out a short .
My hope is that when the strands of the FRAP are woven together in the autumn, at least four things will result.
First, the REF’s aims will be clarified. Its stated purposes have steadily expanded, from three , and there’s now talk of more. Until we rationalise these – and work out which belong in the REF and which should be addressed as part of broader university management – we can’t have a sensible debate about costs, benefits and burdens.
Second, we should sling the lingo. When even the chief executive of UKRI acknowledges that “no one knows what research excellence means”, we need to ditch the phrase – along with other terms, such as “world-leading”. They contribute to an illusion of objectivity around an exercise that is, of course, subjective and negotiated.
中国A片
We may as well classify research “red”, “orange”, “purple” and “pink”. What the REF process tells us is not how much “pink” exists in the UK or where all the “pink” can be found – let alone what share of global “pink” is in the UK. No, it tells us that a group of experts have agreed at a unit of assessment level – under the umbrella of REF rules – what looks like “pink” in the submissions they have to assess. There’s nothing wrong with this – let’s strip away the terminological nonsense and hubris.
Third, we need to move up the level of assessment within institutions. The Stern reforms went some way towards loosening the link between individuals and outputs, but next time around, we need to rethink whether 30-plus units of assessment, based around largely traditional disciplines, is the best approach – particularly when so much policy and funding is pushing for greater interdisciplinarity.
Finally, as others have argued, there is a strong case to start shifting the balance of the REF away from retrospective audit (or summative assessment) and towards prospective culture change (or formative assessment). This needs to be approached carefully to avoid unleashing its own new industry of spreadsheets and army of “culture managers”. But, with thought, the next framework could become a powerful way of incentivising and rewarding the improvements in research culture that so many of us want to see.
is Digital Science professor of research policy at the University of Sheffield and director of the .
中国A片
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login