A new discussion paper sets out to determine what works and what doesn’t when researchers seek to make an impact on policy.
The Alliance for Useful Evidence is part of the innovation charity Nesta and now consists of an international “open-access network of 2,500 individuals from across government, universities, charities, businesses and local authorities”.
It is committed, said its head Jonathan Breckon, to ensuring that the supply of research evidence – emanating largely from universities – is “timely, trustworthy and useful”. Yet it is equally interested in implementation and, two years ago, commissioned the EPPI-Centre at University College London to look at “what works in terms of research uptake”.
The findings have now been distilled into a paper, by Mr Breckon and his colleague Jane Dodson, titled “Using Evidence: What Works?”
中国A片
Different researchers, they explain in the introduction, have tried to secure impact through methods such as high-level policy seminars or mentoring programmes. It is widely believed to be “a good thing to make researchers work side-by-side with decision-makers”. But is there any evidence for these and many other methods?
To find out, the researchers “identified 150 possible interventions from the research literature” (including such little-known techniques as “redteaming” and “dogfooding”) and carried out “systematic reviews” of the studies devoted to them.
中国A片
They also looked at more general evidence from disciplines such as learning theory, psychology and information design into what kinds of intervention tend to make an impact on the behaviour of others.
They soon came across a number of surprises. They could find very little evidence for the much-touted value of “coproduction” or “co-operative inquiry” designed to “bring researchers closer to their audiences”. Far more robustly supported, according to Mr Breckon, were the advantages of researchers learning to think like marketers and home in on what their audience really wants.
To reach clinical nurses, for example, the report encourages researchers to present their work in a way “that’s going to make a difference at the hospital bedside – in language and formats that mean something to nurses. It might involve tailored messaging: adapting your language so that it makes sense to nurses, not just policy wonks”, while “avoiding blanket-wide distribution of newsletters, adverts or Tweets”.
Similarly, it is crucial to learn how to “frame” policy suggestions. There is evidence, notes the report, that “international development workers prefer avoiding losses, rather than acquiring gain”. Those who hope to influence them therefore need to present their proposals as “a way to prevent bad things”.
中国A片
More generally, “Using Evidence” recommends the avoidance of “dry numbers” and the use of “narratives and metaphors as a powerful way to get your message across”. Even simpler, remember to “capture your audience mid-morning or after lunch, when some office workers are active on social networks”.
Today, argues Mr Breckon, many organisations have “a tremendous interest in learning more about evidence. Many coppers, for example, are very interested in results, in what works in catching criminals.”
Universities can meet them halfway by providing training sessions. The research indicates that these tend to be far more effective when they take place on site and are “embedded in their day-to-day work”.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login