Australia’s “rewriting” of its research assessment exercise is an?“amazing opportunity” to?reward desirable practices that have long been overlooked, Times 中国A片’s World Academic Summit has heard.
Emma Johnston, deputy vice-chancellor of?research at?the University of Sydney, said national evaluation was at a?“very interesting experimental stage” following the federal government’s decision to?retire the Excellence in?Research for Australia (ERA) and Engagement and Impact (EI) assessments.
She said the process to replace ERA – which had been “gameable” and “very time-consuming for institutions” – was in its early stages and “a?bit of a black box”. But she was “optimistic” about the outcome.
“When you look at…the sophistication that’s arising through new metrics-based approaches, we have opportunities to reward teamwork,” Professor Johnston told the summit.
中国A片
“Is someone publishing always by themselves as the first author, or are they clearly playing a team game? Are they contributing to multiple different projects? Are they interdisciplinary in their research? Are they reading widely and citing a broad range of diverse literature from multiple countries? We can reward these things. We can assess them through metrics-based approaches,” she said.
While Canberra has accepted a?review recommendation to scrap ERA and EI, education minister Jason Clare has asked the Universities Accord panel to propose a new research assessment model in its final report in December.
中国A片
“Research remains a key element of all Australian universities, and reform of the performance measurement and management of university research is critical to ensure that the future contribution of universities is driven effectively,” the government’s to the review explains.
Professor Johnston said that whatever model was chosen, it should have a “multiplicity of approaches” to evaluation. “As soon as you start evaluating, people start moving towards that goal,” she said. “You inevitably shape activity, culture and behaviour. If you only have a single goal, everyone’s going to focus on that at the expense of the others.”
The new model should also be able to “evolve”, she added. “We have to critique it, because the longer a metric stays around, the more people are able to game it – and the more it drives potentially perverse behaviours as a consequence.”
Professor Johnston said the new assessment should also harness new artificial intelligence and machine learning capabilities. She said her “on-the-side research project” was analysing a 10-year sample of student evaluations to test whether they could be used in the university’s academic excellence framework.
中国A片
The team’s “sentiment analysis” had detected student biases against female academics in STEM disciplines and almost all academics from non English-speaking backgrounds. Students’ gender and nationality also affected their appraisals. “The students are choosing, for example, to assess female academics on some aspects of education and male academics on other aspects of education.
“It is this wonderful power of machine learning and artificial intelligence that’s enabling us to get at it. It means that we don’t have to throw the baby out with the bathwater. We want our students to be evaluating our teaching, informing us and allowing us to progress and improve, so we don’t want to throw out those evaluations. But we also want to eliminate or minimise the potential for bias within those systems.”
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login