中国A片

Preliminary measures

After an earlier plan was scrapped by an incoming Labor government, Australia is finally set to launch its scheme to assess the quality of all academic departments. Zo? Corbyn reports

April 22, 2010

It is an anxious time for Australian academics. Between June and August, their universities will be making submissions to the country's first full Excellence in Research for Australia (ERA) initiative - the equivalent of the UK's research assessment exercise.

Although no funding is being allocated on the results of this inaugural round, that does not mean the results do not matter. An awful lot is riding on them, not least prestige and bragging rights. For when the results are made public some time in 2011, they will give institutions an indication of their research excellence in comparison with that of others across 22 broad fields and 157 subfields of research for the first time.

"It is seen as a really big reputational thing," says Linda Butler, head of the research evaluation and policy project at the Australian National University (ANU). As one of Australia's foremost metrics experts, she has been part of a consortium advising the Australian Research Council (ARC) on various aspects of developing the ERA.

"In Australia, we haven't had anything like an RAE. Nobody can say 'we have the best chemistry in Australia' or 'we have the best astronomy'. With tools such as university world rankings, ANU might have bragging rights for being the best university overall, but you can't then break it down. In the past universities have made claims - we have probably got about 10 'best' departments for chemistry - but there has never been an objective basis."

中国A片

ADVERTISEMENT

But it is more than just reputations that are at stake. "If an overseas student wants to do organic chemistry, where is the best place to go?" Butler asks. The results of the ERA could influence their choices. "There is a lot of money in international students for Australian universities, and I think this is seen as one method of attracting them."

The ERA is the brainchild of Kevin Rudd's Labor government, which developed it after scrapping the previously proposed Research Quality Framework (RQF). The move was a political response to concerns over the RQF's planned inclusion of an assessment of the economic and social impact of research. It also would have changed Australia's metrics-based system to one based on "burdensome" peer review.

中国A片

ADVERTISEMENT

The ERA comes with some ambitious goals. In addition to delivering a "national stocktake" of discipline-level research strengths and weaknesses, it is supposed to provide assurance that excellent research is indeed being conducted in Australia's institutions, to allow international as well as national comparisons, and to identify emerging research areas and opportunities for further development.

And although it is still only a policy tool at this stage, the expectation is that if outcomes are robust enough it will help determine how research funding is distributed, a job that currently falls to the Research Quantum (see box). "There are indications that it may not be long before money is riding on it," Butler says.

The ERA and the RAE share the same aim of identifying quality research, but the two are poles apart in their approach.

Unlike the UK's RAE, the ERA is based on number-crunching a host of different metrics, and it makes no use of traditional peer review except in the humanities and social sciences.

For each scientific discipline in a university, scores will be compiled from metric information such as publication counts, citation counts (the number of times an academic's work is cited by peers) and the ranking of the journals in which papers are published, along with data on grant income, commercial income and esteem measures. Experts advise on how the data can be best combined to reveal quality rather than undertake a peer-review exercise.

In assessing quality in non-scientific disciplines, peer review is the backbone, but panels will be provided with some quantitative measures such as income and journal rankings.

"It is a basket of metrics," Butler says, who also emphasises that, unlike in the UK's RAE, all researchers and all outputs will be counted.

The most controversial element of the ERA is its decision to rank publication outlets based on their perceived quality. All journals have been placed in one of four categories: A*, A, B and C. Journals ranked A* are regarded as publishing the top 5 per cent of work in their discipline, A journals the next 15 per cent, B the next 30, and C the bottom 50. And all ERA panels - including those in humanities and social sciences - are expected to use information on the proportion of a university's output that falls into each category.

中国A片

ADVERTISEMENT

There are concerns that the consultation with academics and professional bodies to determine journal rankings in each field has produced some odd results. Some claim that journals that do not publish significant volumes of Australian research seem to rank rather too low, while ones in which prominent local researchers have published widely tend to come out surprisingly high.

In philosophy, for example, the leading and well-established journal in the field of environmental ethics, Environmental Ethics, is placed in category B, which is the same rating given to Parrhesia: A Journal of Critical Philosophy, a relatively new journal with a strong Australian focus, first published in 2006, edited partly by students and featuring a high proportion of student work.

"The criteria the ARC issued originally as indicators of journal quality just do not correspond to the final rankings," argues Andrew Brennan, pro vice-chancellor (graduate research) of La Trobe University in Melbourne. "The trouble with a flawed ranking when published by a government body amid claims of wide consultation and expert verification is that it gets a status that is very hard to change."

中国A片

ADVERTISEMENT

And there are some "marvels" created by the use of percentages to divide journals into categories. Fields of research with relatively few journals can have only a tiny number of A* outlets. In other disciplines with many journals, A* publications can proliferate. The research field "curriculum and pedagogy" has 12 A* journals, while zoology has but four.

But there are also more fundamental concerns. Jeff Malpas, a professor of philosophy at the University of Tasmania, helped organise a letter of complaint against the ERA journal rankings in 2008, which was signed by a number of leading philosophers.

"The ERA is a joke," Malpas says. "It is anti-innovation and the results - which are generated almost mechanically - are designed to increase control over academics rather than trusting us. Peer review has its problems, but it is the only system that works."

His major concern is that the journal-ranking system inhibits unorthodox research by steering academics towards more mainstream work, which has more likelihood of being published in a high-ranking journal.

Malpas believes this makes academics less likely to pursue radical work, or work with great potential but in an early stage of development, which is more likely to appear in lower-ranked journals. He has similar concerns about the fate of interdisciplinary work.

Although Butler acknowledges the arguments, she does not believe that the system is as controversial as some make out.

"Many see it as an improvement over straight publication counts (which are used in the current Research Quantum formula), where there is no discrimination between a publication in Nature, the Canberra Journal of Frostbite Studies or the Tasmanian Journal of Footrot Research."

She notes that expert panels are involved in scrutinising all the metrics. "Metrics are being used in the way they should."

QUANTUM MECHANICS: How Australia Funds Research

Under Australia's "dual-support system", research project funding is provided via the Australian Research Council (ARC) and the National Health and Medical Research Council, while block federal funding for research is provided by the Department of Education, Employment and Workplace Relations (DEEWR).

The block funding (including money for research infrastructure and training) is distributed via a formula known as the Research Quantum, introduced in the early 1990s and still used today.

The three elements used for determining a university's Research Quantum funding are:

- the number of publications made by its academics (peer-reviewed journal articles, conference papers and book chapters all receive one point, while books earn five points irrespective of quality)

- the number of competitive grants it wins

- the number of students enrolled on higher degrees.

中国A片

ADVERTISEMENT

The Excellence in Research for Australia project is being overseen by the ARC rather than the DEEWR, under the remit of Kim Carr, the innovation minister.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT