Level 4, 10 Moore St, Canberra ACT 2601
+61 2 5123 6700

In the media: Ranking on income ‘a dud measure’

April 28, 2015

The Australian, 29 Apr 2015

By Andrew Trounson

A proposal to measure research engagement by ranking universities on income from end users risks being meaningless unless backed up by peer review assessment, the Innovative Research Universities network has warned.

At the centre of what is potentially a high-stakes fight over how to motivate universities to better engage with end users such as industry is the question of whether using a simple metric can truthfully capture the impact of research. Some claim this can only be assessed by comparing case studies.

Britain has adopted a case study approach.

Industry and science minister Ian Macfarlane has made clear he wants a greater focus in the sector on producing commercial returns on research.

Presently, the incentives for researchers are driven by producing academic outputs such as journal articles that are assessed by the Excellence in Research for Australia framework.

Last week the Academy of Technological Sciences and Engineering released a report saying that its proposed Research Engagement for Australia metric would be an easy measure of engagement, and early testing had shown it could highlight differences between non-identified universities that were unlike the ERA results. It argues that if adopted it will encourage universities to better reward researchers engaging with industry.

“If you start measuring something it will modify behaviour,” ATSE vice-president Peter Gray said. Professor Gray said income was a good proxy for engagement because end users would be discerning investors and could be expected to spend money only where they thought there would be sufficient impact.

But IRU executive director Conor King was unimpressed.

“ATSE has released a set of numbers. There is nothing to say it means anything,” he said.

Mr King criticised ATSE for proposing a proxy for engagement without first establishing whether the end user money a university was attracting was delivering benefits.

To determine that would require a qualitative assessment of how the money was being used, perhaps using case studies, and setting a benchmark of good practice or world standard against which to measure universities rather than a simple ranking. Mr King noted that ERA took this approach. He warned that simply adopting the ATSE metric “will risk discouraging some people and falsely rewarding others with a dud measure”.

The Group of Eight was cautious, warning that if used it would need to be part of a broader framework of assessment.

“It is a step in the right direction. But you wouldn’t want this alone to be the response to driving engagement with industry. It would need to be part of a broader response,” Go8 CEO Vicki Thomson said.

The Australian Technology Network said it was working on a broader set of metrics. “Although this report is a significant first step there is more to be done to actively and comprehensively capture engagement between universities and industry,” ATN executive director Renee Hindmarsh said.

Education Minister Christopher Pyne said ATSE’s approach had “the potential to increase the return on public investment in science, technology, engineering and maths research as well as research in humanities and social sciences”.

The sense that ATSE’s approach is gaining momentum is reinforced by the Queensland and South Australian governments agreeing to fund a stage two development of the measure.

A case study approach was previously championed by the ATN, which conducted a pilot exercise with the Go8. But there were concerns that case studies would be too expensive. Australian Research Council head Aidan Byrne has been wary, preferring a metric approach.

Professor Gray said ATSE wasn’t opposed to trying to measure impact qualitatively through case studies but warned that the cost was likely prohibitive at a time of budget cuts.

Source: The Australian

Related Posts