The Group of Eight (Go8) welcomes the opportunity to provide feedback to the ARC Engagement and Impact (EI) 2018 draft submission guidelines, following on previous written Go8 input into the development of the measure.
The views expressed in this letter have been informed by consultation across the Go8 members. The Go8 institutions may make individual and more detailed responses to the draft guidelines with a particular focus on some of the operational and technical issues raised by the guidelines.
At a high level, the Go8 response to the guidelines is focussed on
- Assessment – ensuring there is broad consultation on the panel selection, assessment process and calibration of the rating categories;
- Impact case studies – clarity on the relationship between impact and path to impact and the use of Socio Economic Objective (SEO) codes rather than Field of Research (FoR) codes; and
- Timeframes – ensuring that a robust result is achievable in the given timelines.
This letter also includes more detailed response to matters raised by the draft guidelines and future considerations for future iterations of the EI assessment process.
The ratings produced out of the EI exercise will be one of the major ways in which the Government and public evaluate the impact of our research. For this reason, it is critical that there be consultation on evaluation panels composition and selection, the process/methodology for considering the submissions and how the final ratings are determined given this process, i.e. rating calibration.
This is not fully described at Section 1.5 of the guidelines. Specifically, the Go8 believes that there should be more detail and consultation regarding:
- Panel member qualification (e.g. desired attributes) and selection. We note the potential challenges in finding appropriate and informed research-end users for the panels and urge that further attention be placed on how to ensure academic experts are appropriately balanced with end-users.
- The rating scale of low, medium and high. While the EI Framework document lists these, a description including thresholds of each rating, and the proposed process by which panels determine the ratings they allocate, should be included in the final 2018 documentation. Guidance could be given for instance to panels regarding how to gauge reach and significance, including in the context of how much time has passed since the original research, and how these factor into their allocated ratings.
Impact case studies
After much discussion leading up to the Go8-ATN Excellence in Innovation (EIA) impact case study trial in 2012 it was decided that Socio Economic Objective (SEO) codes were the best way to categorise case studies as they most usefully identified where the impact was manifest in the real world.
While the impact case studies in the EI draft guidelines do allow for SEOs to be attached they are secondary to the Field of Research (FoR) code attribution which determines the basis of the case study.
This speaks to the tension between the approach to impact i.e. university processes delivering impact which are often in disciplinary, that is FoR based, units and the impact which as an outcome, is often best classified by SEO code.
Care also needs to be taken in assessments of approach to impact and impact– which receive separate ratings in the EI draft guidelines. It would be expected that high rated approaches to impact will be pathways that currently exist in universities, while high rated impact – potentially taking 15+ years – may have arisen out of no longer active impact pathways.
As both approach to impact and impact are included in the one case study and are required to be linked in the case study this assessment needs to be carefully fleshed out and considered in detail, with due consideration given to socio-economic objectives in assessment of the impact.
The timeframes for the EI and ERAs submission are tight – tighter than expected and for ERA tighter than the ERA 2015 timeframes. While universities will make best endeavours to meet the timeframes, universities would benefit from the period being extended. The addition of a month between release of final EI guidelines and EI submission should not overly compromise the ARC’s ability to cross-analyse the inputs from both ERA and EI.
While we acknowledge the cross-over of ERA data inputs to EI, the effort needed to identify a shortlist of narratives along with the production of many narratives – including the engagement indicator explanatory statement not trialled in the pilot – is very resource intensive. This is particularly the case for research intensive universities that might expect to reach EI threshold in most two-digit FoRs and are receiving no additional resourcing to undertake this exercise.
In some institutions including Go8 universities, the personnel involved in ERA and EI submission are identical. This means that staff have the challenges of preparing EI inputs while institutions are still involved in ERA submissions, given there is no interval between ERA processes closing and EI submissions beginning on 16 May 2018.
The short timeframes for EI and ERA make this overlap of duties more challenging.
As an example of where proposed timeframes may cause difficulties, the turnaround period for data integrity checking of five days (18-22 June) seems ambitious, even taking into consideration the re-use of ERA data (and in-built data integrity checking) for the EI implementation. The ARC may wish to consider extending that turnaround period so sufficient time is provided for this step to occur in relation to new data inputs.
Institutional submissions will no doubt also identify other examples of instances such as these that should be given consideration.
More detailed considerations
The draft guidelines introduce via the definition for research end-user the significant concept of excluding publicly funded research agencies and medical research institutes. We question that a complete picture of university research engagement and impact can exclude such vital relationships. We note that such an approach is inconsistent:
- with the Department of Education and Training’s definition under new Higher Degree by Research reporting arrangements, which does not explicitly exclude publicly funded research organisations (PFROs) or medical research institutes (MRIs);
- with the explicit inclusion in the draft EI research end-user definition of government, which includes a range of organisations with whom universities may have research dealings akin to those they have with PFROs. Examples of such organisations include the Bureau of Meteorology and Geoscience Australia.
We note further that it may be challenging to determine if an institution is an affiliate or controlled entity of an overseas higher education provider.
Go8 seeks that at a minimum the guidelines explain the exclusion of PFROs and affiliates, controlled entities and subsidiaries of higher education providers MRIs as research end users.
Go8 agrees with the proposed streamlined set of engagement indicators, and the associated narrative elements.
However, there is a risk of overlooking key engagements by too rigidly defining those Category 1 grant schemes relevant to EI. As an example, the Go8 would propose that the specified Category 1 Schemes should include the entire ARC Linkage scheme, not just Linkage projects.
We seek clarification in the final guidelines regarding how the ARC identified the specified Category 1 grant schemes (which constitute around a third of Category 1 schemes). For example, it is unclear what the rationale is for excluding schemes that would clearly attract engagement including cash support from end-users.
Use of EI information
Go8 notes the intended wide breadth of use by the ARC of data submitted by institutions. The Go8 seeks that, in the broader spirit of better communicating the impacts of research, the ARC work with institutions where specific case study marketing or showcasing action is intended.
Go8 proposes that future development of the EI take into consideration how PFROs and MRIs can be included in the definition of research end-users, acknowledging the diverse functions some of these bodies (such as CSIRO) have beyond academic.
Go8 notes that 15 years for measuring associated research may be too short to provide meaningful information on basic research with impacts today and advocates that development of the EI consider a longer period for subsequent rounds.
Regarding the additional quantitative data that institutions may provide to support their engagement narratives, Go8 agrees with the ARC’s intention to examine these regarding their feasibility for inclusion in subsequent rounds. The Go8 strongly advocates that the ARC consults comprehensively on any new indicators well in advance of circulating draft processes for the subsequent EI round. We note our concerns regarding the use as future indicators without clear definition or parameters of some ‘additional quantitative’ data listed in Appendix F, such as patents, in-kind support and co-funding of research outputs.
In future development, consideration could be given to the possibility of incorporating some of the metrics used to incentivise Knowledge Exchange in the United Kingdom under the Higher Education Innovation Funding (HEIF) system. These include indicators on:
- Contract research
- Collaborative research
- Continuing profession development and continuing education
- Regeneration and development programmes
- Facilities and equipment services
- Intellectual property (including sale of shares)
I look forward to continued collaboration on the development of the EI measure. Please do not hesitate to contact me to discuss any of the above at email@example.com or 02 6175 0700.
 This includes in June 2016, in response to the Discussion paper, and input following the pilot by letter to Professor Thomas on 7 July 2017 and 25 August 2017.