September 20, 2023
Thank you for the invitation to appear before this important inquiry – I emphasise important because the rapid acceleration of generative AI technology is considered by some to be as significant as the industrial revolution. Time will tell.
I will make a short opening statement and I am of course happy to take questions.
Go8 universities are responsible for 70 per cent of the sector’s research – with six members in the world’s top 50 universities and all eight ranked in the top 100. Each year we collectively spend $7.7 billion on research.
In our submission to this inquiry, we outlined what Go8 universities are already doing in this space to ensure the responsible and ethical use of these new technologies and to equip our students, staff and researchers with the knowledge and skills to lead in a generative AI‐enabled world.
There’s both excitement, and in some quarters apprehension, about what this all means for our society, our future and what it means to be human!
Generative AI is a relatively new frontier but we should not forget that AI itself has been around for a long time – as has our research in this space – which underpins the rapid development of AI into generative AI.
What is concerning given this is such an important subject is that Australia is lagging behind competitor nations when it comes to investment in AI and indeed research more broadly.
- the European Union has a target to invest €20 billion per year by 2030 as set in the EC Communication “Artificial Intelligence for Europe”
- the UK has a £1.8 billion program which adds to the significant private investment
- China has committed to becoming a world leader in AI by 2030
- and India ranks fifth in countries with the most AI investments received by startups offering AI‑based products and services.
While the Australian Government committed $100 million in the last budget to support business to integrate quantum and AI tech into their technologies, we have a lot of ground to make up.
As we navigate this new generative AI frontier it will be important for the Government to invest in research and development in AI and related technologies – to ensure Australia can develop its own generative AI resources.
There’s no question generative AI will impact our higher education sector significantly in the future and we are preparing for that eventuality.
It poses both opportunities and risks: risks such as unethical behaviour or cheating, potential risks to privacy and intellectual property, perpetrating bias, and equity is also a major concern.
But we cannot ignore there are opportunities for teaching – allowing students to develop critical thinking and evaluate judgment skills by generating from platforms and critiquing output, and opportunities for researchers to use these tools when undertaking their research, using them to develop code, help design surveys and test methodologies.
We need to balance these opportunities with the need for academic and research integrity.
A study by University of Melbourne and Deakin researchers earlier this year on how generative AI was being used by students and academics, threw up some interesting results.
Some students felt it should be banned, others acknowledged its potential to advance learning. Some students were keen to embrace it, but many others were reluctant to engage.
Almost half of the students who responded to a survey as part of the study said they had not yet tried generative AI and fewer than one in ten had used it to generate content.
The majority of academics HAD tried generative AI and were able to identify many positive aspects to its use.
Our view is that attempting to prohibit the use of generative AI in higher education is both impractical and undesirable.
Go8 universities are leaning in, not walking away from how we harness the use of generative AI.
We place a premium on applying a principles and ethics-based approach to the development of appropriate policies and strategies for the use of generative AI by our students, researchers and staff.
To that end, and to supplement what our individual universities are doing, we have developed a set of overarching principles, benchmarked against the Australia’s AI ethics principles to guide our approach to generative AI tools across our universities.
In developing these principles, we also worked with our international partners in the UK – the Russell Group.
These principles, which I am happy to provide to the Committee, are underpinned by that other AI – Academic Integrity.
I’d just like to conclude with a couple of tangible examples of how generative AI is being used proactively in our universities.
At the University of Sydney, first year medical students were asked to compose a question about contemporary medical challenges and ask ChatGPT to write an essay on it. Students were asked to read and edit the response, track their changes and submit a final draft for marking.
This was a test of judgment, curiosity and creativity rather than simply collating information. We are in the business of critical thinking – not just memorising and reproducing information – which ChatGPT can do for us.
At the University of Melbourne, graduate software engineering students are using a ChatGPT‑enabled process to undertake peer code review whilst Master of Teaching (Visual Arts and Design) students are using a combination of generative AI tools such as ChatGPT and DeepDream to learn about AI for creative and speculative writing exercises as well as developing designed objects and image-based works.
The challenge is not so much the tool itself – but how we build it in to our teaching, learning and research.
I’ll conclude by saying that Go8 universities, no matter the new and emerging technologies that arise, remain committed to delivering quality education and quality research. Australia’s ability to detect risks and leverage opportunities hinges on our research capability. The only handbrake on that is investment in research and development.