Many studies are emerging on student use of Generative Artificial Intelligence technologies (GenAI). Our project is unique, because we are:
- Talking to students about their everyday uses and perceptions.
- Reaching out to a broad group of students from different subjects, disciplines and levels of study.
- Collaborating across institutions.
- Taking a relational view, in which students’ beliefs, understandings, backgrounds and learning conditions form part of how we understand AI in use.
Let’s take a closer look at each of these.
Talking to Students
Our approach to understanding student uses of technology and perspectives on GenAI means allowing students to talk about their everyday uses, thoughts, beliefs and worries about GenAI in a safe space.
We believe it’s important to talk to students about their diverse motivations, ideas, and – importantly – the tensions in their minds about how, when and why to use AI in their learning. Our study has already shown that students don’t make choices about GenAI lightly. They take the risks, and benefits, seriously, and struggle with the challenge of weighing these together.
Our Breadth of Focus
Another thing that makes us unique is our broad focus. Many studies are currently being done on nursing students, or law students, or some subset of students distinguished by course, enrollment method, or task in question. There is a lot of value in these projects (indeed, some of us within the research team are running discipline-specific projects). However, our study takes a broader focus to all students within four Australian universities, online and in-person, postgraduate and undergraduate, young and old, across subjects and disciplines. The results we are finding are all the richer for this diversity. If you are a student over 18 and currently enrolled in one of our universities, you can take part in our study (we will be sending out links to our survey between August and September, 2024).
Our Collaborative Foundation
Our project is intentionally built on a broad collaborative foundation, bringing together expertise from four Australian universities. This collaboration allows us to draw on a wide knowledge base, including experts who have been studying AI since well before 2022 (when ChatGPT was released), enriching our research with diverse insights.
By working across institutions, we are also leveraging an innovative, shared funding model to foster evidence-informed understandings of student experiences with AI. This collective effort enables us to pool resources, share methodologies, and engage in meaningful dialogue, ensuring a comprehensive analysis of how AI is impacting higher education.
Our Relational View
We take our definition of AI from Bearman and Ajjawi (2023, p. 4):
“We define AI according to a relational epistemology, where, in the context of a particular interaction, a computational artefact provides a judgement about an optimal course of action and that this judgement cannot be traced.”
This means that any AI technology (e.g. ChatGPT, Claude, Research Rabbit) is always part of a bigger sociotechnical ensemble (a combination of multiple technologies being engaged with in a social context) (see Bjiker, 1993). Rather than seeing the use of an application by a student in isolation, we see it as always entangled in other artefacts, behaviours, social arrangements and meanings (Johnson and Verdicchio, 2017).
Co-labs: Talking with students
We also have focused on co-labs as a process of partnership between students and other stakeholders, such as academics and industry representatives to share different, but equally important perspectives. Co-creating understanding as well as resources.