A white man with glasses and wearing a suit looks straight into the camera Listen to Dr. Chris DeLuca, Professor of Educational Assessment and Associate Dean - School of Graduate Studies, talk about AI in the classroom - ways to leverage it to enhance student learning, why we shouldn't be afraid of it, and how to get started using AI.

Find out more about Chris DeLuca.

Resources

 

Song: Talking about innovation in teaching and education, Popular Podagogy. Discussions that are topical and sometimes philosophical, Popular Podagogy. Popular Podagogy.

CC: Hi there. Thanks for joining us and welcome to another episode of Popular Podagogy, where we try to bring big ideas in teaching and education to life. I'm your host, Chris Carlton, and this podcast is being brought to you by the Faculty of Education at Queen's University. Welcome to the podcast. In this episode, I am excited to be speaking with Dr. Chris DeLuca, who is an Associate Dean in the School of Graduate Studies at Queen's University and a Professor of Educational Assessment at the Faculty of Education. Chris leads the Classroom Assessment Research Team and is a Director of the Queen's Assessment and Evaluation Group. Our exciting and topical discussion for this podcast is ways to leverage AI to enhance student learning. podcast is Ways to Leverage AI to Enhance Student Learning. Chris's research examines the complex intersection of curriculum, pedagogy, and assessment as operating within the current context of school accountability and standards-based education. His work largely focuses on supporting teachers in negotiating these critical areas of practice to enhance student learning experiences. Chris's research has also been published in national and international journals and has been recognized through several awards. Two recent publications are the focus of our conversation today, and they are Leveraging AI to Enhance Learning and Forward Thinking Assessment in the Era of Artificial Intelligence. Chris, welcome to our podcast.

CD: It's great to be with you, Chris. Thanks for having me. I am very excited about this conversation. I've just started to take a look a little bit more into AI, and it's such a fascinating area and something that I think has so much potential.

CC: Chris, artificial intelligence or AI, as it's referred to, seems to be in the news constantly and with mixed and sometimes contradictory opinions regarding its application. Before we even get into our conversation about AI and education, can you just please give us a definition of what AI is?

CD: For sure, Chris. So AI, artificial intelligence, we typically use when we're talking about large language models or machine learning. And it's where large volumes of texts, think about books, essays, or conversation codes, are used to construct responses to prompts that we give the machines. So most of us are familiar with chat GPT, for instance, where we go into chat GPT and we give it a prompt. For example, write me a paragraph or write me a cover letter for a job that I might be applying for. And ChatGPT mines the internet and the large amounts of data that are available on the internet to write you a draft letter. And then from there, we can take that letter and use it in various ways.

CC:So is it a very elaborate mining tool? It's essentially a very elaborate mining tool.

CD: And it's something that as we wouldn't actually go into the internet and try and search out different templates or information ourselves, now the computer is doing it for us. It's able to look at different patterns, different algorithms, and try and find responses to the kinds of questions we are asking. And as it does that, it gets better and better at doing the tasks that we ask it to do.

CC: Okay, so it sounds innocent enough, but in the news, I read that the United Kingdom held the world's first major AI safety summit. And at the summit, political and tech leaders discussed possible responses to this society-changing technology and focused on growing fears about the implications of the so-called frontier AI. So in your opinion, Chris, is AI something that we should be afraid of?

CD: I think with any change, there are always cautions and risks associated with it. And if we think about within education, it's the same as true. So as we use AI, we are also putting information into systems, and it's using it in ways that maybe we can't control. So we always want to be mindful about what are we feeding into AI models, what are we feeding into applications like ChatGPT? Who is accessing that? And more importantly, as teachers and students ourselves, how are we using the outputs of AI? And what kind of integrity does that hold for students, their learning, and importantly, for what we're talking about today, the assessment of learning? So do you think, Chris, that the fear is coming from that fear of misrepresentation or using our data for something other than our purpose was to input it in? I think specifically in education, the fear is oftentimes academic integrity. So in education, what we're most concerned about is when students are using large language models and artificial intelligence, we are not able to assess what they know versus what the computer knows. And so we need to find new and novel ways of assessing student learning that both allows them to use AI, but also allows them to express their own understandings of the content and the curriculum that we're teaching. And there are a variety of strategies that we can engage in as educators to do that work.

CC: Fantastic. I'm looking forward to that. I was fascinated by your publications. And in one of your publications, you refer to AI as the elephant in the classroom. And I love that description. And you state that the current debate around the presence of AI technologies, such as ChatGPT, which you just mentioned, must quickly shift from one of concerns about assessment integrity to one about how we use technologies in our classroom to enable our students to demonstrate more complex and valued learning outcomes. And in this respect, AI provides the necessary impetus to spur more forward-thinking assessment practices and policies within both the provincial and national education system. So what do you mean by forward-looking with AI?

CD: Right. Great question, Chris. So I'm squarely within the camp that AI serves as a productive disruptor within our classrooms, that when we harness AI, we can think differently about how we engage students in their learning and how we think about assessment in the classrooms. And too often, assessment in the classroom has looked a particular way, oftentimes a paper and pencil test. And we have trouble sometimes moving away from traditional assessment and testing practices. And I think AI is giving us a kick in the pants to actually think differently about assessment. And I don't think that's a bad thing, but I think what we do need to do is come together and say, well, what are some strategies that maybe would allow us to move assessment in productive directions? And so in the paper you mentioned, we start to unpack that a little bit. And so it does lean on things like the more formative assessment strategies, but it also leans on the more authentic and community-rooted assessment strategies. So it fundamentally drives us to think about assessments that invite students to make connections, that invite students to make extensions between what they're learning in classrooms and what that knowledge looks like in the real world and in societies and communities.

CC: I love that word, productive disruptor. That to me says it all. And you're preaching to the choir when you talk about moving from traditional assessment to authentic assessment and productive assessment. I think that is such an important thing for teachers to look at. And sometimes it's a hard shift to make unless we have this, like you said, the disruptor there to force us into doing something. So how do we educate our students using AI?

CD: Right. So I think one of the ways we do that is be explicit in our classrooms about where and how we use AI, both in learning, in teaching, and in assessment. So this can take the form of when we are educating and teaching and talking about our learning goals, we are explicit about how AI can be useful in achieving those learning goals. So if the task is to write an essay, we can talk to students about using chat GPT to help structure the outline of that essay. And that might be an appropriate use of chat GPT in this learning context. So where is AI sanctioned and where is it permissible to use? And also talk about where is it maybe less appropriate to use? Where might it be a breach of academic integrity to use AI? Where does the teacher really want to see the student's individual or collective thinking on something rather than they're leaning into the technology? So being explicit about where and how to use AI already begins to shed light on what it is and how it can be productively leveraged within classrooms. And that point of being explicit about where and how is such an important thing, I feel, as a teacher. They use it. They see it. They're playing around with it every single day. So it makes sense to let them experiment in a controlled way and again control the use and the abuse of such a program.

CC: In one of your other articles you authored entitled How Can Teachers Integrate AI Within Schools, you list a five-step strategy for students, sorry, for teachers to meaningful integrate AI into the classroom, which you say begins with helping students understand the limitations of AI technology. And I've just been playing around, but I've quickly found there are many limitations of AI technology. So can you walk us through these five critical steps, please?

CD: So first off, I think we do need to think about the limitations of AI, and we need to make that known to our students. So AI can't respond to all prompts or can't create assignment responses for everything we ask students to do in classrooms. For instance, it can't engage in group work. It can't make community connections. It can't make personal connections to your own life experience. It can make suggestions to that, but it can't actually make those meaningful connections. So there are limitations to what any computer application can do. There are ways that the computer applications can support the student in some of those tasks, but it certainly can't do them in its complete way. So some of the strategies we suggest are designing the assessment and the assessment criteria with AI in mind. And what this means is, again, being explicit about where and how AI works within the assignment task. And this actually follows right through to, let's say there's a rubric with how we're going to measure student learning and grade student performance at the end, being explicit as to how AI is operating at the various levels of performance. So if a student is performing at a level one or two, they used AI in this kind of way, versus if they're performing at levels three or four, they're using AI in more sophisticated ways. versus if they're performing at levels three or four, they're using AI in more sophisticated ways. And so actually mapping out what the AI proficiency looks like within an assignment. Another strategy is engaging students in feedback cycles so that when they are using AI outputs, they aren't using them just in their raw form, but we're inviting students to unpack the AI outputs and begin  using them just in their raw form, but we're inviting students to unpack the AI outputs and begin to build on them using their own insights, as well as the insights of their peers and insights from instructors and teachers. So engaging in feedback allows them to take the AI output and make it more their own and building some of those connections that they're making from their class learning. Another strategy is to engage in more performance-based tasks. So there's oftentimes the need for written-based tasks, which is where the AI applications come into play. And there are a number of AI applications that engage in artistic forms of production as well but inviting students to represent their knowledge not only in written form but in other forms as well so recognizing that notion of triangulated evidence and so inviting students to say yes you can write it but now show me how you know this concept by presenting it or by reflecting it visually or by communicating it outward to another audience. One of the most important strategies that I think AI is encouraging and inviting us to think more deliberately about is moving learning into communities. communities. And so when we do this, we are saying, how do I actually design the assessment tasks so that it frames learning as an active engagement with community members? So asking the question, if I were to make this knowledge authentic, what would it look like as an assessment task? And so that automatically, not only does it mean that the AI can't solely do the task, we might lean on AI for parts of it, but it requires the students to take that knowledge and mobilize it within communities. And it allows for relationship building and a whole host of other skills development as well. And then the final strategy is engage in evaluative conversations with your students where they actually share with you how and describe to you how their learning was improved and shifted as a result of AI application. This is particularly important because if we look at the world of work going forward, we're going to only increasingly see AI as a key component of how we all do our work. And so part of it is AI proficiency and AI competency. And so in some systems of education, they are making a deliberate move to building this into their curriculum. And here, what we're calling for is actually having students have the capacity to be explicit about their use and intentional about their use.

CC: And I've got so many questions, Chris. I've been writing like crazy. And I actually read your article, and I thought I understood it all. But you're in conversation. There's so many very neat things that are coming up. And if you don't mind, can we jump to the idea of AI proficiency and competency? jump to the idea of AI proficiency and competency. So it's totally opposite than what I've seen happen in some school boards where the fear has just shut down the use of AI. And you have so eloquently said, we need that evaluation conversation and look at how did AI actually assist them or improve or shift their learning. And I think which is so important because they're going to be exposed to AI for a long time. It's not going away. And so how do we utilize it? How do we get better at using it as a resource tool that will be able to help us be better students, be better professionals or whatever we're doing in life?

CD: Yeah, I mean, the elephant in the room is that AI is here and it's here to stay. And so if it's here to stay not only in our classrooms, but in our workplaces, the progressive response is how do we prepare students for a world of work that integrates AI? And so you look at systems of education, like, for example, Singapore, where they've developed intentionally a curriculum around AI competency for their students. And it's being embedded within now how teachers need to teach around this new development within the world versus exactly what you pointed, which is some systems who have tried to actually ward off AI as a development within technology. And so that actually runs counter to actually likely preparing students for what they are going to need to do when they leave school in context. And just from a personal experience, whenever I've witnessed something that has been said, we shouldn't use that because there are areas of concerns, that's what piques our students' interest in saying, I want to use that. I want to see what the buzz is about. So again, not the right approach.

CC: From my point of view, not the right approach. I also love the point you made about when they're using AI and the resources being output, the unpacking of that AI output to engage in feedback and making it their own. So getting away from that plagiarism aspect and saying, this is another resource, this is another opportunity for you to find out information about whatever topic they put into ChatGPT, and then make it your own. Present it in a way that makes sense to you, which also goes back out to moving learning into the community, I think, as well.

CD: Right, exactly. So it takes the concern or the caution or the fear, as you mentioned at the beginning of the podcast, around academic integrity. And it says, how do we address that head-on? We use a series of strategies that almost makes academic integrity an impossible thing to achieve because we are engaging in explicit conversations. We're making the assessment a meta conversation about how we've used AI and we're moving assessment into the community in ways that AI can only help us so far. can only help us so far. And so if we actually change assessment in these significant ways, AI is useful in learning, but it's not the only thing that becomes the assessment. And so that's the challenge, are we prepared to move assessment in this direction? It is a fundamental shift in how many teachers and many contexts of learning do assessment. And so that's why there's an uncomfortableness around AI sometimes, right? And around any new developments in education, because it requires change. This too requires change. Are we ready for that change? Are we ready for that change? And I guess whether we're ready or not, it's here. And it's going to be becoming more prevalent in everything we do in life. So how can we not embrace change and go forward and realize the limitations of AI, but go for what you talked about, that competency and proficiency in using AI, which I think is just amazing. And, you know, on a smaller scale, AI operates in many ways in our lives. Sometimes we don't even realize it in the way that, you know, we write emails and it gets automatically checked for us, right? I mean, and we don't stop students from using AI in those ways, right? We stop students sometimes in using AI explicitly, like write a response to a prompt that I'm giving you or write your essay using chat GPT. But in more subtle ways or in ways that are less deliberately using AI, we don't have policies around because it is so profuse. It is so commonplace. And so, as you say, it is here. And so if we're going to try and sanction it, I think we're going to have some trouble with that.

CC: I agree. There's so many different ways this conversation can go, Chris, and I think we're going to have to have another one because one of my areas that i i want to dig deeper into and i've used it with an assessment with the course that i'm teaching right now are ways to also leverage ai to enhance teacher productivity because i really feel there's there's such an advantage to looking at this as a resource, as a teacher resource. And as you know, there's a lot of AI programs that are now teacher focused that are developing and becoming more and more prevalent. And I know that some boards are now even using AI for a report card generation and things like that. So it's happening. The move is there. So we need to prepare ourselves. But this puts us right to the next question that is a tradition with our podcast. same time um what new teachers are looking at is they they'd love to just get a few tidbits if you had just a few ideas to get teachers thinking about ai in their own classroom what would it be what would be the first steps

CD: yeah so i think initial steps if um you're starting down the AI journey is first things are try it out yourself. You know, play around with ChatGPT, play around with some of the other AI applications, and just get a sense to what they produce, as well as, as you said at the beginning of the podcast, their limitations. Because until you have an intuitive sense of that, it's very hard to think about the assessments you're creating for your students and where you need to shift them and how far you need to take them. So it's important for teachers to actually try this out for themselves. Take one of your assessments, put it into ChatGPT, see what comes back to you, so that you actually have a live test case as to what this kind of software does with the kinds of tasks you are asking your students to do. And then take some of your daily tasks just as an educator and try and see how ChatGPT or other kinds of AI applications help you with those tasks so that you start to see the possibilities that these kinds of softwares enable. And I think once you begin to map that out and start to feel that terrain out a little bit, you start to see it in part productively, not just as something that, you know, we need to be fearful of, but something that actually enables our work. But then you also see, here's how I can more flexibly change my assessments to accommodate this development and education. But I think it's all about trying it out. We ask our own students to, after we've engaged them, to explore in a concept. So it's no different here. Most of us are engaged in this and are curious about it. So we need time to just explore in a safe way and then see how we could use it into our classrooms, which I think is an excellent suggestion. Chris, thank you so much for sharing your time and ideas with us today.

CC: The topic around AI is not going away and will continue to be discussed in many different summits and conferences for a long time. And I'm sure you're going to be writing more about it as well. Our listeners will definitely want to look further into all of your resources that you've provided for us to include in our podcast site as well. So, Chris, again, just thank you so much for taking the include in our podcast site as well. So, Chris, again, just thank you so much for taking the time and discussing this important topic with us.

CD: Pleasure to be here with you today, Chris. Thanks so much. That does it for another episode of Popular Podagogy. Again, thank you to our amazing guest, Dr. Chris DeLuca.

CC: Josh, as always, where can our listeners subscribe to make sure they don't miss any of our popular podagogy podcasts? Yeah, you can find this podcast on Apple Podcasts, Google Podcasts, Spotify, the Faculty of Education website, and pretty much any other place you get your podcasts. Please don't forget to check our Queen's Faculty of Education website and search for Popular Podagogy for additional resources and information on this important topic. Well, that's it from myself, Chris Carlton, and our incredibly talented and resourceful podcast team of Josh Vine and Erin York. Stay healthy, stay safe, and stay connected, and we will see you next time for another exciting episode of Popular Podagogy.