In this episode of Popular Podagogy, host Chris Carlton sits down with educator Heidi Siwak for a thought-provoking dive into what students really think about artificial intelligence. Drawing on insights from over 5,000 Ontario students from Grades 6-12, the conversation explores everything from AI as a tool for inclusion and creativity to concerns about trust, surveillance, and bias. The big takeaway: students aren’t just passive users of tech. They want guidance, ethical guardrails, and a real say in how AI shapes their learning and future.
Resources
Heidi is an award-winning Ontario educator passionate about fostering student voice and agency and equipping students with the skills they need to shape the future of AI. As a Teacher-Coach at I-Think, Heidi has guided over 5000 Ontario students and their teachers through I-Think’s Artificial Intelligence Challenge and leads I-Think’s AI Readiness workshops for educators and parents, helping schools and families navigate the AI era. Heidi’s work as an educator has been featured in articles, books, and podcasts. She is an experienced conference presenter and TEDx speaker. Heidi has an MEd in Education Leadership and Policy with 25+ years of K-8 teaching at the Hamilton-Wentwork District School Board and at the Ontario Ministry of Education. Heidi is an alumna of Mila-Quebec Institute for Artificial Intelligence Summer School on Responsible AI and Human Rights, 2025.
Transcript
Theme Song
Talking about innovation in teaching and education. Popular podagogy. Discussions that are topical and sometimes philosophical. Popular pedogogy. Popular pedogogy.
Chris Carlton
Hi there. Thanks for joining us and welcome to another episode of Popular Podagogy, where we try to bring big ideas in teaching and education to life. I'm your host, Chris Carlton, and this podcast is being brought to you by the Faculty of Education at Queen's University. Welcome to our podcast. In this episode, I'm excited to be speaking with Heidi Siewak, who is an Ontario educator passionate about fostering student voice and agency and equipping students with the skills they need to shape the future. I am so behind that, and that is a great passion. Heidi has her Master of Education in Education Leadership and Policy with 25 plus years of K-8 teaching at the Hamilton-Wentworth District School Board and several years at the Ontario Ministry of Education. As a teacher coach at iThink, Heidi has guided over 5,000 Ontario students and their teachers through iThink's Artificial Intelligence Challenge and leads iThink's AI readiness workshops for educators and parents, helping schools and families navigate the AI era. Today, we are going to be talking about the Student Voice Report, what students are saying about artificial intelligence. The report is informed by over 5,000 students across Ontario who participated in the iThink AI Challenge Kit. Heidi, welcome to our podcast.
Heidi Siwak
Thank you so much, Chris. It's an absolute pleasure to be here. And anytime I can support teacher candidates and a faculty of education, that's a good day.
Chris Carlton
That is a good day for us that you've joined us as well. Heidi, I read in your Student Voice Insight report that studies show rising hopelessness and anxiety among youth who often feel powerless to shape their world. Research also shows that fostering agency in young people is crucial for building collective capacity to face global challenges. One of your phrases is action is the antidote to anxiety. You state that when we give students real, meaningful opportunities to imagine an AI-driven future, we are preparing them with confidence. Your AI challenge demonstrated that youth care deeply about the world, and have the confidence to build a hopeful future using AI. So in your study, one of your recommendations is that all students and teachers should learn what is AI. So my first question to you is, what is AI and how do you define it?
Heidi Siwak
The big question everyone's asking. For me, there's a couple of things I think about when I and I'm asked to define that question. AI, first of all, is any technology that is capable of solving complex problems that we normally would think only humans or animals could solve. And I think it's really important to, when we think about AI, is to demystify it a little bit. It can feel very mysterious to people and threatening. it's artificial intelligence. Really, artificial intelligence is just math. linear algebra and calculus, finding and learning about patterns in really big data sets. So there's a bit of physics involved in there as well. And it's important to remember as we think about AIs, humans create AI. We've lived a long time without artificial intelligence. Artificial intelligence cannot exist without us because we are the ones creating the data. Numbers, words, sounds that get turned into numbers that enter into the system. And in the system, the algorithms learn about the patterns that are in there and then can generate things from them. And the other important thing to remember as we think about AI is that it's been here a long time, since the 1950s. First chatbot, Eliza, was developed in the 1960s. AI is going to continue evolve. Right now, we're struggling, all of us, with large language models. I just heard last week, Jan LeCun, who used to be a chief AI person for AI at Meta. He's now left Meta. He's already abandoned large language models, and he's working on a new form of artificial intelligence because he doesn't think large language models will be able to overcome some of the limitations in them. So AI is something that is going to change, and what we call AI now, we may not call AI in the future.
Chris Carlton
I know. And I love that you talk about demystifying the mysterious AI, because it is one of those relatively unknowns for most people. And when you said AI is just math patterns and data sets, that makes so much sense to me. So thank you so much for clarifying that. And I did not realize that it's been around since the 1950s and the 1960 being the chat box there. So much information here. I really enjoyed and found it very informative reading through your AI student voice insight report, slide presentation in particular and your findings. And I love the word student voice because that to me is the most powerful thing. We're hearing it from their voice. Can you talk about this AI, sorry, the I Think report? What is it and who is involved in it?
Heidi Siwak
Yes, absolutely. The AI report comes from what we have noticed and observed in the students' work and recommendations that they, and things that they create out of the AI challenge. In the AI challenge, Students pursue a question, how might we use AI to enrich the lives and possibilities of every student in our school? That's a big question. For many of them, it's the first time they've ever had to think about the learning needs of their peers, what the actual challenges in the system are. As they think about that question, we ask them to keep three things in mind. First, deeply human. Fundamentally, AI has to be good for humans, and what does that mean? Second, critical thinking. What do students need to know and understand about artificial intelligence in order to be able to think critically about it? And 3rd, enhancing learning. Can it actually enhance learning? In what ways? How might it augment us? Those are the big themes within the kits. And then the students go about learning about artificial intelligence. We immerse them in a number of activities. We bring in expert speakers. They get some hands-on experience with AI. And then they use iThink's problem-solving methodology to explore the tensions and what they've discovered and come up with their own ideas, recommendations, imagine possibilities for AI. And so the report really is a synthesis of the themes we have seen in student thinking who've gone through the AI challenge?
Chris Carlton
And I really like the idea of how can we use it to enrich the lives of others? Not so much for how can we use it for ourselves, but how do we make it better for other people? And the KIT themes, deeply human, critical thinking, and then enhancing learning, which is such a big topic right now. How can we use AI to enhance learning? I'm excited about to hear more of it. Could you briefly walk us through each of the themes brought out by the students, maybe sharing a couple of possibilities and concerns they had, and then if your group had any recommendations regarding the themes. I know there's five of them, but could we start with theme five, which is rebuilding trust? I think that's a big one.
Heidi Siwak
Absolutely. That was a huge one. That was so interesting. It came out in the last year in a challenge. And it was a recognition from students that AI has ruptured relationships between teachers and students and peers and parents in the classroom. And students want to have good relationships with their teachers, and they recognize that something has to be done to repair those relationships. So what they imagine or what their recommendation for that really is that it's a lack of guidance that they have had. And they are proposing that schools hire staff and develop courses focused on critical thinking and ethical use of AI as a way to repair the teacher-student relationships. It really surprised us that students want guardrails. They want consistency. There's lots of variation right now in how AI is allowed to be used. They want boundaries because they want to be responsible, ethical users. In order to do that, they need some guidance in that. What also surprised us as I, you know, explored the idea of trust is they thought about their parents. and that it's not just students and teachers going along on this AI journey. Parents are involved as well. Often, parents' relationship with school and children, there can be tensions around that. There can be difficulties understanding what's happening in the school system. So they're really imagining ways to use AI to personalize communication with their parents, for example, so their parents can stay up to date with what is happening in the school, but they're also thinking about what kinds of learning their parents need to undergo. in order to have trust in the system that's moving ahead with AI. They're all very concerned about over-reliance on AI and it impacting their thinking and creativity. So trust for them also has to do with can I trust these tools and how should they be designed so I can trust them? And they're concerned as well about being required to use AI when they don't necessarily want to use AI. So there are many elements that they see to how we build trust in the system in order for them to be able to use AI in a way that builds relationships and community in the school.
Chris Carlton
And I love that builds relationships because you talked really at the beginning about AI has ruptured relationships between student-teacher, student-parent, parent-teacher, and it's so evident in the education system. And the fact that they would like guardrails and guidance, I think is such a powerful point in the study. So I think that's amazing. There are four other themes. I'm going to leave it up to you, Heidi. Which one do you want to talk about next?
Heidi Siwak
Talk about data, because I think that is a starting point for understanding artificial intelligence. We need to understand what data is. And we spend a lot of time in the challenge helping students understand data. So something they might learn, for example, is the data that is used to train large language models comes predominantly from the global north, from North America. It's predominantly in English. The global majority and most of the planet is not represented in that training data. And for some of those cultures and countries, even if they want to be represented, they can't because they're not generating enough data. Artificial intelligence requires large amounts of data. they cannot participate in it, which really biases the information that we're seeing that is being generated by AI. We're getting a very particular worldview from what is generated from that. So we spend a lot of time helping students understand data and question data and ask about the data sets. You know, who created the data set? When was it created? What's in it? Are there biases in it? Teaching students to ask questions about data. An example of how that was used in a really interesting way was from L. Drive Public School in TDSB. Those students were doing the challenge in parallel with an inquiry into the sustainable development goals. And the students had recognized that there was a problem in their school around just even sorting garbage. The students didn't always know what went in garbage, what went in recycling, what went into compost. So they actually used a tool called Teachable Machine. And they worked on creating their own data set, gathering pictures to train the AI to recognize garbage recycling or compost. From that process, they really learned deeply like how difficult it is to put a data set together, you know, to think about patterns that are in the data sets that they are using. And they became good critical thinkers about data from that. They're now working on creating an app. that students can use to just take a picture of, something they want to discard and the app will help them, decide where to put that. So exciting possibilities. Data is both an opportunity and a risk for students. Students, the things that they're imagining are the school board creating tools that leverage data that keeps their privacy protected, but is actually uses data to improve their learning. Surprisingly, students are open to that. They're curious about how different their learning experience could be if more of their data is used. They're excited by, they see data as a game changer. It will fuel innovation in medicine space, problem solving across disciplines. They are excited that data can be used to really personalize their learning and customize the work that they do. It may be a way for them to get faster feedback, get guidance on what to do next. But they're also very concerned. They recognize the biases in the system, ethics around data. They know that how AI is being created and the impact that it's having is very concerning. And I would say right now, as students think about data and how it's being used in AI, AI currently falls short of their expectations. And they expect the AI that they're going to be using in their learning to improve.
Chris Carlton
I love that they feel it falls short of their expectations. That's incredible. And the fact that you're teaching them to question the data or understand the data, building that curiosity of how is this actually coming across and then into those biases that are obviously in there from the data being presented to it. Phenomenal. That leads us right into theme two, which is a tool for inclusion and well-being if it's used thoughtfully. Can you touch on that?
Heidi Siwak
Absolutely. This one really surprised me because we've heard so much about the negative impact of artificial intelligence on the well-being of users right now. There's lots of stories. Because this process is empathy-based, One of the things that students had to do was really think about their peers and who's not getting served by the education system. They see it all. It's not all. the things that we see as educators, they saw as students. And they began to imagine ways of using AI to improve the opportunities for students. So one of the things they mentioned, for example, was students in school systems are often excluded or often opt themselves out of things ahead of time. They already discount themselves from being able to participate in something. And so they're imagining AI being used to actually dismantle barriers within the school system by personalizing learning and by making school systems easier to navigate for students. And they imagine things like having an AI companion or a tool that would understand the talents that particular students have and might connect them to opportunities in school or out of school. They imagine AI being used beyond the school boundaries to help them connect to things that aren't available within their school. And they also saw AI as being able to provide judgment-free feedback. When a peer is giving feedback, when a teacher is giving feedback, there's also that human judgment perception of that. A lot of them felt that when it's just AI giving me feedback, it's more neutral and it doesn't feel like I'm being judged. So really, they see possibilities for AI to include people and actually to support well-being. What they don't want, though, is for AI to replace human relationships. And they want to ensure that if we're going to be using this, there's some element of morality, however they define that, and there's critical thinking and creativity is still fostered. They don't want to lose human connection. They've already lost it once through the pandemic and going to online learning. And they also, that well-being is to be fostered and that inclusion is to be fostered. Their concerns about making sure we're using it in ways that academically honest, that supports cognitive development and their ability to express their personal ideas matter. And the last piece connected to well-being, I would say, were the environmental impacts. They're very concerned about that. And they want to make sure that if they are using AI, they have the choice to use it. and they have the choice to refuse it. And the AI that's developed for them to use, it's developed in such a way that supports all parts of the beings on our planet, not just human. And an idea that came out of it was informed consent. This year was the first time I saw that idea come up. They want to be informed about how their data is being used. They want to be informed about how the tools are developed. They want to be informed about the environmental impact of particular tools so that they can have agency and voice in making decisions about how they use AI in ways that support their own well-being and that of their peers.
Chris Carlton
So Heidi, I'm writing as fast as I can down here. All these phrases that ring so true with me. And I hear a lot of UDL in here, Universal Design for Learning, the empathy, the use to help dismantle barriers, student voice, student choice. I love your phrase, choice to use or refuse. And then adding that agency and voice. So all the things that teachers love to include in their program, being able to be used to talk about AI and the importance or the way that we can use it to be more empathy-based. I think that's amazing. The last point you made, talking about not to replace human connections, I think flows into our theme three as well. So more efficient, not less human. Can we talk on that a bit?
Heidi Siwak
Yeah, absolutely. So students see AI as a powerful support tool, but they don't want it to be a replacement for teachers. They want to use it in a way that allows it, that fosters the humanness of everyone. And one of, you know, one of the things that they're thinking about is AI offers so many possibilities of what kinds of information, what kinds of learnings that we can access. And they're imagining AI being used to make their own learning and things that they want to learn more relevant and meaningful, using the tools to reshape things so they can understand information more readily, give them feedback and personalize explanations. They see that sometimes students are blocked at school because perhaps they don't speak the language in the classroom, or they're on IEPs, they have learning disabilities, and they get stuck at certain points. And they see AI very much as a tool that can be used to allow students to continue doing whatever is they need to get done in their learning so they're going to become more efficient. And interestingly, in this more efficient, not less human, they thought deeply about their teachers. And they thought, they noticed all the ways in which teachers are stressed, they have a lot of work to do, and they have to work at nighttime, and they can't always meet the needs of all the students in their classroom. That is actually stressful and painful to some of them because they want to be supporting all students. So they're really thinking about how AI can be used not just for students to make learning more efficient, but for their teachers as well to make their process of teaching more efficient so that they are able to focus more on the things that really matter, which are the human relationships. that exist in support of learning.
Chris Carlton
First, I'm glad it's not a replacement for teachers. That's always one of those things that flies around. But it goes back to your point a couple themes ago, that your program is empathy-based, and that just shows it over and over again. In fact, the fact that they're thinking about the teachers. and their effectiveness and the process of teaching and making it more efficient so that the teachers can concentrate on what we love to do and that is on the individual students, taking time on them. Just incredible. Theme 4 is the desire for a learning partner, not surveillance.
Theme 4 is the desire for a learning partner, not surveillance. Can you explain that a bit to us?
Heidi Siwak
Yes. So in the challenge, we spend a lot of time sitting in tensions and exploring those tensions and considering impact of these AI tools on different people. And students are very excited by the idea of AI as a learning partner that supports everyone in the classroom that doesn't replace human, that's used as a guide. For that to work well, there has to be transparency around the use of AI and how the AI is being used. And they're deeply concerned that if AI is being used, they may not be aware of judgments that are being made about them with students or where their data is going, or that the AI begins to control their learning rather than the students remain in charge of their own learning. So they're sitting between the idea of using AI. There is clearly some analysis of the work that they're doing, but they don't want it to turn into a tool of surveillance that controls their own learning. Interestingly, what came up that was concerning to me through the challenge in younger students versus older students, older students are much more aware of the risks of tracking devices that could be used in artificial intelligence to monitor their behavior and perhaps cause them harm. Younger students And this could be because of their developmental stage. When they talk about AI, many of them talk about AI in a parental role or an authoritarian role. And this, they're offloading responsibility and decision making. or surrendering, I would say, to the AI. That's how they're imagining it. The AI can report on me if my desk is messy. The AI can report to my parents, you know, if I didn't do my work, if I was topic in classroom. So they're very much, and they weren't necessarily describing that as a negative thing. But clearly they're not understanding the significance of that level of surveillance of them. And it's very concerning as we're bringing these tools into our classrooms and students are becoming accustomed to them that we don't raise a whole generation of children who just accept the idea of surveillance because it's been embedded so much into their learning. So I think a lot more research needs to take place and thinking about younger children in relationship to these tools that can support their learning. but also surveil their earning.
Chris Carlton
This takes me right back to one of the first things you said, and you said demystifying the mysteriousness of AI. And you're doing that through education and understanding, letting them understand what AI is all about and letting them explore and be curious about it. And I think it's just an incredible program. The AI Challenge program talks a lot about empowering young people, and you express that throughout your talk right now. Educators understand that young people are our future, offering fresh perspectives and creativity to tackle complex problems and challenges. You highlight that by empowering them, we invest in a brighter future for all of us. They challenge the status quo, pushing us to create better systems for the communities and beyond. My final question is, what would you recommend as a first step for teachers to help their students prepare for an AI-driven future?
Heidi Siwak
Burning point. I mean, I could suggest things like learn what AI is, things that we've talked about, but I would begin with their own student identities. And I would start with something like imagining what can AI actually do? And start with silly things. right? Artificial intelligence and potato chips. Is there a connection? Artificial intelligence and socks. List the silliest things possible and see if there is a connection to AI. Because if you search for those, you'll start to see that, oh my goodness, I had no idea that socks had artificial intelligence sensors embedded in them for people who are bedridden or people who have diabetes to detect, you know, problems that are starting to happen and get them to medical treatment. So start to see where AI is present everywhere in the world. And then I would ask those students to think about their own interests and passions, music, sports, art, reading, hiking, nature, fish, dinosaurs, whatever, and put their areas of interest in and artificial intelligence as a search term and begin to see how artificial intelligence is shaping and connected to and being built within the things that they are interested in. And then the last thing I would, as you begin to think about artificial intelligence in the classroom, I keep coming back to Marshall McLuhan, the medium is the message, right? When new technologies come along, they have the capacity to reshape us. And do we want that reshaping? So as you, students use an AI tool in the classroom and they've used it to complete some writing or to brainstorm, take some time to analyze the experience of that. Did they like it? What was gained by doing it this way? What was lost? Did they feel they were actually thinking? So they can start to interrogate the tools in relation to how it is affecting themselves and so that they can begin to develop voice and agency and thoughtfulness about the impact that these tools are having on them. but also the vocabulary to describe that impact and then to be able to make choices that continue it because they like it or that discontinue the use of it because they're not liking how it's impacting their identity and how they exist in the world.
Chris Carlton
So Heidi, as a science teacher, you're speaking my language, the idea of building curiosity, discovery, letting them investigate their own interests and passions and how AI is shaping the topics that they're interested in, and then further analyzing that experience. And I love your phrase, integrate, sorry, investigate the tool and find out what it's all about and then develop the program from there. Heidi, this has been such an exciting conversation for me, and it's hit home on so many levels for me as a teacher as well. I really appreciate you taking the time out of your busy schedule, and I look forward to further conversations with you. So thank you so much for being here today.
Heidi Siwak
Thank you so much, Chris. I love talking about artificial intelligence, so it's been an absolute pleasure to come and share some of the work that we are doing at, I think, and what we're learning from students.
Chris Carlton
That does it for another episode of Popular Podagogy. Again, thank you to our amazing guest educator, Heidi Siewak. I hope that you take the time to visit our podcast website and view some of the additional information she has made available there. Josh, as always, where can our listeners subscribe to make sure they don't miss any of our Popular Podagogy podcasts?
Josh Vine
If you like what you hear, please subscribe to us on Apple Podcasts, Spotify, the CFRC website, the Faculty of Education website, and pretty much any place you get your podcasts.
Chris Carlton
Please don't forget to check out our Queen's Faculty of Education website and search for popular podagogy for additional resources. that's it from myself, Chris Carlton, and our incredibly talented and resourceful podcast team of Josh Vine and Erin York. Stay healthy, stay safe, and stay connected, and we will see you next time for another episode of Popular Podagogy.