18. Shaping the Future Classroom with Mike McGuckin
Season 2, Episode 7 of Kinwise Conversations · Hit play or read the transcript
-
Intro: Welcome to Kinwise Conversations. Today we're talking with Mike McGuckin, a computer science teacher, AI Trailblazer, and on-the-ground innovator from Winston-Salem/Forsyth County Schools. As the only North Carolina educator in the inaugural AI Trailblazer Fellowship, Mike quickly went from teaching himself a new curriculum to guiding teachers across the state. So how does a busy high school teacher with three preps and three kids under two actually use AI to make his job more manageable and him more effective as a teacher? We'll explore his playbook for using AI as an "idea bouncer" to reclaim time for students, his approach to navigating the ethics of AI in the classroom, and what it really means to be a future-ready educator in a world of constant change. Let's dive in.
Lydia Kumar: Hi, Mike, I'm so glad to have you on the podcast today. It's the middle of the school day for you, or towards the end, so you are really on the ground with students and with fellow educators, navigating a lot of things, one of them being AI. I'm so excited to just take a minute for you to talk about your journey in education and artificial intelligence, and what brought you to this work, and what's motivated and interested you.
Mike McGuckin: Yeah, so I got into teaching in 2018 at Thomasville High School. When I was there, I spent my first five years as a swim coach. I ended that career at Thomasville as an assistant athletic director and then moved to Winston-Salem/Forsyth County Schools in 2023, right around the time when my son was born, my firstborn. Since then, I’ve been able to get into computer science, which is something I'm passionate about. Within that first year, I really just took off in computer science and artificial intelligence education and have really been set in stone for that, doing work not just at the local level, but even at the state level.
Lydia Kumar: That's amazing, Mike. I'm curious about computer science because you mentioned that. Do you have a background in computer science, or is that just something that you've been interested in throughout your life?
Mike McGuckin: Yeah, so I originally, when I went to Old Dominion University, started out as a computer science major. I could not really get the math part to figure out for me, so I switched my major to Sport Management. When I was a senior and doing my internship, I ended up working with high school students. So, after I graduated college, I went to Macon, Georgia, where I did group sales for a year, and left there to go teach at Thomasville High School. I wanted to get back into high school; I like high school education, I like athletics, and I had a background in computer science and a passion for it—teaching students about programming, design, and really all sorts of things related to computer science. So, once I came to Winston-Salem/Forsyth County Schools and saw that I was teaching a computer science class, after learning the new curriculum in a semester, I really enjoyed it and made little tweaks to make it more engaging and more fun. My students have enjoyed learning computer science, and some of them have told me they've gone into college taking computer science courses because they had exposure here at Glenn High School. Winston-Salem/Forsyth County Schools is pushing forward towards that, especially as that field is expanding rapidly with the use of artificial intelligence. People need programmers; people need all sorts of people to be able to do things for them.
Lydia Kumar: It's funny because earlier this week, or maybe it was last week, I was reading this report about how earning potentials increase when students have access to even one computer science class. I think it's a class that is not necessarily required for every student but is associated with these really positive outcomes. That must be a cool area of work to be involved with.
Mike McGuckin: It is, and in the state of North Carolina, I know not this current school year, but the next school year, it's going to be a state law where the graduating class of 2030 must have at least one computer science credit. Now, it may not just be a computer science principles class; it can also be a technology engineering design class where you're now teaching computer science inside of that unit. So it's one of the new classes that I'm teaching this year, and once I saw the first unit was on computer science, I got really excited about that because that's something I'm passionate about and know a lot about. That graduation requirement is coming to full fruition, and this current school year is the last year that school districts could use a waiver unless the General Assembly decides to not require that graduation requirement.
Lydia Kumar: That's really cool that it's going to be able to expand and we're going to have more students having access to courses like the ones that you're teaching. I want to ask you about your AI involvement. I know you were one of the AI Trailblazers; you did that fellowship with AIEDU. Do you want to share a little bit about what that was like, why you decided to do it, and what that experience was like for you?
Mike McGuckin: So, I decided to do it after seeing a post in the AP Computer Science Principles group on Facebook. I thought, well, I'm passionate about artificial intelligence. At the time, in the spring of ‘24, I had just finished my second master's degree in sport management and a graduate certificate in project management. I started using artificial intelligence when I was in that second master's program, only as a guideline and to get an idea. For all the journal articles I had to read, it was just easy for me to download and upload them to ChatGPT to get a summary so I knew what I was talking about, while also trying to raise a child and do lots of other things. Once I got accepted into that program, I realized I was one of 25 teachers in the nation. So just to be involved in that was substantial in itself, but I was the only one from North Carolina to be involved. That really skyrocketed me to lots of different roles inside of AIEDU and even inside the state of North Carolina. I have lots of different teachers messaging me over email and sending me messages on different platforms asking me about artificial intelligence. I’ve really had a passion for it as I've seen it grow in the past couple of years as we have used it more and more. Now it's part of our daily life. It's fun teaching it, and it's interesting looking at that pivotal moment of me being in that Trailblazer fellowship, being the only one from the state of North Carolina, and representing not just the state, but being one of 25 teachers in the nation to be part of that fellowship. It's a huge honor.
Lydia Kumar: That's really amazing. And it's interesting because North Carolina has really been at the forefront of AI in education. We were the fourth state that had our AI guidelines published. And it's cool to meet somebody like you, who's a teacher who's been able to move some of that implementation forward and be a part of a national cohort that's thinking about what AI looks like in classrooms. I'm curious about what it looks like in your class now that you're using AI and teaching with AI. If I came into your classroom, what would I learn, or how would you be integrating the technology?
Mike McGuckin: So, AI, as we know, changes every minute of every day, so it's just constantly keeping up with the change. We offer an artificial intelligence intro course that the state has kind of created. We teach out of that course, which was originally developed by the North Carolina School of Science and Mathematics and has been slowly updated. I've been helping update that course to a point. We start out with the ethics portion of AI. I teach my students that self-driving cars, for example, have to make a decision whether to harm people in the car or harm the people on the road. These are hard ethical questions I have them answer, and I tell them this is what a programmer has to think about. We go into a little bit of algorithms and bias and data, and then right now we're doing Scratch programming, just to get a little programming involved. But we go into sensors, and my favorite unit is chatbot creativity. I use a tool on Magic School to allow my students to create their own chatbots to demonstrate that ability. I will also show them on an unrestricted side of it for ChatGPT; I've created my own chatbots for almost everyday resources. I have a chatbot that helps me get my AP classes to be a little more adept and really pushing forward to get higher scores on their AP exams. At the end, for their final project, they have to create their own AI resource. They don't have to do any of the programming; they're just designing it. It's a big paper where they have to research and find different resources. So, I had one student look at healthcare and design on paper an AI tool that would be unbiased in helping diagnose medical diagnoses. Those are just some of the things that we do in the class that kind of help elevate it. And that project I know is being used across a couple of different school districts in North Carolina, but it's also a learning curve because there's people that have never taught this class before and they're just like, "What do I do? How do I teach this? What are we doing?" And I have to sit down and think about what those problems are. And there are some things that I'm not 100% sure on, but I try to do the research and prepare for my students.
Lydia Kumar: Yeah, that makes sense. Because the field of AI has been around for a long time, but machine learning, deep learning—it's becoming complex. There's a lot of math and a lot of complexity in how the technology works. And so trying to figure out what are the most important aspects that students need to understand, that you need to be able to understand to teach it well, and then support the different educators around the state that it sounds like you're working alongside as well. That's a big job.
Mike McGuckin: We have a Moodle, and I will meet with anyone as needed to help go through the class and tell them about the class and work with them. There's only one other teacher in my district that teaches it, so I work with them all the time as needed. He'll reach out to me occasionally to ask questions, but when we do get into some of the complex math, that's where I kind of draw a line. I'm not comfortable teaching my students this if I don't understand it. That's one of the things that I tell some of the teachers: if I'm not comfortable teaching it, I'll show them the math behind it, but I'm not teaching the math because I don't understand it.
Lydia Kumar: I think that's a… I was talking last week with Dr. Brené Brown, who works at Duke, and she has developed this course on AI concepts for people who hate math. I'll link it in this episode, but it's really… it was fun to talk to her because she's like, "There's so much math in artificial intelligence," and sometimes we forget that. There are a lot of people who don't have the comfort or the time to learn that deep level of math, but you still need to understand the concepts to make the technology work well. Like the final project you're having students do is really solving a real-world problem and having them think about how this new technology can help save lives or improve people's experiences. And if you understand the technology, then you can at least conceptualize that and talk intelligently with other people about what's possible.
Mike McGuckin: And remaining unbiased. That's one of the hardest things. So we watch videos where a computer may be biased toward a specific group of people, and I tell them, "You have to think about everyone when you do this." It's a hard thing to teach them, and it's hard to visualize how some computers may not be able to recognize some issues. There are political, societal, and economic issues that a computer does not understand yet, but you have to be able to not be biased towards one or the other. You have to be in the middle, and that's usually hard when you have to develop something.
Lydia Kumar: What do you think is the most important thing for students in your class to understand about artificial intelligence?
Mike McGuckin: I think the hardest part to understand is not to use it for everyday tasks. Unfortunately, we saw last week a student unfortunately ended their life with the use of artificial intelligence. And that's something we discussed in my class—that artificial intelligence is great, but it can be wrong, and it's not meant to replace the individual people that are with you. So, I encourage a very safe use of it. Students are not supposed to be using artificial intelligence, but I know that they do use it. And we talk about the dangers of artificial intelligence and how there's a whole phenomenon of people going there for answers and not doing the research. And when I talk to PDs, I tell them, "You know what your students write, and you can easily tell if something's written by them, really." And I tell them that when I use artificial intelligence, I use it as an idea bouncer. I will ask questions, research the topic a little bit more, and then go in and solve problems. So when I was doing a programming task, I was having an issue, so I used artificial intelligence not just to debug my code, but to help understand why something was wrong. For students, they may just be looking to get the answer when the answer could be wrong because it's not 100% accurate. And that's one of the things I tell my students: yes, you can use artificial intelligence in your own personal life, but the answer may not be 100% accurate. My AP class has the ability from College Board to use it to help with programming. And I tell them that you can use it, but just know that you need to be able to explain what it is and what it's doing. And that can be discouraging for some of them because they may just want to submit something with artificial intelligence and not use it. I tell them, "Get your program, get everything you can, and then go back and take a look further into what can make this better, or what was a different solution?" Use different ways to explore and not be using too much of it. And that's one of the things I tell my students: yes, it's great, it's here, it's fast, it's getting better, but it's not 100% reliable, and don't use it on everyday tasks.
Lydia Kumar: Yeah, and it's, I think it's so important for educators to feel comfortable having conversations with students because some uses of artificial intelligence are really useful, some are really dangerous, some are just replacing essential things that I think are important for someone's development and understanding their own perspective. And so it's great that there are teachers like you who are able to have these conversations so that students can make informed choices because, you know, when you're building understanding of life throughout your life, and so when you're in high school or middle school or even elementary school, you're just at the beginning of building that understanding.
Mike McGuckin: Yeah. Even at the college level, if they move on, they can't rely on artificial intelligence to be able to do their job for them. That's one of the hard parts to look at. So when I tell my students, you gotta be careful. Just like when you go on the internet, you need to be careful what you look at, what you download, because there are people out there that may want to try to do things that are not okay. And that's just some of the things. It's a safety thing. And I tell all my students that. But then we're seeing more and more of those things happen throughout the nation, throughout the world. And at some point, it's like we have to do things to make sure our students are using technology in the best way possible.
Lydia Kumar: Absolutely. I want to flip from the student perspective to using it as an educator, using it to support other educators. I'm curious about for you, how AI has helped you. You mentioned this a little bit with AP, but how do you use it as a teacher? How do you recommend other teachers use it? And then maybe we can talk a little bit about the PD that you've led.
Mike McGuckin: Yeah, so I use it mainly as I said before, an idea generator, first off. When I got access to a new curriculum this year for my newest class, I got it the week that school started. I think I got it on a Wednesday, and school was starting on a Monday. So I had no time to review what I wanted to do. So I used artificial intelligence to lay out some projects or some tasks that we could do, and even with specific dates, like "we are not in school on these days, this is this break," and it kind of gave me an idea of where I wanted to be and a pacing guide. That's how I use it. When I taught artificial intelligence for the first time last year, as someone who got the curriculum, it was kind of bone dry. So I was like, "Okay, what can I do to make this curriculum a little bit better?" So I used artificial intelligence to help me create presentations and went back and made changes to those presentations. When we talk about ethics and stuff, it makes it easier. Or I want to talk about the history of computer science. I may ask where we are now from where we were 50 years ago. I've only been in education for seven years, so this has been a wild ride to kind of see artificial intelligence come out. It's been real crazy watching the way I use it. I use it for AP courses, but I also use it for some everyday tasks. So I may put in a lesson plan or a project. I want to do a mousetrap car in my class, so I'll put in a mousetrap car project and give it a timeline, and it will give me a timeline of when I need to get it done, what parts I need to get done on what day, and I can even get a rubric. I use that as well to kind of help me take some of the weight off me so I can focus on other things, like helping my students understand concepts, or to help me get off of lesson planning and be more engaged with my students and help other people overall.
Lydia Kumar: Yeah, I think it's this time trade-off where preparing to teach an entirely new course is an amazing amount of work that I think if you've never taught, you may not realize, or maybe you do, just the amount of work it takes if you're handed a new curriculum to figure out how to actually deliver that to your students in an engaging way. Because you can't just take that and teach it as is. You have to really think about your students and think about your delivery method, and so being able to make other choices about where to spend your time because you didn't have to spend as much time making presentations or figuring out some of the administrative aspects of the job of a teacher. That's a really positive way to use it so that you can make different choices.
Mike McGuckin: Yeah. It's really… I use it for a lot of different teaching aspects, and I tell my students that I'm open with them that, "Hey, this activity was generated by artificial intelligence, so we're going to see how it goes." And we try to see if it works. And if it does, great. If it doesn't, well, I know what changes I need to make, and I can tell the artificial intelligence what changes I want to make to that assignment, and it'll make those changes, and we'll try it again.
Lydia Kumar: Yeah, I think that's great. And you can do that fairly quickly and easily. The other thing that I think you're highlighting is the importance of really thinking critically about outputs. You're not just saying, "Oh, artificial intelligence gave this to me, it's going to be good." You're thinking about what you're getting. I had an interesting conversation a few weeks ago with a nonprofit I was working with, and they talked about how they're seeing some of their employees sort of put their intelligence on a shelf and use AI more prominently, disregarding their own expertise because AI's outputs look so good. And I think that's really tempting, no matter what field you're in or if you're a student, to just be like, "Wow, this looks great, it's got to be right." But just because it looks good doesn't mean it is what you want to use.
Mike McGuckin: Yeah. And that's the hard part about teaching artificial intelligence and really just pushing to use it, is that lots of things that I do and play with may not be what I want. So when I was messing with code and programming, there were some parts that I did not understand. So I'm having artificial intelligence help me explain what I'm doing or what the issue is and go from there. Now, I was lucky enough, I grew up with technology and computers, so as things advanced, I was able to get right there at the forefront. I remember when the first smartphone came out, and I was able to witness that evolution as I was growing up.
Lydia Kumar: The internet has changed so much in the past 20 years. And I mean, if you've been teaching the last seven years, you've seen some huge changes as schools became much more digital. COVID forced the digital transformation in schools. There was this immense pressure to make sure every student had access to some sort of device. And so, you know, there was that huge push, and now, a couple of years later, we had generative AI, and kids have access to chatbots and devices. And I think that leads to some really complex challenges in the education space because if we were all paper and pencil like some classrooms were 10 years ago, it would feel a little bit different than it does today.
Mike McGuckin: Yeah. It's crazy to think of how much has changed in just a short amount of time. Like we had COVID-19, and then right after we got out of COVID-19, artificial intelligence started coming out, and generative AI. And now, I think ChatGPT is on GPT-5. They're moving at a rapid pace and they're keeping up with industry leaders, and it's just been a wild ride to see all that move forward.
Lydia Kumar: As you work with other educators, what has that experience been like? How have you been able to support new-to-AI educators or people who are more hesitant? Do you see a lot of that?
Mike McGuckin: I see a lot of hesitation. Just like when things come out and things are new, people aren't really up for trying it out. I'm that person that if something's new, I really want to try it out. As someone who wears hearing aids, I try to get the newest and best hearing aids that are on the market. That's just something I've always done. It allows me to test that newest technology. So, for someone like me who's all about technology, we use the latest of the latest. I have educators that say, "We don't like it." And even when I show them the ways they can use it, some of them do like it. Some of them choose that, well, maybe I could use it for a few things. I try to tell them, "Look, we're all busy as educators. We all have our own lives. I have three kids, two and under." So any time I can use artificial intelligence to make my life a little easier, that's what I'm going to do. And I tell them that, and they'll start to use it, and they'll come ask me questions about it. They're hesitant, but they're also okay with exploring. And I tell them our students are using it. So you want to come up with ways that allow students to maybe try to look up something on ChatGPT and get the wrong answer. That's an easy way for you to figure it out.
Lydia Kumar: Yeah. I think that's such a… to be able to talk with educators and try to at least develop some familiarity because, like you said, students are using AI whether you are or not as an educator. And so your ability to understand a little bit about how the technology works and then to use it for yourself is beneficial and can help you know how to use it with your students more effectively. It's really hard. It's like when computers came out, you can't really use computers well with your class if you can't use a computer yourself. So you have to be able to navigate that as an individual so that you can lead the people in your classroom.
Mike McGuckin: Yeah. It's an ongoing battle. They want to learn more about it and they want to discourage cheating, and that's the hard part. It's what I tell every teacher: "You know what your students write. You know how they write." So if you're used to them writing a paragraph with a couple of errors, and the next thing you see is a three-paragraph paper with no errors in it, you can ask some questions about it. And they ask me about the detection tools, and I said, "Well, what is AI? What isn't AI?" I can make something that is not AI get detected that says, "Oh, I used an AI generator," and it may not be 100% right. Like I used Grammarly when I was in college, and that was one of the things that I was encouraged to use. Well, it would flag the AI detection, but I had access to Grammarly as it would help me with the grammar stuff and make everything right, versus the other alternative, which is I could do it on paper and pencil and sit and wait a couple of days to get my students' work back. But I'm also trying to go less paper and pencil and more on the computer and encourage the students that if you have an issue and you're trying to find something, you do research, you make the effort. We talked about the digital divide in my classroom today. They're working on a project in my computer science class that talks about the digital divide, net neutrality, and internet censorship. So we're talking about these real things that are happening and how people may not have internet access. And all those Chromebooks you get are a product of COVID-19. They were trying to get you guys internet access when some of you may not have it, but they couldn't pick and choose. They had to give it to everyone.
Lydia Kumar: Right, right. And so I guess you're teaching in a high school, so all of your students lived through, I guess most of them were in middle school.
Mike McGuckin: It would have been middle school and maybe some in elementary school during this time. If they were a freshman, it was five years ago when we were all remote. So it's been about five years since that happened. So we're talking elementary school and middle school students.
Lydia Kumar: Right. And now most of those students had access to this technology, and now they still do. One thing, I was talking with a friend who is an educator in Virginia, and she was telling me how after COVID, a lot of students transitioned all their courses to be digital on Google Classroom. And so that has also been a change where you can submit everything digitally. We have this technology that students can use to input their assignments for them. And both of those things kind of happened, not simultaneously, but close together. So it's complicated. And the advice you gave about just knowing your students, having conversations with them, being aware, I think that's important to think through.
Mike McGuckin: Yeah. As an educator, you have to know how your students write and what they do in order to really catch them cheating. I used the AI detection that we get for plagiarism review, and it flagged a student, and I watched him submit his project, and it was because he used Grammarly to help him with the grammar stuff. And I said, "So technically you're not supposed to use it because of district policy, but again, you're working on your writing. You're not using it to solve the problem or to do your assignment. You're kind of using it to help understand and get your grammar right, spell check, and all sorts of things." So it's a two-way battle when you're messing with some of these things. Like we talked about, you have kids who are using it to boost themselves, but then you also have the other side where they say, "I'm going to get this done and get an A because my teacher doesn't care how I do this assignment."
Lydia Kumar: We're in such a messy moment right now where artificial intelligence is super useful for learning and also useful for not learning. And so you have to be able to work with students and to set up assignments and to think about how do you make it feel useful and meaningful to use this technology in a way that helps you to learn, because it is a great learning tool.
Mike McGuckin: It's a great learning tool. It's fun to play with, and I show people different prompts that you can ask and get different results. As a joke, some of my friends, I've responded to their text messages while using AI just to mess with them, and they thought something was odd when they kept getting back the responses I was sending them, but they knew exactly what it was.
Lydia Kumar: That's really funny. Okay, I want to ask this question because I think it's interesting to be a teacher working in a school, and we know principals and school leaders have so much power and influence in a school environment. And so if you have a principal who's really passionate about AI and dedicated, it's going to make a really big difference in that school's readiness and use of AI. I'm curious from your perspective, if you had a school leader or principal who is curious if AI is worth their time right now, what would you say to them or what would you encourage them to think about?
Mike McGuckin: I'd say it's worth it. Play around with it, explore it. When I lead PDs, that's one of the best ways I've always learned to teach—play around, see what you can get, and then go back and try to get something else. I always enjoyed doing that type of thing. So when principals ask me about stuff related to artificial intelligence and computer science, I have to put on my thinking cap and think exactly what they're looking for and help them get to the answer that they're trying to get to. I think it's useful for those day-to-day activities. I know Magic School has a newsletter option where you can use AI to generate a newsletter for you. I haven't done it yet but thought about using that for parent contact and sending different newsletters about what we're doing in class and what we're learning about. Just with the time that I have and the ability for me to do that, it's a hard battle. I think it would help them get in and out of different things quicker, but to use it to the full advantage that they would want to, they have to sit down and just play with it. They don't know what all it can do, and you don't know what anything can do until you sit down and play with it and create things that make it work.
Lydia Kumar: Yeah, taking the time to understand the technology that you have access to and then to make choices accordingly. You've mentioned Magic School AI. What AI tools would you recommend to other educators?
Mike McGuckin: So it depends on what they do. I actually do pay for ChatGPT at the professional level. So I pay, I think it's $20 or $25 a month for it. And I use ChatGPT more often because it seems to be that as soon as ChatGPT does something, I see Magic School and all the other ones release their models too. So I feel that ChatGPT is the leader in this industry. Now for Magic School, we're about to do this project in a couple of weeks. I use their chatbot activity, so I have my students create a chatbot. That's one thing that is really cool that they get full access to. They have to write a report and show me their chatbot interactions, and that's something that they find increasingly fun and cool. And that's one of the activities that I plan to keep around for this course as long as I can. I've used some other ones. I use Diffit to help with some students for literacy. And there's one out there for almost everything. At one point, I used CodeGrade a good bit to help grade assignments, and I thought that what they were doing was really cool, and I used it for a little bit. But those are just some of the AI tools that are out there, and like I said, there's so many out there. It's just kind of finding one, playing with it, and seeing if it fits for you. I paid for ChatGPT because I was using it a good bit, and once I got the AI course to teach, I was like, "Okay, we're going to need everything we can." I may not be able to understand some things, but I know ChatGPT has been able to help me out a little bit. So I've been using that to really push me. But I also will use any resource out there. Like I've used Microsoft Copilot as well to test out and play around with. Different tools do different things, and it's just finding the one that works for you.
Lydia Kumar: Right. Even, you know, these foundation models like ChatGPT or Copilot are trained on different datasets. So you can get different outputs based on how the training data looks. So those are things to consider too. Has there been a moment for you as you started using AI with your students or in preparing for class where you're like, "Oh, this was the best use ever"? And have you had any missteps where you're like, "That did not work that well"? The high and the low of working with AI over the last couple of years for you.
Mike McGuckin: Yeah, so the high, especially in a large class of 20-plus kids, where I have some of them sending me emails or messages about their code. When it gets to a point where I'm just looking through lines and lines of code and I get a headache trying to find the small errors, that's where I find the AI high points, because I know it's really good at detecting. It tells me the line, tells me how to fix it, and I'll tell the student, "This is where your error is," but I won't tell them the solution. I'll see the solution, but I'll tell them, "This is where your error is," and I'll verify that that's where that error is. And then I'll see if they send back that either it's fixed or there's another error, and I'll look at their code and go back to the chat AI model and see. The hard part, the low part, would be that some of the lessons I've used AI for have been a little harder. It doesn't hit the topic, so I have to be very specific on what exactly I'm looking for. So it's just playing around with it, and sometimes it is time-consuming to get something that you want out of it. So when I try to get a presentation out of ChatGPT, I have to be very specific. It'll ask me if I want images and things included, which is newer than what it used to do. And I would say yes, but then sometimes it won't do it, and then other times it will. So that part gets a little bit of a low point, but it allows me to work through that process. And the low point is just taking more time that I'm trying to, with my already busy schedule, to figure out.
Lydia Kumar: Yeah. And I think that low point also highlights the importance of your expertise as an educator to bring into the space. Like, you have to know what this lesson should be like, what's going to resonate with my students, what do I need to include. I think there is such an importance of having that human lens and that perspective, even though it takes time and can be frustrating. You know what it needs to be, and so it may take some back and forth, but you're able to figure out what's best for your kids.
Mike McGuckin: Yeah. It's a fun process. It's a learning process for sure. I even tell my students that I believe Microsoft used Copilot to reopen Three Mile Island up in PA. So we talked about that a little bit, and they asked, "Well, what is Three Mile Island?" I tell them it's a nuclear power plant that was shut down and they're using it to power their data center. We're talking about how much data they use, and my students were shocked by that. And then when researching about data centers, how much energy they use, that makes perfect sense.
Lydia Kumar: Do you and your students have a lot of conversations about some of the ethical components of AI? What do those conversations look like?
Mike McGuckin: They get hard. They just get real hard, and I tell my students, "These are things that we have to think about." We may be able to make that decision as a human quickly, but then telling a computer to do something else is harder because you don't know what the scenario is. You're hoping that you have every scenario.
Lydia Kumar: Yeah, it's, I think it's the energy usage, the deciding when to use it, when not to use it, what are the upsides, downsides, what is okay to put into AI, what's not. I think all of those aspects are so important to think through. And it sounds like as you're having conversations in your class, those things I'm sure come up, like, how do you use this technology in a way that is not harmful, or is, I don't know, trying to do as little harm as possible. And it is confusing when you have technology that operates in this kind of black box way that a lot of these chatbots do.
Mike McGuckin: It's just like the example I gave earlier with the student who unfortunately took their own life because they were talking to AI. And that was one of the things; it's not meant for that. And students need to talk to other people, not just rely on AI. And that's something that is really hard to push forward. And I tell my students, "Yeah, you can use it in a way that fits for you, but be careful using it." And that's some of the safety things that we have to teach students, like how to be safe on the computer.
Lydia Kumar: Absolutely. Yeah. Those are really, really important things to think about. I have two questions left. First, I'm curious about what an empowered, future-ready educator looks like from your perspective.
Mike McGuckin: Oh man. An empowered, future-ready educator, in my perspective, you're ready to accept any new technology that comes out. You're using those tools to plan your lessons. You're using those tools to really improve your performance as an educator. You're using those tools to learn more about what's happening. I think that's where I would go with that. It's harder to plan for those as a future-ready educator because if you're new in the field, you need all the help you can get. The first few years of me being a brand-new teacher were hard, and that's something that I did not have a lot of help with. I really push for, even with new teachers, when they're struggling, they're trying to get lesson plans or they're trying to get something, I tell them, "Hey, why don't you use something like an AI tool to help you come up with a lesson?" or even create an AI lesson. Or I tell them, "Hey, AIEDU has resources for free that are useful, and they have bell ringers. So if you want to put a bell ringer on, use that, and then you're not planning a bell ringer anymore. You have a bell ringer that's from this company that I highly recommend that is useful, and you can talk about computer science in your room." I have some teachers that have used it in their classes here at Glenn High School, and I know some other people in the state of North Carolina have used those tools as well, not in my district. So it's really been pushing those future-ready educators to look more to be using the new technology as best as they can, to the best of their ability.
Lydia Kumar: Thank you so much for sharing that. And you know, for anybody listening, it's just that ability to continue to learn. I saw someone, I can't remember their name, talk about how the most important skill of the future is the ability to learn, and learning new technology is one of the things that we have to be able to do. My last question for you, Mike, is about an idea or a question that is sitting with you right now. I always think of this as the question that keeps you up at night, the thing that you're really spending a lot of time thinking about right now.
Mike McGuckin: When my kids wake up at night, that's usually one of the questions I have. So I think one of the crazier questions is, where are we going with this? Like, how's education going to look in a couple more years? So we're using artificial intelligence; it's here. You can't deny it. It is physically here. So what is the next step? What is the next step in education? When I was going for my first Master's, one of the things I mentioned was eventually everything could be self-paced learning or it could be virtual learning. And that was before COVID-19, and then COVID-19 hit and we all had to go virtual. So I was right at one point. There are different schools that offer virtual learning options, and that's one of the things I think is trending. So one of the things I ask myself is, what is that going to look like in a couple of years? Like, what is education itself going to look like in just a few years? Not even 10 or 20 years down the road. What's it going to look like? Are we still going to have classrooms? Are we going to be meeting virtually again? No one knows. And that's one of the questions that keeps me up at night because I not only worry about myself because I enjoy teaching, but I also worry about my son and my kids because they are going to need some sort of schooling, and I don't know what that looks like in a couple of years.
Lydia Kumar: Do you have a hope for when your first kid goes to school? Do you have a hope for what kindergarten will look like for him?
Mike McGuckin: I hope that he gets into STEM. He seems really interested in STEM. Now he's only two, so who knows what he's interested in. But he really seems to like technology and to play around with technology. And then I'm curious about my twins and how they're going to react. Like my vision is that they enjoy learning. My wife is a former educator, and we're both really pushing education. We want education to be their forefront and just really pushing and figuring out what can we do to best support our children? And what's that first year of kindergarten going to look like? What's going to look beyond that? And that's a decision that we have to unfortunately make in a couple of years of how we're going to do this. It's crazy to think about. And we're only just at 2025, and they're going to be going to school in a couple more years.
Lydia Kumar: It's interesting to see what that looks like, and I hope whatever it is that the decisions that the state and schools and educators make are in the best interest of students. And I'm hopeful that we'll move in a direction that's positive and leads to more students being able to learn more effectively. And I know we've talked about a lot, there's just a lot of change happening right now, so it's constant.
Mike McGuckin: It's constant. It won't stop.
Outro: That's a wrap on our conversation with Mike McGuckin, a teacher who's truly shaping the future of education from inside the classroom. Three quick takeaways from our conversation. First, use AI as an idea bouncer, not a brain replacement. Mike's approach is to offload the administrative tasks, like drafting lesson plans or creating rubrics, so he can invest his time engaging directly with students. Two, your best AI detector is knowing your students. The most effective way to address cheating is to understand your students' unique voices and to have open, honest conversations about when, why, and how to use AI. Finally, embrace the learning curve. For teachers and leaders who are hesitant, Mike's advice is simple: you have to get your hands on the tools. Experimenting with platforms like ChatGPT or Magic School AI is the first step toward building confidence to lead.
To continue the conversation, you can connect with Mike on LinkedIn and explore the resources we mentioned, including the free AI literacy snapshots from AIEDU. All links are in the show notes.
And if your school or district is ready to move from conversation to action, Kinwise runs everything from a 30-day teacher AI pilot to a one-day AI leadership lab that helps district teams draft board-ready guidelines. Details and bookings are at kinwise.org. Finally, if you found value in this podcast, the best way to support the show is to subscribe, leave a quick review, or share this episode with a friend. It makes a huge difference. Until next time, stay curious, stay grounded, and stay Kinwise.
-
Mike McGuckin on LinkedIn Connect with Mike professionally to follow his work at the intersection of computer science, classroom teaching, and AI leadership in K-12 education. https://www.linkedin.com/in/mike-mcguckin/
AI Concepts for People Who Hate Math Explore the YouTube series from Duke's Dr. Brinnae Bent that Lydia discussed, which breaks down the complex mathematical concepts behind AI in an accessible way.
AIEDU AI Literacy Snapshots Download the free AI literacy snapshots and other classroom-ready resources from AIEDU to start conversations with your own students about the ethics and impact of artificial intelligence.
AIEDU Trailblazer Fellowship Learn more about the fellowship program that helped launch Mike's journey as a leader in AI education and discover opportunities for educators to get training, resources, and a supportive peer network.
-
1. The "Idea Bouncer" for Rapid Curriculum Planning
Use Case: You've just been assigned a new course and received the curriculum guide only days before school starts. Use AI to quickly generate a project-based learning unit that aligns with the standards while being engaging for students.
Prompt:
"Act as an instructional coach for a high school teacher who is teaching a Technology, Engineering, and Design class for the first time. I need to create a 5-day project-based learning unit on building a mousetrap car.
Your task is to generate a complete plan that includes:
A student-facing project overview with a clear objective.
A day-by-day timeline (Day 1: Intro & Design, Day 2: Build, Day 3: Test & Refine, Day 4: Competition, Day 5: Reflection).
A simple, 4-point rubric to assess the final car based on distance traveled, design creativity, use of materials, and a written reflection.
One bell-ringer question for each day to kick off the class."
2. The AI-Powered Coding Tutor
Use Case: A student is struggling with their code and has sent you an email for help. Use AI to quickly identify the error and generate a guided hint that helps the student solve the problem themselves, rather than just giving them the answer.
Prompt:
"Act as a computer science teaching assistant. A student has submitted the following Python code, which is supposed to ask for a user's name and then print a personalized greeting. It's not working, and the student can't figure out why.
Student's Code:
Python
name = input("What is your name? ") print("Hello, " + name) print("Welcome to our program)
Your task is to:
Identify the specific error in the code.
Do NOT provide the corrected code.
Write a brief, friendly response to the student that points them to the exact line with the error and gives them a hint about what to look for (e.g., 'Take a close look at line 3. Programming languages are very picky about making sure everything that opens also closes. See if you can spot what's missing.')."
3. The Ethical Dilemma Generator for Classroom Discussion
Use Case: You want to move beyond the technical aspects of AI and engage your students in a deep conversation about its ethical implications. Use AI to create a realistic scenario that forces critical thinking.
Prompt:
"Act as a curriculum developer for a high school AI ethics class. Generate a short, compelling scenario to spark a classroom debate on algorithmic bias.
The scenario should involve an AI model used by a city to distribute a limited number of summer job opportunities to high school students. The AI is designed to prioritize students it deems 'most likely to succeed.' After the first year, it's revealed that students from wealthier neighborhoods were recommended at a much higher rate.
Your task is to write the scenario and then provide three open-ended discussion questions for the class, such as:
What factors or data might have caused the AI to produce this biased outcome?
Who is responsible for this unfair outcome: the programmers, the city officials who used the tool, or someone else?
If you were hired to fix this system, what steps would you take to make it more equitable?"
4. The Parent Communication Assistant
Use Case: You want to keep parents informed about what's happening in your class but lack the time to write a detailed newsletter from scratch. Use AI to transform your brief notes into a polished, professional communication.
Prompt:
"Act as my classroom administrative assistant. I need to send a monthly newsletter to the parents of my Introduction to AI students. Please take my bullet points below and transform them into a friendly, engaging, and professional newsletter.
My Notes:
Last Month: We finished the unit on AI ethics and discussed self-driving cars. Students did great on their debates.
This Month: We're starting our 'Chatbot Creativity' unit. Students will use Magic School AI to build their own chatbot based on a historical figure. It's a fun project.
Final Project Sneak Peek: Their final project will be to design an AI tool to solve a real-world problem. More details to come.
Reminder: Parent-teacher conferences are next month.
The newsletter should have a clear subject line, a warm opening, organize my notes into clear sections, and have a professional closing."
5. The Project Ideation Partner
Use Case: You want to design a creative final project for your class that allows students to apply their knowledge in a tangible way, but you need some fresh ideas to get started. Use AI as a brainstorming partner to generate diverse and interesting project options.
Prompt:
"Act as a creative curriculum designer. My high school Computer Science 1 class has just completed units on basic programming concepts (variables, loops, conditionals) and an introduction to AI.
For their final project, I want them to apply these skills to a real-world context. Generate three distinct project ideas that a high school student could reasonably complete.
For each idea, provide:
A catchy project title.
A one-paragraph description of the project.
A list of the key skills they would demonstrate (e.g., 'Using conditional logic,' 'Designing a user interface,' 'Considering AI bias')."