7: Closing the Gap: Tanzania Brown on AI Equity and Student Affirmation

Episode 7 of Kinwise Conversations · Hit play or read the transcript

  • Lydia Kumar: Today I'm honored to be joined by Tanzania Brown, who goes by Taz, an educator, instructional leader, and curriculum designer whose work is deeply rooted in social justice and student affirmation. Drawing from her experiences and vastly different educational settings, Taz is dedicated to shaping learning systems that are culturally responsive and elevate the strengths of all students, especially those who are neurodivergent or on the margins. We'll explore how she's thinking about AI as a tool to foster metacognition, the practical challenges of using AI in the classroom, and her vision for a more equitable and proactive approach to technology in our schools.

    Taz, thank you so much for being here on the show with me today.

    Tanzania Brown: Of course.

    Lydia Kumar: I am really excited to hear about your perspectives on AI. I know you've been thinking a lot about metacognition and some other things, but before we dive into artificial intelligence, I want to hear about your story and what anyone who's listening needs to know about you and your personal or professional experiences that shape how you're thinking about this new technology.

    Tanzania Brown: Yeah. Well, thanks for having me, Lydia. I'm so excited to just be here to talk about AI, how it's used in the classroom, and also how it can be used just from a bird's-eye view perspective, like with curriculum design. But just some things to know about me: I am obviously a Black woman from South Carolina, and I think that really just kind of contextualizes why I'm so interested and deeply invested in education. South Carolina doesn't have the best education system. Especially, I'm from Charleston. So a lot of my formative years were spent in New England private schools, high school, and college. I went to New England, and a part of it was because my parents just felt that I wouldn't have gotten the quality education that they wanted for me, which is like small class sizes, you know, creative experiences, just kind of rich experiences that they thought I needed for my curiosity and my interest.

    So when I was 14, they sent me to boarding school just so I could get that, the quality education that I deserved. And because of that, I really just kind of fell out of place. Of course, I wanted to be at home a lot, and I became a teacher and moved to Baltimore because I really wanted to kind of give those experiences to students in public schools. Like, I don't necessarily think that it's fair or just to not have access to the quality education that I did. So a part of why I joined Urban Teachers, got a Master's in education, and like reading intervention is really to provide those kind of experiences to low-income students. So that is how I got to 2019, why I became a teacher in Baltimore City. I specifically chose Baltimore, too, because I know just how deeply political the city is. I'm a proud—I was a proud member of the Baltimore Teachers Union. So a lot of, I see education as deeply political, and a lot of my work is around just having like centering justice and just really the belief that students should be affirmed in schools and just not erased.

    And a lot of my work is helping students on the margins, and they might not necessarily be in the margins of classrooms 'cause they're usually high-achieving students, but I'm thinking about Black girls. Like, I worked at a school for Black girls for five years. I'm thinking about students with ADHD or any kind of other neurodivergencies. I'm thinking about students who maybe struggle to acquire skills quickly. So a lot of my work is around students with who, with IEPs that are neurodivergent or even just kind of lack self-esteem to ask questions and to speak up in these like large public school classroom settings. And I did that work a while as a teacher for five years, and then in the past year, I've kind of branched out, and now I'm working with a math tutoring company with Baltimore City Public Schools. And my work is really centered around pairing teachers and tutors. And so I coach the tutors and help them with math intervention in algebra one classrooms. And that has, you know, just helped me still kind of focus on those students who struggle with skill acquisition, you know, who are maybe a little, a little bit neurodivergent, while also like supporting teachers in ways that I needed support with and managing their classroom.

    And so it really just kind of gives me a broader perspective. And I get to really like, see teaching as a science, you know, I'm like an expert in teaching, which I think sometimes our profession isn't respected in that way. So I get to really just kind of shift perspectives and what it means to be a teacher, what it means to support teachers and really give students kind of personalized learning. So that's one of the things I've been doing in the past year. And then the second, I'm really trying to build my own company called Ethereal Minds. So I'm working on a platform right now. I think it's gonna be a website, but currently, we are a curriculum development platform. We also offer one-on-one tutoring with students who are neurodivergent and provide learning materials and interactive learning tools, whether that's an affirmations coloring book or just an interactive learning notebook that helps students figure out how to take notes because that's a skill that a lot of students are needing. So I think my overall goal for just like all the things that I've been doing is really about how I can shape learning systems and to respond to students' like realities, but also just shaping curriculums, learning experiences that elevate students' strengths and, you know, of course, develop their areas of growth as well. But I really try to affirm the things that students already know and already are good at so that the harder things aren't as challenging.

    Lydia Kumar: You feel so generative to me, Taz. You're like just full, you're just overflowing with ideas and energy. Some things that I heard you say that are really interesting are, it sounds like educationally you really grew up in like these two totally different worlds, which seems like it informs your perspective as you're thinking about access and affirmation and even the expertise that you're bringing into whatever space you're in. I thought that was a really interesting part of, part of your story. Um, yeah, so I wanted to, I wanted to name that. I don't know if I have a question.

    Tanzania Brown: There's so much there, so much there. Yeah. I mean, I think you're totally, totally right in that my perspective definitely, you know, just guides the choices that I'm making even in my own career and how I wanna use AI. I think for me, yeah, like it kind of has to be culturally responsive because being in both environments where everyone, I think what it means to be American in a lot of ways means to just have a value of education. Like, what does that value really mean when we don't have access to the quality education that we deserve? And it's not because teachers aren't, you know, invested or dedicated, it's not because families and students aren't. But I think it's just a lot about the misuse of some of our wonderful tools that we have and just the under-resourced schools and the growing class sizes. So there's so many obstacles that we're facing, but I think one thing that I'm hopeful about, but also cautious, is that I think AI can kind of act as a Band-Aid for some of the real problems that I see in just terms of access, in terms of personalized learning and the gaps of learning experiences that I see.

    Lydia Kumar: Well, let's move into this AI piece because I feel, I'm interested in a lot, a lot of how your story connects with AI. So first thing I'm curious about is like, how did you realize AI was important and something that we should pay attention to? And then maybe how your perspective informs where you see opportunities and maybe potentially gaps when it comes to AI as an education, AI and education.

    Tanzania Brown: I'll just say that I'm someone who tells stories just to kind of illuminate some of the things that I've seen. My answer's twofold. One is that, as you know, an instructional lead, as a teaching coach, I've seen teachers using AI a lot. Like you mentioned, kind of, these are the people who have their boots on the ground who are just trying to figure out how to use AI to make just like their daily lessons easier. So for instance, I've been seeing teachers like use AI to grade things, right? If you have 35 students, even when I was a teacher, I had 35 students in one class. My entire roster might have been 120 students. It's like I need a tool to help me grade some of their essays. I was also an English teacher. So the cool thing about AI is me and other teachers would develop a rubric, feed it to our AI agent, and then based on our rubric, we're able to just kind of like do some of the essays have, have the ChatGPT actually just read some of these essays and grade them. So our work was around like, okay, does the grade actually match up? Um, so we did some of that, and I saw a lot of teachers using AI to develop their daily lessons, develop slides, to have AI like differentiate worksheets and materials. So a lot of that work, they just were able to kind of hand off. Of course, you still have to train your ChatGPT to of what to change and what to develop. Mm-hmm. But that training was, was sometimes even easier. 'Cause it's like, okay, teachers are experts in differentiation and scaffolds, so they're able to just tell their AI assistant what to look for. So that was kind of a training that me and the people at my, at my specific school, we kind of did that work just to make our lives a little easier with our caseloads.

    On the other hand, my students were misusing AI, but also using AI in wonderful ways, too. Of course, as an English teacher, a lot of the essays I would get, even the creative writing stories, were written by AI. And so a lot of the work, especially in the last two years, were around just me having conversations with my students, but also with parents around like, this is plagiarized, you know, you're using AI to write this. Um, and ways that, and even as a teacher, we found that okay, we're, we're, we are teaching our students how to use AI with Grammarly, with language tune. Like there are some just kind of accepted AI-powered writing tools that we accept in the classroom, but we weren't really teaching them, we just weren't realistic about how students were really using it. So for there, we just became really like defensive and just cautious of AI. So oftentimes we would just be like, "No AI in the classroom. Like, you have to do this yourself." And then as I transitioned out of teaching and really was able to take this more like distant perspective and I got to coach teachers and really kind of see what students were doing, I saw that they were using AIs in just like really creative and wonderful ways.

    Um, one of my students took his math study guide, 'cause now I do math tutoring. So he took his math study guide for a unit and made it into a podcast. And so he would listen to it on his way to and from school to like prepare for this test. Um, and you know, he didn't necessarily get an A on the test because of this, but I think just like him using ChatGPT to like fit his learning needs was something that unlocked so much for me, where I was like, okay, we can, at the same way that teachers are using this to make their lives easier, students are maybe misusing AI as well, but also using it to make their learning lives and experiences easier. And so after that, it really kind of shifted for me what AI could do. And because of that, it made me just way more hopeful and allowed me to start thinking about professional developments and even lessons and curriculums of how I could leverage AI. So it's really kind of shifted me a lot.

    Lydia Kumar: It makes sense. There's research that has come out that says AI, you know, shuts your brain off. You don't learn. But I think if you're replacing your thinking with AI, obviously you're not going to learn. Right? But if, like the student you were talking about is using it as a way to customize your learning or differentiated, or help you learn the way that you should learn, it can really increase your learning. And so it's really just, you know, two sides of a coin. And it depends on which side you're willing to look at. Yeah. When you think about yourself, you know, training tutors or developing curriculum, how are you, how are you leading others or setting up systems so students can use AI in a way that pushes their thinking or helps them to learn more?

    Tanzania Brown: Yeah, that's a great question. I think for me, especially because I'm in the realm of curriculum design, a lot of our work is around, like a lot of my work is with tutors and teachers about the kinds of assignments they're developing, right? So to increase like specifically critical thinking and metacognition, like some of the things, especially for math teachers that we've been working on, it's just like error analysis, assignments, and questions.

    Okay. We realized, like we did a poll and realized that in my school of Geometry Algebra two students, which are about 200 students, 75% of them were using AI to just answer questions. So we give them a worksheet, and 75% of them were just plugging the questions into AI and just copying and pasting the response. Most of the time, as we know, AI is like, you know, gets better as you feed it. So a lot of the times the answers were wrong. And so a lot of them are just so shocked when they see that, you know, their grades aren't necessary aligning with what they thought were like right answers. So a lot of what teachers and I are working on is just doing a lot of error analysis. So showing them an AI prompt, like, okay, let's feed it into AI and then based on your skill acquisition, can you fix these answers? Can you fix these responses? We're doing that not only in math, but I've been seeing, working with some English department heads to do this work too, of using AI of just kind of using the answers and solutions that are on AI and to have students break those down. So a lot of it's around like backwards learning, if you will. It's like, here's the answer. We don't care about the answer anymore since everyone has access to it. What we care is about the process of how you're getting there. Um, and I think we're really trying to focus on that. Like how can we really highlight the process mm-hmm, more than just like reaching a solution even in, you know, math classes and English classes and stuff.

    Lydia Kumar: Yeah, that makes a lot of sense. 'Cause it's like, you know, everybody can. The AI tool may be able to give everyone the right answer soon. And like, how do you focus on that, that process feels really important in math. Other than this kind of like backwards processing, are you seeing any, I don't know, assignments that use AI at the beginning? Like you're giving a student an assignment, you're saying use AI in this way. Um, I don't know. Are you thinking about, have you developed anything like that yet?

    Tanzania Brown: Not yet. I think it's, it's more difficult and we found that it's more difficult in math than it is.

    Lydia Kumar: That's what I was wondering.

    Tanzania Brown: In, in like literature only because we still have to align to some of these like best practices around like what it means to, you know, teach. And so we're really struggling with that. Like I think that's one of the hardest things, like how can we use AI more in the beginning, especially ChatGPT like more specifically, especially for math. But one thing that we've been looking on and we've been using, um, Jo Boaler's work at Stanford a lot. I don't know if you've heard of him, just around like the importance of reasoning and iterative problem-solving. Um, so he does, you know, a lot of his work is around like why problem-solving is more important than rote memorization. And so in math, we've just been thinking about how could we have students develop math problems and develop questions based on the things that they already know. But honestly, I think one of the things that we're really running into is that we find that like just across the country, students' skill level in math is just much lower than we expected. And this is across the board, like this isn't really based on economic status or like location. Um, students just like math knowledge and like fluency in math is just lower than we expect. And so sometimes you just, even when you are using these tools, it still requires a baseline, you know, like level of knowledge that we're finding that students maybe don't necessarily have to really engage with AI in the way that we want them to, especially high schoolers, 'cause we're working with 15 to 17-year-old students. So that's a huge problem that we've been facing right now about just how we can use AI while also still explicitly teaching like just foundational skills and knowledge.

    Lydia Kumar: And just, I loved what you said about having students think about how they develop problems from what they know. Maybe the solution isn't as important anymore. And just AI in general, you're going to use it more effectively if you have some baseline knowledge. Like if you're just like, you need to have the language to be able to get the output that you want. And so if students don't have that language, I can totally see how that becomes a really big challenge in terms of getting useful, useful outputs.

    Tanzania Brown: Yeah, and we found that even like my push now for schools, especially like as a site director, I'm gonna be working with this one school in Baltimore City and a lot of my professional developments will be around this. But for me, I think before we even consider like curriculum design, we have to just get teachers trained on AI. Mm-hmm. Like I said, like so many of us are just against it because we just don't know what to do with it. Right. Like we see we, we see it as like students plagiarizing, we are still kind of against using AI in the classroom because we're worried about students misusing the tool. But I think that we first have to like kind of get teachers on board and really train teachers on how, on all the things that AI could do for them. 'Cause then, like you said, I think they are gonna be the ones who figure out the best kinds of assignments and who are able to really kind of solve some of the issues of the best ways that we can use AI. I do think like literature teachers and like history teachers and just humanities in general already kind of have a leg up because they're doing things with AI, helping with brainstorming of essays and assignments, mm-hmm, and establishing feedback loops for your AI agents or even just creating assignments based on like the things, the books that they've read.

    Lydia Kumar: Teachers, like just teaching multiple ways of thinking that AI, metacognition, humanities seem to have a more natural alignment than, let's say a mathematics class.

    Tanzania Brown: But I, but I think that that's why we just kind of have to get teachers trained more because they'll, they're the, they're the creative ones.

    Lydia Kumar: Absolutely. They're the ones who will figure out how we can actually help. And teachers are smart and they understand, they understand how to teach. So I guess from what you're seeing on the ground, you work with so many, so many different educators. What are you seeing as like the low-hanging fruit when it comes to teacher training? What, what do teachers need? What's, what are the gaps?

    Tanzania Brown: Yeah. I mean I think that this perspective shift is the first thing, right? I think the perspective right now is AI is making my students dumber. I think that's in the most layman of terms, like that's what everyone thinks. Not even just teachers, it's everyone just, yeah, you know, it's a fear. Everyone's saying the thing, everyone just thinks we're stupid, you know, that we're just gonna all be stupid after AI. Um, so I think that perspective shift of being like AI is a tool that can deepen student thinking and learning. Like if we, if we can shift to that perspective for teachers already, I think that will allow teachers to just bring different things to the table and think about it. I think even as when I just think about testing, so much of testing uses AI already, and a lot of teachers didn't realize it. Like when I did my first professional development on like AI use in the classroom. One of my first things that I had to kind of tell teachers is that like, you already use this and you're already working with it and you don't even realize it. Like for instance, when we get test scores for students, um, like NWEA, their testing, they automatically generate skills that students need to work on and skills that students are successful at. And then they generate lesson plans for teachers, mm-hmm, that meet those needs. And so all that, and when they generate a script, so all teachers are doing is just taking this lesson plan, taking the script and reading it verbatim to like teach this kid the skill. And so I'm trying, you know, like a lot of what I was doing was being like, this is AI, like this is artificial intelligence, you're already using it. Um, but how can you use it like more often? A lot of pushback that I've been seeing from teachers is that, especially within urban settings, in public education in general, we get so many just like tools to use all the time. Um, even this year alone, especially as someone who's kind of now running these professional developments, I think I did 25 PDs on some, on a different tool that students, that teachers could use.

    Lydia Kumar: It's wild. The education, yeah. AI tool space is exploding and it's, it's totally overwhelming. Like there's way, how do you choose? What are your goals? Some of them do the same thing. Are they really useful? Like just the energy it takes to evaluate these new tools is, can be exhausting.

    Tanzania Brown: Yes. I think teacher burnout there, so my, the low-hanging fruit would be like, choose one at like one AI tool for whatever skill and it should be a skill that, you know, spans across subjects and then really work on it for a year. Like take time to develop it, evaluate it, make it work for you or not. But there's so many options that I think the teacher burnout of it all is really like overwhelming.

    Lydia Kumar: What do you think is there, and then if you were gonna choose a tool, what would you choose or what is your tool of choice?

    Tanzania Brown: Yeah. Honestly, I would use ChatGPT and it's because it's what students are using. It's also the most iterative in that like when even when I log on, it's like, "Hello, Taz." It like knows what I do. Like you can just feed your information to it once. And it already knows. Like when I was using it as a teacher, they're like, "You are a teacher for eighth-grade girls, you're teaching these three books. Here are the rubrics you've already made." So after a while, it just got so easy to use. Like after a month or two, it was like, okay, this is literally my assistant and it's helping me.

    Lydia Kumar: That memory can be very useful if you are, if you're trying to develop something iterative over time. It's, it's really powerful.

    Tanzania Brown: And the feedback loops I really enjoy where I'm like, can you rank this in comparison to this other rubric I had? Or, um, in comparison to other curriculums that are like mine, that are just as culturally responsive, including these books, how does it compare to other designers or works and materials that are out there. So it, it offers such amazing feedback, um, and it allows me to think in a new way. So I would totally just push for teachers and departments to develop these things together. But I think just the more, more obvious thing is that teachers just need time to do this. Like, we did this over the summer where we got paid to do this and you know, we had, we had six weeks to really like, think about this, but it's like teachers have a week of professional development before they have a hundred or so brand new faces in front of them. So how can districts and school leaders give teachers maybe a week or two weeks and pay them to really kind of explore this?

    Lydia Kumar: Right. That, because I, I really don't think ChatGPT is a hard tool to learn how to use if you have time. And I think almost saying like how, I don't know, things that people are already doing, it's just like, do this with ChatGPT and over time, I think that could help too. Like how can you use ChatGPT to help you set up your classroom for the year? This is the theme you want. This is the, I don't know, here's the texts you teach. Here's some quotes you can put on your wall. How do we integrate in a way that makes it easier?

    Tanzania Brown: And I think also just by working in a public education sector, there's obviously a huge technology gap that we just haven't spoken about yet. Sometimes I do feel just irresponsible talking about AI in this way because my teachers will look at me and be like, "Taz, we have a hundred students and we literally have 15 working computers. What are you even talking about right now?" So I think just when we talk about things that are even harder, I was thinking about student digital literacy and just like access to computers. And even like the Wi-Fi is huge.

    Lydia Kumar: Yeah. Like do you have access to the tools that you need to even be able to do that? And when you were like, your experience of being in Charleston schools and then in these private schools, I'm sure your private, mm-hmm, boarding schools were incredibly well-resourced and just the access to stuff. Um, I had an interview a few weeks ago with Farra Berro and we talked, who developed the, uh, AI standards for NCDPI, and she talked a lot about equity of like equity being one of the things that's top of mind for her is students in well-resourced districts or students from families who are thinking like this are going to learn the technology. And students who don't have a computer in their house, or there's 15 computers to a hundred students, like, it's gonna be much harder for them to pick up these tools. And we know it's gonna change the landscape of work. Nobody knows exactly how, but these are powerful tools that are gonna impact how we work and how we learn. There's some urgency I think here in figuring this out. And it's like, how do you manage the 700 things that teachers have to do and the, uh, access to the physical resources you need to make it happen? It's, it's complicated.

    Tanzania Brown: Yeah, totally. And that, and this is just like, when I even think about just equity, this, it's just only one piece of it. And then also when I think about ethics, we know that AI can be as, as generative and really just, you know, reflects our current culture. Um, so I think about AI bias a lot in terms of, you know, 'cause I think with anything it just reflects dominant societal norms, assumptions, values. So when we are training these, um, ChatGPT bots, it's really important that they know who our students are. They know who our teachers are. Otherwise, I think it'll really just kind of reiterate some of those norms and biases, mm-hmm, that already exist on the internet and that already, you know, kind of maybe misplace our students or pushes them on the margins. Or like our neurodivergent students, our students who are low income. Um, and I really think it's important for us to train these bots and to, and to question them and contextualize them and really to make sure that they're iterative so that they're not just like repeating some of biases that already exist. So I'm sure that there's so many things around ethical design that I'm really just like, I wanna dive deeper into.

    Lydia Kumar: Right. How, like how is ChatGPT trained? What is it trained on? And then are there things that you can do in your, I don't know, let's say you had ChatGPT access for your school or for your district. Are there ways that a district could set up that specific iteration of ChatGPT to be more culturally responsive or, right? I don't know. I don't know the answer to that yet, but I'm sure there's, there's, it does feel important because we don't exactly know how, how it's trained or what it's pulling from. And I think that's, it's something to think about. Are you seeing ways the responses of ChatGPT or if you had a moment that concerned you when you were working in terms of like bias or, I don't know, do you have a story about seeing, have you seen that happen?

    Tanzania Brown: Yeah, I mean, so we actually, in math, I think it happened so much. So a lot of what teachers and I have been working on, especially tutors, like they have two or three students, and a lot of what I do is because our tutors aren't necessarily like trained as teachers, so they're not, they aren't as equipped to develop things like differentiation or scaffolds. Even in these smaller, you know, two or three student teaching environments that I've been trying to train them to use ChatGPT to like input student information. And based on that student information, like their test scores, their grades, their interests, their hobbies, whatever they, the bot can really push out differentiation strategies that might work for them, scaffolds that might work for them, even some scripts about how to redirect them when things are getting hard. Um, but sometimes, especially in the scripts, you don't necessarily see like, often like restorative language or restorative words. Like for instance, one of my, one of my tutors put in like, "Okay, my student often says like, 'I can't do this because I'm stupid. Um, I just don't know my multiplication tables. Like I just don't wanna do this.'" And the ChatGPT response to that was, "Well, you don't know it." And um, it was just like,

    Lydia Kumar: Ah, so bad. Cringey. Yeah.

    Tanzania Brown: It was like, "Oh my God." It was like, "Yeah, you're right. You don't know it. Let's work on your multiplication tables." And so of course the ChatGPT was responding to the student saying they don't know the specific skill. But then when you look at it, you can't, like, yeah, you can say that and you just, you're not gonna teach an 11th grader multiplication tables. That would just be inappropriate. We're trying to get them to, so more than just even the language, it was around just like teaching practices that it was trying to tell this teacher to shift to multiplication, which is just not a strong teaching practice to teach such foundational skills to a student that's in algebra two. It's not.

    Lydia Kumar: Right. It doesn't, it doesn't make sense.

    Tanzania Brown: Mm-hmm. And so it really does tell tutors all the time, your students really don't know adding and subtracting or really are struggling with fractions. You should focus on this. But it's such foundational level skills that best teaching practices tell us that we need to kind of incorporate them, incorporate those skills into grade-level content, right? It's like, just 'cause you're in algebra two still, you still kind of have to learn algebra two even if you struggle with these foundational skills. Whereas AI is just telling our tutors to just stick with the foundational skills, right?

    Lydia Kumar: Um, and that's hard if you are trying to train your tutors and you're trying to, you're trying to train your tutors and you're trying to help them use AI effectively. But then if you are seeing the AI give them advice that isn't good, and directing them to do something and say something in a way that probably wouldn't land well with an 11th grader.

    Tanzania Brown: And so that's why for me, it's really important to, like, I'm thinking a lot about how we retrain and train and just think critically about what AI is producing because I'm like, I think there are more teachers who are gonna start using AI. Right. Um, especially new teachers. Like there's this, um, podcast I just listened to. Um, it's called "From Start Talks with IBM, How AI Assistants Can Transform Education." Mm-hmm. And it's "Revisionist History." It's Malcolm Gladwell, and most of it is just about how, um, someone from Stanford is developing AI assistants that mimic students so that like new teachers and in new teaching training programs can practice just like teaching on these like stereotypical archetypal kinds of students. Yeah. And of course there's very like unethical, like unethical issues with that. But one of the things that I've been finding is that like in that the AI isn't really trained in best practices yet, right? So it's not able to tell a new teacher like, "Oh you, you shouldn't do this 'cause this isn't necessarily best practices on teaching." So I think like before, what I found is that before I can really teach my tutors how to use AI, I have to give them foundational, like best practice knowledge. Because otherwise they're just gonna, just like students, my tutors and even teachers are just gonna accept AI as like fact and knowledge rather than it's like a thinking partner.

    Lydia Kumar: Right. Like you can't, if you don't have the foundational knowledge, AI is just gonna feed you. And you're gonna say, "Okay, this is right." And you can't evaluate. You have to have some level of baseline expertise or knowledge to be able to say, "That's not right," or right, "That doesn't feel," even if it's half right, you gotta know, like if it doesn't feel right to you or it's not grounded in sound practice, like, maybe, maybe it would be good to learn your multiplication tables, but that's not the best use of our time right now. Like, how do you, how do you make those judgment calls requires an understanding of how people learn, the fact that you need to progress over time, what's developmentally appropriate, what's the biggest bang for your buck in, you know, the 20-minute tutoring session you have, or however long it is. Mm-hmm. So those are really important things to be learning about and thinking about.

    Tanzania Brown: Yeah, it is. 'Cause here's my thing, I do think ultimately AI is kind of just a, as a Band-Aid, at least the way that I'm thinking about it now, and like the way that I'm trying to incorporate it into classrooms, into curriculums, it's really just kind of a Band-Aid to solving these bigger issues, whether it's around class sizes, whether it's about like intervention, um, or even just like helping teachers manage their caseloads. Like those are the problems that I am using AI to solve right now, to like personalize students' learning in large spaces.

    Lydia Kumar: How would you want district leaders or any, anyone leading a school or an education nonprofit to think about implementing AI tools with students or teachers and what should they be prioritizing?

    Tanzania Brown: Yeah, I mean, I think, I think for, for professional development, I know that it's usually a week, there should be a lot of teacher, teacher training through professional development around like how to use AI. So like, I like to think about it in how to use, how to detect, and how to adapt AI. And so those are the three buckets that I think instructional leaders, districts really need to focus on. Like how do we teach students, how do we teach teachers and then students how to use it, how to detect it. Um, because of course there's still, because I think what's also missing is that some teachers, if I, if I've been teaching for 25 years, I don't, I don't necessarily know how to identify that my student is using AI, so how to detect it and then how to adapt it to, to make it work for you. Mm-hmm. Whether it's in daily lessons, like you said, how to use AI to set up your classroom or ChatGPT to make grading easier or you know, like adapting it, detecting it, and using it just generally I think would be the three buckets that they should really focus on.

    Lydia Kumar: Yeah, I've thought about that too as like these, for these older teachers who have so much expertise and so much knowledge, this is hugely disruptive, and so how do you help them enter into this new area, era? I think it's harder than it is for an early, early-career teacher, but the late-career teachers have so much content knowledge. I think is more difficult, but then the potential is also so great because you have so much knowledge about how to teach. Uh, yeah. And so I think there's a lot of potential, but it is really hard to, I mean, change the way you teach after so long and such a dramatic, potentially dramatic way. I just, this technology is super disruptive.

    Tanzania Brown: It is. And like you were saying earlier about someone developing AI standards for North Carolina, mm-hmm, I mean, I just think like the AI literacy, digital literacy standards need to be like, whether it's federal or state-based. Um, I just think having those standards, because so many of what it means to be a teacher is about following standards. Yes. And I think even if you like, you know, identify the things that students need to know or even teachers need to know about how to use AI, I think that's just also gives teachers better guidance. Like all teachers or most teachers know how to use standards or know how to use a standard to create a lesson plan to do X, Y, and Z. So I think really coming up with these, like just AI literacy or digital literacy is something that's gonna be really important. In the same way that we have math standards and reading standards, I think all states should have digital or just AI standards at this point. And that includes things like, oh, sorry. But I just think about metacognition and critical thinking. Um, I think one thing that AI can do really well is to get students to like, ask more questions and ask those critical questions, um, of like, "Hmm, is this, is this what I'm really trying to say?" Or "Do I need to tone shift it?" Or, um, "Is this, you know, like, do I really understand it when my teacher gives me notes, um, or written notes?" Or "Do I need to make it into a visual or do I need to make it into a podcast?" So I think in some ways it allows students to do so much reflection on what they're trying to say, on what they're trying to convey, um, or even just how they wanna learn. Um, that I think, but I think so much of that needs to be a part of that AI literacy that we're talking about.

    Lydia Kumar: Absolutely. I think, yeah, those are, those are really good points and things that I think once the standards exist, then it's like districts can actually draw on those standards to do PD and to teach their teachers how to, how to use the technology in a way for students. And I think it takes some of the pressure off. I feel like if you're all out there and you don't have any guidelines and you don't have any guardrails, you're just kinda like doing stuff that feels overwhelming too. It's like, how much is too much? I teach kindergarten. Do they need to know about AI? Like, just that, I think that can feel overwhelming. And so having some standards and knowing what's appropriate for each, uh, each age, each stage is really important too, to give, to let people breathe, you know? Like, okay, yeah, I know this well enough. I know something, at least. How have you seen districts responding to it? So you work with districts mostly?

    Well, I, you know, I did in the past, we did kind of change management work for schools, implementing staffing structures. And now what I'm trying to pivot to is like AI work to help districts kind of set these AI standards and guidelines that we're talking about. 'Cause I think, I think training people on AI is sort of a low lift, but I think the harder, more complex opportunity is for districts or organizations to think about what is our vision for AI? How do we wanna use it? How do we maintain like data and privacy? And then how, and then think about how you wanna train your people. It's like if it's sort of like when we were talking about the AI standards, it's like once a, once a state has AI standards, then you can build up from there. But if you're just, if it's just kind of like a free for all, I think you're gonna get a lot of variety and quality of use. You're gonna have ethical concerns, you're gonna have data breaches that you don't feel good about. And so I'm really interested in like, how do we set a strong foundation? And then once the foundation is there thinking about what training needs to happen. But what I see happening is I feel like people are way more like opened and ready for training than they are for those deep guideline-setting conversations. So I feel like maybe a district would be more interested in how do we, a PD session about using AI to set up your classroom. The deeper, harder work that I'm interested in is how do you set the guidelines to, to really allow this to be used safely, ethically, effectively, um, down the road. Yeah.

    Tanzania Brown: And I kind of wish those, the questions that you're asking that like district leaders was doing that work, like during COVID, you know what I mean? Right. Like where it's like, it feels so like now it just, yeah. Yeah. It feels so reactive that you're like, okay, we're already using it. It's like, the car's already moving. I don't know that we can like go backwards. Yeah. So developing it in real time is gonna be the hardest part.

    Lydia Kumar: But I think there is some, I think there is some importance in thinking about how you want to use something because the technology is shifting so much. And so if you have like a baseline, then I think you can build up from there. So those are, those are things that I'm thinking about and um, I think there's tons of people just on the ground using it. And you, how do you, do you pull back? Do you push ahead? But I mean, the equity piece I think matters too for teacher experience and for student experience. And if only some people have access to the knowledge or the skill, then I think that has impacts on students down the road.

    Tanzania Brown: Yeah. These are amazing questions.

    Lydia Kumar: My last question that I love to end with, yeah, is just like, what's the question or tension or idea around AI that you are, you can't stop thinking about?

    Tanzania Brown: Hmm. I mean, for me it's really about just like student voice and agency when it comes to all of this and like teacher, teacher and student voice and agency. Like, I think so often school districts like things just happen to teachers, right? Like, you know, it's a mandate, you have to do this, or it's a professional development, you have to do this. But what I think is so rich about AI, because it's so new and because so many people are figuring it out, that I hope that it's an opportunity for teachers and students to kind of guide the conversations, mm-hmm, around what, just to answer some of those questions around how they should use it or what variety and what quality use looks like. I think one, something that I'm really struggling with is that I don't want, especially in the public education sector, I don't want teachers and students to just be like, left behind with this. And I can totally see that happening just because of those perspectives that I've been talking about around just really the resistance to artificial intelligence. Mm-hmm. Um, I don't want this to create like, further gaps and agency and voice and teacher shortage and student like connection to school. Um, it's something that I just, that like weighs on me, 'cause I see this, like you said, I see this happening in the workforce. I see this changing so much of corporations that I think often education is like the last thing to be impacted and affected, which then means that like teachers and then of course students are just like impacted in ways that we won't know from until like 10 or 15 years. And I just don't want it to like exacerbate gaps that we already have. Mm-hmm. Instead, I really do think it can close some of these gaps, but how, when, and yeah, what does it take? Yeah, yeah. What does it take? What do we need to do? I'm like, do I, I've even just been looking for like classes that I can take. I'm, you know, I'm like, where are the leading thought partners in education that are really just thinking about these things so that we're not left behind and that not necessarily even left behind, 'cause I do believe that like, students and teachers make things happen for themselves all the time. Um, but I want us to just be proactive, I think is the better wording.

    Lydia Kumar: That was such an insightful conversation with Tanzania Brown. A huge thank you to her for sharing her powerful perspective on the intersection of AI equity and education. Her stories from students using AI to create personalized study tools to the real-world challenges of AI bias, highlight the urgent need for teachers to be trained, valued, and given the time to lead this transformation.

    Continuing our exploration of what it means to lead thoughtfully in the age of AI, we're going from the classroom to the creative industry. Join me next time for a discussion with John JK Carnegie, a professional who uniquely bridges the analytical and the artistic. By day, he's a digital marketer with over a decade of experience, and by night, he's a photographer capturing human stories. We'll explore how he uses AI as an efficiency engine and a creative companion, freeing him up to focus on the human expertise that truly makes a difference. To dive deeper into today's topics with Tanzania, I put everything for you in one place. Just head over to the resource page for the episode at KINWISE.org/podcast. There you'll find the full transcript, more about Tanzania and her curriculum work, and a list of resources and prompts inspired by our conversation.

    For the school and district leaders listening, if you're looking for AI professional development that truly lasts, I invite you to learn more about the KINWISE Educator PD pilot program. We partner with districts to select a topic that's meaningful for your teachers, and together we build a community of practice that continues to support them long after our work together is done. You can learn more about my approach at kinwise.org/pilot.

    Finally, if you found value in this conversation, the best way to support the show is to subscribe, leave a quick review, or share with a friend. It makes a huge difference. Until next time, stay curious, stay grounded, and stay Kinwise.

  • I hope you enjoyed hearing Tanzania Brown’s insights on accelerating learning through high-impact, relationship-driven tutoring. If you’re inspired by her equity-first approach and want to champion stronger math & literacy outcomes in your own setting, here are a few easy ways to connect:

    • Saga Education Official Website – Explore their evidence-based tutoring model, impact stats, and partnership options for districts nationwide. Visit the website saga.org

    • Connect on LinkedIn – Follow Tanzania’s leadership journey and reflections on curriculum design, program management, and instructional coaching. Connect on LinkedIn linkedin.com

    • Follow Saga Education on Instagram – Get behind-the-scenes stories, tutor spotlights, and real-time results from classrooms across the country. Follow on Instagram instagram.com

  • Putting Inspiration into Action: ChatGPT Prompts Inspired by Tanzania Brown

    Listened to the episode with Taz and feeling inspired? Here are some practical prompts you can use with ChatGPT to start integrating her ideas on AI, equity, and student affirmation into your own teaching and curriculum design.

    Pro-Tips for Better Results:

    • Set the Scene: Before your main prompt, tell ChatGPT who you are and what you're trying to achieve. For example, "I am a 9th-grade English teacher in a diverse public school..."

    • Use Your Voice: For brainstorming, try using the voice feature in the ChatGPT mobile app. It can feel more natural and help you generate ideas more freely.

    • Upload Documents: For tasks like grading or creating differentiated materials, upload your existing rubrics, lesson plans, or de-identified student work samples to give ChatGPT specific context.

    • Iterate and Refine: The first response is just the beginning. Talk to ChatGPT like a thinking partner. Use follow-up prompts like, "That's a good start, but can you make it more appropriate for students with ADHD?" or "How can we make this more culturally responsive to Black students?"

    1. For Personalizing Student Learning & Metacognition

    These prompts are designed to help you use AI to create customized learning experiences that encourage students to think about how they learn best, just like the student who turned his study guide into a podcast.

    Prompt for a Personalized Study Tool:

    "I want to help my students create personalized study tools using AI. My student is studying for a test on [insert topic, e.g., 'the causes of the American Revolution']. They learn best through [insert learning preference, e.g., 'auditory learning,' 'visuals,' 'kinesthetic activity']. Act as an instructional coach and generate 3-5 creative ideas for how this student could use an AI tool like ChatGPT to create a personalized study guide that fits their learning style. For each idea, provide a sample prompt the student could use."

    Prompt for Metacognitive Reflection:

    "Act as a 10th-grade geometry teacher. I want to help my students reflect on their learning process. Create a short, student-friendly script that I can use to introduce the idea of using AI as a 'thinking partner.' The script should encourage them to ask AI questions like:

    • 'How can I rephrase this concept in a simpler way?'

    • 'Can you turn my notes into a set of flashcards?'

    • 'What are some different ways to approach this type of problem?' The goal is to empower students to take ownership of their learning by using AI to discover what works best for them."

    2. For Designing Culturally Responsive & Affirming Curriculum

    Use these prompts to ensure your curriculum, like Taz's, is rooted in social justice and affirms the identities and strengths of all students.

    Prompt for Curriculum Brainstorming:

    "I am a [your grade level and subject, e.g., '7th-grade history'] teacher. My students are primarily [describe your student population, e.g., 'Black and Latinx from low-income backgrounds']. I am planning a unit on [insert topic, e.g., 'the Industrial Revolution']. Act as a curriculum designer with a deep understanding of culturally responsive pedagogy and social justice. Brainstorm 5 ways I can teach this topic that:

    1. Centers the experiences and contributions of people of color.

    2. Connects the historical content to the students' present-day realities.

    3. Affirms their identities and empowers them to see themselves as agents of change. For each idea, suggest a key question we could explore as a class."

    Prompt for Differentiating Materials with an Equity Lens:

    "Attached is my lesson plan and a worksheet on [insert topic]. I need to differentiate this for a student who has an IEP for [e.g., 'a reading disability'] and struggles with self-esteem. Act as an expert in both special education and student affirmation. Review the attached materials and provide specific, actionable suggestions for how to scaffold this assignment. The suggestions should not just simplify the task, but also elevate the student's strengths and build their confidence. Focus on restorative language and framing."

    3. For Creating Engaging Math Activities (That Go Beyond the Right Answer)

    Inspired by Taz's work with error analysis and problem-solving, these prompts focus on the process of mathematical thinking.

    Prompt for an Error Analysis Assignment:

    "Act as an Algebra 1 teacher. I want to create an 'error analysis' assignment. Below is a math problem. First, solve it step-by-step. Then, create a second version of the solution that contains a common conceptual error that a student might make.

    Problem: [Insert math problem here]

    Now, create a worksheet for my students. The worksheet should present the incorrect solution and ask students to:

    1. Identify the error in the thinking.

    2. Explain why it's an error.

    3. Provide the correct step-by-step solution."

    Prompt for Generating Real-World Math Problems:

    "I want my students to see how math applies to their lives. My students are interested in [insert student interests, e.g., 'fashion design, video games, and social media']. Act as a creative math curriculum designer. Generate 3-5 word problems for a [insert math level, e.g., 'Geometry'] class that connect to these interests. Ensure the problems are complex enough to require critical thinking and can't be solved with a simple copy-paste into an AI tool."

    4. For Teacher Training & Professional Development

    If you're a school leader or instructional coach, use these prompts to guide your staff in thinking critically and proactively about AI.

    Prompt for a Professional Development Session:

    "I am an instructional leader planning a 60-minute professional development session for teachers who are hesitant or resistant to using AI in the classroom. Based on the insights of Tanzania Brown, my goal is to shift their perspective from seeing AI as a threat to viewing it as a tool for deepening student learning and saving them time.

    Create an outline for this session that includes:

    1. An engaging opening activity to surface their current beliefs and fears about AI.

    2. Key talking points that address common concerns like plagiarism and equity.

    3. A hands-on activity where teachers use ChatGPT for a low-stakes task, like generating a lesson hook or creating a grading rubric.

    4. A concluding reflection question that encourages them to identify one small way they could experiment with AI in the coming week."

    Prompt for Developing School-Wide AI Guidelines:

    "Act as a school district consultant specializing in educational technology and equity. Our school needs to develop clear, ethical, and practical guidelines for the use of AI by both teachers and students. Based on the conversation with Tanzania Brown, generate a list of key questions our leadership team should discuss to create these guidelines. The questions should be organized into the following three categories as she suggested:

    • How to Use AI: (e.g., What are the approved AI tools for our school? How can we leverage AI to support our school's mission?)

    • How to Detect AI: (e.g., What is our school's policy on AI-generated work? How will we teach students about plagiarism and academic integrity in the age of AI?)

    How to Adapt to AI: (e.g., How will we need to change our assignment design and assessment methods? How will we ensure equitable access to these tools?)"

Next
Next

6: Drafting the Future: Mariah Street on Using AI in Law and Legacy Planning