25. Danelle Brostrom on Leading AI: Privacy, Humanity, and Progress in Schools
Episode 24 of Kinwise Conversations · Hit play or read the transcript
The Strategic Shift in K-12 Leadership and AI
In this essential episode, we host Danelle Brostrom, a leading K-12 Educational Technology Coach for Traverse City Area Public Schools, an ISTE One to Watch in 2023, and a 2025 EDSAFE AI Alliance Women in AI Fellow. Danelle brings a critical, human-centered lens to the rapid adoption of Artificial Intelligence in educational environments. This conversation is mandatory listening for Superintendents and Mission-Driven Executives grappling with the technological inflection point currently underway.
Danelle emphasizes that current institutional challenges are not merely technical; they are deeply ethical and operational. We discuss the critical need to learn from the mistakes of the social media era to avoid losing our "humanity" in the transition to AI. Core to the discussion is the pervasive and often invisible embedding of AI into nearly every EdTech product, creating a massive, underestimated risk to student data privacy. Leaders must build new organizational muscles—including vetting privacy policies, fostering internal curiosity, and elevating student voices—to ensure that AI adoption leads to equitable access, not future liability.
Key Takeaways for K-12 Leaders
The Privacy Blind Spot: The greatest risk is underestimating how many AI-embedded tools teachers are using, making robust vetting of vendor privacy policies a necessity, not an option.
The Social Media Warning: Leaders must proactively address the rise of AI companions to preserve human-to-human connection, learning from the societal mistakes made during the initial social media boom.
Librarians as Fact-Checkers: School librarians and media specialists are the ideal organizational experts to lead information literacy and critical thinking initiatives against deepfakes and misinformation.
Equity as AI's Core Value: AI offers scalable solutions for inclusion, such as instant translation and text adaptation, which can immediately remove barriers for multilingual and neurodivergent learners.
Lead with Curiosity: The most effective organizational change begins with creating a safe space for staff to "wrestle" with AI's ethical questions, encouraging progress, not perfection.
Setting the Stage: From Classroom to EdTech Advocacy
Lydia Kumar: To begin with, I always ask people to tell a little bit about their story and help any listener become oriented to who you are and what has kind of led you and I into this conversation together. So, your background, your interest in AI, the things that bring you to the work that you do.
Danelle Brostrom: My name is Danelle Brostrom. I am in EdTech in Traverse City, Michigan. I have always been interested in technology. When I first started teaching—I taught for maybe 12 years—my teaching partner noticed I had an excitement for technology, and I eventually found a group of teachers talking about tech in the classroom and how it could help make an impact. That's when I really felt the power of technology. After teaching and doing a principal internship, I went into the technology department in this EdTech role. I was able to help teachers and affect more students and just have a greater impact. More recently, I've become involved with AI because I see the train coming, and it's here, and I want to be thinking about this in a more thoughtful way. I think about how, when our society embraced social media, we made a lot of mistakes with our kids. I want to make sure that that doesn't happen with AI. I want to make sure that we don't lose our humanity with all of this technology.
Lydia Kumar: If we're both just using AI as a means to communicate with each other, are we really communicating with each other anymore?
Danelle Brostrom: Yeah, and I worry about that with our kids. They've got AI at their fingertips, and they are using it whether we think they are or not, so we have to talk to them about it. I see a big opportunity to have those deeper discussions about AI in the classroom.
The Organizational Risk of Consumer AI and Companions
Lydia Kumar: When you think about the comparison between social media and where we are now, what lessons or warnings are top of mind?
Danelle Brostrom: I think the rise of AI companions. With social media, we found kids sitting next to each other on a couch, each on their own phones, and we lost a little bit of that human-to-human piece. I see the AI companions that are coming down—I think three out of four teens have used it. We've seen stories about grown-ups who have found someone that they love, and it's actually an AI chatbot. We need to make sure we don't make those same mistakes. I want to make sure that we are keeping that humanity and keeping people forefront.
Lydia Kumar: It feels a little bit dystopian or strange to think about. And so much of what students learn is modeled by adults. How do we model healthy behaviors in schools so that they can connect effectively?
Danelle Brostrom: Our country is very divided right now, and we don't need anything that makes those divisions worse. We need things that help bring us together, which is putting down the phones and just looking at people in the eyes and talking to them.
Lydia Kumar: Was it AI companions, or something else, that made you think, "Okay, this is something we need to pay attention to and get serious about"?
Danelle Brostrom: I think it was when the images and the video started getting really good. I could see that the images and the video were only going to get better, and that's the part that makes me nervous. We've got to start talking about this because I think it's hard for me as a grown-up to be able to tell when an image or a video is doctored or is not true. If we're not helping students learn those critical thinking skills, then our society is not headed for something great.
The Essential Role of Librarians in Information Literacy
Lydia Kumar: In your role, what are some practical things you've been able to do to help teachers or students think critically about these things?
Danelle Brostrom: We've been working a lot with our library staff. Librarians know how to read and how to look for misinformation. They've been working with our students on how to open up a second window and do a reverse image search, and "Can you find this information somewhere else?" I think having that connection between tech and libraries has been huge for our region.
Lydia Kumar: Librarians really do feel perfectly positioned to do that.
Danelle Brostrom: 100%. Their role is really to help all of us navigate this world of misinformation and information literacy. Much more powerful.
Organizational Risk Management: Data Privacy and Vendor Vetting
Lydia Kumar: What's one risk that you see that districts may be underestimating right now?
Danelle Brostrom: I think that AI is literally everywhere. It is in every product now. Every single vendor that was out had some type of new way that AI was going to be interacting with their tools. Teachers are just grabbing and taking things that are going to help their kids. But if you don't have a robust tech department or somebody who's looking at those privacy policies and thinking heavily about data privacy, I think you've got to have somebody who's constantly talking about that stuff. A lot of districts are underestimating the amount of technology tools that teachers are using.
Lydia Kumar: All of the data that teachers are then putting into the technology is then fed to feed a model. If you don't understand the privacy policy, who knows what they're doing with that data?
Danelle Brostrom: Sometimes you don't know to check, and that's why you need somebody like me who reads privacy policies for fun. A lot of the times we can negotiate with vendors, too.
Practical Red Flags in Privacy Policies
Lydia Kumar: Have you read anything in a privacy policy that was concerning?
Danelle Brostrom: Sometimes it's something as simple as, this specific tool doesn't protect data for students under 13 or under 16. That's an instant no-brainer. But often, it's something simple like, "We are going to use your data, and your students' data to train our models," or, "If we get sold, we're going to just give that data to another company." We have paid for the upgraded version of Gemini in our district because it does keep our students' data safer. It's in that little protected bucket. There are sometimes things that you can do. It comes at a cost, but if it's worth it, that can happen.
Designing Professional Learning for Change
Lydia Kumar: What's worked well in terms of teacher engagement and impact for AI professional learning? Have you noticed resistance, and how do you navigate that?
Danelle Brostrom: Resistance, yes, excitement, yes. All of the above. At first, our conversations were similar to everyone else's: "Teachers were worried that kids were going to use AI to cheat." We have an amazing curriculum team, and they work very closely with our technology department. We did some really big, large-scale trainings. We worked with teachers, parapros, administrative assistants, administrators—pretty much anyone who wanted to come learn more about AI. We did it in a really low-stakes, just come and play way. Let's use it to create a menu for your weekly meals at home. Let's use it to look at your resume.
Danelle Brostrom: We just gave people the time and space to wrestle with all this because it's messy work. It's not always black and white. You have to talk about, "Is AI okay to use, and for what?" "Is it cheating if you have it write an outline for you?" All of this, it's not black and white, and our staff really appreciated that we gave them time and space to just wrestle with this stuff, so that way they could own the work a little more. We really had to show: "It's here, your kids are using it. I just want you to be able to recognize it when you see it." To be able to do that, you have to know how it works, and you have to play with it. I think that mindset really gave us a lot of buy-in.
Embracing AI for Equitable Access and Inclusion
Lydia Kumar: Can you provide some examples of how, if AI is handled well, that could expand access and inclusion in learning?
Danelle Brostrom: For sure. We have students who are new to this country. We are using AI where kids can take a picture of the curriculum, and it automatically translates it into their home language. Or we're using AI where they can sit with a friend and have a conversation, and the AI is doing instant translation so that way they can make those connections with their peers.
Danelle Brostrom: A lot of teachers are using AI to change reading levels. So you can change the reading level of any text to make it higher, lower; that's great for a dyslexic student. We are using it to help break down large projects for students who have ADHD. I think just that differentiation piece, because that job as a teacher when all of your kids need something different, it's very overwhelming. You can use AI to really differentiate and meet the needs of all your students with a lot less time.
Danelle Brostrom: Our high school teachers have just started using Google Meet—they turn on the automatic transcription in their language. It's again, removing that barrier for those kids. They're able to access the same content, and it's simple. The fact that it has that automatic translation is a game changer for our kids.
The Core Mindset for Institutional Change
Lydia Kumar: What mindset would you encourage people to embrace about this?
Danelle Brostrom: I would embrace the mindset of curiosity and just listening to your staff. Being thoughtful and curious and listening will get you very, very far in this work. If you're encouraging progress, not perfection, and just giving them that safe place to make mistakes. You're not expected to know it all, but you are expected to lead with grace and compassion and humanity.
Lydia Kumar: Is there an idea or question about AI that's sitting with you right now, something that sparks hope, concern, or curiosity?
Danelle Brostrom: Today I'm in the hopeful camp. I'm thinking a lot about how education has to change. With AI, we're actually seeing a lot of these changes happen. I think we're starting to listen to students. We booked one of our featured speakers for our conference this year, and she's a freshman. Her name is Mackenzie Gilkeson, and she has dyslexia. She talks about how she uses AI, and how it's helped her. Those kinds of things give me goosebumps and make me think: "This could be the thing that helps kids and levels the playing field." The fact that I keep seeing her name, and I keep seeing places giving her a platform and a chance to speak... maybe we're at a point where AI can help us make real change for kids, if we use it right.
Connect and Resources
Connect with Danelle on LinkedIn
Podcast: TCAPS Loop (EdTech Loop Podcast Network)
Organization: Traverse City Area Public Schools
Fellowship: 2025 EDSAFE AI Alliance – Women in AI Fellow
Recognition: ISTE 20 to Watch (2023) • MACUL Board Member
Prompts Inspired by Danelle
1. “Start Small” Prompt: Personal Productivity Warm-Up
“I’m a K-12 educator with limited time. Create a weekly meal plan and grocery list that’s healthy, affordable, and fast to prep. Include a few leftovers for lunches.”
Why it works: It mirrors Danelle’s low-stakes PD approach—build comfort with AI through personal life use before tackling professional tasks.
2. “Interpretation over Automation” Prompt: Student Critical Thinking Exercise
“Provide two short news stories about technology. Then, guide students through questions that help them determine which story is more credible and why.”
💡 Why it works: Turns AI into a tool for teaching media literacy and discernment instead of outsourcing thought.
3. “Data-Smart Classroom” Prompt: Privacy Scenario Review
“Act as a district technology coach. List 5 questions teachers should ask before using a new AI tool in the classroom to ensure student data privacy.”
Why it works: Helps teachers understand what to look for in privacy policies and creates conversation around responsible tool use.
4. “Equity in Action” Prompt: Inclusive Lesson Design
“Rewrite this reading passage [paste your text] at three different reading levels and translate it into Spanish. Include comprehension questions for each level.”
Why it works: Demonstrates how AI can immediately increase accessibility and inclusion—just like Danelle’s examples of translation and differentiation.
5. “Leadership Mindset” Prompt: Reflective Dialogue Starter
“As a school leader, I want to open a faculty meeting by inviting curiosity about AI. Write three reflection questions that help my team think about AI’s impact on learning and connection.”
Why it works: Promotes curiosity and shared ownership—the very leadership posture Danelle emphasized.
About the Guest
Danelle Brostrom is a K-12 Educational Technology Coach at Traverse City Area Public Schools, a 2025 EDSAFE AI Alliance Women in AI Fellow, ISTE 20 to Watch (2023), MACUL Board Member, and co-host of the TCAPS Loop podcast. A Google Certified Trainer/Educator and former classroom teacher, she designs practical, human-centered PD and partners with librarians to build information-literacy muscles across schools. Danelle is a fierce advocate for digital citizenship, student data privacy, and equitable access, leveraging AI for real-time translation, leveled texts, and executive-function supports. She holds an M.A. in Educational Technology from Michigan State University and a B.A. from Alma College.

