13. Beyond the Bot: John Sharon on Protecting Human Connection in Schools

Season 2, Episode 3 of Kinwise Conversations · Hit play or read the transcript

  • Intro

    Welcome to Kinwise Conversations, where we explore what it means to integrate AI into our work and lives with care, clarity, and creativity111. I'm your host, Lydia Kumar.

    Today, we're taking a step back from the rapid pace of AI to ask a more fundamental question: how can timeless values guide our relationship with modern technology? I'm honored to be joined by John Sharon of the Carolina Friends School, a veteran educator with over three decades of experience. John brings a deeply reflective, value-based perspective to the conversation2. We'll explore how Quaker testimonies like simplicity, integrity, and environmental stewardship create a powerful lens for evaluating AI's role in education, the ethical questions his community is grappling with, and why he believes in keeping one foot on the brake and one on the accelerator.

    Transcript

    Lydia Kumar: Thank you so much for being on the podcast, John3. I'm really excited to hear your perspective on AI in terms of who you are as a person and the work you do. I know you work at the Carolina Friends School, which is a Quaker school, and so I want to start by giving you space to talk about your journey in education—personally or professionally—anything you think is important for listeners to understand about you for our conversation4.

    John Sharon: Thanks, Lydia. It's great to talk to you. I'm going to date myself, but this is my 34th year in education5. When I started as a fourth-grade teacher at an all-boys school in Washington, DC, I was still using a mimeograph machine to copy tests and quizzes6. I start with that because I've been doing this a long time and have seen a lot of evolution in education7. Early in my career, I got into school administration, probably too early, but that gave me a window into bigger trends in education and seeing which innovations came in and actually stuck8. I distinctly remember when laptops were being aggressively pushed into schools by Microsoft9. It seemed like a monetization venture for Microsoft rather than something that was good for education10. Part of my job as assistant head for teaching and learning at Carolina Friends School is to keep an eye on educational trends. I oversee the curriculum and learning for pre-K through 12th grade, and it's a fascinating job11. It's been incredibly helpful to see what innovations have happened over the years through the lens of a 30-plus year career12.

    Lydia Kumar: That's fascinating. I'm curious because you talked about computers coming in and how that felt, and now we have AI with a huge push from corporations, the government, and other organizations for AI in education13. How does that land for you? Does it feel like it's what's best for education, or is there a way for it to be what's best?14.

    John Sharon: It's a great question15. What I feel is happening actually reminds me of the Microsoft push for laptops in schools16. It feels like a monetization of a technology that might be good for the users but isn't necessarily so17. We haven't had enough time to see how good it is for the users18. My question is always, if nobody was making money off of AI, would this be such a push in education?19. I think the answer is no, but it's something I'm careful to pay attention to20. I'm not a Luddite and I'm not afraid of it, but I do think caution is really important because I don't think we know the full impact it's going to have on learning21. The only thing constant is fluidity, so it's going to change, and the question is how we're going to be able to respond22.

    Lydia Kumar: Carolina Friends School is a Quaker school, and Quaker values might be helpful for listeners to understand23. Can you talk about the perspective or lens that the Carolina Friends School holds and then maybe we can think about looking at this new technology through that lens?24.

    John Sharon: Happy to. There are a few Quaker testimonies that are relevant to this conversation25. Quakers believe that

    truth is continually revealed26. We're wrestling with what that means when you're dealing with a technology that spits out hallucinations and sometimes is deliberately intended to deceive27. Another tenet of Quakerism is

    simplicity, and we're asking to what extent AI is oversimplifying things, because simplicity is actually incredibly complex28. There's another testimony that says there's

    that of God in everyone, so everyone is deserving of respect29. AI feels very impersonal, so we're grappling with how to reconcile this notion with technology that's not human-produced directly30. Another thing that doesn't get a lot of airplay is the Quaker belief in

    environmental stewardship31. AI uses a ridiculous amount of water to function, and I haven't seen a lot of other people talking about the ethical implications of its long-term impact on the environment32.

    Lydia Kumar: That's a fascinating and different perspective33. It's interesting to step back and make choices about technology through the lens of values that are woven through your community34. I know you're part of a consortium of Quaker schools that are thinking about AI35. Are these the conversations you're having? What are your conversations with other Quaker schools like right now?36.

    John Sharon: The group is just in its genesis stage, and my observation is that there's a range of comfort with AI that's more reflective of school culture than anything else37. There's an equal amount of pressure on the accelerator and the brake38. The car's moving, but it's not moving super fast, which feels good for my perspective on how Quakers should be approaching AI39. We shouldn't put the brakes on fully, but we also shouldn't fully use the accelerator quite yet40. A school recently joined the conversation that is piloting the use of AI to onboard new faculty41. In the instructions they used to develop the tool, they were very intentional about making sure the answers were fully aligned with the school's mission and Quaker values, with directions like, "make sure your answers are warm and friendly and soft"42. The idea is that a new faculty member could go into this tool and ask a question, and the response would be totally in line with the school's mission and culture43. It could be a really helpful tool, but I also worry what happens to human relationships when new faculty can just click a button to get a question answered44. Where does human interdependence come into the equation?45. I'm not saying it will go away, but it has the potential to change the dynamic, and I'm interested to see how that goes46.

    Lydia Kumar: It's interesting because it feels like you have this alien intelligence that you can go to for answers rather than having slower conversations with a person47. The intelligence is trying to mirror the values of the school, but there's a potential for blurriness or over-reliance that could take away from the weak ties you create in a school48. You might not ever meet the person in HR because you don't have a reason to49.

    John Sharon: Right, I saw a friend who is an educational consultant and former head of school recently developed an AI tool that replicates his advice50. You don't have to talk to him; you can just talk to the tool51. I thought that was fascinating, and I'm interested to see how that impacts human interaction with him and his clients52. The AI version of me would probably always be nice and have the right answer53. This idea of scaling my God-given identity by creating a digital clone of myself seems strange54.

    Lydia Kumar: It's a big change55. I was talking to someone who works at one of the two-hour learning schools, a private school model that does two hours of AI instruction in the morning56. The rest of the day, students pursue clubs, entrepreneurship, critical thinking, and public speaking57. The interesting thing about that conversation was the idea that more AI doesn't mean fewer adults58. They want to have more adults and more time for relationships59. This is one of the most AI-forward learning institutions in the country, but there's a huge focus on the human side and relationships60.

    John Sharon: I'm encouraged to hear that because what's getting lost in the conversation is the role that human interdependence plays in teaching and learning61. The reality is that teaching is interactive and interdependent, between adult and child62. Here at my school, we intentionally flatten hierarchies, and there's no distinction between teacher and student. We call them staff, and students call teachers by their first name because we are all in the process of learning together63. I think that learning is symbiotic between the teacher and the learner, and with AI tools, you can take that symbiosis out of it64. You can learn information from AI, but beyond information, are you learning about yourself, about learning, and about the process of learning from a machine?65.

    Lydia Kumar: How are you seeing your staff or students using AI right now?66. Is there a lot of that going on in the school? What does it look like?67.

    John Sharon: We've taken an intentionally ground-up approach to AI68. We've given teachers leeway to use AI as they see fit, and sometimes that's been great and sometimes it's been not so great69. A few years ago, we had a situation where a teacher was having AI write his end-of-year comments for him70. That was problematic because it was putting student information into ChatGPT, which raised privacy concerns71. Our end-of-year comments are very extensive because we don't have grades, and parents and guardians believe they're being written by human beings who know their children, which was not the case72. We now allow teachers who struggle with writing to do a little AI writing as long as they personalize it afterward73. Otherwise, teachers are using AI as thought partners and for brainstorming lesson plans74. This is where I'm most excited about it—teaching students how to be

    critical users of AI75. My vision is that we become a school that intentionally teaches critical use of AI, so students learn to use it thoughtfully, constructively, and cautiously76. They're going to see people using AI in all kinds of ways, and we want to teach them to use it for the good77.

    Lydia Kumar: I think that's so important78. I've seen some emerging research about the impacts of AI tools on learning79. In one study, students perceived that they had learned more than they actually had when working with a chatbot because they saw writing down the answers as actual learning80. It's important to teach people to think critically because interacting with the technology can change how we view ourselves and our learning81.

    John Sharon: Right, and that's a great story82. It reminds me of research from a "Learning in the Brain" conference I went to 20 years ago about students who got perfect SAT scores83. The students who did well on those tests were able to put themselves into the shoes of the person writing the test, and there was this

    empathy factor that other students didn't have84. This was an incredibly human interaction they were exhibiting85. We all need empathy for whoever is selecting the training data our AI is trained on86.

    Lydia Kumar: You've led schools through different technological changes—dot-com, cell phones, social media, and now AI87. What are some nuggets of wisdom you have from the past that you're holding onto, or mistakes you want to avoid with this new innovation?88.

    John Sharon: The reason I encourage schools to develop nimble AI policies is because with other technology, we found that you needed to be willing to change them from year to year89. I've also been thinking about AI as being a little bit like the discovery of a new animal in a jungle90. The villagers don't know if it's friendly, if it's going to eat them, or whether it should be tamed91. They have to be very careful about getting to know this new animal and letting the animal get to know them92. I think we're still in that phase with AI, carefully measuring its impact93. We continue to grapple with whether this is going to be for our good or for our harm, and we should continue to grapple with it for the foreseeable future94. Social media showed us that more than one thing can be true—there were a lot of pros and a lot of harm95.

    Lydia Kumar: What's an idea or a question about AI that you are just continually grappling with right now?96.

    John Sharon: I think what I'm grappling with is always looking at AI as a tool for learning and not as a replacement for learning itself97. I want educators to be curious about it and to be willing to ask hard questions, especially as it continues to evolve and increase in usage98. I also want us to be paying careful attention to both the

    environmental impact of AI and the privacy impact of AI99. For example, when teachers have AI grade papers, that data is not secured or private, and student content is suddenly out there100. So, I think we have to continue to move forward, but we have to keep our foot on the brake and see AI as a tool for learning and not the learning itself101.

    Outro

    I enjoyed that thoughtful and grounding conversation with John Sharon. Thank you for sharing your wisdom with us, John. I was particularly struck by his call to view AI not as the learning itself, but as a

    tool for learning102. His use of the Quaker value that

    truth is continually revealed 103 frames the critical, ongoing process of discerning AI's role in our schools and our lives.

    To dive deeper into today's topics with John, I've put everything for you in one place. Just head over to the resource page for this episode at kinwise.org/podcast104. There, you'll find the full transcript, more about the Carolina Friends School, and other resources inspired by our conversation105.

    For the leaders and teams listening, if these insights have you thinking about how to build a real AI strategy for your own work, I invite you to learn more about the Kinwise Pilot Program. We partner with organizations to create practical, human-centered professional development and policies that empower your team to use these new tools with confidence and care. You can learn more at kinwise.org/pilot.

    Finally, if you found value in this conversation, the best way to support the show is to subscribe, leave a quick review, or share this episode with a friend. It makes a huge difference.

    Until next time, stay curious, stay grounded, and stay kinwise.

  • Connect & Resources

    1. Policy with Purpose
      "Help me design an AI policy for [your school / nonprofit / organization] that reflects the values of [insert key values, e.g., simplicity, integrity, environmental stewardship]. Present it as a two-page document with sections for guiding principles, acceptable uses, and ongoing review."

    2. Student Voices on AI
      "Create a set of 10 discussion questions for [student age group, e.g., high school / middle school] to critically evaluate AI’s role in learning. Include prompts about [insert focus areas, e.g., privacy, empathy, environmental impact, human relationships], and format them in a printable handout."

    3. Mission-Driven Onboarding
      "Draft onboarding materials for new staff at [your school / organization] that explain our AI tools in a warm, mission-aligned tone. Write it as a one-page welcome letter plus a short FAQ that reinforces our values of [insert values]."

    4. AI’s Environmental Ledger
      "Compare the environmental footprint of AI with that of other common technologies used in [your context, e.g., education / nonprofit operations]. Present the findings in a simple table and add three actionable steps to reduce AI’s impact."

    5. A Year of Thoughtful AI Use
      "Design a year-long curriculum module for [your target audience, e.g., 9th–12th grade students / adult learners] that teaches them to use AI as a tool for learning, not a replacement for learning. Outline monthly topics, key activities, and a capstone project, with an emphasis on [insert priorities, e.g., critical thinking, ethical use, community values]."

Next
Next

12. Trailblazing AI Literacy: Connor Mulvaney’s Rural Classroom Revolution