
I spent a few days last week in Phoenix, surrounded by hundreds of years of collective teaching experience at the NEA Retired Conference. There is a specific kind of hum in a room full of retired educators. I went there to talk about AI, but I left thinking much more about the humans who have spent their lives standing in the gap for students.
It is easy for the world to think of retirement as a quiet sideline. We often treat our veteran teachers like a favorite old textbook that is just a few editions out of date. But as I listened to the forty faces in my session, I realized the stakes of our technological future are incredibly high for them. One woman shared her profound frustration over AI making medical decisions for seniors. She had received an automatic decline for a procedure, and the notice simply stated at the bottom that AI made the decision. When she called the hospital, they told her they trust the AI and there is no appeals process. Her anger was palpable, and rightfully so. It is a stark reminder of the danger we face when we allow cold algorithms to replace human care and judgment.
Contrast that with a conversation I had the very next day. As I was getting ready to head home, another attendee stopped me. She had taken the AI skills we practiced in my session and used them to draft a three-minute persuasive speech for her local school board. She already knew exactly what she needed to say, but she spent twenty minutes iterating, iterating, iterating until the AI helped her polish her words into a professional plea. She told me it would have taken her hours to write it on her own. Her gratitude for this tool was just as profound as the other woman’s anger.
These two conversations perfectly capture the tightrope we are currently walking with this technology. On one side, we face the pitfall of unchecked, automated bureaucracy where human appeals are silenced. On the other side, we have the power to amplify the voices of the very people who have spent decades advocating for fairness. This perfectly aligned with what I had been reading in Ethan Mollick’s book Co-Intelligence: Living and Working with AI. Working alongside these machines should make us better and more effective, not render us obsolete.
There was so much talk about standing up and being heard. People spoke with a lot of heat about resisting unfair systems and advocating for every human being, regardless of the color of their skin. Seeing that fire still burning in educators who have technically left the classroom reminded me that advocacy does not have a retirement date. We are the ones who must demand that the human stays in the loop forever.
It is a strange thing to be a bridge between thirty years of traditional teaching and the high-speed world of AI. Sometimes it feels like I am mourning the simplicity of a chalkboard while simultaneously trying to explain neural networks. But if my time in Phoenix taught me anything, it is that our veterans are not on the sidelines. They are holding the map.
I am back home now, logging onto my computer for another week of Google Meets with my students. My suitcase is unpacked, but my heart is heavy with the weight and the beauty of this responsibility. We have a lot of work to do to protect the humanity of our world, and I am so grateful I am not standing in that gap alone.If your district or organization is navigating these exact conversations and needs a partner in keeping the human at the center of AI integration, I would love to help. You can book an exploratory meeting with me here.
Leave a comment