The Intersection of Humanity and AI

Navigating Technology Use in Classrooms

3–5 minutes

read

There was a time in my teaching career when I believed, truly, deeply, that the right tech tool could change everything.

If I could just find the latest, shiniest, most innovative tool, my students would grow in unprecedented ways. Learning would explode. Engagement would soar. The future would arrive neatly packaged inside an app.

And to be fair… that belief didn’t come from a bad place.

It came from hope.

Like many educators, I saw technology as a way to widen my students’ worlds, especially for students whose lives and opportunities were often limited by circumstances outside of school. Technology felt like a bridge to possibility.

But looking back, I can also admit this (with a little humor and a lot of humility): I spent a lot of mental energy chasing tools, instead of always focusing on the learning.

A shift I didn’t see coming

In the 2015–2016 school year, I tried something new, at least new to me.

Every Friday in my Composition class became Genius Hour.

No mandated prompts.
No identical projects.
No single “right” answer.

Students chose a topic they cared about, researched it deeply, and presented their learning. Some projects were polished. Some were messy. All of them were theirs.

And it was amazing.

What struck me most wasn’t the final presentations, it was the thinking. The questioning. The ownership. Students weren’t asking, “Is this enough?” They were asking, “What’s next?”

Enter the shiny new tool (of course)

Around that same time, a new tool appeared: Google Hangouts.

And I was thrilled.

In my mind, this was it. This was the tool that would take Genius Hour to the next level. Students could meet virtually with experts. Authors. Scientists. Professionals. People outside our classroom walls.

I desperately wanted my district to open Google Hangouts to students.

They didn’t.

At the time, very little was known about privacy, safety, and appropriate guardrails. So students weren’t allowed to use it independently, even though I could use it in the classroom.

Did Genius Hour fall apart because of that?

No.

Did learning stop?

Also no.

What did happen was something important: I realized that while the tool mattered, it wasn’t the most important part.

The curiosity was.
The research skills were.
The critical thinking was.
The human connection, whether virtual or not, was.

What I’ve learned since then

I still believe deeply that technology can widen students’ minds and give them opportunities beyond the classroom. That belief hasn’t gone away.

What has changed is my understanding of ethical technology use.

The more I’ve learned about data privacy, bias, surveillance, and unintended consequences, the more careful I’ve become about what we place in front of students.

And here’s the key lesson that connects my past to our present moment with AI:

AI tools change. AI literacy doesn’t.

Why tool-first thinking doesn’t hold up

When schools focus primarily on tools:

  • Instruction becomes reactive
  • Teachers feel perpetually behind
  • Policies age poorly
  • Students learn how to work around systems instead of understanding them

This was true in 2015.

It’s even more true now.

What AI literacy actually means

AI literacy isn’t about mastering a specific platform.

It’s about understanding:

  • What AI is and what it is not
  • How large language models generate responses
  • Why AI can sound confident and still be wrong
  • Where bias comes from
  • Why human judgment must stay in the loop

When students understand these foundations, they can adapt, no matter what tool shows up next.

Tools as examples, not the curriculum

AI tools still have a place in classrooms, but as supports, not endpoints.

Just like Google Hangouts once helped me imagine broader connections, today’s AI tools can help students explore:

  • How prompts shape outcomes
  • Why specificity matters
  • How bias appears in outputs
  • Why verification is non-negotiable

If a tool disappears tomorrow, those lessons remain.

AI Behind the Tool (what really matters)

Most generative AI tools share common truths:

  • They are trained on massive datasets that may include bias
  • They predict patterns, not truth
  • They do not understand context like humans do
  • They require ethical guardrails and intentional use

Teaching these realities prepares students for any AI system they encounter, now or in the future.

The bigger goal

The goal isn’t to chase the next shiny thing.

The goal is to teach students how to think, question, and decide responsibly in a world where technology will always keep evolving.

AI tools will keep changing.

Our responsibility is to make sure learning doesn’t.
If your school or district is beginning to navigate AI literacy, policy, or responsible classroom use, I’m opening a limited number of conversations this spring. You can learn more or schedule a conversation here.

Leave a comment