If you’re a school leader, you probably have teachers like Heather Gauck in your district. At least … I hope you do.

(And if you’re a teacher, you might be like her — or work down the hall from someone like her.)

She’s the author of today’s guest post (below).

And she is just the kind of teacher we need in education right now. As AI integrates into schools and classrooms in bigger ways, we need teachers who are willing to innovate and tryevaluate and adjust … and then, most importantly, share what works and what doesn’t.

Her guest post below — “Stepping into Discomfort: Why Educators Don’t Need to be AI Experts” — comes from her own experiences. It centers around this crucial question …

“Can we embrace AI as a tool for deeper learning?”

Even if we think that AI has potential in schools, it’s a question we should all seriously consider and wrestle with. We can’t quickly and easily default to “yes” or “no.”

(PS: Would you like to write a guest post for AI for Admins? Hit reply and let’s talk about it!)

Heather’s experiences that she shares today feel so familiar to me and my own.

I wrote about them in my new book, AI Literacy in Any Class, which released last week!

📢 KINDLE NOW AVAILABLE: It’s FINALLY available in Kindle ebook now ($9.97 USD) … a significantly lower price than the paperback ($26.95 USD)!

More details and discussion from the book upcoming. But for now, let’s get to this week’s newsletter content!

In this week’s newsletter:

  • 📢 NEW RELEASE: AI Literacy in Any Class

  • 📚 New AI resources this week

  • 🗳 Poll: Modeling tech use with students

  • 🌱 Stepping into Discomfort: Why Educators Don’t Need to be AI Experts

📢 NEW RELEASE: AI Literacy in Any Class

Everyone’s talking about AI literacy these days. The politicians, the tech CEO’s, the influencers … they’re all saying: “We need to teach students x, y, and z about artificial intelligence.”

But how does it actually fit in a classroom when you have a curriculum to teach and content to cover?

AI Literacy in Any Class provides teachers with a framework to incorporate tiny AI literacy shifts in what they’re already teaching — in ways that strengthen how they teach.

“Matt has created the best case for easy acts of incorporating AI literacy in whatever you teach!” — Dustin Rimmey

Interested in running a book study with teachers?

  • Email [email protected] to get a significant bulk discount on orders of 10+ (with free shipping!).

  • FREE book study materials are coming!

📚 New AI resources this week

1️⃣ The AI-Resistant Classroom Is a Myth (via eSchool News) — Argues that leaders should stop trying to “AI-proof” classrooms and instead redesign assessments that assume AI is present and still measure real thinking.

2️⃣ Schools Are Teaching AI — and Making a Massive Mistake (via Washington Post) — Makes the case that schools should focus less on tool use and more on helping students develop agency, judgment, and understanding of how AI works.

3️⃣ Real-Time Data Shows Exactly How Students Use AI on School Devices (via Education Week) — New data from 1.2M student interactions reveals how AI is actually used (and misused), with clear implications for district policy and monitoring.

What would you like to read in AI for Admins?

What’s a topic you’d like to see covered here? Hit REPLY to this email and let me know.

Have you done anything you’d like to share with the AI for Admins community? Hit REPLY and let me know.

Would you like to write a guest post to support and equip AI for Admins readers? Hit REPLY and let me know.

🗳 Poll: Modeling tech use with students

Instructions:

  1. Please vote on this week’s poll. It just takes a click!

  2. Optional: Explain your vote / provide context / add details in a comment afterward.

  3. Optional: Include your name in your comment so I can credit you if I use your response. (I’ll try to pull names from email addresses. If you don’t want me to do that, please say so.)

Login or Subscribe to participate

GUEST POST

🌱 Stepping into Discomfort: Why Educators Don’t Need to be AI Experts

Guest post by special education teacher Heather Gauck

I first attempted to integrate technology into my classroom more than a decade ago.

My carefully planned lessons often crashed and burned as I growled at my computer while my students watched.

In those moments, my students didn’t just see a teacher; they saw a learner. The productive and, face it, uncomfortable struggle was happening right in front of them.

With the recent news that Google and ISTE+ASCD will partner to train six million educators in AI, teachers bring the most essential perspective on what it actually takes to integrate AI thoughtfully into teaching and learning.

The urgency is clear: RAND Corp. found that the number of districts training teachers on AI more than doubled in just one year, with three-fourths expected to follow suit by late last year.

I leaned into technology because it gave a voice to my often voiceless special education students. It allowed them to demonstrate their understanding through multiple means of expression and provided greater access and equity.

Take, for instance, Anna … For a full year, I could not hear her talk as she mumbled and was afraid to speak. Anna was a selective mute, but with the help of an iPad app, she learned how to modulate her voice by watching the animal shapes on the screen open their mouths. It was not without hiccups along the way as the app crashed or would not work, but it was Anna that year who taught my new students the importance of using volume when making our vocabulary videos.

For me, and for them, that time was a mix of excitement and frustration. Technology failures became opportunities for my students to watch me develop Plan A, then B, C, and sometimes even D, until a solution emerged. 

Those early failures led to an unexpected lesson: I was modeling persistence, problem-solving, humility, and lifelong learning. I learned that teachers do not need to be experts to be effective. My discomfort was not a sign of failure. 

AI is no different. It offers the same opportunity for shared learning, even as it raises legitimate concerns about ethics, cheating, bias, and misuse. Many educators feel pressure to become experts in a rapidly evolving AI landscape. In some educational spaces, that discomfort has already led to restrictions or calls for banning AI tools, as districts and states struggle to draft policies about how, when, and whether these technologies should be allowed in classrooms

Yet some of my most impactful teaching moments this year have come from AI-related missteps. When one of my students cursed, Magic School AI flagged the content as inappropriate, and that became a conversation. I brought the student into my room, and we discussed responsible use of language, digital citizenship, and accountability. I watched his eyes fill up with tears. “I just sometimes get really angry,” he said. Rather than punishing the behavior, I told him it was okay to feel angry, but we needed to find better ways to express it. In this moment, AI became a powerful learning opportunity.

So the question becomes: can we embrace AI as a tool for deeper learning? Banning AI removes the opportunity to teach ethics, build critical thinking skills, and prepare students for the real-world tools they will inevitably encounter. If we ban AI outright, we lose those moments. We send the message that mistakes are something to hide rather than learn from. We also miss a chance to teach students how to navigate powerful tools responsibly in a world where AI is not going away.

I do not need to be an AI expert. When I designed chatbot rooms in MagicSchool AI that were too wordy and overwhelming, my students’ visible frustration became valuable feedback. It pushed me to refine the room so it truly served their learning. There is tremendous power in co-learning alongside students, power that builds trust, transparency, and an authentic classroom culture.

When teachers step into discomfort, when we admit we are still figuring things out, we model curiosity and ethical decision-making. We show students that learning doesn’t end with a degree or a job title. We normalize asking questions, setting boundaries, and revising our thinking as new information emerges.

As I reflect on my early years integrating technology with my special education students, I find myself once again learning, struggling, and growing through AI integration. I remind myself that discomfort leads to growth, and that modeling how to learn may be the most important lesson we teach.

Years ago, my students didn’t need me to be perfect with technology. They needed me to be willing to try. The same is true now with AI.

If we choose to embrace AI thoughtfully, acknowledging both its promise and its pitfalls, we give students something far more valuable than answers. We give them a living example of how to learn, adapt, and act responsibly in an ever-changing world.

And that may be the most important lesson of all.

Heather Gauck is a special education teacher in Grand Rapids Public Schools, Michigan, and a Teach Plus Leading Edge Fellow.  

I hope you enjoy these resources — and I hope they support you in your work!

Please always feel free to share what’s working for you — or how we can improve this community.

Matt Miller
Host, AI for Admins
Educator, Author, Speaker, Podcaster
[email protected]

Reply

Avatar

or to participate

Keep Reading