• AI for Admins
  • Posts
  • 🤖 Decision time: Evaluating and selecting AI tools

🤖 Decision time: Evaluating and selecting AI tools

How to make smart, responsible decisions

The competition in the AI edtech space is fierce.

Now, I have to say that there are some genuinely great people working for edtech companies, developing products with educator voices in mind and working toward the greater good.

But there are sharks in the water, too. 🦈

It’s hard enough to evaluate and choose tech tools. It’s even harder when there’s a whole new class of technology — artificial intelligence — to manage.

And it’s even harder than that when there’s fierce competition for your business — and sometimes, people will do anything to make a sale.

In today’s AI for Admins, we’re looking at some steps you can take to select and choose AI tools in a smart and responsible way.

Plus, I’d love to hear from you if you’ve been on this journey. Hit reply and tell me your best tips, advice, and lessons learned. I’d love to share some of them in next week’s newsletter!

In this week’s newsletter:

  • 😆 WeWillWrite gamifies writing — and creates low-stakes practice

  • 🗣️ Your voice: Teacher AI literacy

  • 🗳 This week’s poll: Evaluating and choosing AI tools

  • ✅ Decision time: Evaluating and selecting AI tools

  • 📚 New AI resources this week

😆 WeWillWrite gamifies writing — and creates low-stakes practice

This message is sponsored by WeWillWrite

Does it ever feel like we’ve been practicing writing all wrong?

Drudgery. High stakes. Anxiety.

What if, instead, we made writing fun? Social? Gamified? Low-stakes?

WeWillWrite (wewillwrite.com) is the gamified social writing app that students will beg for.

  • Set up a writing challenge (and share with students)

  • Students complete fun writing prompts

  • They’re sorted into teams and vote

  • Votes earn points that push teams up the leaderboard

  • The teacher can pull up student writing and provide feedback

The best part? The free version offers a LOT (and the premium version unlocks LOTS of potential).

🗣️ Your voice: Teacher AI literacy

In last week’s poll, I asked … What is most important in developing teacher AI literacy?

Poll results:

🟩🟩🟩🟩🟩🟩 Basic, fundamental understandings about AI (30)
🟨🟨⬜️⬜️⬜️⬜️ When to use AI in classwork (or not) (12)
🟨🟨🟨⬜️⬜️⬜️ Ethical implications of AI in school and the real world (18)
🟨⬜️⬜️⬜️⬜️⬜️ How it likely will change the world (work, personal, etc.) (7)
⬜️⬜️⬜️⬜️⬜️⬜️ Other ... (4)

Here’s what you shared with me …

  • It is important that we connect current learning to the real world in order to build relevancy for students. Students need to understand the ethical implications of AI in school and the real world so that they develop usable real world skills around AI. Whether we teach these ethics in the classroom or not, students are going to use AI; however, where will they learn about the ethics of AI outside of the classroom? — I. McDougall

  • Teachers already shoulder tremendous responsibilities. Many view AI as yet another burden added to their workload. In reality, AI represents something revolutionary: a tool that actually lightens their load. With a basic understanding of AI's capabilities, teachers can recognize its true value as an assistant rather than an obligation—potentially sparking curiosity to explore its possibilities further. — A. Schak

  • Understanding the differences in different tools is important to me. When I hear a teacher say "don't use an AI to do this for you" something cringes in my head. I think my personal philosophy about AI has changed the more I use it for different tasks, and I also understand the limitations of what I ask it to do. — Lindsey Bolin

  • I think teachers will continue to be resistant to developing AI literacy in themselves as well as students if they don't fully grasp the ethical implications around AI. If they only see AI as a threat to academic integrity, they will avoid the conversation and live in a "gotcha" mindset when it comes to AI as opposed to talking to their students about the potential impact on relationships, creative work, and even warfare. — Galina Bell

🗳 This week’s poll: Evaluating and choosing AI tools

What do you value the most when it comes to evaluating and choosing AI tools? (I know that several of them may be important … but for the sake of a poll, please choose one — and tell us why!)

Instructions:

  1. Please vote on this week’s poll. It just takes a click!

  2. Optional: Explain your vote / provide context / add details in a comment afterward.

  3. Optional: Include your name in your comment so I can credit you if I use your response. (I’ll try to credit you from the info in your email address unless you tell me not to.)

What is most important when evaluating and choosing AI tools?

Login or Subscribe to participate in polls.

Decision time: Evaluating and selecting AI tools

AI image generated by ChatGPT’s DALL-E

You’ve seen all of the buzz on social media and online.

You may have been to a conference and traveled the vendor hall, hearing all of the pitches at vibrant, meticulously planned vendor booths.

There are LOTS of AI-powered apps these days.

They all promise to “revolutionize education” … or some sort of buzzy statement like that. And they ALL really, really want your business.

EdTech — especially in the AI era — is big business.

Think of it this way: There are WAY more companies and products out there than all of the world’s schools can support with their classroom use.

Not everyone is going to thrive. Not everyone will survive.

We can’t pick everything. We have to be smart. But how?

I’ve collected 10 suggestions for evaluating and selecting AI tools. 

(NOTE: I haven’t done this work myself. So I’ll draw from my own experiences — as well as suggestions from other educators. I added dozens of state education AI guidance documents to NotebookLM and pulled general conclusions from them — and specific suggestions/conclusions, which I cited directly.)

(INPUT: If you have experience or advice you’re willing to share, please hit reply — and I’ll share what you all share in a future newsletter.)

1. Gather all of your options — but don’t overgather.

At this point, there are probably thousands and thousands of edtech tools that are powered by AI in some way.

You need a list. Then you need a short list.

And (I believe), you don’t need to consider all of the tools. Be specific. What are the needs you’re trying to address with particular tools?

Personally, the tools that grab my attention? They’re the ones that teachers have said that they love and actually use in class. I also look for school and district success stories (but not just from the company marketing hype).

2. Remember that you have power and you have options.

In the edtech world, there’s a lot of FOMO (fear of missing out).

Companies are watching each other. If a competitor launches a new feature, companies commonly replicate that feature just so their potential customers don’t leave for a product with that feature. (Consequently, it leads to feature bloat — when products do too many things instead of focusing on what makes them great.)

What does this mean for you? As a decision maker, you have power. And options. If you aren’t thrilled something — the contract being offered to you, the company’s practices, the training offered — you can walk. There’s probably another company that can offer similar options.

It’s like buying a car. Lots of folks buy on emotion — they loved the test drive and want to keep that car forever. Then they don’t do their homework and explore other options (and pricing) … and they miss out on something better.

3. Establish a process for selection (but be flexible).

The boss might have a selection process in their mind. But if everyone involved can’t see that process in the boss’s mind — and isn’t aligned on how it should look — you’re going to have problems.

Some things to consider when establishing that process:

  • Determine what your needs are — what you want the tool to accomplish.

  • Determine where you want to go as an organization — and how the tool can help.

  • Determine where you are now — existing practices, programs, tools, etc.

  • Determine who you want to (need to!) listen to — teachers, leaders, other districts, parents, community, etc.

Consider a variety of evaluation criteria, like …

  • Evidence (research, studies, etc. … general education research AND AI-specific)

  • Data privacy and security (federal, state, and local privacy laws … why and how the tool collects data and whether you can opt out)

  • Equity and ethics (does it promote fair and balanced usage, address potential bias, consider accessibility?)

⭐ A great resource: ISTE’s Teacher Ready Evaluation Tool. It asks you lots of questions about user interface, learning design, digital pedagogy, and other criteria.

4. Connect and align with guidance documents.

To know how to proceed, you have to know where you want to go.

(A GPS map app needs a destination before it can give directions!)

Arizona’s state AI guidance document suggests establishing a “north star”: “Envision the possibilities to transform not only teaching, learning, and leadership, but also our profession, and unquestionably, our future workforce.”

The U.S. Department of Education’s Office of EdTech, in its AI guidance document, suggested reviewing and aligning with core documents like:

  • mission statement

  • organizational strategy documents

  • portrait of a graduate

Ohio's state AI guidance document lists five steps to take to develop policy for AI in education — which really helps align to core values. (Check it out in detail on page 10 of the linked document.)

  • Step 1: Take stock of the current landscape by understanding available AI technologies, existing relevant policies, and internal resources.

  • Step 2: Identify the high-level values, goals, and priorities that are already driving initiatives within the school or district, independent of AI.

  • Step 3: Derive principles for the responsible adoption and use of AI by considering the technology landscape and the identified core values and objectives.

  • Step 4: Articulate actionable, evidence-based policies governing the adoption, use, and retirement of AI based on the derived principles and an inventory of assessable evidence.

  • Step 5: Put the policies into practice by addressing practical conditions, providing necessary training, and implementing monitoring and feedback mechanisms.

5. Do a risk analysis.

When you work with a technology — artificial intelligence — that doesn’t perform the same every time and is unpredictable by nature, you’re going to introduce risk.

There’s the old business maxim — if you can’t measure it, you can’t manage it.

A risk analysis helps identify potential harms and negative consequences involved with using AI (or any technology, for that matter) in an educational setting. Some things to consider in a risk analysis:

  • privacy and data security — are student and organization data being protected?

  • bias and fairness — how likely is information generated by the systems to be fair and just?

  • accuracy and misinformation — how likely is it to be inaccurate or to perpetuate unfair claims?

  • academic integrity and plagiarism — are there safeguards in place to promote responsible student use and mitigate overreliance?

  • transparency and explainability — can the company explain how it works — and where it gets its information?

  • legal and policy compliance — is the company willing to comply with applicable laws and policy?

6. Do a pilot program before going all in.

Pilot programs put the technology in the hands of real educators to see how (and if!) the tools will be used in a classroom setting.

Sure, it may be impossible to get a complete understanding of how a teaching corps will use a new tool before implementing it.

But if a sample of teachers is representative — aligning with the majority of educators — there’s a good chance it can at least give insights.

Include your all stars — your ambitious go-getters. But try to pull in your cautious tech users (and even your tech avoiders, too). All of these teachers make up your teaching corps — and all of their voices should be heard.

7. Stress test any product.

Pressure, as they say, will bust pipes — and also create diamonds.

You have to know if a product’s pipes will withstand the pressure of all sorts of uses.

It may seem unintuitive, but try all of the tricks that your students might pull — any of the ill-advised, inappropriate ways of using the product. Then, see how it performs.

I’m going to stay super general on this, but … I was sent a transcript of a student-facing AI tool (one you’ve probably heard of) (and no, you can’t side text me to get the name of it). The conversation started in a health class-related topic. The user was able to guide the AI chatbot to discuss very explicit sexual topics without raising flags.

Stress test failed.

If you have students that you trust, you might even invite them to stress test a product. I consulted with a district in norther Indiana that was a very early tester of Khan Academy’s product, Khanmigo. Their Khan Academy rep asked them to stress test the product for them. The process was helpful to the company but also to the district.

8. Get feedback from a variety of perspectives.

Talk to EVERYONE you can to gather feedback, ideas, concerns … any data points you can. That includes …

  • teachers (who will use the product directly)

  • students (who also will use the product directly)

  • tech integration specialists (who will advise teachers and students)

  • district tech admin (who will manage the product from the back end)

  • parents (to gauge their opinions on AI in general and the specific product … and not just parents who are your buddies!)

  • representatives from the product/company (but take all of their claims with a grain of salt)

  • representatives from other products/companies (they might dish on their competitors)

  • other school leaders (who have or haven’t used the product)

  • people who hate AI (in education or not)

  • people who love AI (in education or not)

  • trusted voices online (who you’ve followed for a while and respect)

  • other stakeholders (anyone who might lend a voice or perspective that can help you make a good decision)

9. Gather data points at different times.

The pilot program and the initial look at a product? Those will help you get a first take about an AI tool.

But I’d look at this like I’d look at buying a new dishwasher.

  • Will I love it after I install it?

  • Will I love it in three months?

  • Will I love it in a year?

  • Will I still be using it in five years?

Your teacher feedback — especially from the pilot program — will be more valuable the longer you let the teachers use the product.

This also goes for other testers — your tech integration specialists, your curriculum folks, your IT admin … everyone.

  • After a semester, what are they using the product for?

  • Are they using it at all?

  • What features do they use the most — and what haven’t they touched?

  • What is it good at — and terrible at?

  • Does it integrate well with other tools and technologies?

10. Prepare for AI professional development.

I’ve been at schools where tech rollouts involved little more than “OK, these interactive displays have been installed in your rooms … you’re all set!”

Introducing a new AI tool (let alone AI tools plural) into the teaching ecosystem involves a variety of factors …

  • how it will be used to support instruction by teachers

  • how it SHOULD NOT be used to support instruction

  • implications for overreliance and overuse

  • cautions and concerns about using AI

  • preparing students for the future workforce

  • developing and promoting AI literacy

This isn’t just AI tool training.

This is professional development about AI — AI literacy, AI ethics, AI responsible use.

It’s a life skill. And it’s one that our students AND our teachers need.

Choosing a new AI tools is, in a way, like a marriage. You don’t just say “I do.” You pave the way for a long-term responsible relationship — for success.

A successful marriage is predicated on more than a dress and a caterer.

It’s a commitment.

I don’t want to overstate here, but if you want to support educators and, in turn, your students, you can’t just reduce this to AI tool training.

What you need is AI literacy — and a vision of how to prepare students for the future that they WILL indeed face in their lifetime.

What else could be added to this list?

Have something else to contribute to this list?

Do you have some practical experience with evaluating and choosing AI tools?

Please hit “reply” and let me know. I’d love to share it in an upcoming newsletter!

📚 New AI resources this week

1️⃣ Schools relying on digital surveillance find security still takes a human touch (via AP News): In light of school shootings, schools are finding that AI surveillance isn’t a panacea.

2️⃣ Most Americans don’t trust AI — or the people in charge of it (via The Verge): Two new studies show that the public is anxious about AI.

3️⃣ Gen Z is still anxiously using AI: Poll (via Axios): Here’s how young Americans say AI makes them feel

I hope you enjoy these resources — and I hope they support you in your work!

Please always feel free to share what’s working for you — or how we can improve this community.

Matt Miller
Host, AI for Admins
Educator, Author, Speaker, Podcaster
[email protected]