• AI for Admins
  • Posts
  • 🗣 What students want adults to know about AI

🗣 What students want adults to know about AI

Plus: students using AI to introduce themselves to class???

I found two pieces of news that I’ve been DYING to share with you this week …

One is a report sharing teen/young adult views on generative AI. I wanted to share the findings — and what students think we (the adults) need to know about AI.

Two is a professor finding that students were using ChatGPT to write introductions of themselves to the class (!) in a discussion board. I wanted to share my mixed emotions about it and some issues this whole situation underlines in education.

They’re both below!

Also, I’d love to hear your response to this …

The report below shares teen/young adult views on generative AI and what they think WE should know.

What do you want students to know about generative AI? What would you tell them?

(Be as brief or as lengthy as you’d like.)

PS: Remember, we have an AI for Admins community, and you have free access! You can register, access it, and interact with others here.

In this week’s newsletter:

  • 🤖 What students want adults to know about AI

  • 🗣 DISCUSSION BOARD: What students should know

  • 👋 Students using ChatGPT to introduce themselves in class

  • 📚 New AI resources this week: School guidance, AI copilots/agents, “get rid of that tech”

🤖 What students want adults to know about AI

Image created with Ideogram.ai

Teens and young adults are using generative AI. And we should listen to what they have to say about it.

A report was published recently — “Teen and Young Adult Perspectives on Generative AI” — by Hopelab, Common Sense, and Harvard GSE’s Center for Digital Thriving.

They collected data from 1,274 US-based teens and young adults (ages 14-22) in late 2023.

Some key findings:

  • Half (51%) have used generative AI at some point in their lives.

  • Only 4% consider themselves daily users of generative AI.

  • Most common uses include getting information (53%) and brainstorming (51%).

  • Black teens are more likely to use generative AI to get information (72% vs 41%), brainstorm ideas (68% vs 42%), and get help with schoolwork (62% vs 40%).

Their perceptions of AI:

Teens and young adults are all over the board about generative AI.

Here’s their perception of the impact generative AI will have on their lives in the next 10 years:

  • 16% view it as mostly positive

  • 41% view it as both positive and negative

  • 19% view it as mostly negative

  • 8% view it as neither positive or negative

  • 16% don’t know or don’t have an opinion

What adults should know:

They shared lots of anecdotal evidence about generative AI. Much like their general perceptions on it above, teens and young adults had a wide variety of opinions.

Some viewed AI as “very dangerous” with risks and concerns like cheating, porn, bullying, privacy concerns, and addiction.

Others saw it as safe (“AI helps, not hurts” one said) with benefits like learning, creativity, and “to talk.”

“[Generative AI] is a tool that can be used for both beneficial and malicious purposes.”

latinx teen boy

While some are using it, others aren’t using generative AI and want adults to know that “we are not sure how to use AI,” “we are not really good at using it yet,” and “I don’t know too much about it.”

Teens and young adults did address the cheating issue, saying …

  • “A lot of my friends use AI to cheat on assignments.” — white LGBTQ+ teen boy

  • “It does homework.” — black teen boy

  • “Some [teens], not all, can use it for getting test answers.” — white teen boy

But not all are using it to gain unfair advantages on classwork. In fact, some see it as a tool to improve their learning

  • “We use it to start papers and get a structure to a paper started. Just edit out the details and nuances to put my personal touches on it.” — Asian teen boy

  • “We use artificial intelligence to help us with our homework.” — Asian teen boy

  • “You can make a special learning plan just for you. [AI] can also help with vocabulary.” — white teen boy

In the report, they also discuss using generative AI for creativity and fun; for companionship and comfort; and to answer questions.

They also report that it can be used for bullying and lying, including to parents. They see a role for adults in supporting young people’s uses of generative AI.

🗣 DISCUSSION BOARD: What students should know

Let’s turn around the topic of today’s email — what students want adults to know about AI.

This week’s question: What do you want STUDENTS to know about AI? What would you tell them?

👋 Students using ChatGPT to introduce themselves in class

Megan Fritts, an assistant professor at University of Arkansas Little Rock, gave her students an assignment.

“Briefly introduce yourself and say what you’re hoping to get out of this class.”

Here’s how it went …

Screenshot from Megan Fritts @freganmitts on Twitter/X

In an article in Business Insider, she expanded: “They all owned up to it, to their credit. But it was just really surprising to me that — what was supposed to be a kind of freebie in terms of assignments — even that they felt compelled to generate with an LLM.”

I have so, so many thoughts about all of this …

The first is outrage — that students wouldn’t even introduce themselves on their own.

But I think it’s fair to ask: how necessary is this assignment?

“What you’re hoping to get out of this class.” What if a student is really just taking it because they HAVE to, to fulfill a requirement? Could they actually say that?

Here’s another issue: she called it “a kind of freebie in terms of assignments.” We have a student motivation issue here. This statement kind of implies the value of points and grades in learning … and that students will do anything we ask for points.

Here’s one of my big takeaways: We’re going to have to be very intentional about what we assign — why we assign it — how we expect students to do it.

We’re going to have to get really honest with ourselves about how students might use AI to do work — whether we want them to or not.

  • Might we do shorter assignments — and more of them — so they’re less likely to be copy/pasted into ChatGPT?

  • Might we encourage more turn-and-talks? Small group discussion? With a quick “do now” reflection on it?

  • Might we scaffold our writing more so students feel more able to do the work — increasing the self efficacy?

Or maybe we want to stand our ground that these more “ChatGPT-able” assignments are what’s best for students. And we stick with them.

How this is like self checkout at Walmart

But if we’re going to do that, we’ll need to do a cost/benefit analysis on the risks and rewards of this approach.

Compare it to Walmart’s decision to allow customers to scan their own products at checkout …

Benefits: paying less in payroll to cashiers, better efficiency, happier customers (who prefer scanning their own stuff)

Risks: shoplifting, loss of revenue

Walmart decided that the risks (loss of money from shoplifting) were outweighed by the benefits (efficiency, payroll, happier customers).

They didn’t say that there were no risks.

They were willing to assume the risks.

They said that assuming the risks was worth it because the benefits outweighed them.

Let’s take a similar approach to modifying assignments in light of the existence of ChatGPT …

The risks:

  • The risk if we stand our ground: They might do it with AI.

  • The risk if we modify the assignment with something less: They might not grow as much as learners.

The benefits:

  • The benefit if we stand our ground: They get the rewards they always have from this type of assignment.

  • The benefit if we modify with something less: They feel like they CAN do the assignment — and will because it’s doable.

Am I saying that we should throw out any challenging work for students and make everything quicker and easier because they might use ChatGPT?

No.

But I AM saying … instead of saying “no” to the negatives and digging our heels in, let’s look at the big picture. Let’s look at the risks AND the benefits together in the same picture and decide what’s best.

We are really weighing lots of pro’s and con’s when we make decisions like this. We just have to be neutral and look at ALL of the pro’s and con’s. And take a cold, hard look at reality — what really, truly might happen — when we decide.

This AI world is the wild west. There are no easy answers. It’s going to be messy.

Let’s go into it with an open mind — and a desire to do what’s best for students — and be willing to adjust along the way.

📚 New AI resources this week

1️⃣ States are Crafting AI Guidance for Schools, but Have More to Do (via EdWeek): Nearly half of states — 23 total — have released some form of AI guidance.

2️⃣ Every White-Collar Role Will Have an AI Copilot. Then an AI Agent (via Andreesen Horowitz): An interesting look at the possible realities of the future workforce.

3️⃣ “Let’s Get Rid of That Classroom Technology” (via Digital Learning Podcast): This is a refrain we’re hearing more and more from teachers and schools. But is it shortsighted? In this episode of the Digital Learning Podcast, Matt and Holly discuss the pro’s and con’s of dialing back the technology.

I hope you enjoy these resources — and I hope they support you in your work!

Please always feel free to share what’s working for you — or how we can improve this community.

Matt Miller
Host, AI for Admins
Educator, Author, Speaker, Podcaster
[email protected]