Mar 03, 2026
This commentary is by Ellen Parent, a high school English teacher who lives in Berlin. Let’s say you wake up one morning and decide it’s finally time to get ripped. Being absolutely chiseled will make it easy to lift heavy objects, help you feel strong and competent and — let’s be ho nest — get you ready for swimsuit season. So you head to the gym. Gyms are great places to get strong. They have instruments specially designed to build your muscles. They have trainers available to guide your progress towards washboard abs. Working out can be tedious and challenging, for sure, but all those reps and sets get you ready to tackle the world with strength, agility and confidence. READ MORE Now, let’s say your trainer shows you a new piece of equipment: a robot that will assist you with the more tedious or difficult parts of your workout. Not sure how to curl a dumbbell? That’s okay — the robot can curl it for you. Getting tired on the elliptical? Don’t worry — the robot can move your legs for you. Want to deadlift 500 pounds like you saw that guy do on TikTok? The robot will support your arms so you can do it on day one.  So you use the robot workout assistant. And exercising is so easy! Unfortunately, no matter how much you train, your muscles don’t seem to get bigger. You can’t lift objects at home. You’re not ready for swimsuit season. You just aren’t very ripped.  As a high school English teacher, I know what builds students’ mental muscles. The more we do with our brains, the more capacity we have. On the other hand, the more cognitive tasks we offload, the more we limit our own ability to do those things.  This is especially true with developing brains. Kids need lots of practice and challenge to maximize their cognitive growth. This is why all I’ve learned about artificial intelligence (AI) these last three years points me to an absurdly obvious conclusion: using generative AI in school will harm our students.  Consider the writing process. To be honest, when you’re writing an essay for school, the product doesn’t matter very much. But every part of creating that product, from outlining to drafting to revising, is essential. It trains our brains to do the important things: idea generation, problem-solving, critical thinking, logical thinking, attention to detail, persistence and sustained attention.  Take away any part of the writing process, and we take away the opportunity for our minds to grow and gain power. So when a new tool claims it can help students by generating ideas for them, doing their research, drafting their essays and fixing their mistakes, I see what it’s really doing: measurably removing opportunities for growth and learning. Cheating our students out of their education. The emerging research backs me up. A new study by the Brookings Institution’s Center for Universal Education examines the effects of AI use in schools. While it points to some theoretical benefits, it also describes what an NPR summary of the report characterizes as a “doom loop of AI dependence,” which could cause cognitive decline and mental atrophy in students. Students who use generative AI, the NPR article continues, “are already seeing declines in content knowledge, critical thinking and even creativity.”  Another study by MIT found that some participants who used ChatGPT to write essays showed lower neural engagement and weaker recall of their own work compared with those who wrote unaided — findings that raise concerns that heavy AI assistance could reduce deep cognitive engagement during writing. It’s sometimes argued that students should use AI in school because AI is transforming our society, and they need to keep up. I agree, but with major caveats. To effectively use AI, you need skills you can only develop by doing the kind of practice AI seeks to replace. Yes, kids will need to learn how to use it. But we don’t put a scalpel in the hands of a kindergartener because they want to one day become a surgeon. AI literacy needs to be carefully planned, incremental and designed with a firm foundation of real-world learning. Additionally, asking students to decide on their own when to use AI and when to avoid it is unfair. I’ve seen experts suggest that students use AI for parts of the writing process, but not others. But AI is programmed to do the work for us, and offers the temptation in the form of persistent, cheerful suggestions. Asking students to resist this temptation, when the tool is designed to get us to use it more, is unrealistic and counterproductive. At the very least, students need rigorous education and preparation before being asked to use these tools. As with all huge, earth-shakingly complicated topics, there are many nuances I haven’t captured here. The conversation around AI should (and does) look a little different in the worlds of special education and English Language Learning. And sure, there may be helpful ways for teachers to use AI in their own workflow.  I even see a glimmer of potential in some AI systems currently being developed as tutors — designed to ask students questions and prompt their learning, rather than provide answers. But we owe it to our kids to lead with caution. The potential benefits are murky, but the harm is crystal clear. Read the story on VTDigger here: Ellen Parent: Keep AI out of schools. ...read more read less
Respond, make new discussions, see other discussions and customize your news...

To add this website to your home screen:

1. Tap tutorialsPoint

2. Select 'Add to Home screen' or 'Install app'.

3. Follow the on-scrren instructions.

Feedback
FAQ
Privacy Policy
Terms of Service