25 Comments
User's avatar
Jack Watson's avatar

Cracking post, Terry. I especially think you raise a great point about it being a useful too for an established writer, because it's a little hack that helps us break through mental blocks. This can make such a difference to output and motivation. However, I still (considering myself as an established writer) enjoy the struggle and feel more satisfaction when I get through the barriers alone. Definitely a tough balance to strike, but it's still such early days for AI that there's no harm in chancing our arms and seeing what works.

Expand full comment
Terry underwood's avatar

Don’t get me wrong. When I’m using AI I’m not offloading. If anything, my brain is working harder. As Malcolm says, garbage in, garbage out. Automating mismanagement email updates for Walmart is one thing. I don’t use AI for that

Expand full comment
Matt Renwick's avatar

Hi Terry, I appreciate this post, especially when you note that "The key lies in recognizing it as a tool that amplifies existing abilities rather than replacing foundational development." I think about other interventions and resources we provide in schools. They can serve to create more equitable learning conditions, or cultivate learned helplessness. It's all in how we model and discuss technologies such as A.I. with students and faculty.

Expand full comment
Terry underwood's avatar

Yes. Everything we do in schools in one way or another supports student agency or restricts it. AI is no exception. We need guys like you because you have the good sense to monitor and act when things get too far out of whack and kids are getting hurt.

Expand full comment
Stephen Fitzpatrick's avatar

I agree with almost all of this, Terry. I suspect that over the past two years, curious writers who have not fallen into the “hate AI” camp outright have experimented in all sorts of ways with their writing process. I maintain that, for teachers, we have many different types of writing tasks that require different skills and content. I have no problem using AI to draft an assignment that needs to include very specific information in an organized list - think questions, reading lists, instructions, etc…. AI is very, very good at these. Anything primarily designed to simply convey information in a structured and logical way I see as a fundamentally different task than something more discursive, thoughtful, and conceptual. And I strongly agree with you that AI has actually helped my writing considerably though it would be fun to explore why I think that is in more depth. But the drafting piece is a slippery slope almost all students. So, so easy but fraught. What did you think of the recent NY Mag piece about cheating?

Expand full comment
Terry underwood's avatar

I haven’t seen the piece, Steve. To tell you the truth, Steve, I see cheating personally as uninteresting and have read enough about it for my stomach to handle, an iatrogenic disease that is easily explained. Suggestion: Instead of focusing on the tasks you want students to do (you’ve got that wired), try a gradual release of responsibility vis a vis self-regulation and hold students accountable for self-regulation. Every objection I hear from you (and this goes back to previous discussions) has to do with your fear that students can’t self-regulate. You see it all the time. Why can’t they? They have such little experience. Why don’t they have experience? Because their teachers have concluded long ago that they can’t self-regulate. They’ve had no mentoring. That’s the deep transformation, the relationship between teacher and students re: who’s in charge of my brain. Self-regulation in first grade is far different from self-regulation in fifth grade etc. School curriculum doesn’t even have a place for self-regulation. It’s assumed that kids can’t do it.

Expand full comment
Stephen Fitzpatrick's avatar

I would love it if this were the reality. Our school, and many others, are intensely focused on product. The irony is we actually have excellent teacher / student relationships. The trust is not there because the kids need the grades to get into the schools and the products get them the grades. I can't swim upstream against that culture on my own. Unfortunately, as long as the cheating narrative is the primary one out there - and it is - I'm afraid I'm just not at the right place to engage in the level of experimentation I would like and I fear we will keep butting our heads against the wall with this one-dimensional conversation. I wish I could get more of our teachers to read you and others doing this kind of thoughtful work.

Expand full comment
Terry underwood's avatar

That’s the wrong tree, Steve. And you can swim against the stream on your own. I did. And how do you know so many kids are cheating? The field can’t even define the term. Lots of written puffery and anecdotes—nothing realz. Truth be told, it makes me barf. Most high school teachers so concerned about cheating have very little idea what their students are doing. That’s why TurnItIn exists. I question your assumptions big time. The incentive scheme is backwards. Teachers throw the dice, set the rules, and corral the kids and then act surprised when they “cheat.” Many of the eloquent professorial types are so high on themselves and what they know they forget—that doesn’t matter in the child’s world. It’s coercion by ego. We need more teaching and less preaching.

Expand full comment
Stephen Fitzpatrick's avatar

Terry - just to be clear. We’ve had over 100 cases of academic dishonesty. There is nothing emotional about that claim. It’s real and it’s constant and it’s precisely because teachers AREN’T having conversations. I appreciate your experience but it is definitely our reality. I’m reporting facts.

Expand full comment
Terry underwood's avatar

But you are doing more than that. You are generalizing beyond your data. That’s the point. It’s not just you. We have an epidemic of overgeneralization that is distorting the public debate. I feel compelled to point it out when I see. I’m working with a raw data set from a survey of teachers in a high school who do NOT see such an increase. Teachers report far less of a concern with “cheating” than you are representing here. An alternative explanation is that teachers at the two schools define “cheating” differently. It’s such a loaded, vague charge. It sucks the oxygen out of the room. There is zero scholarly treatment of the issue. It’s all bluster and anecdote. I don’t doubt your “100 cases.” I can’t examine the claim because I have no definition of “cheating” at your school and cheating is not a simple gerund like swimming or chewing. Until there is a pause in this rhetoric we have a serious problem facing the institution. I’ve stopped reading the ranting and raving. It is personal and situated and therefore useless in conceptualizing a big picture. Your reality, my reality—they are microscopic fragments of the big picture. Teachers don’t acknowledge any limitations to their generalizations in these media-driven “debates.”

Expand full comment
Stephen Fitzpatrick's avatar

I would dispute I am generalizing beyond the data. The uptick in unauthorized student AI use over the past 9 months is well documented. I am plugged into a significant community of people in education - students, teachers, parents - and I can assure you, at least here on the East Coast, I am not overgeneralizing. But my larger point, and the reason I don't think you can afford to ignore stories like the NY Mag article, is that IS the narrative being driven and pushed by the mainstream media whether it's true or fair or overhyped. It's the narrative the larger public sees when they hear about AI in schools and the way in which they talk about it. What I am advocating for is a counter narrative that makes its way into legacy media sources. I've seen how people who advocate for AI integration or AI fluency or AI experimentation get treated first hand and it's a hard sell. You've been in enough faculty meetings to know that the one person who references the "cheating" article dominates the conversation and those who want to discuss the issue with more purpose and nuance are sidelined or silenced. Throw in the fact that most administrators either don't understand or don't use the technology and you do not have a recipe for thoughtful discussion. I loved your post about cheating and I would love for my colleagues to read it though I'm not sure they would agree. I do realize ours is one school but I have talked to enough people from different places to know it is not atypical. I would be willing to bet your school is more the outlier but, again, as you suggest, it's mostly irrelevant. I don't see a pause in the rhetoric anytime soon until we get a Terry Underwood post in the NY Times / WaPo / WSJ Op-Ed! How do you get the voices on Substack who are more thoughtful about this issue in front of the people who need to read it? That's why I championed the New Yorker piece - unfortunately, visibility matters.

Expand full comment
Terry underwood's avatar

Steve, you are offering data from your school along with assurances that schools on the East Coast are telling similar tales. Having been trained in educational research and having taught researchers at the doctoral level, I’m not in the habit of accepting assertions without the receipts. You wouldn’t get past peer review in a mid-level journal with such evidence. I’m not suggesting you are misrepresenting your experience; I’m am suggesting it isn’t compelling evidence. You lose credibility among academics by arguing such points as “reality” and “fact.” I’m also not willing to waste my time trying to influence those who are happy with their idiosyncratic impression. I’m serious as a heart attack when it comes to relying on credible data collected and analyzed reliably and with construct validity. Cheating in this rhetoric has no construct validity. My goal is to support future children as long as the good Lord lets me. To do this I must remember why I came to the dance: to contribute knowledge to literacy pedagogy. Your teachers are unwilling to deal with research?

Expand full comment
Stephen Fitzpatrick's avatar

Annette Vee's recent post provides all the "research" I need to confirm what I already know through my own eyes and ears.

https://aiandhowweteach.substack.com/p/how-are-students-using-ai?utm_source=post-email-title&publication_id=3346862&post_id=162547230&utm_campaign=email-post-title&isFreemail=true&r=eha4u&triedRedirect=true&utm_medium=email

It's abundantly clear that massive numbers of students are using AI. Of course the "cheating" question is in the eye of the beholder, but my guess is most teachers would see much of what students are doing with AI as falling into the "cheating" bucket. You are and I are obviously on the same side - we want to see more teachers engage with this technology more productively and less reactively. My main stance is I wish it was happening more and there was more institutional support (and yes, this is a particular individual frustration of mine) to do so and I am just not seeing it at scale yet. I am more impatient than you. I have a 5th grader and want to see a little more thoughtfulness around the question.

Expand full comment
Terry underwood's avatar

Steve, please read more carefully. Your bias is showing more than your assertion of evidence. I really can’t say any more about the importance of construct validity in surveys. Cheating has no construct validity. It’s like asking c”Have you ever fallen in love?” Note this hedge from Annette: “Studies vary widely on their measures of student AI use, but it’s highly prevalent and increasing dramatically. And we’re probably not seeing the extent of it.” You could show me survey results until the cows come home. I need to see the items in the survey and how the construct the survey is measuring is defined. Dare I suggest you read my post I just put up?

Expand full comment
Stephen Fitzpatrick's avatar

With due respect, spend a month in our school (and many others like it) and you’ll spot the problems. Culture is not easy to change, especially when the powers that be think it’s working. Fortunately, I can swim against the stream, but it’s a lonely place. But kids are cheating. Big time. And that’s what’s driving the conversation despite my efforts to change the focus. I can’t even get us to have a conversation at the faculty / department level, let alone offer ways for teachers to engage with students. Sorry, but that’s the reality. I wish it wasn’t and maybe it will get better but I am beyond frustrated.

Expand full comment
Terry underwood's avatar

I’ve taught enough to know how the banking concept of learning works and have worked in research. Statements like “Cheating is happening big time and that’s the reality” are emotional, Steve. That’s your reality. Suggest that teachers talk with their students. Let that become the reality. You can’t single handedly change culture. But all of this unsubstantiated and theoretically thin talk talk about cheating is an echo chamber. That’s why it seems so real. Btw, I am co-teaching a course for high school seniors as we speak. I’ve been in high schools. Try not to rest your argument on a specious assumption about my experience. Frustration is one thing. Take a deep breath and examine your assumptions. Things aren’t as binary as many teachers seem to think

Expand full comment
Ryan Bromley's avatar

Thanks for sharing, Terry. This is an issue that I think about a lot. In teaching my students to write with AI I'm always looking for the fulcrum where student voice, knowledge, and process strengthen ability and individual voice, rather than diminishing it. I work them through a three part process:

1. I have the students wireframe an outline with AI. This is exploratory and messy. 'What do I really want to say about this? What are all the arguments and critical issues? What is my focus/scope? What is my central thesis or question?' AI structures this into an outline, maximising for flow, cohesion, arc, brevity, etc.

2. Each element of the outline is then broken into questions, writing prompts, and assigned a word count. More questions tend to be better, and AI is asked to only ask one at a time. Students respond to the outline questions like an interview - extracting existing knowledge rather than filling in gaps. They shouldn't worry if they go over the word count, it's only an estimate to achieve a target length, but it can always be edited later. This process maintains student voice and agency in the thinking/drafting process.

3. The AI is instructed not to change anything in the canvas until approved. It is told to be reductive, not additive; correct errors but preserve student writing; not to change meaning or intention. It should evaluate for clarity, brevity, repetition, etc. AI is ask to add comments in the canvas, then go through them one at a time. Students should correct the comments themselves wherever possible. They want the AI to take what's there and make it shine, but not to change it to its own voice. Prompting is important here.

I've found this process to be an effective balance between student writing and AI facilitation. Vigilance is required, as AI always wants to do more than it's asked to do. It will change phrasing, word-choice, tone in subtle ways that matter. I teach my students to protect their writing against AI overreach. By having them assess and edit section by section, protecting but improving, students strengthen their cognitive processing. 

I thought I'd share my process, as it may be interesting for you. It seems we're playing the same game in slightly different ways.

Expand full comment
Stephen Fitzpatrick's avatar

This seems like a good strategy for older more advanced writers. I agree that part of my problem when trying to have students use AI for either feedback or structure in their writing process, is that it always wants to do more, as you also observe. I have dozens of prompts I share with students - almost all explicitly state not to rewrite, but invariably it will offer alternate suggestions and revisions. This is not fatal, but for the age group I teach - 10th, 11th, and even HS seniors - they tend to see these suggestions as gospel and have not fully developed the ability to impose their own view. All of this said, I have had some successes utilizing AI to help with feedback on structural and evidentiary problems within a student research paper. It’s a lot of trial and error trying to figure out how to leverage the things AI is really good at (outlining would be one) and teaching students how to navigate the aspect of AI output that they need to challenge. To ask most teachers to take valuable class time to experiment, especially when most are not especially well versed in the models, is practically a non-starter.

Expand full comment
Terry underwood's avatar

Steve, I’d ask teachers who can't share metacognition with students to start by talking with their students. They’re not going to go from zero to sixty in five seconds. Ryan is a perfect example of a teacher who is teaching self-regulation, not tasks. Re: your students who haven’t developed enough to know their own mind—that’s the fault of schooling. They’ve been micromanaged and graded so long they have a hard time breaking out of it when their teachers won’t let them. Teachers need to reach the point where they understand their students well enough before they even start to experiment with AI.

Expand full comment
Ryan Bromley's avatar

I agree. The overreach is an issue. Also, that most teachers don't take the time to experiment. Another issue is the pace of change; models update and change in unexpected ways, so what worked six weeks ago may not work the same way tomorrow. For example, the sycophant nature of 4o creates confirmation bias that my students (also high school) accept and even enjoy.

Much conversation in AI has been about IF AI should be applied in the classroom. This is a pointless conversation for me (with a few exceptions). Instead, the HOW is very interesting. Figuring out the how is exciting because it drives us closer to understanding the skill of leveraging human potential while nurturing it. This is transformational for education and the arena has shifted from research departments to applied work in classrooms. It's an exciting time to be alive.

Expand full comment
Stephen Fitzpatrick's avatar

You and I and others seem to be the outliers - most of my colleagues (and most educators I know - this is my 30th year teaching) are still very much in the IF camp, and many are simply in denial. I write a bit about the models and gave an AI PD day presentation to our faculty where I tried to explain that AI was way more than just ChatGPT. It’s not just the models but the other tools available - as a humanities teacher, I haven’t had much experience with coding, but using Claude and agentic AI’s like Manus, I’ve been able to build websites with online teaching modules, create entire interactive textbooks from scratch, and just push the envelope with what’s possible. I also teach an independent research class and the conversation around “cheating” and research strikes me as absurd. Deep Research models and platforms like Elicit are simply going to be the norm. To suggest kids should not learn how to use these makes no sense to me. Anyway, I’m glad someone else is experimenting!

Expand full comment
Ryan Bromley's avatar

That's great to hear. I couldn't agree more. The whole 'cheating' conversation is a lazy way of pointing the finger at students, instead of schools and teachers. Very soon, teachers that don't know how to effectively utilize AI for accelerated learning are going to be relics of a past system.

I've created a new course for next year that places AI at the centre of the classroom. If AI can leverage ability by 10/100/1000x then the challenges we've been assigning students need to grow by the same measure. The aim of the course is to see how far students can reach in collaboration with AI. Students apply by proposing a challenge they would like to try to achieve (i.e write a finished novel in a school year, learn a language, an engineering project, a coding project, etc.) I guide them through the process and provide Critical AI Skills workshops. I've developed a really cool automated assessment tool that provides feedback loops. Can't wait to get started.

In my EdTech lab, students use AI to support their student-directed start-up. They're also working on building their own LLM on a local server.

All exciting stuff. I can't imagine ever teaching a standardised curriculum again!

Expand full comment
Terry underwood's avatar

Incredibly cool, Ryan. I’d LOVE to be a kid in your class!

Expand full comment