14-05-2025
I Thought ChatGPT Was Killing My Students' Skills. It's Killing Something More Important Than That.
This essay was adapted from Phil Christman's newsletter, the Tourist. Subscribe here.
Before 2023, my teaching year used to follow a predictable emotional arc. In September, I was always excited, not only about meeting a new crop of first-year writing students but even about the prep work. My lesson-planning sessions would take longer than intended and yet leave me feeling energized. I'd look forward to conference week—the one-on-one meetings I try to hold with every student, every term, at least once—and even to the first stack of papers. In October, predictably, I'd crash a little bit, but by late November, I'd be seeing evidence that even some of my least enthusiastic students were beginning to take in everything I'd been trying to tell them. By the time classes ended, I'd be loving everything about my job again, especially in the years when kids would stay behind on the last day to shake my hand and say thank you, or write me a note. The second semester would go roughly the same way. The exhaustion would hit a little earlier, which made the recovery all the sweeter.
The funny thing about this cycle is that I would forget, every time, that it was a cycle. In October, in March, I would genuinely believe that I had never had a group of students who had let me down like this before, and that I had never let myself and a group of students down to this extent before. The crash was new each time. Oh, sure, I thought, a year ago I kind of felt this way, but this time I have solid reasons—last year's solid reasons having already evaporated from my memory. The intensity of teaching brings a certain amnesia with it, like marathoning and—I am told—childbirth. I only know I go through this cycle because my wife watches me go through it every year, and reminds me. She remembers last year's solid reasons even if I don't.
Since the 2022–23 school year, when ChatGPT-2 and then -3 hit the scene, this cycle now has a new component. About a week or so after the end-of-semester Good Feelings Era, I read the latest big journalistic exposé about the ubiquity of college-level Chat-GP-Cheat and start wondering whether anyone learned anything. As I end yet another semester, I have my pick of such articles, whether it's this ambivalent longer view from the New Yorker or this rather sensational on-the-ground exposé from New York magazine. The latter article begins by introducing us to a student named Lee (not his real name):
Lee was born in South Korea and grew up outside Atlanta, where his parents run a college-prep consulting business. … When he started at Columbia as a sophomore this past September, he didn't worry much about academics or his GPA. 'Most assignments in college are not relevant,' he told me. 'They're hackable by AI, and I just had no interest in doing them.' While other new students fretted over the university's rigorous core curriculum … Lee used AI to breeze through with minimal effort. When I asked him why he had gone through so much trouble to get to an Ivy League university only to off-load all of the learning to a robot, he said, 'It's the best place to meet your co-founder and your wife.'
'The best place to meet your co-founder and your wife'! Only well-off people expect life to be this predictable; everybody else knows better. In fact, there are, if we have eyes to see them, many reasons in this early paragraph not to panic, not to feel that anything has fundamentally changed. Lee's parents, we're told, run a test-prep company, which means that he's part of a large, existing system that already treats education as a series of robotic steps even as it pretends to value students and learning. Well, any longtime writing instructor knows that there's no real way to stop a determined rich kid from cheating their way through a writing class. If nothing else, they can always afford to pay someone to write a paper for them—and even if you think you've attuned your paper requirements so thoroughly to your assigned readings and class discussions that a bought paper will fail your rubric, they can probably always pay someone enough to fake that. For ye have the rich always with you. Lee is almost charmingly brazen in his lack of integrity, almost innocent in his seeming ignorance of the possibility of having it. After he gets hauled into Columbia's honor court because he built a business helping other kids cheat their way through remote interviews, his story concludes thus: 'Lee thought it absurd that Columbia, which had a partnership with ChatGPT's parent company, OpenAI, would punish him for innovating with AI.' There's hope for Lee yet. Though maybe not for Columbia University, governed as it is by people who aren't even capable of this insight.
Lee is a problem, but a problem of a sort that I'm familiar with. It's a student like Wendy who makes me panic:
I asked Wendy if I could read the paper she turned in, and when I opened the document, I was surprised to see the topic: critical pedagogy, the philosophy of education pioneered by Paulo Freire. The philosophy examines the influence of social and political forces on learning and classroom dynamics. Her opening line: 'To what extent is schooling hindering students' cognitive ability to think critically?' Later, I asked Wendy if she recognized the irony in using AI to write not just a paper on critical pedagogy but one that argues learning is what 'makes us truly human.' She wasn't sure what to make of the question.
What most worries me about this anecdote—which is perfect in its thematically fractal quality, with the first sentence of Wendy's ersatz essay embodying the intellectual decline that 'her' essay ostensibly describes—is that I'd be reasonably happy if a first-year student turned in something like this. It doesn't have that ChatGPT stiffness that I've learned to look for, and unlike a lot of (fake and real) essays that I always end up tearing apart in the comments, it immediately zeroes in on a single point, rather than messing about with the three-pronged '[Writer] does [X] by doing [thing, thing, and thing]' format that Advanced Placement classes have cursed us with, and that I spend weeks deprogramming my students out of. I would maybe cut 'cognitive' out of the sentence, but it's otherwise unobjectionable. If this is what cheating now looks like, I not only don't know how I'm supposed to tell if my students are cheating—I don't even know how I can be sure they wrote the thank-yous that mean so much to me. ChatGPT, in giving my students an alternative to skill-building, hurts their ability to learn, but more than that, it kills the trust that any teaching relationship depends on.
Or perhaps it simply reminds us that that trust has always been a precious, much-abused thing. If I feel that my job now requires me to make judgments that are basically impossible—to tell an orderly, slightly stiff, reasonably good paper arrived at through hours of frustration from one arrived at through a minute's prompting and half an hour of light editing, for example—the job of my students has always been likewise impossible. There I am, demanding that they practice the extreme vulnerability of young adults learning in public, asking them to commit themselves to the study of things such as reading and writing that I consider to be living processes, open-ended and unmasterable. And there the surrounding society is—their justifiably anxious and perhaps indebted parents, who want them to be successful and happy; the corporate donors and partners that prestigious schools openly court and who want them to be productive and docile employees. What they want are people who have mastered various discrete and somewhat mechanized sets of skills. All of us insist on the life-and-death importance of a thing called 'education' while not remotely agreeing on what that thing is. And then there are the demands of their own big, half-formed, restless selves to consider too. What should we expect, but that they should take every shortcut in their doomed efforts to placate everyone? We asked them to work hard, but forgot to give them a consistent answer as to why. No wonder they cheat—they must already feel so cheated.