Why I Don't Care if Students Use GPT

They can go ahead, use it to cheat on their essay. It won't do them much good.

An Experiment

Here's a recent experience I've had with GPT-3.5: I was editing a paper by one of my graduate students, and right from the start, the abstract was very difficult to understand. It started with a slow introductory sentence (a pet peeve of mine), then was riddled with repetitions, passive and vague language. By the time I read through it, I wasn't even sure what the main point of the paper was.

So I asked GPT-3.5 to improve the writing.


import openai

prompt = 'How would you improve the following academic paper abstract?\n\n'
abstract = # <imagine some text here>

response = openai.Completion.create(
                  model='text-davinci-003',
                  prompt=prompt+abstract, 
                  temperature=0, 
                  max_tokens=350)

print(response['choices'][0]['text'])

I am posting the full code above to clarify how easy GPT-3.5 is to use, and what a low barrier already exists for any person or company to offer services based on this technology. Assuming, of course, OpenAI will continue to share their model through an API.

How Did it Do?

The resulting abstract was surprisingly clear and accurate. It retained all the crucial information from the original text, but started with a strong summary and then clearly laid out the argument of the paper. And that's not all. Reading GPT's version of our abstract, I finally understood what the main point of the student's paper was.

I was impressed. It was as if GPT crystallized the student's ideas that were buried in the badly written text, and carved them out of the raw marble with amazing rhetoric clarity.

Good news everyone, right?

Good News Everyone!

Not so fast.

Bans, Regulation, and Embracement

Anecdotes like this one, which are flooding blog posts, news articles, and social media accounts, have sparked a swift reaction from pundits to policymakers. The main sentiment seems to agree with the recent Atlantic headline “The College Essay is Dead”, meaning we can never give out writing assignments to students again. Others took the opposite side and imagined a brave new world where everyone can write brilliantly.

Policy reactions in the education space were quick to follow, and can be divided into three categories: outright bans, partial regulation, and—in contrast—wholehearted embracement of the technology.

Reportedly, the New York City education department is one of several organizations that blocked access to the technology in schools, to prevent students from cheating on their writing assignments.

Others, like my own college at Cornell University, have forgone outright bans and instead guide professors to add a note to their syllabus that “submitting work created by ChatGPT, or copied from a bot or a website, as your own work violates Academic Integrity.”

The Cornell approach is good in the sense that it shifts the responsibility to the user of the technology—in this case the student—while setting policy boundaries on the organization's part. That said, it is also problematic because it sets up the use of GPT as a kind of probabilistic gamble for the student. The vagueness of “submitting work [...] as your own” may trigger boundary questions like: Can I use the technology just a little without outright submitting its outcome? How much text can I use from GPT without getting caught? What use case can I defend in an academic integrity hearing?, etc.

In response to these bans and warnings, a whole movement of opinion makers and instructors counter that the GPT ship has sailed. We can't prevent students from using it, and therefore, we should not only accept this fact, but embrace it. Instructors design assignments where students are not only allowed but required to use ChatGPT, and many are saying that its potential to teach outweighs the potential downsides.

There is even a trending hype around “prompt engineering” for ChatGPT, as if there was some sophisticated theory and practice behind trial-and-error chatting with an AI bot. Call me back when they open a “Prompt Engineering” department at a major university.

New Proposal: Shoulder-Shrugging

Amidst this flurry of banning, regulating, and embracing, not many are proposing the most appropriate reaction: shoulder-shrugging along with a good dose of mentorship for students.

First of all, I agree with those who claim that bans and regulations are not effective. There is no point in starting an arms race between students, professors, cheating-detectors, cheating-detector-circumventors, digital watermarking, watermark-erasers, and so forth. It is perhaps a good jobs program and can be a boost for the economy, but I see it as a huge waste of energy on everyone's part.

But that doesn't mean that I am excited for students using it for writing assignments.

Instead, I propose a radical approach: Teachers shouldn't worry very much about algorithms that can write, and instead spend a significant amount of time explaining to students why they do not actually want to use GPT to write their essay for them.

One reason is that there are many ethical issues with large language models.

But there is also WestWorld.

WestWorld narrative start

The WestWorld Metaphor

Since WestWorld made a resurgence as an HBO series, I tell my students that one of the most important things they need to understand in college is that they are guests in a WestWorld theme park. It will change the way they study.

For those who have not seen the film or series, here is the basic premise: Guests pay a lot of money to spend a week in a huge artificially constructed world populated by hyperrealistic robot “hosts”. Gradually, guests are exposed to a number of carefully crafted challenges, called “narratives”, which they have to conquer with lots of hard work and cunning—not to mention a good amount of sexual and physical violence. Once their week is over, the robot hosts do a movie-set-style scene reset and wait frozen at their narrative starting point, awaiting the next round of visitors. These, in turn, are put on the same quests, with the same challenges, and the same reanimated robot hosts.

You can see where this is going.

Students spend their years in school under a false perception that we professors ask them for pieces of work, which they provide to us. At the day-to-day micro level, it seems that instructors are the customers and students the service providers. We (the faculty) give students specifications for things we want and then they work hard to solve problems we have posed for them in the best way possible. It is as if we are renovating our kitchen and our students are cabinet makers.

This causes a host of confusing situations. First of all, if students are doing all the work, why are they paying us for it? Second, if we are the clients of student homework, why do they get to complain about us not being happy with the results? And so on.

All of this is less confusing once students realize that they are the clients of this strange reverse interaction. In fact, they are guests in an academic version of WestWorld.

Every semester, I spend part of my last lecture letting students in on the secret that we professors are nothing more than very realistic looking robots who pretend to need them to solve problems we secretly already have the solution for. We pose questions and ask for their best work, but in reality already have the answers and have done the work ourselves. And the whole thing is constructed in semester-long narrative arcs that do a complete scene-reset once the quests are completed, and a new train full of guests arrive at the same exact narrative point where the last one was a year ago.

What Students Really Want

WestWorld sells the enactment of violent fantasies. Given that, for the most part, we do not allow students to play out their violent fantasies on the faculty, what is the WestWorld experience that students pay for, with their money, their effort, and their time?

The answer is perhaps obvious: learning. Amidst the daily discussion about assignments, projects, grades, and credits, students can easily forget that the only useful outcome of all the hard work that goes into a semester (theirs and ours) is knowledge gained by students.

At WestWorld, guests really feel like they are solving life-or-death problems. In reality, they are merely collecting high-end vacation experiences. Similarly, and this is hard to remember when you are stuck on a problem set: You are not trying to solve a differential equation; you are trying to learn.

This gap is even more painful when it comes to creative projects and writing assignments. The elaborate student design project that I, as a professor, have been obsessively dissecting, critiquing, and guiding for weeks, is not important to me at all. It doesn't matter to me or, in fact, to anyone else. I don't care if it works or not, how accurate it is, or how clever the solution was, even though that's all we seem to talk about during the semester. Once the result is perfect, it can be thrown in the trash. The only thing that matters is the difference in student knowledge before and after having done the project. It is a somewhat paradoxical situation, because both instructors and students need to care deeply about something that is essentially worthless.

Destruction of a Sand Mandala

The Return of Craft Writing

Which brings me back to GPT and other AI systems that write for you.

After reading my introductory story, one might come to the conclusion that we should embrace the support that GPT-3.5 gave the authors of the badly written abstract. In this view, the ideas were already in the student's head, and they just had a technical problem of translating these ideas into readable text. “It's like a calculator” is a common quote I hear about ChatGPT. As if the idea is what matters and writing it down is just a necessary evil or technical chore that needs to be done by someone or somecode.

But anyone who writes for a living knows that in many ways writing is thinking. The process of translating vague ideas into a coherent text helps structure ideas and make connections. The time spent editing and re-editing weeds out important ideas from marginal ones. The effort to address an imaginary reader, to clarify things to them, helps eliminate unnecessary style decisions. Finding your own voice helps you understand yourself and your contribution to the world better.

Letting an AI system do this work for you means giving up all of that. It's like sending a robot to do your WestWorld vacation for you, and just sharing the photos it took on your Instagram feed. Behaving in this way is not at all about cheating, it is about missing the whole point. If you care about having clear ideas and becoming better at what you do, you want to be writing.

That's why I think the appearance of hyper-realistic text generation does not only not imply the death of the essay, but rather may usher in a renaissance of appreciation for good, slow, writing. The existence of computer-generated writing can take the focus off the end result and re-center students on the fact that writing is a craft with value in its process, and not just a means to get an outcome. Students will finally realize that nobody needs their homework essays but they.

It reminds me of the appearance of perfect mechanical image reproduction by means of photography in the 19th century. This invention did not spell the end of painting, but rather the start of an explosion of painting genres. I recently speculated that, similarly, AI-generated art that can look like any existing visual genre might bring on a renaissance of detailed realistic hand painting. Perhaps the next generation of artists will rediscover the usefulness of learning to capture light with brush strokes. GPT and the likes could similarly bring on a return of craft writing. Writing without any computational help, perhaps even without spell checkers and grammar suggestion engines, might make a comeback as a tool of sharpening ones thinking.

But I Just Want to Get a Good Job

Cynics will accuse me of naïveté. What students really want is get good grades, be done, pad their CV, have a nice transcript, and get the best job they can. Agreed, some students will. I realize everyone is trying to optimize their own function, and I have nothing against anyone's priorities.

Let them use GPT. It doesn't bother me, .

Our role as educators is to remind our students of WestWorld; ask them to step out of the client-supplier narrative; remind them that they are doing this only for themselves and that even if you have the best job in the world, it won't make you feel accomplished and won't give you the same satisfaction as being able to have clear thoughts, ideas, and opinions. And at the end of the day you still have to go to sleep with your own thoughts, ideas, and opinions. Unless, of course, you fall asleep in the Metaverse.

Sleeping in the Metaverse