Skip to Content
Artificial intelligence

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

A ballpoint pen doodled OpenAi logo is in the center of a lined paper from a school notebook, surrounded by other doodles.
Selman Design

The response from schools and universities was swift and decisive.

Just days after OpenAI dropped ChatGPT in late November 2022, the chatbot was widely denounced as a free essay-writing, test-taking tool that made it laughably easy to cheat on assignments.

Los Angeles Unified, the second-­largest school district in the US, immediately blocked access to OpenAI’s website from its schools’ network. Others soon joined. By January, school districts across the English-speaking world had started banning the software, from Washington, New York, Alabama, and Virginia in the United States to Queensland and New South Wales in Australia.

Several leading universities in the UK, including Imperial College London and the University of Cambridge, issued statements that warned students against using ChatGPT to cheat. 

“While the tool may be able to provide quick and easy answers to questions, it does not build critical-­thinking and problem-solving skills, which are essential for academic and lifelong success,” Jenna Lyle, a spokeswoman for the New York City Department of Education, told the Washington Post in early January.

This initial panic from the education sector was understandable. ChatGPT, available to the public via a web app, can answer questions and generate slick, well-structured blocks of text several thousand words long on almost any topic it is asked about, from string theory to Shakespeare. Each essay it produces is unique, even when it is given the same prompt again, and its authorship is (practically) impossible to spot. It looked as if ChatGPT would undermine the way we test what students have learned, a cornerstone of education.

But three months on, the outlook is a lot less bleak. I spoke to a number of teachers and other educators who are now reevaluating what chatbots like ChatGPT mean for how we teach our kids. Far from being just a dream machine for cheaters, many teachers now believe, ChatGPT could actually help make education better.

Advanced chatbots could be used as powerful classroom aids that make lessons more interactive, teach students media literacy, generate personalized lesson plans, save teachers time on admin, and more.

Educational-tech companies including Duolingo and Quizlet, which makes digital flash cards and practice assessments used by half of all high school students in the US, have already integrated OpenAI’s chatbot into their apps. And OpenAI has worked with educators to put together a fact sheet about ChatGPT’s potential impact in schools. The company says it also consulted educators when it developed a free tool to spot text written by a chatbot (though its accuracy is limited). 

“We believe that educational policy experts should decide what works best for their districts and schools when it comes to the use of new technology,” says Niko Felix, a spokesperson for OpenAI. “We are engaging with educators across the country to inform them of ChatGPT’s capabilities. This is an important conversation to have so that they are aware of the potential benefits and misuse of AI, and so they understand how they might apply it to their classrooms.”

But it will take time and resources for educators to innovate in this way. Many are too overworked, under-resourced, and beholden to strict performance metrics to take advantage of any opportunities that chatbots may present. 

It is far too soon to say what the lasting impact of ChatGPT will be—it hasn’t even been around for a full semester. What’s certain is that essay-writing chatbots are here to stay. And they will only get better at standing in for a student on deadline—more accurate and harder to detect. Banning them is futile, possibly even counterproductive. “We need to be asking what we need to do to prepare young people—learners—for a future world that’s not that far in the future,” says Richard Culatta, CEO of the International Society for Technology in Education (ISTE), a nonprofit that advocates for the use of technology in teaching.

Tech’s ability to revolutionize schools has been overhyped in the past, and it’s easy to get caught up in the excitement around ChatGPT’s transformative potential. But this feels bigger: AI will be in the classroom one way or another. It’s vital that we get it right. 

From ABC to GPT

Much of the early hype around ChatGPT was based on how good it is at test taking. In fact, this was a key point OpenAI touted when it rolled out GPT-4, the latest version of the large language model that powers the chatbot, in March. It could pass the bar exam! It scored a 1410 on the SAT! It aced the AP tests for biology, art history, environmental science, macroeconomics, psychology, US history, and more. Whew!

It’s little wonder that some school districts totally freaked out.

Yet in hindsight, the immediate calls to ban ChatGPT in schools were a dumb reaction to some very smart software. “People panicked,” says Jessica Stansbury, director of teaching and learning excellence at the University of Baltimore. “We had the wrong conversations instead of thinking, ‘Okay, it’s here. How can we use it?’”

“It was a storm in a teacup,” says David Smith, a professor of bioscience education at Sheffield Hallam University in the UK. Far from using the chatbot to cheat, Smith says, many of his students hadn’t yet heard of the technology until he mentioned it to them: “When I started asking my students about it, they were like, ‘Sorry, what?’”

Even so, teachers are right to see the technology as a game changer. Large language models like OpenAI’s ChatGPT and its successor GPT-4, as well as Google’s Bard and Microsoft’s Bing Chat, are set to have a massive impact on the world. The technology is already being rolled out into consumer and business software. If nothing else, many teachers now recognize that they have an obligation to teach their students about how this new technology works and what it can make possible. “They don’t want it to be vilified,” says Smith. “They want to be taught how to use it.”

Change can be hard. “There’s still some fear,” says Stansbury. “But we do our students a disservice if we get stuck on that fear.”

Stansbury has helped organize workshops at her university to allow faculty and other teaching staff to share their experiences and voice their concerns. She says that some of her colleagues turned up worried about cheating, others about losing their jobs. But talking it out helped. “I think some of the fear that faculty had was because of the media,” she says. “It’s not because of the students.”

In fact, a US survey of 1,002 K–12 teachers and 1,000 students between 12 and 17, commissioned by the Walton Family Foundation in February, found that more than half the teachers had used ChatGPT—10% of them reported using it every day—but only a third of the students. Nearly all those who had used it (88% of teachers and 79% of students) said it had a positive impact.

A majority of teachers and students surveyed also agreed with this statement: “ChatGPT is just another example of why we can’t keep doing things the old way for schools in the modern world.”

Helen Crompton, an associate professor of instructional technology at Old Dominion University in Norfolk, Virginia, hopes that chatbots like ChatGPT will make school better.

Many educators think that schools are stuck in a groove, says Crompton, who was a K–12 teacher for 16 years before becoming a researcher. In a system with too much focus on grading and not enough on learning, ChatGPT is forcing a debate that is overdue. “We’ve long wanted to transform education,” she says. “We’ve been talking about it for years.”

Take cheating. In Crompton’s view, if ChatGPT makes it easy to cheat on an assignment, teachers should throw out the assignment rather than ban the chatbot.

We need to change how we assess learning, says Culatta: “Did ChatGPT kill assessments? They were probably already dead, and they’ve been in zombie mode for a long time. What ChatGPT did was call us out on that.”

Critical thinking

Emily Donahoe, a writing tutor and educational developer at the University of Mississippi, has noticed classroom discussions starting to change in the months since ChatGPT’s release. Although she first started to talk to her undergraduate students about the technology out of a sense of duty, she now thinks that ChatGPT could help teachers shift away from an excessive focus on final results. Getting a class to engage with AI and think critically about what it generates could make teaching feel more human, she says, “rather than asking students to write and perform like robots.”

This idea isn’t new. Generations of teachers have subscribed to a framework known as Bloom’s taxonomy, introduced by the educational psychologist Benjamin Bloom in the 1950s, in which basic knowledge of facts is just the bedrock on which other forms of learning, such as analysis and evaluation, sit. Teachers like Donahoe and Crompton think that chatbots could help teach those other skills. 

In the past, Donahoe would set her students to writing assignments in which they had to make an argument for something—and grade them on the text they turned in. This semester, she asked her students to use ChatGPT to generate an argument and then had them annotate it according to how effective they thought the argument was for a specific audience. Then they turned in a rewrite based on their criticism.

Breaking down the assignment in this way also helps students focus on specific skills without getting sidetracked. Donahoe found, for example, that using ChatGPT to generate a first draft helped some students stop worrying about the blank page and instead focus on the critical phase of the assignment. “It can help you move beyond particular pain points when those pain points aren’t necessarily part of the learning goals of the assignment,” she says.

Smith, the bioscience professor, is also experimenting with ChatGPT assignments. The hand-wringing around it reminds him of the anxiety many teachers experienced a couple of years ago during the pandemic. With students stuck at home, teachers had to find ways to set assignments where solutions were not too easy to Google. But what he found was that Googling—what to ask for and what to make of the results—was itself a skill worth teaching. 

Smith thinks chatbots could be the same way. If his undergraduate students want to use ChatGPT in their written assignments, he will assess the prompt as well as—or even rather than—the essay itself. “Knowing the words to use in a prompt and then understanding the output that comes back is important,” he says. “We need to teach how to do that.”

The new education

These changing attitudes reflect a wider shift in the role that teachers play, says Stansbury. Information that was once dispensed in the classroom is now everywhere: first online, then in chatbots. What educators must now do is show students not only how to find it, but what information to trust and what not to, and how to tell the difference. “Teachers are no longer gatekeepers of information, but facilitators,” she says.

In fact, teachers are finding opportunities in the misinformation and bias that large language models often produce. These shortcomings can kick off productive discussions, says Crompton: “The fact that it’s not perfect is great.”

Teachers are asking students to use ChatGPT to generate text on a topic and then getting them to point out the flaws. In one example that a colleague of Stansbury’s shared at her workshop, students used the bot to generate an essay about the history of the printing press. When its US-centric response included no information about the origins of print in Europe or China, the teacher used that as the starting point for a conversation about bias. “It’s a great way to focus on media literacy,” says Stansbury.

Crompton is working on a study of ways that chatbots can improve teaching. She runs off a list of potential applications she’s excited about, from generating test questions to summarizing information for students with different reading levels to helping with time-­consuming administrative tasks such as drafting emails to colleagues and parents.

One of her favorite uses of the technology is to bring more interactivity into the classroom. Teaching methods that get students to be creative, to role-play, or to think critically lead to a deeper kind of learning than rote memorization, she says. ChatGPT can play the role of a debate opponent and generate counterarguments to a student’s positions, for example. By exposing students to an endless supply of opposing viewpoints, chatbots could help them look for weak points in their own thinking. 

Crompton also notes that if English is not a student’s first language, chatbots can be a big help in drafting text or paraphrasing existing documents, doing a lot to level the playing field. Chatbots also serve students who have specific learning needs, too. Ask ChatGPT to explain Newton’s laws of motion to a student who learns better with images rather than words, for example, and it will generate an explanation that features balls rolling on a table.

Made-to-measure learning

All students can benefit from personalized teaching materials, says Culatta, because everybody has different learning preferences. Teachers might prepare a few different versions of their teaching materials to cover a range of students’ needs. Culatta thinks that chatbots could generate personalized material for 50 or 100 students and make bespoke tutors the norm. “I think in five years the idea of a tool that gives us information that was written for somebody else is going to feel really strange,” he says.

Some ed-tech companies are already doing this. In March, Quizlet updated its app with a feature called Q-Chat, built using ChatGPT, that tailors material to each user’s needs. The app adjusts the difficulty of the questions according to how well students know the material they’re studying and how they prefer to learn. “Q-Chat provides our students with an experience similar to a one-on-one tutor,” says Quizlet’s CEO, Lex Bayer.

In fact, some educators think future textbooks could be bundled with chatbots trained on their contents. Students would have a conversation with the bot about the book’s contents as well as (or instead of) reading it. The chatbot could generate personalized quizzes to coach students on topics they understand less well.

chalkboard where the "T" of G-P-T has been erased and replaced with an "A"
SELMAN DESIGN

Not all these approaches will be instantly successful, of course. Donahoe and her students came up with guidelines for using ChatGPT together, but “it may be that we get to the end of this class and I think this absolutely did not work,” she says. “This is still an ongoing experiment.”

She has also found that students need considerable support to make sure ChatGPT promotes learning rather than getting in the way of it. Some students find it harder to move beyond the tool’s output and make it their own, she says: “It needs to be a jumping-off point rather than a crutch.”

And, of course, some students will still use ChatGPT to cheat. In fact, it makes it easier than ever. With a deadline looming, who wouldn’t be tempted to get that assignment written at the push of a button? “It equalizes cheating for everyone,” says Crompton. “You don’t have to pay. You don’t have to hack into a school computer.”

Some types of assignments will be harder hit than others, too. ChatGPT is really good at summarizing information. When that is the goal of an assignment, cheating is a legitimate concern, says Donahoe: “It would be virtually indistinguishable from an A answer in that context. It is something we should take seriously.”

None of the educators I spoke to have a fix for that. And not all other fears will be easily allayed. (Donahoe recalls a recent workshop at her university in which faculty were asked what they were planning to do differently after learning about ChatGPT. One faculty member responded: “I think I’ll retire.”)

But nor are teachers as worried as initial reports suggested. Cheating is not a new problem: schools have survived calculators, Google, Wikipedia, essays-for-pay websites, and more.

For now, teachers have been thrown into a radical new experiment. They need support to figure it out—perhaps even government support in the form of money, training, and regulation. But this is not the end of education. It’s a new beginning.

“We have to withhold some of our quick judgment,” says Culatta. “That’s not helpful right now. We need to get comfortable kicking the tires on this thing.”

Deep Dive

Artificial intelligence

How to opt out of Meta’s AI training

Your posts are a gold mine, especially as companies start to run out of AI training data.

Apple is promising personalized AI in a private cloud. Here’s how that will work.

Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.

This AI-powered “black box” could make surgery safer

A new smart monitoring system could help doctors avoid mistakes—but it’s also alarming some surgeons and leading to sabotage.

An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary

Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.