Major support for MindShift comes from
Landmark College
upper waypoint

Worried about ChatGPT and cheating? Here are 4 things teachers should know

Save ArticleSave Article
Failed to save article

Please try again

Close-up of hands on a laptop computer
 ((Allison Shelley/The Verbatim Agency for EDUimages))

In his university teaching days, Mark Schneider watched as his students’ research sources moved from the library to Wikipedia to Google. With greater access to online information, cheating and plagiarism became easier. So Schneider, who taught at State University of New York, Stony Brook for 30 years, crafted essay prompts in ways that he hoped would deter copy-paste responses. Even then, he once received a student essay with a bill from a paper-writing company stapled to the back. 

Teachers probably spend more time than they’d like trying to thwart students who are able to cheat in creative ways. And many educators are alarmed that ChatGPT, a new and widely available artificial intelligence (AI) model developed by OpenAI, offers yet another way for students to sidestep assignments. ChatGPT uses machine learning and large language modeling to produce convincingly human-like writing. Because users can input prompts or questions into ChatGPT and get paragraphs of text, it has become a popular way for students to complete essays and research papers.

Some schools have already banned ChatGPT for students. At the same time, some educators are exploring ways to harness the tool for learning. To help educators understand how artificial intelligence might fit into a classroom environment, Schneider, who is now the director of the Institute of Education Sciences (IES), an independent research arm of the U.S. Department of Education, compares it to the invention of the calculator. “For years there was a question about whether or not students should have calculators when they do a math assessment,” he said. “And this happens all over the place: Some new technology comes [and] it’s overwhelming.” 

Eventually educators decided to permit calculators and make test questions more complex instead of constantly having to monitor students’ behavior. Similarly, with ChatGPT, Schneider urges educators to ask themselves, “What do you need to do with this incredibly powerful tool so that it is used in the furtherance of education rather than as a cheat sheet?” In a conversation with MindShift, he addressed teachers’ ChatGPT worries and offered insights on how to ensure students continue to have meaningful learning experiences.

Using ChatGPT to cheat isn’t fool-proof

ChatGPT produces essays that are grammatically correct and free of spelling errors in a matter of seconds; however, its information isn’t always factual. ChatGPT provides answers that draw from webpages that may be biased, outdated or incorrect. Schneider described ChatGPT’s output as “semi reliable.” It has been shown to produce plausible references that are inaccurate and supply convincing answers that are not rooted in science. 

Sponsored

“So when people get lazy and [say], ‘Hey, write this thing for me,’ and then take it and use it, there could be errors in it,” said Schneider. This makes it a valuable tool for generating ideas and writing rough drafts, but a risky option when using it for final assignments. Students who decide to use ChatGPT will likely need to double check that the information it provides is correct either by knowing the information in the first place or confirming with other dependable sources.

ChatGPT can support teachers, not replace them

For some educators, ChatGPT also raises alarm that the widespread adoption of AI could lead to job losses, particularly in areas such as tutoring and teaching languages. Schneider said that’s unlikely. “I can't imagine a school system that has no teachers in it,” he said. Numerous studies show a correlation between strong student-teacher connections and increased student involvement, attendance and academic performance.

As people explore how AI will support teaching and learning, teachers' roles may change as these tech tools become more widely used. “Teachers are going to have to evolve and figure out how to harness the power of this tool to improve instruction,” said Schneider. For example, the AI Institute for Transforming Education for Children with Speech and Language Processing Challenges, which was awarded $20 million in funding from IES and the National Science Foundation, is exploring how ChatGPT can support speech pathologists. According to a recent survey by the American Speech-Language-Hearing Association, the median number of students served by one speech pathologist is 48. “There are simply not enough pathologists in schools,” said Schneider. ChatGPT has the potential to help speech pathologists complete paperwork, which takes up almost six hours each week, and build personalized treatment plans for students with cognitive disabilities, such as dyslexia.

“We need to rethink what we can do to free up teachers to do the work that they are really good at and how to help them individualize their interventions and provide instruction and support,” said Schneider.

When you use ChatGPT, your data is not secure

ChatGPT is convincing because it references a massive amount of data and identifies patterns to generate text that seems like it is written by a human. It can even mimic the writing style and tone of the person who uses it. “The more data they have, the better the model,” said Schneider, referring to ChatGPT’s ability to generate responses. “And there's tons of data floating around.” 

The information that users put into ChatGPT to make it generate a response – also known as the input – can take the form of a question, a statement or even a partial text that the user wants ChatGPT to complete. But when students use ChatGPT they may be putting their data at risk.

According to Open AI’s privacy policy, inputs – including ones with personal information, such as names, addresses, phone numbers or other sensitive content – may be reviewed and shared with third parties. Also, there is the ever present risk that if ChatGPT is hacked, a bad actor can access users’ data. 

Schneider acknowledged that if ChatGPT will be used to support teaching and learning, privacy is a major concern. “We are developing much better methods for preserving privacy than we have in the past,” he said. “We have to remember it's a bit of a cost analysis. Using all this data has many benefits. It also has some risks. We have to balance those.” He added that ChatGPT is similar to wearing an Apple Watch or talking to an Amazon Alexa, because those tools also rely on data from users.  

Banning ChatGPT isn’t a long-term solution

Because students can input original prompts into ChatGPT and get unique answers, it raises the question: Is using ChatGPT plagiarism? And how much does AI-generated text need to be edited until it is considered a students’ own work? In lieu of answering these questions, some schools, including districts in Los Angeles, New York City and Seattle, have opted to ban use of ChatGPT outright.

Sponsored

Schneider concedes that it makes sense for schools and teachers to hold ChatGPT at bay for the rest of the school year so they can take the summer to figure out how to use it next year. For example, ChatGPT can be used to help students outline essays before they write a rough draft longhand. Other teachers have used ChatGPT to suggest classroom activities or generate test questions. Trying to ban it completely won’t work and it’s an innovation in education that teachers will eventually have to face, Schneider said. “Just like they had to face calculators and computers and laptops and iPhones.”

lower waypoint
next waypoint