Artificial intelligence raises plagiarism concerns

Pixabay photo

Some students are using artificial intelligence like ChatGPT to complete their assignments.

Sam Gauntt, Managing Editor

Students who use artificial intelligence tools to write their papers or answer exam questions are guilty of plagiarism, AACC faculty members said in January. 

According to Wayne Kobylinski, the academic chair of the English department, faculty have flagged at least two students who they believe used AI text generators for assignments on the first day of school.

“Yes, it took one day this semester before I heard of an instance,” Kobylinski said. 

The professors were tipped off when they saw discussion posts by two different students that were “virtually identical,” Kobylinski added. 

AI text generators, such as ChatGPT or Copy.ai, can answer prompts given to them—even writing essays or completing assignments. 

“The use of AI text generators, like ChatGPT, without attribution is no different than any other kind of plagiarism,” Kobylinski said. “Presenting somebody else’s words and ideas as your own is considered academic dishonesty. So those are the policies, those are the guidelines, that’s the way it’s going to be treated. So tread lightly.”

Common penalties for plagiarism are a zero on the assignment, a required trip to the assistant dean’s office or even expulsion after multiple academic violations. 

English professor Suzanne Spoor said she initially worried about the potential use of AI tools in her class, but said most students will still want to do their own work.

“Most of my students really want to find their own voices, learn, explore their own ideas,” Spoor said. “And so I don’t need to panic, because they’re the same students, right. They’re still going to want to do all of those things.”

Photography professor Christiana Caro, who worked at Google from 2016-18 on AI camera technology, said her impression of ChatGPT “is that it’s really good, which is interesting because it’s super problematic. … Because the pursuit of knowledge is meant to teach us how to do a thing, not how to get a thing to do a thing for us.”

Caro, who also teaches photography at Johns Hopkins University, added: “But at the same time, it’s like if the thing is so good, and it writes the paper, you could, like, ostensibly learn from that as well by looking at the way that the AI translates, kind of, an idea or a synthesis of ideas.”

First-year transfer studies student Gabriel Henstrand said the use of AI for college assignments  “flat out … is just academic dishonesty.”

“I’m sure there are ways to go about, I guess, using the AI to help you, or help someone write a paper,” Henstrand said. “But at the end of the day … I think it should be the student’s own research that’s conducted in their own writing. It’s really the only way that, like, someone can actually turn in something that’s, you know, genuine.”

Second-year transfer studies student John Finn agreed. 

“I mean, I think it is just cheating to some extent,” Finn said. “If it’s not writing the actual paper, like, you can get ideas and things from it, and that’s kind of fine. But you do have to do your own work after that.”

Spoor said she ran assignments through ChatGPT to see what the program could generate. 

“I said, ‘Compare these two poems,’ and I tried to pick obscure poets that almost nobody writes about,” Spoor said. “And they did a good job with topic sentences. … But … none of the quotes were from the actual poems, for example, but it just acted like it did. … I think it’s really scary for misinformation.”

Henstrand said he does not “see it getting to a point where it’s like more students are using AI than not.”

Spoor said professors could potentially use AI as a teaching aide in the classroom. 

“I’m curious, maybe it is good to let the ChatGPT write an essay so you can see the good topic sentences,” Spoor said. “I’m kind of open to that because I’ve seen students struggle. And then if they had topic sentences, they could write a better essay, you know what I mean? So at the moment, I’d rather have them not use the tool, unless we’re all aware that they’re using it.”