
Photo courtesy of Dan Nataf
74% in a survey of more than 300 students believe they're not violating AACC's academic integrity policies.
AACC students who admitted in a survey that they use artificial intelligence apps to help them with their schoolwork said they are “very confident” that they are not violating the college’s academic integrity policies.
In a March survey of more than 300 students by AACC’s Center for the Stuy of Local Issues, students said their most frequent use of AI is to brainstorm ideas and to check their grammar and spelling.
Political science professor Dan Nataf, the center’s director, said a surprising finding of the survey is that 33% of the students said they have never used AI tools, like ChatGPT, to help them with homework. More than 50% said they find AI helpful with schoolwork and 12% said it is not useful.
“My take is that we’re still in an early spot in students’ appreciation for what AI can and can’t do or shouldn’t do,” Nataf said. “They’re kind of groping, trying to figure out how much they should use it without either on the one hand, violating policies about academic integrity, especially plagiarism, but on the other hand, using it … so that they can bounce ideas off of it.”
Still, in an optional comments section, some students said using the tool in that way could cheat them out of their education.
“I believe that the whole point of higher education places such as AACC are there to help an individual to learn how to think on their own as well as teaching them the tools to document those thoughts,” one student wrote in the segment for anonymous notes. “By introducing AI into assignments for higher education institutions, I believe that is defeating their purpose.”
Another student agreed, noting, “At this level of education, AI seems to undermine the intellectual and creative process.”
Still, some in the poll said they ask AI for help understanding concepts and textbook readings and to check their work or create outlines for essays and presentations.
This article has been updated.