Although Daily acknowledges that this technological growth incites new concerns in the world of academia, she doesn’t find it to be a realm entirely unexplored. “I think we’ve been in a version of this territory for a while already,” Daily says. “Students who commit plagiarism often borrow material from a ‘somewhere’—a website, for example, that doesn’t have clear authorial attribution. I suspect the definition of plagiarism will expand to include things that produce.”
Eventually, Daily believes, a student who uses text from ChatGPT will be seen as no different than one that copies and pastes chunks of text from Wikipedia without attribution.
Students’ views on ChatGPT are another issue entirely. There are those, like Cobbs, who can’t imagine putting their name on anything bot-generated, but there are others who see it as just another tool, like spellcheck or even a calculator. For Brown University sophomore Jacob Gelman, ChatGPT exists merely as a convenient research assistant and nothing more.
“Calling the use of ChatGPT to pull reliable sources from the internet ‘cheating’ is absurd. It’s like saying using the internet to conduct research is unethical,” Gelman says. “To me, ChatGPT is the research equivalent of [typing assistant] Grammarly. I use it out of practicality and that’s really all.” Cobbs expressed similar sentiment, comparing the AI bot to “an online encyclopedia.”
But while students like Gelman use the bot to speed up research, others take advantage of the high-capacity prompt input feature to generate completed works for submission. It might seem obvious what qualifies as cheating here, but different schools across the country offer contrasting takes.
According to Carlee Warfield, chair of Bryn Mawr College’s Student Honor Board, the school considers any use of these AI platforms as plagiarism. The tool’s popularization just calls for greater focus in evaluating the intent behind students’ violations. Warfield explains that students who turn in essays entirely produced by AI are categorically different from those who borrow from online tools without knowledge of standard citations. Because the ChatGPT phenomenon is still new, students’ confusion surrounding the ethics of operation is understandable. And, because ChatGPT is still so new, it’s unclear what policies will remain in place once the dust settles—at any school.
In the midst of fundamental change in both the academic and technological spheres, universities are forced to reconsider their definitions of academic integrity to reasonably reflect the circumstances of society. The only problem is, society shows no stagnance.
“Villanova’s current academic integrity code will be updated to include language that prohibits the use of these tools to generate text that then students represent as text they generated independently,” Daily explained. “But I think it’s an evolving thing. And what it can do and what we will then need in order to keep an eye on will also be kind of a moving target.”
In addition to increasingly complex questions about whether ChatGPT is a research tool or a plagiarism engine, there’s also the possibility that it can be used for learning. In other educational settings, teachers see it as a way to teach students about the shortcomings of AI. Some instructors are already modifying how they teach by giving students assignments bots couldn’t complete, like those that require personal details or anecdotes. There’s also the matter of detecting AI use in students’ work, which is a burgeoning cottage industry all its own.
Ultimately, Daily says, schools may need rules that reflect a range of variables.
“My guess is that there will be the development of some broad blanket policies that essentially say, unless you have permission from a professor to use AI tools, using them will be considered a violation of the academic integrity code,” Daily says. “That then gives faculty broad latitude to use it in their teaching or in their assignments, as long as they are stipulating explicitly that they are allowing it.”
As for ChatGTP, the program agrees. “Advances in fields such as artificial intelligence are expected to drive significant innovation in the coming years,” it says, when asked how schools can combat academic dishonesty today. “Schools should constantly review and update their academic honor codes as technology evolves to ensure they are addressing the current ways in which technology is being used in academic settings.”
But, a bot would say that.