Baylor’s ChatGPT policy results in divided opinions among faculty, students

Student works on her laptop in Moody-Memorial Library. Abby Roper | Photographer

By Sarah Gallaher | Staff Writer

Baylor’s Office of Academic Integrity offers resources to professors to combat the use of ChatGPT in the classroom. The list of resources was made available to faculty on April 4, five months after OpenAI released ChatGPT and sparked conversations about the ethics of using it in the classroom.

ChatGPT, as a new writing tool, does not fit the traditional description of plagiarism. This means universities have to expand their definition of cheating to include AI writing. Baylor’s Honor Code, issued by the Office of Academic Integrity, prohibits the use of AI writing tools under Sec. III, which prevents students from using the work of others, including AI writing, as their own.

Dr. Maura Jortner, senior lecturer in the English department, said she has firsthand experience with students violating the Honor Code in her courses. After her first book publication, “102 Days of Lying About Lauren,” was released in June 2023, Jortner said she values the art of writing and hopes students refrain from using ChatGPT as a shortcut.

“I completely agree with Baylor’s policy, because if you’re cheating, you should get turned into the Honor Council,” Jortner said.

Despite the resources given to faculty to detect cheating, including AI detection programs like turnitin.com, some professors choose to adapt their coursework to prevent cheating. In an article for The New York Times, journalist Kalley Huang interviewed over 30 university faculty and students about such changes in the classroom.

“Some professors are redesigning their courses entirely, making changes that include more oral exams, group work and handwritten assessments in lieu of typed ones,” Huang wrote.

However, some professors and students say ChatGPT is a useful tool in the classroom. Matthew Brammer, lecturer in the department of journalism, new media and public relations, said he has a more positive view on AI tools, encouraging students to learn more about the new technology.

“I want students to learn AI,” Brammer said. “That’s very important to me, because it’s an amazing tool, and it’s transformative in our society in the way we communicate and gather and get information. I’m not scared of it; I embrace it.”

However, Brammer said he does not support the use of AI in all circumstances, particularly when students present artificially generated work as their own. He said he implemented a two-strike policy, which offers a warning to students the first time they use AI to complete an assignment and resorts to disciplinary action if a second offense occurs.

“As far as ethics go, anytime you represent someone else’s work — whether it’s copied or artificially generated — as your own work and claim it as such, that’s plagiarism,” Brammer said. “If you just use a tool and say, ‘This is mine,’ well, you’re just cheating yourself, and that’s not right.”

Among students, there are diverse opinions surrounding the debate on ChatGPT. Fort Worth junior Giana Grace said AI can be used for plagiarism, but it can also be a helpful tool for students. She added that professors shouldn’t assume all students are using AI to cheat or are using AI writing programs at all.

“I don’t like the thought of [redesigning courses and] … making coursework more difficult than it normally would have been,” Grace said. “I feel like that would be at the disadvantage of the students, especially the ones that aren’t using AI or ChatGPT.”

However, there are students who use such programs to plagiarize, and Grace acknowledged that prohibiting the use of AI might be a necessary step in preventing plagiarism among students.

“I also understand professors wanting to cut down the use of AI in the classroom and plagiarism and cheating,” Grace said. “That is completely understandable. I just wish there was a middle ground area that could help people not cheat but also not become a disadvantage for students.”

Whether professors choose to implement new teaching styles or not, most will still rely on AI detection programs to identify plagiarism. Brammer offered a unique perspective on the use of AI, combining innovative teaching methods with new technology like artificial intelligence.

“I think that as professors, it forces us to be very creative,” Brammer said. “It forces us to really engage our students, and that’s one of the cool things about Baylor, and that is that we’ve got all these amazing, brilliant professors. We will adapt. Some are nervous. Some are scared. But the sky is not falling. The world is not coming to an end yet. And this is just another tool like the typewriter or the personal computer or the smartphone or the calculator. There have been all these transformative things in history, and this is just one of those things. And we, as an educational body, have to engage it, to understand it and to teach it.”