Meet ChatGPT: How AI will affect academia

An image generative AI created an image of its perception of Baylor bears upon request. Josh McSwain | Roundup

By Caitlyn Meisner | Copy Editor

A new artificial intelligence software poses a new challenge to Baylor professors and students as it revolutionizes the art of long-form writing.

ChatGPT has been sounding alarms in academia since its release in November 2022.

This type of technology is somewhat new: ChatGPT is the latest iteration of large language models — a software that can produce human-like text — being created by OpenAI. ChatGPT is meant to “interact in a conversational way,” as stated on the company’s website.

GPT stands for generative pretraining transformer, which means it continuously “learns” from user feedback and search algorithms, Dr. Richard Sneed, lecturer in the philosophy department, said through email.

“Its input comes from human texts, conversations and the like, so [it] can not only respond but also ask follow-up questions,” Sneed said. “What this means is that it is much harder to detect whether or not it is a bot.”

Dr. Robert Reed, lecturer in the philosophy department, said ChatGPT was able to pass the Turing test, which means a human was unable to detect the text provided was produced by AI. He said he was impressed with the software’s capabilities as someone who spent three years working with AI and is teaching a course on the ethics of this type of technology.

“The goal of this type of AI is to produce text that a human reading it cannot tell that it was written by a computer,” Reed said. “[OpenAI] has released several versions. I remember playing at work with one of them in 2020, and it was pretty comical. You could tell it was written by a computer.”

The release of ChatGPT is an upgrade from its sibling model InstructGPT, which was created to follow instructions from a prompt and provide a detailed answer. InstructGPT was known to give inaccurate and inappropriate responses.

Reed said ChatGPT was trained to provide sophisticated responses. He said it was trained with many large data sets through reinforced and supervised machine-learning, mainly through a punishment and reward system similar to a child.

“If you had a small child playing with a cat, he’s going to play too rough and the cat is going to scratch him — that’s a punishment. Or, the kid’s going to play gently and the cat’s going to purr — that’s a reward,” Reed said. “The kid is making these exploratory movements, then either being rewarded or punished, and over time, the kid learns how to handle a cat properly. Computers can learn the same way.”

Reed and Sneed said they aren’t entirely sure students aren’t already using the software to do homework, but there are many limitations to the AI.

“It is not perfect. Humans have writing quirks, idiosyncrasies of spelling, word order and the like, [so] it seems that ChatGPT autocorrects these,” Sneed said. “As a professor, I can get a sense of a student’s writing style from in-class work; a paper that is bot-generated will be too ‘clean’ and probably more sophisticated, elevated in tone [and] wordy.”

Reed said there is a detector out there because “anything that’s produced by AI, another AI can detect.”

Dr. Tomas Cerny, assistant professor in the computer science department, said AI is rapidly changing.

“I believe that even if all the students [in the same class] will ask similar questions, it might still produce different answers,” Cerny said. “As with any invention, there are very good things coming out of it, but there are consequences.”

While much of the conversation about the impact of this new chatbot has centered around academia and universities, Reed said it will likely have a larger impact on other fields, like the arts.

“Professors who instruct students who are going into the creative workforce, they might have to advise them that the space might be changing rapidly,” Reed said.

Cerny said AI is developing so rapidly that one day, many things we cannot conceive of will eventually be on the market.

“It’s just a matter of time,” Cerny said. “It’s not a question of if, it’s a question of when.”