3 minute read
Chatting about ChatGPT
Students and professors explore OpenAI’s new language model ChatGPT and its potential to change the standards of education
Srilakshmi Medarametla | Features Writer
ChatGPT has inextricably changed the landscape of artificial intelligence capabilities. This generative A.I. application developed by OpenAI can provide written content in response to any text-based query. For the University community, it affects students and professors alike as questions arise regarding how students may choose to utilize the software and if it will affect the integrity of their work. Opposing attitudes toward the software have risen from both sides of the classroom, ranging from excitement to unease.
Computer Science Engineering Prof. Aaron Bloomfield has already acknowledged the tremendous capability now available at students’ fingertips and how it could affect academic integrity.
“There certainly is a desire amongst students at every level to focus more on grades and this is something that I think is going to enable students to cheat more,” Bloomfield said. “I think a lot of college assignments and college courses are going to have to adapt, and I think that’s going to be challenging on many levels because the standard college essay is now writable with generative A.I.”
Despite the existence of resources like Chegg and forums like Reddit and Stack Overflow available online, Bloomfield pointed out not only how cheating via ChatGPT would be harder to detect, but could also deepen issues of inequity if the service were to be completely gated behind a paywall. In fact, since its release last November, ChatGPT took a step in this direction and announced Feb. 1 that it will offer a new subscription plan for $20 a month.
While this technology has incredible potential, there are some drawbacks. Because ChatGPT does not cite where it gets its information from, it is difficult for a user to fact-check and know whether its responses are accurate.
Assoc. Public Policy Prof. Andrew Pennock said that some professors are unaware of the lack of citations and fallibility, especially for University students who plan to employ the A.I. for their assignments.
“It’s dangerous if you don’t know that it’s nonsense,” Pennock said. “Or if you’re in an area where you don’t know what nonsense is, then it will give you nonsense and you’ll have no filter to be able to check it.”
As ChatGPT becomes more prevalent, however, some professors are beginning to adapt their teaching methods to the existence of this new technology. Pennock recently led a forum where at least 100 professors discussed how generative A.I. can and will impact academia.
“There are the professors who, for good reasons… understand that training people to think without a computer is important, and so they’ll go old school,” Pennock said. “But then, there are also professors who will embrace the change and say, ‘This is a new reality. How can I teach and help students learn in this new reality?’”
Pennock and Bloomfield both place themselves in the latter category. While Pennock permits his students to use ChatGPT to assist with their assignments rather broadly, he encourages them to think about the issue for themselves first and then discuss their findings with the group.
Bloomfield hopes to promote a better understanding of the AI’s imperfections by incorporating it into his Introduction to Cybersecurity course. He plans to have his students ask the software an ethical question and then discuss how ChatGPT’s response is incorrect or incomplete.
“I think there are ways to design assignments that avoid just being able to use ChatGPT to cheat and I don’t know if my way is going to work or completely flop, but I think this is something that collegiate instructors are very interested in,” Bloomfield said.
Amidst criticisms, many see endless opportunities for advancement with such technology across the University. Third-year Engineering student Saahith Janapati noted how any student in any specialization could utilize the software and realize its full potential.
“I think that some of the hype is warranted because if you want to deploy these systems in the real world, the interface of language is useful for everyone to interact with — you don’t have to have any specialized domain knowledge about how this model works,” Janapati said.
Although the concern for academic integrity is prevalent across the University, Janapati envisions an optimistic outlook where the software could promote education.
“You can imagine that in the future there’s gonna be a tutoring version of ChatGPT where it not only knows the topic that you’re learning about, but it can actually customize itself to you and how you learn,” Janapati said.
Regardless of having an optimistic or suspicious attitude toward the software, students and professors agree that generative A.I. and ChatGPT specifically have great potential as a resource at the University and must be handled carefully.
Pennock likens the progress of generative A.I. to advances that have irrevocably shaped our past where it is yet unknown the full extent to which the changes on the horizon may affect our society, economy and lives.
“This is a once-in-a-generation change to how students learn and how professors create learning environments — it’s a big deal,” Pennock said.