SU can better serve its students by embracing AI
To support student journalism and the content you love, become a member of The Daily Orange today.
This past spring, most of our syllabi and course expectations featured a new component: a warning against the use of Artificial Intelligence to complete assignments or classwork. Faculty and staff have had to address the use of AI in classrooms, and many have different approaches.
AI, while still very unfamiliar, is a part of our new reality. There needs to be more discussion on how to use it properly without considering it an interference to academic integrity. Ignoring or completely banning its use is an unrealistic expectation since many college students already use AI regularly.
Shelby Smith, a pre-med junior studying Neuroscience and Psychology, uses AI to help her understand complex topics in her classes.
“I’ve used it to help me study difficult class content such as asking AI to explain difficult material so that it is easier to understand,” Smith said.
Evan Krukin, a junior double majoring in Television, Radio and Film and Information Management and Technology uses AI to help create citations. He uses it the same way people use online citation tools but with faster results.
“In my WRT 209: Critical Research and Writing class I took last spring, we explored using ChatGPT to help us find scholarly sources,” Krukin said. “With varying degrees of success, we were able to find articles and narrow our search quickly. We still had to assess the credibility of the sources, read them and then see if they could be of use. The actual work was the same, but it saved us time in our preliminary searches.”
Just as we’ve used the internet for years, AI can be that extra step to make new, complex material more digestible. These types of uses uphold academic integrity but still assist students in their work.
Sierra Zaccagnino | Design Editor
The School of Information Studies is going so far as to make AI a permanent fixture in its curriculum, including an entire course built around the new technology. Use of AI is encouraged, rather than warned against. Maddy de Vera, a junior in the iSchool, said her professor has created a tool called Pybot using ChatGPT. It helps students write lines of code when they’re struggling.
“My professors usually let me use AI as long as I explain how I used it,” de Vera said.
In the S.I. Newhouse School of Public Communications, some professors prohibit the use of AI entirely, especially when the class is centered on students producing their own writing. While it’s understandable that ChatGPT shouldn’t do the writing for us, AI can be beneficial to students in majors and classes outside of Newhouse.
Not all professors can incorporate AI as extensively, but more need to teach students how to use AI tools properly while still upholding academic integrity. Students should be producing their own work, but just as colleges had to adjust to using the internet, adapting to AI is necessary.
SU needs to recognize that if we don’t learn to use every tool at our disposal, we’ll be left behind in the workplace. Mandated training about AI for staff and students would help to make policies more consistent throughout the university, preventing the advantage of one school’s students over another’s.
It’s important that students also understand the ethical issues that come with it. Before asking ChatGPT for assignment answers, consider the importance of owning what you create, and crediting that of others. In college and in the professional world, relying on work produced by tools like ChatGPT can undermine your value as a student and employee.
We don’t know the full extent of AI’s capabilities yet and we should still be cautious for ethical reasons. It’s unknown territory, but we can start exploring it in ways that don’t compromise the integrity of our work.
Emilie (Lily) Newman is a junior Political Science and Magazine, News and Digital Journalism major. Her column appears biweekly. She can be reached at emnewman@syr.edu.