This story contains anonymous sources. USF Encounter only uses anonymous sources when it feels it is necessary to protect the identity, safety or livelihood of a source.
Correction: The printed version of this story incorrectly stated that the USF Educational Standards Committee added a new AI policy independent of the existing Integrity policy. The committee added AI to the existing Integrity policy.
Technology is constantly changing the world and its trickling effect on education is often felt as an aftershock. The advent of AI, however, is seeing education take the forefront of this technological innovation with varying results. Generative artificial intelligence, or AI, has been a subject of much debate in the relatively brief time it has been available.
On the positive end, AI can accommodate students’ versatile learning needs and lessen the burden of normative tasks on instructors, which in theory frees them up to engage with their students in more productive and meaningful ways.
One of the more popular applications of generative AI, ChatGPT, has been a game-changer in many facets of business, creativity and school. The program has been in operation since November of 2022 and has carved out a sizable home in the world of education.
ChatGPT and other AI tools are quickly making a difference in lives here at USF, with one senior, who asked to remain anonymous, saying it has changed their academic behavior.
“I just use it for 'gen eds' or classes where I’m never going to use any of it after the class is over,” they said.
As we catch up to this new entity here at the university, it’s critical to understand what we’re dealing with and how to best use it to our advantage, instructors and students alike. Inside Higher Education took a look at usage rates of AI in 2023 and compared the spring and fall semesters. 27% of students were using AI in the spring and that figure ballooned to 49% in the fall. With instructors, they found that only 7% were using AI in the spring but that number more than tripled in the fall to 22%. Of these users, 75% say they have no plans of stopping and that AI will continue to be a useful tool for professional work environments.
Elizabeth McDermott, the dean of the college of arts and sciences, sheds some light on the official university stance on AI.
“The Educational Standards Committee has added artificial intelligence to the USF Academic Integrity policy,” she said. “Per the new syllabus statement at USF, a student should get ‘express consent’ from their instructor before using AI on any of their assignments.”
This sets the scene for how USF adapts to AI, but, as the anonymous senior puts it, this is not how things are unfolding.
When asked how they go outside the school’s policy, they said, “It’s easy so far, none of the teachers seem to be that good at catching it, and if I think they’re getting suspicious I’ll just change some stuff in the answer I get from ChatGPT.”
As the university reacts to this new tech tool, McDermott highlights that leadership is poised to react to any challenge.
“Our faculty are constantly reflecting about what they're teaching and why. Should they change a textbook, create a 'flipped' classroom, or work with their department to update their curriculum? These are questions that require stepping back from the classroom and taking feedback into account,” she said.
Part of this feedback comes from the anonymous senior, who doesn’t mince words when asked if AI is taking anything away from their learning.
“I don’t think I’m losing anything by using it. I still think I’m getting as much out of my classes as I ever have. Most of the stuff [professors] have us do is bulls**t so I’m glad [AI] came out while I’m still in college.”
When asked what they use AI for the most with their coursework, they did not hesitate.
“Discussion boards are pointless. Nobody wants to do them and they don’t do anything for me. It’s just busy work that’s easy to use AI for and get them over with.”
AI is touted as an important tool to aid learning and this is not lost on the anonymous senior, who often uses it for its intended purpose.
“I still will do most of my major course work myself, and if I do use AI, it’s mostly to try and understand something instead of having it do it for me. It’s just like using the internet in that way, at least in my opinion.”
Part of the reason students have been able to get around the rules so easily, the New York Times found, is that AI detectors are extremely unreliable and easily manipulated. The common prediction is that these detectors will be refined and improved in the near future, but this is far from a guarantee and doesn’t forecast how people will respond to these adjustments.
This creates a need here and abroad for leadership to continue looking for ways to improve the environment of the classroom, a task McDermott says the university is prepared to take on.
“There has been much discussion about academic integrity, writing as learning and assessment of student outcomes. My impression is that we need to keep having these discussions and listen to experts across the disciplines about how AI relates to ethical awareness, liberal learning and research.”
For more help on AI, instructors can access the AI Education Project, which is geared towards making educators aware of capabilities and uses of generative artificial intelligence for their curriculum.