When ASU professor Andrew Maynard asked one of his classes last fall what they thought about using ChatGPT, he said the feedback included two main responses.
The first response came from a student who argued that they were the ones who wanted to learn, not the machine. Why would they make the machine do the work for them?
The second response came from a student who raised another point: they were paying for this degree. Why would they not do the work for something they invested so much time and money in?
Even though ChatGPT, a language model developed by OpenAI, raises some security, privacy and ethical concerns, the technology continues to advance rapidly.
ASU recently announced its partnership with OpenAI, and starting this month, faculty can start pitching ways for the University to use the new model ChatGPT Enterprise, with students able to access it later.
"ASU recognizes that augmented and artificial intelligence systems are here to stay, and we are optimistic about their ability to become incredible tools that help students to learn, learn more quickly, and understand subjects more thoroughly," ASU President Michael Crow said in a statement sent out by the university.
READ MORE: ASU announces first partnership between OpenAI and a university
Despite the reality that AI is "here to stay," students don't need to be given special access to this new technology. ChatGPT can advance the research of faculty members, but the point of giving students this unbridled resource is unclear, and it's uncertain if students can use this technology responsibly and consistently.
"The question is, can we learn how to use it responsibly, fast enough?" Maynard, a professor at the School for the Future of Innovation in Society, said. "Of course, there are very irresponsible uses. If you just go back to the idea of if you've got an assignment and you only got 10 minutes to complete it, you ask ChatGPT to do it, that's probably not the most responsible use."
The apparent downside of this technology is the significant prospect of cheating. On a deeper level, ChatGPT is accelerating and changing constantly. It's uncertain if the University will be able to keep up with the evolving software enough that students can truly see its benefits.
Maynard designed an online prompt engineering course for students to develop the skills needed to use AI effectively. The course was available during the summer and fall of 2023.
Now, in 2024, Maynard won’t be teaching that course again.
"The main thing is the technology is moving so fast," Maynard said. "The course is already out of date. If we were to keep it relevant, we'd have to reinvent the course every time we taught it."
Another glaring concern for student accessibility to ChatGPT is the ethical guidelines involved. Right now, AI use is up to the discretion of each individual professor. It's hard to draw the line between right and wrong in complex situations such as this one, with an infinite number of scenarios that could play out.
But setting hard rules might not be the answer either.
"I'm slightly concerned that if we're not careful, we'll start putting down hard and fast rules," said Maynard, who is also on the Faculty Ethics Committee on AI Technology. "The reason that worries me is if you do that, you stop allowing people to be creative and innovative. You run the risk of preventing uses, which could be beneficial."
READ MORE: Charting ASU's future: Navigating OpenAI's first partnership in higher education
The ChatGPT technology has benefits. Possessing the skill to harness the technology and maximize it could make students more marketable after graduation. According to Maynard, one possible idea submitted by faculty is the design of personal tutors to help students in courses.
When the partnership was announced, ASU Chief Information Officer Lev Gonick said ChatGPT Enterprise would also be used for student improvement in Freshman Composition, ASU's largest course.
For English professor Kara Childress, the tool has been helpful in her classes for brainstorming and the early stages of assignments.
"If you asked me even two years ago, I would say I loathe the idea of AI except for maybe spellcheck and Grammarly, which are still AI, just not in the terms that we think it is now," Childress said. "However, starting last semester, I did a pilot program where I had students use it for brainstorming or a little bit of peer review, and it was enlightening to see what kinds of ideas came from that."
Even if ChatGPT can help students brainstorm, researching ideas for a school assignment is something students can succeed at on their own.Childress pointed out that with this new technology, we're developing new skills but diminishing others. Students are learning how to become AI literate instead of practicing brainstorming and researching independently.
And even though AI is prevalent in our society, not all students feel obligated to use it. Tools like Grammarly can be a great resource for students to check their spelling and make sure their work flows. However, this type of AI is different than the ChatGPT machine that can do research and generate ideas for a student, possibly hindering a student's creativity and originality.
"From even last semester, I noticed a lot of students, even if they were allowed to, they didn't (use AI) because it was too much trouble to put in the directions and see what's available there," Childress said. "I don't think it is hindering anyone if you don't use AI, at least in English class."
While these benefits are noteworthy, the University also leaves student involvement with the technology in murky waters, even in the early stages of the partnership. Why don’t students have access to submit proposals for the University's ChatGPT use at the same time as faculty? What are the guidelines for submitting their ideas?
It also remains a murky topic for faculty, as many professors across fields express their uncertainty and lack of knowledge on the initiative.
Maynard makes a good point: "If the world didn't have ChatGPT, the world will be fine."
It could be monumental for research, but students don't need ChatGPT to succeed. As technology continues to change, students should avoid abusing its power.
Edited by Walker Smith, Sadie Buggle and Caera Learmonth.
Reach the reporter at katrinamic03@gmail.com and follow @kat_m67 on X.
Editor's note: The opinions presented in this column are the author's and do not imply any endorsement from The State Press or its editors.
Want to join the conversation? Send an email to editor.statepress@gmail.com. Keep letters under 500 words, and be sure to include your university affiliation. Anonymity will not be granted.
Like The State Press on Facebook and follow @statepress on X.