Surbhi Bharadwaj, Senior Photographer
Artificial intelligence has had a huge impact on Yale classrooms.
One academic year after the launch of ChatGPT, Yale administrators and instructors have changed their guidelines and teaching styles to accommodate the new technology. Although many people remain concerned about plagiarism regarding AI, some professors are embracing its use in the classroom.
English professor Ben Glaser first became interested in applying AI to the humanities after taking a summer course for faculty on natural language processing in 2022. He recalled thinking that AI would “transform the writing environment.”
“I said, 'I'm going to teach a writing course so you can think about how to write better, which is what I do all the time,'” Glaser said. “'We're going to look at these tools that might be a hindrance or might be helpful.'”
In the fall of 2023, Glaser taught an introductory English seminar titled “How to Write Essays with AI,” where he and his students discussed how artificial intelligence can be applied to writing. They also investigated the relationship between authorship, creativity, and AI.
His ultimate goal, he said, is to help students become better writers.
“It's easy to say, 'Oh, AI will never be creative,'” Glaser said. “And I said, 'Can't we do that? Let's look at these differences.'
Students in the course also read AI-generated stories, analyzed the differences between poems written by humans and AI, and practiced planning essays using AI. For their final project, they investigated how different industries use AI.
Shortly after Glaser designed and proposed the course, and about a year before he first taught it, in November 2022, OpenAI launched ChatGPT, a popular generative AI program. According to Glaser, the release of the chatbot program has made students and teachers even more focused on how his AI transforms learning.
Poorvu Center responds
For the Poorvu Center, the university's teaching and learning support center, the release of ChatGPT was the impetus for developing academic AI guidelines, Alfred Guy said. As Deputy Director of the Poorvu Center and Director of the Poorvu Center's Writing and Tutoring Program, Mr. Guy has been instrumental in conducting workshops on AI and facilitating teaching education programs on technology.
Guy learned of ChatGPT's impending release through Facebook and immediately felt that Poorvu Center should be involved. After the chatbot was published, the Poorvu Center published its first guidelines for the use of AI in the Yale classroom in January 2023.
“The first thing we said was, 'These tools are powerful, so people will use them,'” Guy said. “Everything that happens after this has to be thought about in terms of how people use these tools.”
As AI software like ChatGPT becomes more mainstream, Guy noted that Yale instructors generally don't react with panic or fear. Still, they wondered how his AI would affect them and whether they needed to take specific actions in their classrooms. Guy mentioned the Pourve Center. curriculum guidelinescurrently overseen by a five-person committee, aims to answer these questions.
The current guidelines include suggestions for how instructors can approach AI in their syllabi, encourage students to cite AI correctly, and provide precautions when using AI technology. In addition to providing links to numerous articles and webinars on AI, she also encourages instructors to try out her AI tools for themselves and share their feedback with the Center.
Guy said Poorvu Center's approach to AI has changed over the past year. Rather than sounding an alarmist tone, the center is now encouraging hands-on exploration of AI technologies.
“Our tone and demeanor has shifted very slightly towards AI,” Guy says. “Even when it comes to specific advice for teachers, we say, 'You should really be proactive.'”
Glaser also noted that Yale's open approach to teaching and learning with AI does not necessarily reflect the approach of other universities. He said he thinks AI is less controversial at Yale than at other universities because of widespread institutional support and awareness of the technology.
“Once we get out of the Yale bubble, the writing environment will be very different, and the AI tools will work differently,” Glaser said.
AI in the classroom
As part of one of his class projects, Glaser asked his class to revise Poorvu's proposed guidelines for using AI.
Jared Wyetzner ’27 is a member of Glaser’s class and previously co-founded Myndful-AI, a machine learning chatbot that provides mental health resources to high school students.
Weitzner said everyone in the class generally agreed that AI tools play a role in Yale education.
“Our goal was to be able to bring AI into the classroom,” he said. “There are certain methods that should be used to facilitate work.”
For Weitzner, AI is best understood as a tool that can be compared to a writing “calculator.”
“You learn how to do calculations, addition and multiplication all on your own, and then eventually you start using a calculator and that becomes the norm,” he said. “How can we leverage AI tools while learning from text?”
One course at Yale University quickly added AI to the classroom: CPSC 100 (Yale University) Introductory programming classco-taught with Harvard University's CS50 course.
According to Ozan Elat, a Yale University computer science professor who helps organize CPSC 100, the course used two different AI technologies. The first one is called a chatbot. duck debugger, allowed students to ask questions about the course and helped debug the code. Her second CS50 Duck Bot was integrated into the online forum Ed Discussion to answer student questions.
Thanks to Duck Debugger, attendance at CS50's office hours has decreased by about 30%, Erat says. He said this is a positive development because students with simple questions can ask his Duck Debugger at home, and students with deeper questions can receive more attention during office hours. Ta.
Erato said she was initially concerned about academic dishonesty, but by the end of the semester there had been no “excessive cheating.” He noted that the percentage of students who contacted the university's executive committee has not changed dramatically and that CS50 instructors will continue to use Duck Debugger and Duck Bot as part of their courses. did.
Plagiarism concerns
The News spoke with Mick Hunter, chair of Yale University's executive committee, about student use of AI to commit plagiarism and academic misconduct. He said the committee started seeing cases related to AI shortly after ChatGPT was launched. In response, the university added a section on the use of AI to its report. academic integrity Guidelines.
In the first incident, the use of AI was “clumsy,” Hunter said, but now incidents are “less blatant,” for example when a student created a false citation for a paper.
“While there are still students who break the rules or use AI in unauthorized ways, students are learning to cover their tracks or use AI more responsibly. It seems so,” Hunter added.
However, Hunter estimates that the executive committee has received about 10 cases related to AI since November 2022, making it a “minority” of academic fraud cases.
Thanks to his work at the Poorvu Center, Guy considers himself “one step short of an expert” when it comes to pre-AI plagiarism.
While Gai acknowledged that AI could contribute to plagiarism in writing, he expressed optimism about the various ways instructors can help limit it.
the guy mentioned multiple things the study Research shows that plagiarism rates decrease when instructors require students to write low-cost responses to coursework, set interim deadlines for large assignments throughout the semester, and encourage conversations in which students explain their ideas. Proven.
He suggested that instances of plagiarism could also be reduced by modifying quotas to include requirements that exceed the capabilities of language-generating AI models.
Glaser and his students also found that using chatbots for writing presents its own challenges. Because of the capacity for errors and “idiosyncratic” responses, students need a type of “literacy” to interpret the AI output and screen for errors.
“During the class, we quickly realized that to get anything good out of the class, we actually needed a fair amount of interaction,” he said.
By the time students create ChatGPTs with appropriate prompts, assess the quality of their responses, and incorporate them into their writing, let alone cite them, the process can be more effort than it's worth, Glaser says. he pointed out.
“By the end of that process, you don't have to worry about plagiarism anymore,” Glaser said. “What I'm wondering is, 'Was it actually efficient, was it helpful?' Did it make you a better writer? The answer could also be 'yes.' I think there is. ”
Yale University established the Poorvu Center in 2014.