At UCLA, Professors See 'Exciting Opportunities' in AI Writing Tools
Image by Pixels Hunter/ Shutterstock

At UCLA, Professors See 'Exciting Opportunities' in AI Writing Tools

Generative AI is tech’s latest buzz word, with developers creating programs that can do anything from writing an academic essay about guitars and elevators to creating photorealistic paintings of majestic cats.

ChatGPT, a platform built by DALL-E 2 and GPT-3 founder OpenAI, is the latest one of these tools to go viral. But this tool can go far beyond writing a version of the Declaration of Independence in the style of Jar Jar Binks. It has the capability to write full essays on almost any subject a college kid could desire — creating another layer of complex technology that humanities professors now have to consider when they teach and dole out assignments.


While ChatGPT does have some limitations (It can only write up to 650 words per prompt), some students have taken to Reddit to talk about the potential uses and workarounds of the word limit to help them pass their classes. Ironically, another student even used the AI to write an apology email to his professor for using AI to write his emails.

One student wrote, “As finals are hitting, I’ve written 6 papers for people and made a great chunk of change. Same day turnaround, any size paper with perfect grammar and in depth writing plagiarism free is a pretty lucrative way to advertise oneself to a bunch of cracked out stressed college students.”

But despite the tool’s internet virality among desperate college students, UCLA professors told dot.LA that they aren’t worried about ChatGPT’s capabilities. Rather than viewing the technology as something they have to shield students from using, they see it as another potential tool in their arsenal and something they can implement in their classrooms.


“My sense of ChatGPT is that it's actually a really exciting opportunity to reconsider what it is that we do when we write things like essays,” said Danny Snelson, assistant professor of English at UCLA. “Rather than raising questions of academic integrity, this should have us asking questions about what kinds of assignments we give our students.”

Snelson tried ChatGPT out for himself, prompting the platform to write an essay about “the literary merit of video games that cites three key scholars in the field.” As it does, ChatGPT instantaneously churned out an essay which answered the prompt accurately and synthesized the arguments of three scholars in a compelling way. But Snelson could spot flaws in its work. The writing style was repetitive and the scholars the AI chose were not diverse.

“I probably will give my students the assignment on the first day of class to write a ChatGPT essay about a topic they know nothing about,” Snelson says. “Then have them discuss the essays that ChatGPT has written for them and what the limits of their arguments are.”

Christine Holten, director of Writing Programs and the UCLA Undergraduate Writing Center, said that she and other instructors are currently having similar talks about how to integrate these tools in a responsible way.


“One way is to allow students to use them,” she said. “Build them into the course, and allow reflection about the bounds of their use, what their limitations are, what are their advantages? How does it change their composing?”

Along with dissecting the platform’s limitations, Snelson also sees using ChatGPT as a tool to propel students’ writing even further. For example, one of the hardest parts about writing an essay is the first line. Having an AI write it for you can be a great starting point to push past the “blank page dilemma,” he said.

And while ChatGPT can write a passable essay on almost any subject, Snelson said students still need to have an understanding of the subjects they’re writing about. “Having a live conversation about Chaucer in the classroom, a student is not going to be helped by an AI,” he said.

“In the real world, you have access to information, you have access to writing tools,” Snelson added. “Why should (academics) disavow or disallow those kinds of tools?”

To that end, Holten said she recognizes that ChatGPT “raises the stakes” by circumventing tools that academics have relied on to detect plagiarism. But students turning in papers that aren’t their own isn’t new: Essay mills have existed for a long time, and Instagram is filled with pages that will sell students an academic paper.

“We have to do our part by trying to craft assignments carefully and making sure that we're not assigning these open-ended prompts of the sort that could be bought from paper mills,” she said.

It helps, too, that ChatGPT may already be working on a solution. Scott Aaronson, who works on the theoretical foundations of AI safety at OpenAI, said in a blog post that he’s working on a tool for “statistically watermarking the outputs of a text model like GPT” that adds in an “otherwise unnoticeable secret signal in its choices of words” to prevent things like academic plagiarism, mass generation of propaganda or impersonating someone’s writing style to incriminate them, though it's unclear how far away this development is.

“We want it to be much harder to take a GPT output and pass it off as if it came from a human,” Aaronson wrote.

All of which explains why even despite claims that high-school English and the student essay are nearing their death knell, Holten thinks, ultimately, “The availability of ChatGPT is not likely to change very much.”
nat@dot.la
RELATEDTRENDING
LA TECH JOBS
interchangeLA