Brave New World

With the rise of AI, disinformation, deep fakes, and other threats to our democracy, how do we teach students to recognize what’s true and authentic?

New technologies like ChatGPT are changing the game and the educational realm—but, instead of shunning them, Endicott College professors are teaching students how to leverage these powerful tools for good, and how to be more media literate in the process. Photo by David Le
Est. Read Time

When given a few vague prompts, ChatGPT spits out this generic, if grammatically flawless, Convocation speech to be delivered at Endicott: 

“As you stand on this beautiful campus overlooking the sweeping views of the Atlantic, you are not just in Beverly, Massachusetts. You’re at the starting line of one of the most transformative journeys of your life. Endicott is not just a place; it’s an experience, a challenge, a community, and from today—it’s your home.” 

It might read fine, but who actually composed the speech? Has it appeared elsewhere? And, mostly, could these bland words ever resonate with Endicott first-year students the way that President DiSalvo’s Convocation speech—infused with humor and wisdom—does each September? 

These are just some of the hot-button ethical questions that higher education is immersed in at the moment. As a result, some American colleges have banned the use of ChatGPT among students with the reasoning that it borders on plagiarism and isn’t a student’s original intellectual thinking or writing. 

“We are all frontline users bringing different backgrounds to AI. It’s a black box and any use ... will require critical reflection.”—Sam Alexander, Professor of English

But instead of running away from the use of generative language models, a forward-thinking committee at Endicott determined that for the 2023–24 academic year, each faculty member has the agency to determine how much students are able to use ChatGPT and other generative AI tools for classwork and exams. 

There’s even a new related course this fall, Writing with AI, an honors seminar co-taught by Associate Professor of Computer Science Henry Feild and Professor of English Sam Alexander, wherein students are challenged to push the limits of generative language models with a deep dive into writing with ChatGPT. 

Neither professor has a preconceived notion of how AI should be incorporated into academic writing. “We are learning together—in real-time—with our students,” said Feild, a proponent of exploring new technology. 

Alexander, who teaches courses on literature and writing at Endicott, agreed. 

New technologies like ChatGPT are changing the game and the educational realm—but, instead of shunning them, Endicott College professors are teaching students how to leverage these powerful tools for good, and how to be more media literate in the process.

“We’re always learning from our students in some sense. For example, I’m often surprised by something new that a student sees in a poem I’ve been teaching for years. But in this case, we are all frontline users bringing different backgrounds to AI. It’s a black box and any use of AI will require critical reflection.” 

The course draws students from multiple majors. “Together, we are studying the history of writing and how generative AI turns the tables, how AI works on a technical level, whether it’s possible for AI to aid with the creative writing process, and the social implications of generative AI,” Alexander explained. 

The semester will culminate with students writing a research paper with the help of AI tools, a practical creative exercise that Feild will facilitate. “The question we have and which we are going to be exploring is whether AI can think for users or not,” he said. 

Distinguished Professor of Broadcast and Digital Journalism Lara Salahi has been monitoring the pulse of journalism for decades. “AI is nothing new,” she insisted. 

What is new, she continued, is AI’s prominence in pop culture and the additional opportunities for its interference in the American political system. 

This does pose significant concerns about the increasing ability of AI to generate deep fake visuals, racially biased images or texts, or propagate dangerously inaccurate news. 

With some of these issues in mind, Salahi created a course called Politics and the Press that she consciously teaches to coincide with each presidential election cycle. The use of AI to do good or create storms of misinformation—such as fake news stories, doctored images, and videos, and even the creation of nonexistent people—is something she and her students will be watching for as the 2024 election takes shape. 

Associate Professor of Criminal Justice and Security Studies Ashlie Perry said that with the rise of misinformation and racial bias in American media, she’s also placed an added emphasis on giving her students the tools they need to avoid being victims of misinformation when they fly the Nest. 

New technologies like ChatGPT are changing the game and the educational realm—but, instead of shunning them, Endicott College professors are teaching students how to leverage these powerful tools for good, and how to be more media literate in the process.

She cited recent research on Gen Z’s consumption of media, which concluded that more than half of Gen Z teens and young adults get their news from social media feeds or digital news platforms. She’s intent on teaching students how to sift through information critically and fact-check what they read online—particularly in their feeds.

 “We look at research on how media outlets use bias to slowly manipulate or create misinformation,” said Perry of her Political Psychology class. “Of course, this is also intertwined with creating fake news. As citizens, we like to think we are rational actors, but actually, we are heavily led by bias and are attracted to media outlets that confirm that bias.” 

She argues in her Intro to Security Studies course that while it’s not possible to ever eliminate that bias, it is possible to become more conscious of it. 

“People are going to be using AI to make life easier and more efficient,” said Perry, “but we have to teach students to use it in an appropriate manner.” 

Now, Endicott is at the forefront of doing just that.