As AI seemingly seeps into every area of consumers’ lives, especially media, it’s now vital to find proper ways to engage with it.
AI-generated media content can often further stereotypes or only push user-frequented content about marginalized communities. However, experts say a few tools can be used in order to benefit consumers and increase their news consumption.
In the journalism sphere, much of the conversation is centered around generative AI—the type of AI which can create new content after consuming existing data and studying it. Its counterpart, predictive AI, is more often used in the medical and business fields.
Dr. Michael Spikes, a lecturer and director for Northwestern Medill’s Teach for Chicago Journalism Program, said generative AI is often being used in journalism to simply create content. These creations don’t use any original content and just regurgitate information similar to what they’ve consumed.
There are also entire websites—sometimes called “pink slime” news sites—which can look like regular news sites, but are filled with content often skewed in favor of one party or person running for office. This can also happen in the newsletter space, where local reporting is scaled back and instead replaced with AI generated sources.

“I think, with all technology, there’s a little bit of good and a little bit of the not so good,” Spikes said.
In terms of how generative AI could be used to promote accurate information, Spikes said algorithmic sources could potentially point readers to information they didn’t know about before. An algorithm can use demographics like location or interests to help find content with angles that interest that specific reader.
Also, he said AI tools could help reporters with large-scale investigations by synthesizing large amounts of data and finding trends. As with all AI content, that data still needs to be independently verified.
For minority communities who are often left behind by mainstream media coverage, Spikes said algorithms can potentially point people towards content that they haven’t seen before. However, some of this content may not have the nuance and distinction needed to educate people properly.
“Nowadays, because we are confronted as consumers with so much content, I think we’re losing the ability to make distinctions [about] what makes journalism,” he said.
People should focus on media literacy and the skills needed to decipher what’s real journalism, Spikes said. Communities of all demographics need to educate themselves on how to find authentic sources with real reporting, and not just take content at face value.
“People are being left behind even more because they’re not getting that kind of education—the platforms are not helping them do that,” he said. “They’re just feeding them tons of content.”
This can and should also be used when accessing new information via algorithms. For example, Spikes said YouTube pages about Chicago news will often have their “South Side” tab as just mostly about violence or other stereotypes. He said often what audiences are seeking out will affect algorithms, so people also need to be intentionally seeking out diverse news.
AI keeps growing—seemingly every search engine or app nowadays boasts an AI feature for use, with many requiring a manual opt-out. Spikes said he thinks it’ll keep being subtly introduced into people’s media ecosystems, and as it does, he wants people to have places to go for content they know they can trust.
“I would encourage people to…develop and cultivate trusted sources of information, especially information you’re going to use to take action,” he said. “People should look for organizations that have sets of practices and ethics.”
Spikes said readers can vet an information source by determining if a publication has an editing chain—by looking for a masthead, for example—signifying that multiple individuals approved the content. Also, it’s important to follow organizations who make it clear to their audiences when they use AI tools, and look for ones who use it not to make content, but for other areas like data analysis.
Consumers can use these tools in order to ask themselves what they’re missing, Spikes said. They can also potentially use AI to help them generate questions about what they may be missing in their news consumption.
“I think generative AI is good at looking at large bundles of content and saying, ‘Here are themes that I see,’” he said. “I don’t think that can happen in the process of writing the content. But I think overall, doing an analysis of it, that can help publishers.”
This story is part of the Digital Equity Local Voices Fellowship lab through News is Out. The lab initiative is made possible with support from Comcast NBCUniversal.
