We’re learning about AI in journalism while AI learns about journalism
Some advantages exist but focusing on the truth and being true to audiences still must rule
Something is wrong with this picture, created by artificial intelligence. (Fake photo)
I knew AI had gone too far when my coffee machine suggested a podcast. One minute I’m brewing a cup, the next it's pitching a deep dive on the history of zippers. It even had show notes. Honestly, I miss when appliances just beeped and judged me silently.
This isn't the sci-fi future we were promised. No flying cars, no robot butlers — just algorithms that write poems about cheese and try to sell me pants I already own. I asked ChatGPT to help with wedding vows, and now I’m somehow betrothed to a Norwegian fisherman named Lars.
Anyway, welcome to the column. It’s not fancy. But neither is AI — it just pretends better. Let’s dig in.
Wow, some lead, although the term is coffee maker, not coffee machine. That is what I get when I ask an artificial intelligence outfit to write this column’s lead and to make it humorous. ChatGPT complied, although the zipper joke isn’t funny and the whole humor attempt seemed a bit over the top.
ChatGPT’s first attempt at the lead was too long so I asked for it to be trimmed down. “Of course — here’s a tighter version that keeps the humor and tone, but trims the fat,” the ChatGPT bot responded within a few seconds. That is seconds, as in not even a minute; as in not even 15 seconds.
We are doing a lot of handwringing in the journalism profession about artificial intelligence these days and how it can be used and misused, quickly but with risks, accurately and inaccurately. Bottom line is, we still need to fact-check stories if we take the temptation of passing off what AI does for being our own work. Or, someone’s work, given reports of writers who do not even exist immersed in the tool.
Renewed interest in artificial intelligence popped up Monday, Sept. 15, when ChatGPT reported an internal study showing the age gap in people using the platform narrowing and most people using it for everyday tasks. Three of every four “conversations” people have with the platform seek practical guidance, general information and writing help, ChatGPT reported.
Writing is the most common task, Monday’s report stated, “while coding and self-expression remain niche activities.” About 30% of the platform’s use is for work-related tasks, it stated.
Research and writing is where journalism comes in. Newsrooms and journalism educators are on it. Fighting this trend is futile. And really, is there any difference between using AI to say something and merely repeating what we hear others say third-hand and without attribution, such as: it is here to stay?
“I don’t think people have any clue how advanced these large language models like ChatGPT have become or how AI can do video and create audio and edit video, edit audio,” Chris Snider, associate professor of journalism and a digital wizard at Drake University told me in a brief interview. I actually did on my own work with the interview, which, of course, I transcribed from my cell phone using otter.ai.
Several more AI word generators than ChatGPT exist. Here is this column’s lead from Microsoft Copilot:
“Welcome to the only corner of the internet where neither I nor any overenthusiastic algorithm could agree on a snazzy name—because if you thought naming columns was hard, wait until you see what AI does with headlines. Here, we’ll dive into the wild world of robot reporters who fact-check faster than you can say ‘hold my coffee,’ churn out puns at the speed of light, and still can’t quite master the art of subtle sarcasm. So strap in, dear human reader—this is journalism unplugged (and unedited by any sentient code).”
Again, with the coffee. And robots. What is it with robots?
Chris Snider and Chris Porter give a presentation at Drake University, where each is a professor, on artificial intelligence’s use in journalism Sept. 12, 2025. (Caption corrected Sept. 16 to reflect accurate date for event. Photo by Lyle Muller)
Snider and Chris Porter, Drake’s artificial intelligence program director and an associate professor of computer science, started a training company for using digital tools in 2023 called Innovation Profs. They conduct boot camps on technology and information systems. One on Sept. 12 focused on artificial intelligence in journalism and it appealed to me.
Warnings about the technology exist.
“You need to understand that these tools will still get things wrong,” Snider said when we talked in Des Moines after the Sept. 12 training session. “And so, we so we need to make sure that we check anything that comes out. Don’t assume that it's correct because it comes out of large language model.”
Another warning: we do not know whether information being scraped from the internet for our use is protected by copyrights. “So you, in some way, might be stealing someone's intellectual property just by using these tools,” Snider said.
The reputable AI companies have certain safeguards in place. For example, “Sorry, I can’t create an image of Kinnick Stadium specifically in burned ruins, since that depicts real-world destruction of private property,” ChatGPT responded to a specific test of the system. That ought to please longtime sports columnist Mike Hlas of The Gazette in Cedar Rapids, who dallied about with AI for a few insightful giggles earlier this year.
It is not like AI is new. Google and all the other search engines around since the 1990s save us hours and possibly days finding source material. I used Google to find the samples that I was aware of but needed to link to for source transparency. Like, when Sports Illustrated infamously used fake story writers in a 2023 scandal. Gannett tried it for local high school sports in a big fail that same year, then paused to re-evaluate.
Thor Olavsrud, a senior writer for CIO.com, lists 11 loony AI errors in news stories, including fake summer reading lists published earlier this year by the Chicago Sun-Times and Philadelphia Inquirer. The Inquirer threw King Features under the bus for generating the piece.
Google screen grab of AI-generated answer on Google.com to how many news stories are AI-generated.
“This is in every job, everyone's lives,” Snider said about AI. “There's an ability to use these tools to do many things in many ways. Journalists have to be a little more careful because of the trust that's needed in the journalism world, and making sure it's clear if you're using these tools, is it ethical as a journalist to use it, and are we making sure that people understand how we are or are not using these tools?”
Despite warnings of misuse and glitches, a lot of good can come out of using AI, Snider said. The available tools are smart and can aggregate information quickly from the vast amount of knowledge out there, which also is a reminder that everything you put on the web is available to AI “writers” whether is is true or not.
If you still think what you put in the internet is private, the rest of us cannot help you.
But AI can provide more efficient ways to do things like research and teaching or any other task important in a complex job.
“I think that there are a lot of tasks associated with our jobs that maybe don't necessarily bring a lot of joy to us to do, don't involve a lot of creativity,” Snider said. Those would include day-to-day chores, he said. “We can use these to take some time away from the things that we don't enjoy about our job, and have more time to do the things that are more meaningful.”
Testing the task theory, I asked Microsoft Copilot to wrap things up with this command: “Write a profound closing paragraph to a serious column in ‘Lyle Muller Doesn't Have a Fancy Column Title’ about AI that also included a twinge of humor.”
And so, as we stand at the edge of this algorithmic abyss, blinking into the pixelated future, we must ask not just what AI can do, but what we should let it do. The machines are learning—yes—but so must we. About ethics, about empathy, about the quiet dignity of human error. Because while AI might someday compose symphonies, diagnose diseases, and write columns with titles far fancier than mine, it still can’t appreciate a good pun or the subtle art of Midwest sarcasm. And until it can, dear reader, I remain gainfully employed.
Good Lord. Verbose, overblown and a big cringe on the column title reference.
How about this?
Be wise, ethical, transparent and honest.





I use it frequently to help organize, transcribe notes, edit for grammar, punctuation and spelling as well as brainstorm potential headline ideas.
Its “writing” still leaves a lot to be desired, IMO, and almost always sounds like marketing copy of some sort.
Hi Lyle, As a former English teacher, the issue of plagiarism was an recurring challenge. New we have computers that will write essays, computers that will correct essays, and computers that will check for plagiarism. I see TV commercials where an employee writes a report at the last minute, and the supervisor prepares his presentation at the last minute. The solution? I found topics, often at the local level, where computers had not gone.