Last year, I interviewed a mother whose daughter had a genetic condition so rare there are fewer than 300 known cases in the world.
She didn't speak like someone giving a quote — she spoke …
This item is available in full to subscribers.
Please log in to continue |
Last year, I interviewed a mother whose daughter had a genetic condition so rare there are fewer than 300 known cases in the world.
She didn't speak like someone giving a quote — she spoke like a neighbor. Like someone you might see at one of the Gulf Coast's plentiful social events or the grocery store. Though strong, her voice trembled ever so slightly when she described the endless doctor visits, the sleepless nights, the moments of hope. She smiled when she said her daughter loved to dance.
I didn't write that story because it would get clicks.
I wrote it because it mattered — to her, to her family and to this community. That's what human journalism does. It sees the people behind the headlines. It listens. It connects. It reminds us we're not alone.
That kind of storytelling — real, local, personal — is built on trust and truth. But in today's world, where content spreads faster than context, it's getting harder to tell what's real and what's been manufactured for clicks or laughs.
For example, on Jan. 16, 2017, YouTuber Jack Douglass — known to thousands as Jacksfilms — posted a video titled "FAKE FACTS: Let's Fool the World!"
It was part of his popular "YIAY" (Yesterday I Asked You) series, known for crowdsourced comedy steeped in satire. The concept was deceptively simple: invite fans to submit completely false, meme-worthy "facts," then share them online to see who might fall for the bait.
Douglass would continue making these videos as part of the YIAY series because they were successful, eventually adding "Bamboozled" episodes, where fans shared responses they received after posting fake fact memes.
One such fake fact claimed, "2 out of 3 Alabamians cannot name more than 5 other U.S. states... and 10% cannot spell 'Alabama.'" Responses poured in. One person who had been tricked commented, "I just read that," despite the fact being fake. Another: "At least they are good at college football." And another: "This is the world we live in unfortunately."
That video, while comedic and clearly labeled satire, reflected a troubling truth: people believe what they want to believe.
Almost a decade later, fake "facts" are everywhere — on Facebook, in group chats, on X threads, in YouTube Shorts. They cover celebrity deaths, politics, science, health — you name it.
And with the rise of artificial intelligence, things are only getting murkier.
As a journalist and editor in Baldwin County — the state's largest by land area, bigger than Rhode Island and even larger than some countries such as Monaco, Liechtenstein and Luxembourg — I believe the job of the press isn't just to inform. It's to serve. To care. To understand. To live among the people we cover.
AI might produce copy in milliseconds. But can it sit across the table from Crawford McWilliams, whose daughter, Shreve, lives with the rare CTNNB1 syndrome and talk about the challenges and victories of advocating for awareness like I did last year?
Can it walk the banks of Little Lagoon with Aidan Holdsworth, a local high schooler working to restore a living shoreline by hand-planting black needle rush grass to fight erosion?
Can it capture the emotion of Tyler Burkett, who returned to the Gulf Shores locker room to prep for his senior football season just months after undergoing chemotherapy for Hodgkin lymphoma?
Or the grief of Amy Childress, who lost her son Garrison to suicide after years of struggling with depression and anxiety?
AI doesn't feel. It doesn't cry or cheer. It doesn't root for a small-town football team. It doesn't sit in a church pew during a funeral or take notes at a high school graduation.
Art, journalism, storytelling, etc., are more than the transmission of information. They are reflections of our lived experience, shaped by emotion, memory and meaning. A news article written by someone who has stood beside you and cheered for you at a state championship or cried over the loss of a neighbor isn't just different from AI — it's essential. Human-made work captures the nuance of our communities. It's not just about who, what, when and where — it's about why. And only a human can ask that with empathy. To strip that away in favor of efficiency isn't just a technical change. It's a loss of soul.
Yes, AI is helpful. I use it for spellcheck, Google Maps, search engines and more. These are tools — but they must not become the tool.
Because as AI creeps further into our lives — from art generation to journalism to law and even medicine — it's learning from us. All of us. That includes trolls, bad data, fake news and misinformation, outdated studies and yes, even those satirical memes meant to prove a point.
And then it gets really dangerous.
A recent article published on Forbes.com by Jack Kelly, titled "The Jobs That Will Fall First As AI Takes Over the Workplace," lays out a grim outlook. It cites projections from McKinsey, PwC and the World Economic Forum that say 30% of U.S. jobs could be automated by 2030. That number could rise to 60% by 2050. Goldman Sachs estimates 300 million jobs could be lost worldwide.
Customer service, paralegal work, journalism, bookkeeping and design are among the first expected to be affected. Even empathy-driven fields like therapy and teaching aren't entirely safe.
Kelly quotes Ray Dalio of Bridgewater, who says workers need to prepare for a "great deleveraging" as AI outpaces job creation. Larry Fink of BlackRock already sees AI reshaping legal and finance roles. JPMorgan CEO Jamie Dimon estimates 20% of analytical jobs at his bank could vanish by 2030.
"Graphic design, copywriting, and basic journalism face disruption from tools like DALL-E and GPT-derived platforms, which produce content at scale," Kelly writes in the article. "A 2024 Pew Research Center report notes 30% of media jobs could be automated by 2035."
Even with research aimed at making AI more human-like, it will never be human. At best, it can only simulate humanity. So, I ask you, in a world already full of hate and disconnection, do we really want a future without humanity?
At Gulf Coast Media, our newsroom may be small, but it is passionate. Executive Editor Kayla Green, Photojournalist and Chief Digital Officer Micah Green, reporters Ruth Mayor and Colin James, myself and even Publisher Vince Johnson — we don't just report stories. We live them. We do our best to cover Alabama's biggest county, and some days, we still feel like we can't cover enough (OK, we know we can't cover enough — Google Maps may say the drive from Bay Minette to Gulf Shores is a little over one hour, but we know the truth, sometimes it is more like two).
We don't get it perfect. We have typos. Sometimes we get things wrong. But we're human. Accountable.
AI might produce slick copy or mimic a news anchor's cadence. It might spit out art that looks beautiful. But it will never replace the handmade ceramics of Tom Jones, the jewelry of Bill Wismar or the brushwork of Loran Chavez. It won't write poetry like Gulf Shores' Jessica Jones or tell stories like local historian Harriet Outlaw. It might write a catchy song — but not like Mike Turner of Fairhope can.
And I certainly don't want my news written by AI. You shouldn't either — not just because I would be out of a job (though that definitely helps). But because reporting isn't about just relaying information. It's about trust, curiosity and context — things you can't download, code or fake. Look to AL.com, the Mullet Wrapper or, better yet, Gulf Coast Media. You'll find people who don't just write the news. They live it.
Even in gaming, the warning rings true. In "Subnautica," an open-world underwater survival game released by Unknown Worlds Entertainment in 2018, a character named Medical Officer Danby says:
"I'm not really a doctor. I know that's what my ID says, but I never have been. Cheated the medical exams. What does a doctor these days need to know about manually resetting bones? When was the last time a top surgeon actually cut someone open? That's what the robots are for.
"Doctors these days read diagnoses off of computer readouts. For that, I'm perfectly qualified.
"But what good is it when I'm not connected to the main network? I'm bleeding. I've got glowing green pustules growing on my hands. I run a self-scan and it tells me I've got skin irritation. The only thing I studied in medical school was how to lie convincingly. What the hell do I know about treating an alien disease?
"I think I'm actually going to die down here."
He knew the truth too late: no amount of simulated knowledge can match the value of human experience.
As we mark another annual World Press Freedom Day, as AI moves deeper into our lives and our jobs, remember this: Freedom of expression means nothing if it isn't backed by a voice with a heartbeat.
Facts need verification. Stories need soul.
So yes, let's use AI as a tool. Let's adapt. Let's prepare. But let's never let go of our humanity.
Because when it comes to your news, your community, your truth — you deserve more than a machine.