The Ghost in the Machine: AI’s Cold Touch in Publishing
By Jeffrey Mangus, CEO, Mangus Media Group
As someone who’s ghostwritten for executives, celebrities, and thought leaders for over ten years, I know the heart of a great book lies in its human vulnerability—the raw, spirited emotions, the personal struggles, the triumphs that make readers feel seen.
But lately, with AI bursting into the publishing scene like a roaring freight train, it has… dare I say it, made me nervous.
I am not afraid of losing my role or being replaced, but I am uneasy about what it’s doing to the art and craft of writing.
Writing is pure, free, and individualized. It’s personal, a way of flipping the inside out, hanging it out there, and being brave enough to tell the world about our feelings and what we see. It’s our words and perspective. Hurts the heart and beautiful part of it all.
In walks artificial intelligence, and suddenly, that pureness of writing becomes dark, tainted, less worthy, and sadly less valuable.
I fear the damage to writing and to maintaining the honesty and integrity of our words and efforts.
With AI, there’s a ghost in the machine, simulating a soul but lacking the genuine human spirit.
I want to explore the ethics of AI-assisted ghostwriting, the moral disgust it often evokes, and why publishing risks losing its core essence without that human touch.
Drawing from recent stats, the numbers don’t lie: AI is infiltrating publishing at breakneck speed.
By 2025, a whopping 71.7% of content marketers will be using AI for writing tasks, from drafting to editing. In newsrooms, 87% of leaders report being “fully or somewhat transformed” by generative AI.
Ghostwriting, traditionally a human art of channeling someone else’s voice, is evolving, too. AI tools now assist in organizing timelines, suggesting structures, and even generating complete memoir chapters.
But for me, here’s where the moral disgust creeps in full throttle.
Psychological studies show that when readers learn that emotional content was AI-generated, they feel deceived and cheated, leading to reduced loyalty, fewer book sales, and word-of-mouth marketing.
It’s that gut reaction—disgust at the perceived fakery.
I’ve felt it when reviewing AI drafts; they mimic emotion and miss the nuanced pain of lived experience.
This disgust ties into broader ethics questions: Who owns the words? Is it authentic if a machine “feels” for you?
Research from 2024 highlights how AI-labeled responses make people feel less heard, even if the AI detects emotions better than humans.
Authors also feel guilty—using AI feels like cheating, short-circuiting the vulnerability that builds real connections.
Writing is a craft and needs to be respected from our human standpoint.
Yet, somehow, some are satisfied with bending AI usage to do the grunt work and then infusing the human element at the end.
A 2025 forecast predicts a backlash against AI’s weaknesses, like flat narratives, emphasizing why ethics-focused journals are buzzing about this.
Consider the controversy around “Psychic Witch Magick” by Mat Auryn and “Dark Goddesses” by Rebecca Silva. Popular in the spiritual nonfiction niche, these books scored alarmingly high on AI detection tools—97% and up to 99% in samples.
Readers flooded forums with testimonials of disappointment and frustration: “It felt robotic, lacking the emotional depth I crave in witchy reads,” one said on Reddit.
Before AI suspicions, the books sold well, but after the reveal, sales dipped as moral disgust spread.
The authors denied heavy AI use, claiming human ghostwriters, but the damage was done. This mirrors broader trends;
Amazon is flooded with AI-generated knockoffs, corrupting bestseller lists.
On the other hand, a human-led rewrite of similar content—think vulnerability-infused drafts—restored reader joy.
Testimonials from literary communities highlight this: “Human stories make me nostalgic for that authentic craftsmanship; AI just leaves me cold,” one author group member shared.
This leads to AI’s cold touch: Publishing thrives on emotional exposure, but AI simulates without embodying it, resulting in flatter narratives.
Studies comparing AI vs. human writing show AI outputs are more positive but lack the singular word usage and sentence variety that convey real vulnerability.
Human drafts evoke joy through connection—think of the nostalgia of traditional storytelling in classics like Tara Westover's “Educated,” where raw pain resonates.
AI, however, frustrates with its robotic predictability.
Before-and-after comparisons drive this home: An AI first draft of a memoir chapter might spit out generic “overcoming adversity” arcs in minutes, but a human revision adds deep emotional layers—like the guilt of survivor’s remorse, pain of loss, the sting that hooks readers emotionally.
In one experiment, AI-generated beach reads were reevaluated against human ones; while AI was faster, humans won on engagement.
Success stories in human-led publishing abound. Take “Atomic Habits” by James Clear—pure human insight, no AI shortcuts, selling millions through relatable vulnerability.
Readers testify: “It felt like James was speaking directly to my struggles.”
AI can’t replicate that soul. A 2025 review of emotional AI in content found human-AI hybrids work best when humans lead, avoiding the disgust of full automation.
As we navigate 2025’s AI revolution, with tools like Ghostwriter AI promising quick books but delivering hollow ones, I urge transparency.
Label AI use, blend it ethically, and preserve human vulnerability. At Mangus Media Group, we use AI for research but keep the heart human. Publishing’s soul depends on it—let’s not let the machine’s ghost haunt our stories.
Speak to our team and unleash the human power of great writing for your book. (click here)