Tag: fantasy-fiction

  • AI is eating publishing from both ends (and we’re all pretending it’s fine)

    Here’s the situation, folks: we need to talk about the AI elephant in the room, because it’s not just sitting ominously in the corner anymore—it’s rearranging the furniture, helping itself to your food, and leaving its damn prompts in published novels like dirty socks on your pillow.

    The war against the robots is here. We all thought we’d be facing down a 6’4” killing machine with an Austrian accent. Turns out our greatest foe is an overly friendly jumped-up chatbot that will happily provide a recipe for strawberry cheesecake.

    I’ve been watching this unfold for two years now, and I’m here to tell you something that might make you uncomfortable: the AI disruption of fantasy self-publishing isn’t coming. It’s here. And it’s eating our industry from both ends.

    On one end, you’ve got the democratization dream—AI tools making professional-quality covers, editorial assistance, and even decent prose accessible to anyone with a ChatGPT subscription. On the other end, you’ve got the slop tsunami—an endless flood of algorithmic garbage that’s drowning out real authors and turning Amazon into a digital slush pile.

    As always, YA is the canary in the mineshaft here. One 2023 study by indie Author Caitlyn Lynch put the number of entirely AI-generated YA romance novels at 81%

    Fantasy is next. And somehow, we’re all just… pretending this is fine.

    First the good news (yes, there is some)

    Let’s start with what’s actually working, because I’m not here to be a digital Luddite screaming at artificially rendered clouds.

    AI is legitimately solving real problems for self-published authors. Remember when getting a professional fantasy cover meant dropping $800-2000 and waiting weeks for revisions? Now you can generate something genuinely gorgeous with Midjourney for the cost of a decent lunch. I’ve seen indie authors create covers that would make traditional publishers weep with envy, all because they finally have access to tools that don’t require a degree from Art Center.

    Fantasy covers have always been about evoking atmosphere—dragons, castles, mysterious figures in cloaks, the whole nine yards. AI is actually brilliant at this because it excels at combining familiar visual elements in emotionally resonant ways. That dream-like, slightly surreal quality that makes Midjourney perfect for fantasy? That’s not a bug, it’s a feature.

    Across the process, the economics are transformative. Professional editing used to run $500-3000. Cover design was another $300-800. Formatting, marketing copy, audiobook narration—it all added up to thousands before you’d sold a single copy. Now Claude can help you tighten your prose, Midjourney can handle your cover, and AI narration is getting good enough that audiobook production costs are dropping through the floor.

    For progression fantasy and LitRPG authors especially, AI is proving invaluable for managing complex world-building. All you need to do is ask ChatGPT to keep track of your magic system’s rules across seven books, or to generate consistent stat progressions for your characters. These are exactly the kind of systematic, logical tasks that AI handles brilliantly.

    I know authors pulling in six figures who use AI extensively as a creative partner. Not to create their voice, but to amplify it. They’re writing faster, publishing more consistently, and building audiences that traditional publishers would murder for.

    A long time ago writers had no choice but to write entire novels on typewriters. Then word processors came along. Then laptops. In recent years great books have been written on almost entirely on iPhones. Any tool that makes life easier for proper authors and gets genuinely great writing into our hands is a net benefit. It isn’t cheating. It’s evolution.

    And the bad news (it’s badder than the good news is good)

    But here’s where we hit the wall, and it’s a big one.

    The slop is real, and it’s getting worse. Amazon has had to cap self-published uploads at three books per day. Three books per day. Think about that. Think about the kind of person who is perfectly happy generating an entire book using ChatGPT, then self-publishing it on Amazon more than three times a day. They can be nothing other than absolute charlatans. And apparently there’s enough of them that Amazon needed to put limits in place to stop them. 

    The kindle store lists over 1,000 books which site ChatGPT as a co-author. And they’re only the ones who chose to come clean.

    We’ve seen books where authors literally left their prompts in the published text. Lena McDonald’s “Darkhollow Academy: Year 2” included this charming passage: “I’ve rewritten the passage to align more with J. Bree’s style, which features more tension, gritty undertones, and raw emotional subtext beneath the supernatural elements.”

    It’s not just embarrassing for Lena MCDonald; it’s insulting to her readers. 

    When KC Crowne got caught with 171 titles containing AI artifacts like “Thought for 13 seconds,” it wasn’t just one bad actor. It was a glimpse into an entire ecosystem of people treating our genre like a content farm. They’re not writers. They’re not storytellers. They’re digital sharecroppers harvesting Kindle Unlimited page reads with algorithmic efficiency.

    This matters because fantasy readers are sophisticated. They notice when your magic system doesn’t make sense across chapters. They catch inconsistent character voices. They can smell derivative plots from orbit. 

    The tragedy is, the reader won’t know these failings until after they’ve bought the book. And that initial sale is the only thing these crooks care about. AI slop doesn’t just betray readers—it actively trains them to be more suspicious of self-published work in general. Which is taking a brilliant, positive trend and forcing it hard in the opposite direction. Nice work Sam Altman!

    Hey! Here’s 250 words on how I’m destroying your industry

    The really scary thing is that AI isn’t just changing how books get made. It’s changing what books are.

    When major publishers like Tor start using AI-generated assets “by accident” (twice now, for Christopher Paolini’s Fractal Noise and RuNyx’s Gothikana), when Sarah J. Maas gets AI-generated wolf heads on her covers through Adobe Stock, we’re witnessing in real time the normalization of artificial creativity in an industry built on human imagination.

    The economic pressures are only getting worse. Median author income hit $20,000 in 2022, with only half coming from book sales. When you’re competing against someone who can pump out three “novels” a day using ChatGPT, how do you price your work? How do you justify spending six months crafting a story when the bot next to you is optimizing for engagement metrics and SEO keywords?

    We’re teaching readers to accept lower standards. Every AI-generated book that gets decent reviews because it hit the right tropes and kept the plot moving is quietly lowering the bar for what constitutes acceptable storytelling. When readers start expecting books to feel like algorithmic modules of content —predictable, optimized, safe—what happens to the weird, wonderful, genuinely surprising stories that make this genre worth reading?

    The I mentioned earlier about how ChatGPT is perfect for tracking progression fantasy and LitRPG? It also makes these genres particularly vulnerable to abuse. When AI can competently handle stat progressions and skill trees, what’s left that feels uniquely human?

    So how do we be more Sarah Connor?

    AI tools are here to stay, and many of them genuinely make books better. But if we want to stop the whole sector going up like the Hindenburg (and as fast), we need to get smart about this, like yesterday.

    First, let’s stop pretending AI assistance and generative AI are the same thing. Using Claude to polish your dialogue is fundamentally different from asking it to write your novel. One refines your voice and saves you some gruntwork; the other creates a voice for you (the same one its created for thousands of others) and demands none of the qualities it takes to become a great writer. The distinction is crucial.

    Second, we need better detection and curation. Amazon’s three-book per day limit is a start, but platforms need to get serious about filtering non carbon-based garbage. If Spotify can identify AI-generated music, Amazon can flag AI-generated novels. The technology exists—it’s just a question of which approach will make Mr Bezos the most money. Which brings us onto point three.

    As a community and a cash cow, we need to reach a point where the avalanche of AI generated dirge stops making financial sense to anyone involved. Starting from the teenagers using it as a summer side-hustle, up through the publishing houses who’’s bottom line keeps them up at night, and all the way to the top of the Amazon shareholder food chain. That means identifying it, and buying a lot less of it. Seeking out authors with real online presence, consistent publishing histories, and actual human responses to reviews is a good start. Supporting writers who engage with their communities like humans, not engagement-optimization bots, will be a great help too.

    Most importantly, we need to elevate what AI can’t do. Genuine emotional intelligence. Cultural specificity. Personal experience. The kind of messy, human insight that turns a decent story into something that changes how you see the world.

    AI can write you a perfectly serviceable hero’s journey. It can’t write you Jorg Ancrath’s particular brand of charming sociopathy, or Harry Dresden’s relationship with technology, or the specific way N.K. Jemisin uses geological metaphors to explore systemic oppression.

    Most of us become avid readers because at some point in our lives a certain text moved us deeply. In all its wrinkles and texture and left-turns, it felt like it was written just for us. That’s something humans – and only humans – have been doing for thousands of years. 

    Those uniquely human elements are our competitive advantage. But we must celebrate an champion them if we want to keep them. If you’re an early career writer, you have to trust that its your unique perspective on the world that will cut through the noise and build an organic readership. Forget metrics. Forget, if you can, money and sales (for now). Focus on making your work uniquely yours. People will find it, and treat it like it’s uniquely theirs. 

    AI can be a tool that makes better writers more productive, or it can be a weapon that does tremendous damage to our creative economies. The difference isn’t in the technology—it’s in how we choose to use it.

    We’re at a crossroads, friends. We can embrace AI as a creative partner while maintaining human standards for storytelling. Or we can slide into a Matrix-like purgatory and spend our sad little lives sucking down pre-digested gloop.

    I know which future I want. It’s not the gross gloop one.

    What’s your take, fellow degenerate loyalists? Are you using AI tools in your creative process? How do we separate the genuinely helpful from the soul-crushing slop? Let me know in the comments—but please, for the love of Sanderson, make sure it’s actually you writing them.