Adventures in Writing: AI
I must admit I’ve had some experience with generative AI writing. I know I should be ashamed of this, but I’m not. Here’s what I learned about this tool or terror.
When ChatGPT was new, I was intrigued. I just had to try it out. I spent a couple of days feeding it some of my favorite topics and styles to see what it could do with them. What it produced was largely crummy.
For example, I asked it to write a haiku on the subject of sex. What it produced was: “Whispers in moonlight/Bodies weave an intimate/Haiku of passion.” Not exactly T.S. Eliot.
I asked it to write other kinds of poems. In almost every one, ChatGPT included the word “poetry” or the name of the style: “The poetry of passsion’s sweet romance” or “O sonnet of the flesh,” “Each line a brushstroke in the poet’s light,” “In every sonnet, life finds sweet peace.”
It also relied heavily on metaphors, many different ones in the same piece. For example, in a poem on writing, it offered “With verses woven like a tapestry. The writer’s heart, an open book” and “how the words, like melodies, entwine, In stanzas, whispers of a silent song.” Now that’s just bad.
It mixed metaphors atrociously: “The enigmatic tapestry…kaleidoscopic hues… orchestrated by unseen forces, paint the canvas of existence… a symphony of discordant notes, each mood a chapter in a cosmic novel.” All those in one paragraph. Another particularly egregious one was “etched into the fabric of one’s existence.”
It also got facts wrong. Recently, hoping that it had improved, I asked it to write a country song about horses in the style of Willie Nelson. “He’s a big ol’ bay with a coat dark as night” was one of the lyrics. (Bay horses are reddish brown.)
Later on, I had a chance to give a workout to a different AI program tasked with writing ten chapters of a novel. Then I was tasked with cleaning it up into human language. It was a lot of work.
AI couldn’t keep track of the characters, for one. It called a little boy Nicky, Billy, and Jamie and another character Henderson and Nelson in different chapters. It chose weird words (“thrummed”) and repeated phrases (“weight of the world”) in nearly every chapter. It forgot that a pivotal scene was supposed to take place in an abandoned warehouse, setting it instead in the woods. Once I had to have it produce a whole new chapter because the first one was so far off. Plot, characters, dialogue, narrative, backstory, continuity — all would be rejected by any competent editor.
AI nonfiction can be just as bad. There’s a tendency to be simply wrong about names, dates, and locations, for example. Flaws like that will seriously mislead readers. AI nonfiction has also been known to get things like herbal remedies drastically and dangerously wrong.
Once I toyed with an AI image generator, which did reasonably well with a human girl (though it couldn’t produce medium-length auburn hair), but couldn’t make a satisfactory alien to meet my criteria. It looked like one of those sad-eyed children in the old paintings.
I understand why special effects professionals hold AI in horror. Entire departments are being canned and replaced by larger, more sophisticated sorts of AI than the Tinkertoy ones I played with. But movies have been using CGI for years, so AI is the next logical extension. I’m not saying that the SpFX companies are right to abandon the people who have done so well for them over the years. Human imagination controlling the tools of creation will — or should — always have a place in the equation.
But generative AI has a long way to go before it can produce prose or poetry that can substitute for human works. I understand that publishers are being assailed with AI-produced novels, but I can’t imagine they can’t tell the difference. Readers of self-published novels, though, should watch out for AI books and avoid them.
Or, if you prefer, avoid all AI writing. I understand why you would.
(Note: Aside from the brief quotes, no AI was used to write this post.)