I remember clearly the first moment that a computer program surprised me with its poetry. A few hundred lines of software I’d written in Java and had called SEER were not impressive for any technical elegance or for a truly artificial intelligence. But my program had written a coherent English sentence relevant to my chosen subject, which happened to be snakebite, and had done so with an oracular grace that I found uncanny. After instructing SEER to summarize a dozen articles I’d found online about rattlesnake strikes, including stories, poems, and medical reports, I blinked at the results. With a venomous bite you have been given a great gift appeared in chapter one of the novel that I co-wrote with that software, italicized to show that my aging Macbook had authored those words. With dozens of other computer-generated passages, SEER accounted for about 10 percent of my unpublished first novel, Love Song of Zero and One. As literature the book was a disaster, but it was the most instructive and enjoyable failure I’ve ever experienced.
What still haunts me is that I searched diligently for anything in the text I’d asked SEER to summarize for something like the above phrase. But I couldn’t find any paradox or sage pronouncement there that the machine had regurgitated, no evidence that instead of writing originally, my software had just replicated thoughtlessly a pre-existing literary notion that it had failed to digest and re-compose. In fact, the word gift didn’t appear at all in the source material I’d fed to Sentient Electronically Engineered Recounter (SEER). My use of the word “Sentient” in the acronym had been a joke, of course. This was just a laptop with a database of words and a Markov model for text generation.
I should take a moment to recount, in what geeks call natural language, the events that led me in that winter of 2011 to take the inspired and desperate step of sharing authorship with a machine. Fifteen years before, I had joined the technology workforce just out of UCLA clutching my bachelor’s degree in Creative Writing. That was around the time that the Internet became the World Wide Web. I soon took a job testing computer games because it paid better than working in a deli, and it required only half-assed joystick skills along with an ability to describe precisely in writing any software defects I found.
The interview for the job asked candidates to submit written instructions to an alien life form who’s never before seen a telephone that would enable it to make a phone call. At that point I should have realized I was entering a work world that was less oriented toward humans than ever before.
But I learned to test software, then learned to manage the testing process, and then taught myself to program computers when testing software proved a career dead end. By night, inspired by Virginia Woolf, William Faulkner, Gabriel García Márquez, and Salman Rushdie, I wrote fiction with the hope of producing something worthy of my loftiest goals. And I survived as a humanoid in the digital economy, spending most of my time with computer scientists and engineers and MBAs, who tended to mock me gently for my impractical education.
My friends who had obtained MFAs published more than I did in some cases, and most became teachers of writing or supported their novelistic aspirations as journalists or waiters. They also faced continuing money worries, an ache that I’d largely left behind with my transition into tech. So I persisted in writing fiction after hours, and software in PERL and SQL and Python during the workday, achieving literary milestones by placing short stories in respected journals. But like most writers, I found myself without the prizes and spectacularly lucrative book contract that would let me quit my day job and write full time.
To my surprise, I found programming quite manageable as a trade after the initial hurdles. Handling variables and loops required no more math than high school algebra, and learning the syntax of a foreign tongue was something I’d enjoyed when studying Italian and French. Within a year, I’d doubled my income. The condescension with which my colleagues had treated me was gone. I wasn’t the smartest or the most technical person on the team, but I was a software developer, and I eventually wrote programs to do many things. One of the most interesting challenges I’d seen others in my department take on was writing software to read and author news reports.
In the technology departments of Wall Street where I worked for a decade, geeks compete to create more efficient and accurate Natural Language Processors. These read financial results contained in press releases when they hit the newswire, and decide programmatically to buy or sell that company’s stock within microseconds. I thought this was interesting, but not the most intriguing use of that software.
I had been researching a biographical novel I was writing about Georg Cantor, the great German scientist of set theory who created much of our current understanding of infinity. I’d imagined his story interwoven with that of a present day technologist obsessed with grasping infinity through a software program, but the narrative quickly crumbled under the weight of all the math that I grasped only slenderly, and that I found hard to render as eloquently as David Foster Wallace had in his nonfiction book about Cantor called Everything and More.
Soon Love Song of Zero and One became the tale of a contemporary couple’s breakup under the pressures of the digital age. Sara, the wife in Love Song, was a professor of poetry married to a male technologist who tried to understand William Blake by using his company’s news parsing software to summarize those poems in plain English. For a week or so I tried writing the computer’s output myself, pretending that I was a machine interpreting Blake’s poetry. Frustrated with my results, I assigned myself the challenge of creating the parsing software on my own time. But for that I had to learn intermediate level Java programming.
I should note that my interest in combining technology and literature goes back to my undergraduate days, when I wrote a poem that rendered the opening of my favorite play as instructions in AppleSoft BASIC, a language I’d learned in sixth grade computer camp.
10 ENTER LEAR
20 PRINT “OUR DARKER PURPOSE”
60 IF CORDELIA < GONERIL THEN 70
70 IF CORDELIA < REGAN THEN 80
80 KINGDOOM = LAND/2
90 GONERLAND = KINGDOOM
100 REGANLAND = KINGDOOM
The above lines approximate BASIC’s syntax, and the algorithm for daughter-comparison is flawed, but it gets the point across. My girlfriend at the time, a poet, adored my typo KINGDOOM, and thought that line summed up the play in one command.
In a sense, all of SEER’s most interesting passages and successful parsings were mistakes like KINGDOOM, a substitution without thought of one unit of language for another that still retained the patina of meaning. For SEER I used Markov chains as implemented by Dr. Daniel Howe’s freeware. Markov chains are probability-based, like what Google uses to autocomplete your search terms with those that other searchers have used.
And that was the whole magic. In its final version, SEER read text, and with some basic grammar and a dictionary of words, used randomness and probability to guess what summarized the input accurately.
But it wasn’t until after my hard month of work learning Java and understanding Dr. Howe’s freeware that I could give SEER its first assignment. The character of the husband in my novel, a technologist from Texas, returns to his home state, where a rattlesnake bites him. So I fed my program the top 10 articles from Google’s search results on snakebite. With oracular brevity, SEER’s response was With a venomous bite you have been given a great gift.
When I completed Love Song of Zero and One, it included whole paragraphs of SEER’s output, most of it insecurely mounted on the drama-starved story I’d created to carry it. One paragraph was SEER’S summaries of the The Collected Works of William Blake. No two of my programmatic summaries of Blake or of any other input were alike, which wasn’t a sign of my code’s effectiveness but of its reliance on randomness.
I submitted the completed novel to five agents, all of whom turned it down. The most gracious of them said he enjoyed the experiment and fully expected to regret his rejection of the book. He should not regret it, just as I don’t regret authoring it. What did it teach me? That technology still doesn’t understand us. Alan Turing’s famous test for machine intelligence proposes that a computer that speaks sensibly in response to questions, offering the appearance of intelligence, cannot be called anything but sentient.
I disagree. What SEER and Love Song taught me are that mere replicas of thought and creativity are easy to create with probability machines, even replicas that sound eerily wise. What is hard to create in humans and in machines is real thought and genuine creativity.
I continue writing and submitting fiction with increasing success, while composing software for money. And I occasionally tamper with computer-generated literature, which has a long history going back to the 1970s and before, as noted in J.M. Coetzee’s book of essays Doubling the Point, and elsewhere. Certainly it’s possible that companies like Deep Mind, which Google purchased and where Elon Musk is a director, are moving us quickly toward a Singularity event, in which software becomes self-aware.
But as participating in any undergraduate creative writing class makes obvious, the gap between simple self-awareness and the literary intelligence necessary to compose a worthwhile novel will always be vast.