What if I told you there almost wasn’t a raven in Edgar Allan Poe’s “The Raven”? What if I told you that, instead of having his nameless narrator drive himself mad beneath the shadow of a grim and stately raven of the saintly days of yore, Poe almost went with a parrot?
I agree with your derisive scoff. But the truth is this ridiculous hypothetical isn’t quite as ridiculous or hypothetical as you might think. It’s absolutely true — for a split second, Poe was going to write “The Parrot.”
The reason this seems so instinctually wrong has a great deal to do with our collective idea of Poe. While even people uninterested in literature were probably forced to read a couple of his short stories or poems in school, that alone can’t account for his iconic status in pop culture. To give just one example, his face appears on countless t-shirts, which are usually jet-black and adorned with dead-eyed ravens, chalk-white skulls (sometimes his own, poking through his flesh), and other equally chipper images, along with the occasional quote about insanity or despair. Not that such merchandise needs to add much to get across a macabre vibe — with his sunken eyes, bulging forehead, and perpetual grimace he apparently thought counted as a smile, Poe’s face alone conveys the dark tone of the dark world with which we associate him.
The problem is that this idea of Poe marketed to people through shirts and mugs and so much more is an unfair caricature of a profound and multifaceted artist. I’ll admit that there are more than enough heartbroken men sleeping alongside dead lovers in crypts and mass murders at masque balls — and that’s without going into the weird stories — to justify seeing Poe strictly as a horror writer. But he wrote far more than simply horror. For instance, those who are familiar with more than just his most famous works likely know about his character C. Augustin Dupin, a coldly logical detective so similar to Sherlock Holmes it’s easy to forget that Holmes was influenced by Dupin, not the other way around. As much as he had a permanent impact on horror, Poe was just as important in the development of detective fiction.
But what truly makes Poe so unique among authors is the mathematical philosophy underpinning his work, and there is no better way to appreciate the strange synthesis between art and science Poe achieved than by examining his essay, “The Philosophy of Composition.” This essay offers invaluable insight into how Poe created “The Raven,” and offers hope to any of us who have ever picked up a pen and tried to translate the hurricane of nameless emotions within us into words so that we might better understand ourselves.
Poe’s “The Philosophy of Composition” could almost have been titled, “The Anti-Poetic Manifesto.” That’s because before he gets to explaining precisely how “The Raven” came to be, he spends the first few paragraphs launching a savage attack against the idea of poets he believes most people possess. Specifically, he loathes the idea that poets are some kind of elevated species, far more insightful and wise than the rest of the slobbering masses. But he doesn’t blame us for this misleading impression; he blames writers. Early on, Poe claims, “Most writers — poets in especial – prefer having it understood that they compose by a species of fine frenzy — and ecstatic intuition — and would positively shudder at letting the public take a peep behind the scenes.” This charge brings to mind a letter John Keats wrote to John Taylor on February 27, 1818, in which he said, “…if Poetry comes not as naturally as the leaves to a tree it had better not come at all.” Keats might not have claimed to be inspired by a “fine frenzy” or “ecstatic intuition,” but there is still a sense that great poetry either bursts out of us perfectly polished from the start or…not, with no in-between. Poe, in contrast, not only disagrees, but believes great poetry can only exist by working through that in-between. Put another way, Poe does not treat poetry as a gift someone must be born with to possess at all, but a craft that can be honed through practice. And if it really is a craft, well, then why couldn’t any of us write “The Raven”?
You might be derisively scoffing for the second time, but why not? If nothing else, “The Philosophy of Composition” argues forcefully and repeatedly that good writing is the result of good choices. The key is to know what questions to ask, something Poe teaches us through an examination of every choice he made to produce “The Raven,” going so far as to say that his essay will “render it manifest that no one point in its composition is referable either to accident or intuition — that the work proceeded, step by step, to its completion with the precision and rigid consequence of a mathematical problem.”
Poe isn’t kidding. He goes into such meticulous detail that it would be impossible to discuss every choice. To give a taste of the essay, however, it’s worth examining how some of the most famous elements of “The Raven” came to be.
To begin with, how did he come up with the subject of the poem? First, Poe considered “Beauty…the sole legitimate province of the poem,” or, more specifically, “the contemplation of the beautiful.” Poe also believed “Melancholy is…the most legitimate of all the poetical tones.” Put these two ideas together, and Poe concluded that “the death, then, of a beautiful woman is…the most poetical topic in the world — equally is beyond doubt that the lips best suited for such a topic are those of a bereaved lover.”
Moving onto more mechanical elements, how did Poe come up with the haunting, “Nevermore”? Well, first he decided using a refrain at all would be a good idea because so many great artists have used it before, and, “The universality of its employment sufficed to assure me of its intrinsic value, and spared me the necessity of submitting it to analysis.” Then, he decided the refrain should be brief and determined on the “character” of the word by noting that “o” is “the most sonorous vowel, with r as the most producible consonant.”
But how to naturally insert the refrain? This question puzzles Poe at first. After all, if he was going to have a dialogue with two characters, it would be hard to imagine how one could always appropriately respond with the same word to the other’s, presumably varied, questions. Unless, that is, one person in this dialogue was not a person, but an animal. This leads to my favorite line in the essay, where Poe explains “very naturally, a parrot, in the first instance, suggested itself, but was superseded forthwith by a Raven, as equally capable of speech, and infinitely more in keeping with the intended tone.” So while Poe only considered using a parrot for a moment, the fact he considered it at all once again demonstrates the open, dispassionate, and logical approach he used (plus it makes for a fun story).
“The Philosophy of Composition” is worth reflecting on for three reasons, regardless of whether you are a Poe expert or neophyte. First, it thoroughly traces the writing process. There have been plenty of critical essays on the writing process at least as far back as Aristotle’s Poetics, where he argues for the three unities (time, place, and action), contrasts the strengths and weaknesses of epics versus tragedies, etc. But Aristotle was critiquing the works of others, primarily Homer and Sophocles. Here, Poe is critiquing Poe with the objectivity of a scientist studying a specimen under a microscope.
The essay also shatters the facsimile of Poe peddled by popular culture. As soon as you step outside his most famous stories and poems, you will see Poe’s intimidatingly vast knowledge all sorts of subjects, including Greek, Latin, mythology, philosophy, and science, with references to Apollo, Charles Babbage, Seneca, and Francis Bacon, to name only a few. You will also see how frequently mathematics are evoked, whether in stories as disparate as “The Purloined Letter” or “Ligea” or in his other technical works, such as “The Rationale of Verse,” where he declares that, “[Verse] is exceedingly simple; one tenth of it, possibly, may be called ethical; nine tenths, however, appertain to mathematics.”
But the third, and most important, reason this essay should be read more is the way it democratizes writing. It’s easy to fall into the misconception Poe tries so hard to dispel in his essay about poetry being the result of a “fine frenzy.” I certainly find it hard to believe that the eeriness of the line, “And its eyes have all the seeming of a demon’s that is dreaming” or the mysterious beauty of the opening lines of “Annabel Lee” – “It was many and many a year ago, in a kingdom by the sea” – are the result of logic. Yet for Poe they were precisely that, the results of deciding on the right answers after asking the right questions.
“The Philosophy of Composition” proves you don’t need to wait for, let alone be born possessing, poetic inspiration to write well. And that is an inspiring idea.
Image Credit: Flickr/Kevin Dooley.
In her April review of Thomas Kunkel’s Man in Profile: Joseph Mitchell of The New Yorker, Janet Malcolm discussed Mitchell’s beloved, beautiful stories — “cryptic and ambiguous and incantatory and disconnected and extravagant and oracular and apocalyptic” — and his inclination toward invention — toward “radical departures from factuality.” “Mitchell’s genre,” she wrote, “is some kind of hybrid, as yet to be named.”
Almost three-quarters of a century after The New Yorker published Mitchell’s first story, Mitchell’s genre, some kind of hybrid, still has yet to be named. Or, rather, perhaps, Mitchell’esque genres have yet to be given a name that fits. Literary Journalism? New Journalism? Literary Nonfiction, Narrative Nonfiction, Immersion, Creative Nonfiction, Faction? Fiction?
Some time ago, I received the suggestion that I conflate two characters in a manuscript I hoped would someday be a book that, in the tradition of, say, George Orwell’s Homage to Catalonia, combined field reporting, memoir, essay, and history. Some kind of hybrid.
Conflating my characters — two children, in this case — would create a single and more compelling protagonist, I was told. Reflexively, I etched in quotes. “My” characters? But I also felt a different urge: It was true. Inventing one composite kid from two could make the story stronger. Certainly it would make writing the story easier for me, and I wanted that too. But how did what I want matter? I come in part from cheating stock — thieves, adulterers, at least two murderers, as far as I know. I was curious: Could I be a cheater, or, more precisely, a compositor, too?
According to Dan Ariely, absolutely. We all are cheats and liars, his research suggests, and, for writers of some kind of hybrid, this matters, perhaps a great deal. Professor of psychology and behavioral economics at Duke University, Ariely has spent more than a decade studying why humans don’t tell the truth. Last month, with collaborator Yael Melamede, he released a documentary film, (Dis)Honesty: The Truth about Lies. In 2012, he published a book, The (Honest) Truth about Dishonesty: How We Lie to Everyone — Especially Ourselves, including a back-cover blurb by A. J. Jacobs, author of the immersion memoir The Year of Living Biblically: “…those who claim not to tell lies are liars.”
Humans, evolved to find advantage at the lowest possible cost, possess “a deeply ingrained propensity to lie to ourselves and to others,” Ariely reports. We are dishonest to serve the self — the ego, Latin and Greek for “I,” distinct from the world and others. We are dishonest to serve our fears — of inadequacy, of rejection, of difference, obscurity, going broke, oblivion, death. We are dishonest to serve our desires — for meaning, love, power, fame, a single compelling protagonist. In Joseph Mitchell’s case, perhaps, for art.
Sometimes, he says, we’re dishonest so we can think of ourselves as good and honest people. As Marcel Proust once also observed, “It is not only by dint of lying to others, but also of lying to ourselves, that we cease to notice that we are lying.” To further complicate matters, says Ariely, “The more creative we are, the more we are able to come up with good stories that help us justify our selfish interests.” This, it seems to me, is both the good news and the bad.
Orwell for one believed that “all writers are vain, selfish, and lazy.” He himself was sometimes a cheat, and also a coward, at least as far as we know. I’ve read Homage to Catalonia several times over now, and an essay he wrote about writing books. “Writing a book is a horrible, exhausting struggle,” he observed, “like a long bout of some painful illness. One would never undertake such a thing if one were not driven on by some demon whom one can neither resist nor understand.”
The Polish author Ryszard Kapuściński, complicated, enigmatic, alternately choleric and charming, comes to mind. “One Kapuściński is worth more than a thousand whimpering and fantasizing scribblers,” said Salman Rushdie of his friend. Said John Updike, Kapuściński wrote “with a magical elegance that…achieves poetry and aphorism.” Yet, in Imperium, a personal account of communist Russia — the camps, the purges — Kapuściński reports nothing of his onetime collaboration with the communist party in his homeland Poland. When the director of Iranian studies at Stanford met with Artur Domoslawski, Kapuściński’s biographer and friend, he said to Domoslawski, “You can open Shah of Shahs at any page, point to a passage, and I will tell you what’s wrong or inaccurate.” And then he proceeded to do so.
Kapuściński once yelled at a friend asking about his books’ omissions and fabrications. “You don’t understand a thing! I’m not writing so the details add up — the point is the essence of the matter!” On this point, Kapuściński was right. In literature, the essence of the matter rather than the adding up of details is the point. But, if in the end, readers perceive that “essence” comes in disregard of the worlds of possibility found between writer and reader — and if readers never pick up another of your books again — what’s the point at all?
Fabricators like Mitchell and Kapuściński may now be the exception, as Charles McGrath wrote in his New Yorker review of Man in Profile: Joseph Mitchell of The New Yorker, “because now…it’s harder to get away with,” but their stories, Ariely’esque in the telling, are enduring, and enduringly stirring.
Consider anthropologist Wendy “Wednesday” Martin, for instance. Her new book, The Primates of Park Avenue, one of two books she’s written that she describes as “very research-intensive blendings of memoir and social science,” earned this New York Post headline in early June: “Upper East-Side Housewife’s Tell-All Book Is Full of Lies.”
After The Post published its revelations of Martin’s practices — compression, conflation, fabrication — Cary Goldstein, vice president and executive director of publicity at Simon & Schuster, told The New York Times, “It is a common narrative technique in memoirs for some names, identifying characteristics and chronologies to be adjusted or disguised, and that is the case with Primates of Park Avenue.”
Ariely argues that the human inclination toward deception, highly evolved and driven by dread or desire, has a slow corrosive effect on society. Think subprime mortgage crisis. Neural MRIs show the more we lie, the less the region of the brain associated with guilt responds, suggesting the act of lying desensitizes us to the shame of lying. There are those who argue dishonesty has a slow, corrosive effect on nonfiction. Think Jim Fingal, the one-time Harper’s fact-checker who took John D’Agata and his fictionalized essay “About a Mountain” to task in the book they co-wrote, called The Lifespan of a Fact. Near the end of the book, Fingal and D’Agata come to verbal fisticuffs. Fingal writes, exasperated, “I mean, the whole point of all these shit storms over the last ten years…isn’t that the reading public doesn’t understand that writers sometimes ‘use their imaginations.’ It’s about people searching for some sort of Truth…and then being devastated when they find out that the thing they were inspired by turned out to be deliberately falsified…for seemingly self-aggrandizing purposes.”
Or maybe don’t think of Jim Fingal, however spot-on his words. Maybe we ditch Jim Fingal, who, it was revealed in post-publication coverage, partially reinvented his correspondence with D’Agata for their nonfiction book. “Contrary to the impression created by the promotional material, and the way it has subsequently been characterized in reviews,” wrote Craig Silverman on Poynter.org, “…The Lifespan of a Fact isn’t, you know, factual. D’Agata never called Fingal a dickhead, to cite but one example.”
In journalism, where truth is an explicit part of the deal between writer and reader, shit storms are understandable and necessary, as real harm is often a consequence.
Ironically, during filming, Ariely and Melamede couldn’t find any journalists who’d talk to them about deception in their field.
Yet in hybrid genres where rules are less clearly defined, the consequences of unreliability are also often felt at great intensity. Even in memoir, recently described by Daphne Merkin as an “elasticized form for truths and untruths,” outrage and pain seem to register when a writer is perceived to betray the trust.
It is curious to note the research that suggests the emotional experience of social pain, and betrayal specifically, lights up the same regions of the brain as physical pain. Humans remember social pain more acutely and for a longer duration than physical pain. Neurologically, the experience of being cast away appears to mirror that of being burned. Like, with fire. I confess to feeling something akin to this when I learned the cat in Annie Dillard’s Pilgrim at Tinker Creek didn’t exist. It was a metaphorical cat. Jesus Christ. Annie Dillard. I thought she was perfect.
The truth is, many readers want to believe they know who the author is. Readers have for millennia. In Tiger Writing: Art, Culture, and the Interdependent Self, novelist Gish Jen reminds us that the independent self — “the self unhitched from the collective” — has been “making things up [since] even before the words ‘fiction’ and ‘poetry’ were coined.” In ancient Rome and Greece, writers who fabricated were eyed with suspicion, “not only because they could make the untrue seem true, but because they tended to be highly individualistic, with interests that might or might not be yours.” This has not changed. Many readers still eye with suspicion writers who fabricate. Which, if we’re really being honest with ourselves, (which, as Ariely notes, is harder than it might first appear) is quite a lot of writers. Hell, many readers eye with suspicion writers who don’t fabricate. As author Robin Hemley observes in A Field Guide for Immersion Writing, “Whether you’re putting yourself in harm’s way emotionally, psychologically, or physically, it’s almost a guarantee that you’re going to get pummeled in one way or another.”
Orwell called for “discipline.” In Homage, he copped, “…beware of my partisanship, my mistakes of fact and the distortion inevitably caused by my having seen only one corner of events.”
In her essay for the anthology Blurring the Boundaries: Explorations on the Fringes of Nonfiction, Naomi Kimbell advises, “[T]he first and most important gesture a writer can make to the reader is letting him or her in on the joke.” And yet, in the words of author Lee Gutkind, founder of Creative Nonfiction magazine, there are no “creative nonfiction police” handcuffing those who don’t, nor should there be.
Gutkind, identified on his website with a quote from Vanity Fair as “the Godfather behind creative nonfiction,” advises writers to rely upon conscience. Yes, fact-checking is critical, he writes in You Can’t Make This Stuff Up: The Complete Guide to Writing Creative Nonfiction, but so too “following the old-fashioned golden rule by treating your characters and their stories with as much respect as you would want them to treat you.” “Conscience,” he writes, “a reminder and an invisible artbiter over us all.”
And, yet, as Ariely’s research and the nonfiction world’s regular shit storms reveal, relying on conscience is slippery business.
Ariely recommends concrete approaches designed to address the conflict of interest between self and other — a signed legal contract, if you’re a trader at J.P. Morgan Chase & Company, for instance. If you’re a writer of some kind of hybrid that blends fact and invention, an author’s note, disclaimer, afterword, use of the conditional tense, caveat, or limitless other artful and crafty techniques. It’s reasonable to assume this is why Simon & Schuster announced soon after The Primates of Park Avenue shit storm that they would add “a clarifying note” to the e-book and subsequent print editions. At the back of the Kindle edition I consulted in mid June, in addition to an introduction that describes the book as “an academic experiment,” is now an author’s note:
This work is a memoir. It reflects my experiences over a period of several years. Some names and identifying details have been changed, and some individuals portrayed are composites. For narrative purposes and to mask the identities of certain individuals, the timeline of certain events has been altered or compressed.
“Every time we lie, we dilute the trust,” Ariely said when we corresponded. Ariely can’t prove this with empirical evidence, but he believes it to be true. When Ariely was in high school in Israel, a magnesium battlefield flare exploded at his feet, and he was trapped in a chemical fire. He spent three years in a hospital. “The experience of pain has led me to beauty,” he later wrote in an essay. Also, “as I am not very concerned with my personal ‘small problems,’ I…can’t get too excited about the ‘small problems’ others are experiencing.”
All the same, Ariely took time to respond to my questions: Given everything we know, why do nonfiction writers continue to make stuff up and not tell readers? Given everything we know, why do readers continue to feel betrayed and outraged when nonfiction writers do this?
“I don’t think this is planned,” Ariely said in a recording he made because typing can be hard. “I think people start writing something that is based in reality, and then the boundary for what is acceptable and what is not acceptable is not very clear.” He said, “It’s very human.”
So, too, the timeless desire for a good story well told. Writes Malcolm of Mitchell, “We should respect his inhibiting reverence for literary transcendence and be grateful for the work that got past his censor.”
While Mitchell’s genre remains nameless, there is a name for the startling discovery of the truth of it: Anagnorisis, Aristotle wrote in his Poetics, the moment of change from ignorance to knowledge. I was wrong. It is here, he believed, “the finest recognitions.”
This essay was adapted and updated from a longer article about constructing nonfiction personae, originally published in the pedagogy department of Assay: A Journal of Nonfiction Studies.
Image Credit: Pexels/Sergey Zhumaev.
Count weather among the forces that I move through life without fully understanding. On a recent frigid Saturday, a sharp chill hunted my joints through worn thermals and cheap gloves. Forget grocery shopping, not with the stroller, not in this cold. My 16-month-old son and I retreated to the library. More than basmati rice or cauliflower, we needed open space, the familiar thick carpet where he could squat and squeal freely. We needed the warm light of enormous lampshades embossed with ants, birds, and humpback whales. We needed more books.
Maybe that should read “wanted.” My son hadn’t tired of Good Dog, Carl or My Friends. He’d started requesting Tickle, Tickle by name. His mother invoked Knuffle Bunny while he handed her laundry, and Brush Your Teeth, Please had helped me transform a grim chore into something like dessert. (Grape-flavored toothpaste deserves some credit here). For weeks, maybe months, books had reliably engaged him, exciting or calming him depending on the title, the time of day, and what could be called his nascent taste. A rotation of Goodnight, Moon, Good Night, Gorilla, and Goodnight, Goodnight, Construction Site guided him to sleep most nights. Our afternoons sometimes focused on books of a certain theme: bunnies, oceans, dogs, primates. Reading was becoming a force that shaped his world, but unlike a polar vortex or a hurricane, it felt like a force within parental control. Three weeks later, our borrowed books due, I am less sure. I can control the presence or absence of books. I can curate his library. I can open the book and ask him “Where is the bird?” After that, mystery follows. Once the words leave my mouth for his ears and brain, they enter a universe where nothing is fixed.
This particular trip to the library marked the end of an eventful week in his early literacy. At bedtime the previous Sunday, he had restacked his books to recover a strategically buried title, Trains Go. My son sat in his typical way, circling once and bracing with both hands before thudding down. (“I learned to sit from my friend the dog…”) He laughed, looking like a tiny John Belushi. In this moment, I saw a flash of what I’ve heard other parents call the Boy-Boy, a term I understand but dislike. He raised the book, which we had read dozens of times, above his head.
In that moment I thought about another force in the world, the force behind the very idea of a Boy-Boy: I thought about gender. My son leaned forward, arms raised and extended. A smile strained his cheeks, seven tiny teeth bared. His whole body twisted into an absurd, miniature shrine to the ultimate Boy-Boy book, which he loves with an intensity that surprises me, though I bought it for him. How little control I have over what gender information he learns! Male adults he has barely met teach him “High five!” and call him “Buddy.” His favorite YouTube videos feature John Lee Hooker wanly performing “Boom, Boom, Boom,” and Ray Charles teaching The Blues Brothers to “Shake a Tail Feather.” The two-year-old boy at the Houston Children’s Museum shows him that cars go “rrrrrr,” and by that evening, my son imitates him precisely. At a playground’s toy house, a trio of four-year-old girls cold shoulders him; at daycare, where his favorite place to play is the toy kitchen, his three tias teach him that la cocina is for boys, too. This thrills his mother and me, even if we still haven’t been able to stop calling him, affectionately, “Mister, Mister.” That night, my son held Trains Go over his head and I thought, Gender is weather: it’s a force you can’t do much to change, the changes of which are are over time unpredictable and not entirely benign.
The horizontal outline of Trains Go settled in my hands, and the contours of its worn cardboard pages dragged me out of abstraction. “Junebug, we read that already,” I said. He grunted, then opened it in his own lap. “Wooo woooo wooooooo,” he said, reading words not on that page, but the next one. I shook my head, astonished. The next storytime, he opened The House in the Night, flapping his lips and saying “bah-bah, baah-baaah,” as he must think I do. He learns, as toddlers do, through mimicry. On the shelf above Harold and the Purple Crayon and the one below What to Expect: The Second Year rests my neglected copy of Poetics, in which Aristotle says: “Imitation is natural to man from childhood, one of his advantages over the lower animals being this, that he is the most imitative creature in the world, and learns at first by imitation.” Reading this, after he went to bed that night, I wondered even more intensely, what does he learn about gender from me? Does he learn the next claim in Poetics, that “it is also natural for all to delight in works of imitation.” All feels impossible, because I know I cannot delight in the countless imitations of violence and chauvinism and privilege I witness in too many fathers and sons and brothers and husbands. So if I can’t delight, can he? Of course he can. I start to worry that I should be more vigilant in my observations. What exactly does he imitate? What does he ignore?
I am probably more typically masculine now, as a father and husband, than I have been since age seven, when I left behind construction trucks and dinosaurs for a Cabbage Patch doll in a Boston Red Sox uniform. I named the doll Ellis Burks, after the actual center fielder, and I thought I treated him with the tenderness any of the Girl-Girls in the playhouse. I referred to him formally, using only his complete name. Ellis Burks, drink your milk. Ellis Burks, go to sleep. Ellis Burks, why are you crying? Ellis Burks, let’s play catch.
Later in elementary school, after I memorized the presidents in order using flashcards my father had bought me, my cousins and I filled summers with games of “White House” on piles of seaside driftwood. “White House” cribbed its structure from your typical “House” played by aspiring princesses everywhere. I adapted it, persuading my younger siblings and cousins to assume the identities of actual figures: Aaron Burr, my shifty vice president, James Madison, my trusted secretary of state. My cousin Kate, of course, was Mrs. Jefferson. As a parent, my younger self’s lack of imagination for the female roles troubles me. Nerd that I was, fidelity to fact did not keep me from asking Kate’s older sister to play Queen Victoria (not yet crowned) or Catherine the Great (dead). No: a hole in my own concept of the world, a lack of schema for powerful women in that world, is what kept my sister from playing Sacajawea, reporting back with Lewis and Clark, her twin boy cousins. I hadn’t learned to expect to see women in the White House as anything other than first ladies, despite having read through countless books providing stories of men to populate a constructed world. I’d read these books, of course, at the place I had taken my son on that freezing Saturday — my local library.
Leaving the library, I pushed my son north on St. Nicholas Avenue. My voice described what we passed — “There’s a streetlamp, and a WALK signal. This street is 1-7-9, which is one more than 1-7-8 and one less than 1-8-0. Look there’s a yellow light, it means slow down. That car has its hazards on, those lights are for parking illegally.” My mind sorted through the way each incident in his young life left a trace of information, a tiny dot. Even in its flawed form, “White House” was only possible because I’d connected dots between texts. A few days before he read Trains Go to me, I had noticed my son apparently doing the same thing. Sorting his library before bedtime, he double-fisted Sandra Boynton books, opening The Going to Bed Book, then Opposites. Like most Boynton books, these two shared an ensemble of cartoonish animals: a moose, a hippopotamus, a rabbit, a cat. He opened both books and compared them, turning a page in one, then looking at the other, turning a page in that one, then looking at the first book. This moment returned to me on the cold walk home, and I did a quick census. Only the hippopotamus had a clearly stated gender. This is a problem, I thought.
It could only be fixed by camel.
“Fixed by Camel,” was a phrase my father, a one-time carpenter, had used after reassembling my son’s crib in our new apartment. At Christmas, to clarify, he gave my son Jacquelyn Reinach’s Sweet Pickles book Fixed by Camel, a gift as much for me — and for him — as it was for my son. With the book in my hands, I immediately remembered Clever Camel and her brick-orange, long-sleeved coverall jumpsuit. How she outsmarts Kidding Kangaroo when he strands her on a roof, traps her under a manhole cover, and defaces the park benches she paints. How she captures him in a jungle gym snare, lured by an irresistible sign reading “DANGER DO NOT RING DOORBELL UNDER ANY CIRCUMSTANCES.” How she also leaves a glass of milk and two peanut butter sandwiches, which she tells him he should eat while he waits for her sidewalk to dry. I always loved that Camel — calm, solutions-oriented, quick-witted, and compassionate — was female. The writer’s choice to explicitly gender her felt radical to me, risky, rare. It added depth and surprise, two qualities Aimee Bender calls universal pleasures in a close read of her infant twins’ favorite book, Good Night, Moon.
After our trip to the library, I sat down with my son and Fixed by Camel. As with A is for Activist and A Rule is to Break: The Child’s Guide to Anarchy, as with Ganesha’s Sweet Tooth and Press Here, the first read felt more for my pleasure than for his. Clever Camel might have been a revelation for me as a child, a fixed point from which I could explore the idea — gender — that I still don’t understand. But my guidepost can’t be my son’s. He’ll find his own, in time, and I can hope that he does that because I help him ask better questions, no matter what those questions, or their answers, might be. This thought chased my disappointment with his reaction to the book. There’s nothing broken, nothing to be fixed. Aristotle didn’t have the last word on imitation; the modernists and phenomenologists saw to that. “Of course, works of art imitate the objects they represent, but their end is certainly not to represent them,” Jacques Lacan once wrote. “In offering the imitation of an object, they make something different out of that object. Thus they only pretend to imitate.” Whether or not my son’s imitations of men and what they should love are pretend, they are his attempts to be in the world and to love it. Every time he surprises me with that sort of expression, I need to be grateful beyond words. Imposing my version of gender, my preference for skepticism and nonconformity, isn’t any more appropriate or healthy than forcing him into army fatigue onesies or calling him “Daddy’s Big Guy.” Yes, I hope to expose him to stated versions of all kinds of gender, not only your Gymboree binary, not only Boynton’s sexless bestiary. Yes, I plan to fight the tendency to make the default gender male. But most of all, I think of the gift my father gave to his son when he gave Fixed by Camel to mine. The book transmitted of a spark, bequeathing to my son and I both an encounter with surprise and depth. It’s that electricity I want to conduct. My father’s easy transfer of precise kindness is something I may well spend my life learning to imitate.
Image Credit: Pixabay.
The post was produced in partnership with Bloom, a literary site that features authors whose first books were published when they were 40 or older. Click here to read about “Post-40 Bloomers,” a monthly feature at The Millions.
Years ago, a friend of mine complained about the lack of intellectual stimulation at his day job. He gave as an example a coworker who spent her breaks reading — insert scorn here — a mystery novel. “Whose mystery is it?” I remember asking. “If it’s by, say, P.D. James, then your coworker is probably pretty smart.” A few weeks later, my friend called me back. “The book was P.D. James, and she is really smart.”
Phyllis Dorothy James White, the daughter of a tax inspector, has more than a dozen honorary doctorates and fellowships, and from institutions as eminent as Oxford and Cambridge. At 15 or 16, she won her high school’s prize for a story she describes as “low in credibility but high on drama and atmosphere” (she no longer has a copy). Her father was “not well off and not disposed to educate girls,” so James left school at 16 to work in a tax office, only returning to formal education decades later for night classes in hospital administration. Around that time, her mother was committed to a mental hospital, leaving James to care for her younger siblings. In a 1995 interview with The Paris Review, she said, “I would have loved to have gone to university, but I don’t think I would necessarily have been a better writer, indeed perhaps the reverse.” Regardless, her books abound with highly educated, often-influential characters who spout references to classic British literature and debate fine points of moral theology; and her most enduring creation, the fictional detective Adam Dalgliesh, is both a commander at New Scotland Yard and a famous poet. (He also shares a surname with one of James’s English teachers at Cambridge High School for Girls.)
In the decades before she published her first novel, Cover Her Face, at 42, James married an army doctor, survived World War II, struggled to raise two daughters when her husband returned from the war incapacitated by a mental illness that may have been schizophrenia, and worked full time to support her family. In the decades since, she has written 16 detective novels (including a sequel to Pride and Prejudice in which one of the characters is murdered), a disquisition on the history of detective fiction, a memoir, and the dystopian morality tale The Children of Men. She was honored with the title of baroness by Queen Elizabeth and has subsequently sat on the Conservative benches of the House of Lords. Despite her success as a novelist, she kept her administrative job until she retired at 60, a decision she attributes to growing up during the Depression, when secure civil service jobs were coveted. As recently as last October, she claimed to have cracked an unsolved 1931 murder.
James intended 2008’s The Private Patient to be the last Adam Dalgliesh novel, confessing to USA Today in 2010 that, at 90 years old, “I felt I wasn’t quite sure whether I could begin a new Dalgliesh…I hate the thought of not completing it.” In an April 2013 interview with the London Evening Standard, she revealed that she hoped to bring Dalgliesh back for a 15th book, in which she plans for Dalgliesh (whom James describes as “a reverent agnostic”), like his creator, to confront the certainty of his own death. But by December 2013, she again expressed doubts about her ability to finish: “[b]ut I have no time, no time at all and I do not think [the book] is going to get written, and I am just having to face that…but I hope it may get written.” Undoubtedly, so do her fans.
James resists what she sees as an artificial divide between literary and genre fiction. She takes pains, however, to draw distinctions between the detective fiction she writes and other species of crime novels — particularly ones in which the reader knows the murderer’s identity but the characters don’t, or in which, for the sake of suspense, the author deliberately withholds from the reader facts that are known to the characters. By contrast, a true detective novel makes a compact with the reader, as James explains in Talking about Detective Fiction:
What we can expect is a central mysterious crime, usually murder; a closed circle of suspects, each with motive, means and opportunity for the crime; a detective, either amateur or professional, who comes in like an avenging deity to solve it; and, by the end of the book, a solution which the reader should be able to arrive at by logical deduction from clues inserted in the novel with deceptive cunning but essential fairness.
A well-crafted detective novel fulfills the same essentials of “a perfect tragedy” as outlined in Aristotle’s Poetics: it begins with catastrophe; calls forth fear, pity, and finally recognition; and invokes catharsis — a purging of emotion and restoration of order. As James herself pointed out in a 2009 interview for NPR’s Morning Edition, “The theory is that the mystery flourishes best in times of acute anxiety.” Any reader who picks up a murder mystery does so with absolute assurance that the murderer will be caught and brought to justice by the end, which partly accounts for the genre’s popularity. Within this structure, however, writers can (and do) weave social and political commentary, unresolved and conflicted relationships, philosophical and ethical questions, and often unsettling portraits of lives before and after a murder.
Another of the detective novel’s pleasures is that it asks the reader to solve the central mystery along with the detectives. The author unveils facts as the characters experience them, while simultaneously manipulating these revelations in ways that create deception and misdirection; but both the solution and the process of discovering it are supremely logical. Says James: “The detective can know nothing which the reader isn’t also told …It would be a very, very bad detective story at the end if the reader felt, ‘Who could possibly have guessed that?’” In the end, the detective’s (and the reader’s) triumph reaffirms the power of humanity over forces of darkness and sometimes terror. As James puts it, the mystery is “solved not by good luck or divine intervention…It’s solved by a human being. By human courage and human intelligence and human perseverance. In a sense, the detective story is a small celebration of reason and order in our very disorderly world.”
James admits that she abhors disorder: “In a long life, I have never taken a drug or got drunk, and I say that not as a matter of pride: it’s because the idea of being out of control is appalling to me. I think that when one writes detective stories one is imposing order, and a form of imperfect but human justice, on chaos.” Yet her flirtation with disorder, the terror that an ordinary person could transform into a killer or a killer’s victim, shares more with the complexity of literary fiction than it does with the tidy plots and flat characterization in, say, Agatha Christie mysteries. She has no interest in mass murders or psychopaths, she explained in a 2010 Telegraph interview: “They don’t interest me as much from a crime writing point of view because they kill without recognisable (sic) motives. What is fascinating is when you have an educated, law-abiding person who steps over a line.”
In Time to Be in Earnest, James recounts a day when she was feeding her baby daughters butter that her husband, Dr. Ernest Connor Bantry White, had sent from where he was stationed in India.
I was feeding fingers of toast into Jane’s buttery mouth that I heard…the news of the dropping of the atomic bomb…I knew that the dropping of the bomb would almost certainly bring Connor home earlier and probably safely. But it was still, for me, a moment of horror and, looking almost aghast at my two happy, buttery daughters…I knew that for all of us the world had changed for ever.
Sadly, her husband’s return from the war brought James neither peace nor safety. Suffering from psychosis, Connor was in and out of mental hospitals and sometimes violent. In 1986, pressed by an interviewer, she said, “He did have highs and lows. It was terrifying and terribly disruptive. It’s not a part of my life that’s very happy for me, and I don’t think about it often.” He died at 44 in what some articles have suggested may have been a suicide, but James has rarely discussed him and has never disclosed the details of his death. “One suffers with the patient and for oneself,” she writes in Time to Be in Earnest. “Another human being who was once a beloved companion can become not only a stranger, but occasionally a malevolent stranger.”
She never considered divorce even when it became clear that Connor would never be well again, and she has turned down more than one marriage proposal. Asked in 1986 why she had never remarried, she replied, “Connor was a very exceptional man — one of the few men I’ve met who really believed in the equality of women…I certainly miss my husband as much now as when he first died…We did have a mutual understanding. But I’m not sure one can find that very easily in another man.” Her love for her late husband was just as strong in a 2010 interview: “If I had met someone I wanted to spend the rest of my life with, I would have. I had men friends and I like men generally but I never met the right one again.” Her husband, troubled as he was, was James’s great love.
Arguably, James’s other great love is her fictional hero, Adam Dalgliesh, who embodies good looks, poetic sensibility, uncompromising ethics, compassion, wisdom, and, most of all, dispassionate intellect. When we first meet him in James’s first novel, Cover Her Face (1962), he has lost his wife in childbirth more than a decade earlier and has remained resolutely single ever since. Cordelia Gray, a detective in James’s only two novels to feature a female protagonist, encounters Dalgiesh through the investigation in An Unsuitable Job for a Woman. In the next novel, A Taste for Death, Gray and Dalgliesh have been seen dining together, but James is too coy to discuss the nature of their relationship: “I’m afraid that as far as Adam’s sex life is concerned, what he does in private with a consenting adult is no affair of mine.”
Inevitably, interviewers ask about the relationship between James and her leading man. In a 1977 interview for The New York Times, James said of Dalgliesh, “I wanted a very sensitive, essentially lonely and withdrawn person;” by 1996, interviewed by People, she says, “I think in some ways he may be the masculine equivalent of me…He’s not a self-portrait, but he does have qualities I admire. He’s intelligent, he’s literary. I admire his sensitivities and certainly his courage and his self-sufficiency, but he may be too self-sufficient. I think there’s a splinter of ice in his heart.” Over the years, Dalgliesh’s character has softened somewhat — he falls in love over the course of the most recent four novels and finally remarries at the end of The Private Patient.
As a character, Dalgliesh can sometimes be too exemplary to be convincing. For example, though he contracts SARS in The Lighthouse and ruminates on the possibility of never seeing his soon-to-be fiancée again, before he collapses he completes an examination of a body, gives exhaustive instructions to his subordinates, and, later, in his delirium, solves a key piece of the mystery and insists on holding staff meetings from his sickbed. At such moments, Dalgliesh’s personal life seems superfluous. In giving him one, James gestures toward continuity of character that readers can follow from novel to novel, but she has made him so dependably unimpeachable that his inner conflicts have far less resonance than those of his subordinates and the minor characters he encounters — many of whom are so sharply and compassionately observed that her disciplined main plots contain a kaleidoscope of tiny and evocative stories.
In The Lighthouse, a servant grieves when she learns that a victim died wearing something she had sewed, tormented by the idea that the clothing had caused the death; Kate Miskin, Dalgliesh’s second-in-command, is acutely aware of the servant’s desolation but struggles to express compassion; her subordinate briefly recalls feeling as a child like his existence intruded on his parents’ intimacy: “Coming quietly and unexpectedly into a room where they were alone, he would see the cloud of disappointment quickly change to smiles of welcome — but not quickly enough.” In the last Dalgliesh novel, The Private Patient, the murder victim has waited 34 years to have plastic surgery for the scar left when her drunken father slashed her cheek with a bottle and her mother let the wound get seriously infected rather than reveal the abuse. Questioned by the doctor about her motives for removing the scar, she replies, “Because I no longer have need of it.”
For Dalgliesh as well as James, solving the murder is paramount, and James’s novels are notable for the virtuosity with which their plots are executed. James’s 2005 novel, The Lighthouse, for instance, begins with Dalgliesh being summoned to his superior’s office: a murder has been committed on an isolated island where the Prime Minister would like to hold a private meeting in a few months. The island has no cell phone service, allows only VIPs personally referred by previous guests to visit, and can be accessed only by private boat. More than 100 pages elapse — in which each of the characters is painstakingly introduced — before the body appears, swinging dramatically from the island’s lighthouse railing as gulls (and a servant) shriek in the background. James narrows her gaze to the disquieting image of “the neck mottled and stretched like the neck of a bald turkey, the head, grotesquely large, dropped to one side, the hands, palms outward, as if in a parody of benediction.” The body belongs to a famous novelist, publicly celebrated for his brilliant writing, disliked on the island for his difficult personality, and privately loathed for spectacular cruelties that emerge during the investigation. A handful of suspects stand at the foot of the lighthouse and look up at the body; the permanent staff sequester themselves in a dining room to speculate about the current batch of visiting VIPs; Dalgliesh and team promptly swoop in by helicopter to investigate. Suddenly, the game is on.
It’s at this point that James’s novels make their bid for a place among the “Queens of Crime” (Dorothy L. Sayers, Agatha Christie, Ngaio Marsh, and Margery Allingham), whose books she devoured as a teenager. Her plots combine Christie’s audacious cleverness, Marsh’s evocation of situation and setting, Sayers’s acute sense of morality, and Allingham’s sensitivity to character. The resulting detective novels represent the best qualities of the genre: they are absorbing, intellectually challenging, emotionally satisfying, and artfully constructed. The process of unraveling the mystery demands the reader’s attention and patience as the investigators work through the evidence, and yet the solutions that emerge seem simultaneously surprising and inevitable. No matter how chilling, the murderers are sympathetically drawn; and the supposed innocents differ morally from the guilty only in that they happen not to have committed murder. In James’s hands, murder is simply another gesture that arises from character motivation. James writes in Time to Be in Earnest:
As a writer I find that the most credible motive and, perhaps, the one for which the reader can feel some sympathy, is the murderer’s wish to advantage, protect or avenge someone he or she greatly loves. But should the reader feel sympathy for the murderer? Perhaps sympathy is too strong a word; but I think there should be empathy and understanding.
This empathy and understanding distinguishes James from other giants of detective fiction. In Talking about Detective Fiction, James, in explaining the detective story’s “Golden Age,” writes with admiration for Agatha Christie’s “imaginative duplicity,” but notes, “Both the trickery and the final solution are invariably more ingenious than believable…there is no grief, no loss, an absence of outrage.” For James, conscience cannot be separated from compassion, and her work never delivers justice without also reaching into those dark places where the human spirit falls short.
James was born on August 3, 1920 — a few weeks before women’s suffrage would go into effect in the U.S.; and in England, women (including her role model Dorothy L. Sayers) would not be allowed to enroll at the University of Oxford for a few more months.
In Time to Be in Earnest, James writes, “I can recall a sentence from the Cambridge High School prospectus which, after pointing out that girls could enter the sixth form and be prepared for teacher training college or could take a secretarial course, added: ‘The school thus prepares either for a career or for the ordinary pursuits of womanhood.’” She recounts applying for promotions: “I accepted that I would have to be not only better qualified than the male candidates, but considerably better qualified. This can hardly be regarded as equality of opportunity.” When her debut novel Cover Her Face was published in 1962, all but one reviewer assumed it was written by a man. James denies any intention to conceal her sex, writing that she decided P.D. “was enigmatic and would look best on the book spine.”
Even in her memoir, James refuses to dwell too long on her own memories. She writes:
I see no need to write about these things. They are over and must be accepted, made sense of and forgiven, afforded no more than their proper place in a long life in which I have always known that happiness is a gift, not a right…Like dangerous and unpredictable beasts they lie curled in the pit of the subconscious. This seems a merciful dispensation; I have no intention of lying on a psychiatrist’s couch in an attempt to hear their waking growls.
One could easily imagine James pursuing a different literary path driven by autobiographical impulses, mining her parents’ uneasy marriage, her mother’s institutionalization, her husband’s disintegration into mental illness, her struggles to support a family and gain professional standing, her triumphant career as a writer.
One could also easily picture James as a child in a difficult family, discovering the writers who became influences — Jane Austen, Graham Greene, Evelyn Waugh, Anthony Trollope, George Eliot — and reading simply for the pleasure they gave her. The challenges that have informed James’s writing could easily have kept her from writing at all. The mystery writers she read in adolescence and most admires as influences — Dorothy L. Sayers, Margery Allingham, Ngaio Marsh, and Josephine Tey — could have been devoured and forgotten. Instead, P.D. James has taken her place among them.
Image Credit: Flickr/Chris Boland
While plugging away at a novel in some dark corner or café table, a writer can only hope and trust that following the energy of obsession will lead the narrative to a vital destination. And yet when the novel’s timing and focus coincide with significant events on the nation’s stage, there’s also some serendipity involved. And so it goes with Nicholas Mennuti, whose fiction has long been preoccupied with the new forms of Internet surveillance. Mennuti’s story “Connected,” which was published in Agni in 2008, told the story of an intelligence agent who becomes obsessed with the subject he’s monitoring. This summer Mennuti’s first novel, Weaponized, written in collaboration with screenwriter David Guggenheim, proved to be all too prescient in its imagining of government surveillance systems in place. In the novel, Kyle West, a government contractor and genius programmer, hides out in Cambodia after he’s outed as the mastermind who devised the U.S.’s secret surveillance program software. At the time of Weaponized’s publication, Edward Snowden had just released classified documents revealing the extent of PRISM and other U.S. Internet surveillance programs and was still holed up at the Moscow airport seeking asylum. And regardless of chance, it seems that in this case Mennuti had his finger on the pulse of the techno-zeitgeist.
Weaponized, too, has its own pulse and inner rhythm: this thriller pulled me in and kept me turning pages with its quick pacing and intelligence. Mennuti and I corresponded during the summer months about the Snowden scandal and just what makes him weary regarding Internet surveillance and personal privacy, the burdens and pleasures of writing genre (vs. literary) fiction, and what he’s learned from masters like John le Carré and Graham Greene, as well as the perpetual issues of exile, identity, public versus private, plotting, and pacing that inform his writing.
The Millions: Surveillance and, specifically, surveillance states figure prominently in your fiction. In your story “Connected” an intelligence agent becomes obsessed with the woman he’s monitoring, and now in your novel, Weaponized, the programmer Kyle West has created advanced surveillance software that the U.S. government uses to spy on its citizens. Can you talk more about your interest in surveillance and how (as we’ve recently discovered) events in your novel parallel the government’s existing surveillance programs, specifically with regard to Edward Snowden and the US PRISM program?
Nick Mennuti: I think my ongoing relationship to surveillance themes stems from two factors. First, there’s my love of ’70s and early ’80s Hollywood paranoid thrillers like The Conversation, 3 Days of the Condor, All the President’s Men, Blow Out, Prince of the City, The Parallax View. And some of the ’90s versions like Kathryn Bigelow’s Strange Days too.
But, and this is probably the bigger one, as writers, we really are essentially voyeurs, so commenting on surveillance culture allows me to sort of auto-critique my own psychopathology. I’m a voyeur who is writing about other voyeurs. Clearly, I find this far too comfortable, because I’ve mined the hell out of it for fiction, as you’ve noted.
I used it in “Connected” and now again in Weaponized, but I did switch around their relationships to the surveillance state, per se. In “Connected” my protagonist is inside the system. In Weaponized, he’s running from the system he helped create. Kyle’s scenario is slightly more ironic and certainly less tragic.
Regarding PRISM and now XKeyscore and all the other code words that Snowden has revealed — it didn’t take Cassandra to see that one coming. Our government never stops using a program if it’s working for them, no matter what the outrage. All they do is take it further underground and go off the books.
That’s what I posited in Weaponized, after the whole Bush uproar that we would further privatize wiretapping and rely less on the NSA, which has partially happened (Snowden worked for Booz Allen after all). What’s going to happen next is further automation and less relying on humans to monitor all of this, and that’s what Weaponized was kind of getting at. Eventually, the machines will watch us. Because machines can’t defect to Russia.
I don’t think I have an overriding interest in Internet security/surveillance; I just think it’s the new telephone. It’s the new bug on your car’s dashboard. If you write tech thrillers, you have to use the tools of the trade, and the Internet was a gift to all of us.
And of course, I’m going back into surveillance again. My new book is all about East Berlin in the ’80s. So I’m pre-Internet this time.
In fact, I’m beginning to worry that surveillance for me is what psychiatry and discipline were for Foucault. The themes that dog your career no matter how hard you run. Although, I’m not sure how hard I’m running — or that Foucault did, for that matter — it’s just that this societal shift really got under my skin that I can’t shake.
TM: You have a mixed background of training as a screenwriting, as a fiction writer with high literary aspirations, and now as a thriller novelist — who just sold the film rights. Would you talk more about the difference between writing genre fiction and literary fiction as well as the commonalities that you perceive? I recall you once said your genre fiction has a literary sheen. Please divulge.
NM: Did I say my genre fiction has a literary sheen? I can sure be pretentious! And thanks for exposing The Millions world to it.
TM:, Is that really pretentious? I didn’t think so. I thought you were attempting to describe your authorial intentions. Honestly, I would prefer to read genre fiction with literary ambitions than genre fiction with a demotic sheen.
NM: I’m kidding. I actually think genre fiction could use a little more pretentiousness at times. I think the difference between genre fiction and literary fiction varies for a lot of people, but I’ll try and give you my own definition.
Genre fiction frequently relies on well-mined archetypes both on a character and narrative level. Even the best genre fiction is mining the same tropes. Guys like John le Carré and Graham Greene, even at their most trenchant and deconstructive, retain certain genre tropes. One man against the system. Usually a romance with someone from the other side. And narrative momentum really is the essential thing in genre.
In literary fiction the archetypes still exist, but you have more to choose from and they’re far more malleable. Plus, people are willing to tolerate more experimentation and purplish prose in literary fiction (and I mean “purplish” in the best way possible). Genre fiction is meant to be very accessible and sort of shorn of linguistic personality. Even at its best, the prose is supposed to be very workmanlike. The author is supposed to disappear in service of the story.
And that’s where I sometimes get into trouble and where I think my “literary sheen” comes into play. I have trouble staying the hell out of my work. I’m all over the place. I’m not deliberately trying to alienate the reader, but I think both my genre and my literary fiction aren’t designed to lack personality. And this is way, way more of a problem in genre fiction.
TM: Your shelves are filled with books of serious philosophy, European history, literary fiction — for example, the contents of just one stack include Aristotle’s Poetics, Bataille’s Visions of Excess, Highsmith’s Ripley novels, Graham Greene’s The Ministry of Fear, Fritz Lang’s Interviews, Littell’s The Kindly Ones, Genet’s Querelle, Blanchot’s The Space of Literature, Martin Amis’s House of Meetings, and Peter Gay’s Freud: A Life for Our Times. Even in this fast-paced thriller you’ve dropped in ideas from Cioran and Freud, among others. And you’re probably even better versed in film than you are in literature. Could you talk about the books and films that have most influenced your own writing, and specifically what you channeled for Weaponized?
NM: I’m kind of a sponge. I just look for what fascinates or inspires me, and there’s no plan. There’s no strategy to my reading or my watching. That said, once I decide what I’m going to work on, my reading and watching become very focused. They have to — otherwise I’d never start working. I’d just keep going through the shelves.
Cioran and Freud both spent significant time as exiles, so that was important to me for Kyle in Weaponized. I wanted to draw upon the literature of the exile. So you end up with Highsmith, Conrad, Greene, Hemingway, Duras, and that ilk. They’re all part of the literary DNA of Weaponized.
For movies, I really studied Michael Mann and Hitchcock very closely. Mann in particular because he knows how to use landscapes and architecture to express psychological states — and that was key for me in Weaponized. I wanted you to experience the state of being an exile in Cambodia in a subjective way. Mann and Hitchcock really are the kings of subjective cinema and to be more specific—subjective thriller cinema, which obviously Weaponized owes a great deal to.
When I’m not in depth on a particular project my reading is all over the place, but it’s usually going to contain some liberal dose of J. G. Ballard, Thomas Pynchon, Robert Stone, Michel Houellebecq, Martin Amis, Alan Glynn, William Gibson, Thomas Mann. And then I do like cultural theory, science, and politics a great deal. I’m also an avid consumer of what Ballard called “invisible literature” (internal documents from corporations and PR firms, filled with awe-inspiring statistics and data-analysis) and the Internet is really a great place for that. I spend far too much time every day searching out esoteric ramblings and numbers online.
TM: I’m interested in the idea of identity and the disintegration of the public / private dichotomy, with regard to your novel. In Weaponized, it’s evoked with the surveillance program that Kyle West creates. However, despite having developed the U.S. government’s surveillance software, West shuns revealing personal information about himself. He hates appearing on the television news and having information about his personal life disclosed publicly. His foil, Julian Robinson, seems quite the opposite, charming and comfortable in his own skin despite his chameleon-like ability to transform his identity. And of course, the two end up trading identities. Could you talk more about this paradox in relation to the book, and what identity means now, and what it will become?
NM: Identity is a huge concern for me because it really is the common human denominator. No matter what advances we make in technology, it’s going to stay the same. Humans crave recognition. And we construct ourselves through the eyes of others. We become who we are by how people see us, recognize us. So no matter how diffuse we get socially — we’re still going crave this. What I think we’re seeing now is people negotiating the new boundaries of recognition and some really kamikaze it.
I sometimes think the amount of social media is really just a sign of how desperate for interpersonal recognition we really are. And it’s become a sort of cultural blackmail. If you want recognition, you have to use it. I’m not even sure if exhibitionism as a psychological trait really exists anymore. I think we’re all exhibitionists now whether we want to be or not. I mean people are posting amateur porn and their deepest thoughts and anxieties on a 24/7 basis with no let-up. But it’s not exhibitionism, really, and it’s not over-sharing. It’s just trying to construct a self in the new digital era. It’s a constant process of negotiation and whatever application allows you to best construct a self survives. It really is like Darwinism in that sense. Poor MySpace.
I think one of the reasons literature has been having so much trouble is that maybe we need new words. We have new numbers to express the new world. We have quantum numbers. No one can understand them except a few, but we have those numbers. But do we have those words yet? Can we get them? And maybe the new literature will have to be a formal creation instead of a lexical one. I don’t know.
Robinson is uniquely qualified to live in a quantified world because his identity is fluid. He’s able to adapt to whatever the form requires. Kyle has adapted technologically — well, he’s done better than that — but he’s still living in the 20th century in regard to identity, to the sovereign self. He’s fixed. He’s rooted. And of course that’s probably how he ended up on the run in the first place.
To be honest, I’m way more Kyle than Robinson in those terms. I’m not ready to surrender my identity to a bunch of pixels.
TM: You’ve said elsewhere that with Kyle West you inadvertently created a pseudo-doppelganger for Edward Snowden, as both are seeking to escape the consequences of their actions with regard to their government’s surveillance system. The greatest difference is the offence — Snowden merely leaked the information about the secret surveillance programs to the public while West created the software that made this kind of government surveillance possible. It seems that you were rather prescient in this respect, so I’ll ask, what do you think is most important to keep in mind both with regard to the Snowden case and the government’s ability and willingness to eavesdrop on every part of our lives?
NM: The first thing to remember is that the government has never, ever respected your privacy. At least not since post-WWI and the Communist threat in America. They’ve been opening your mail for years. They’ve been wire-tapping without warrants for years. The only difference is that it’s easier now.
The Internet was created as a doomsday device for Continuance of Government. It has always had a military function, just like the highway system in the U.S. The Internet may seem like the Wild West, but I don’t think it’s ever been as free as people would like to think.
I think we’re going to see more and more Snowdens and Mannings because the government classifies SO MUCH these days. You’re a leaker if you divulge classified information — well, what do you do when most things are classified? Manning just did a data dump on WikiLeaks. Half that stuff should never have been classified. Some of it deserved classification, but a lot didn’t. So I think the culture of leaking has been facilitated by a government that’s more and more reliant on classifying everything in its path to punish people.
I chose to make Kyle the victim of a leak as opposed to the leaker because as a writer I found that more provocative. How do you get people to feel for that guy? As you’re well aware, I love ambiguity in my characters and to me Kyle is more ambiguous than Snowden. I can make a case for Snowden. It’s harder to make one for Kyle and that excites me as a writer.
TM: The question of a disregard for ethics is central to the narrative, too. Kyle West and Julian Robinson and Andrei Protosevitch are all powerful men who do what they do because they can. Protosevitch says that he’s ruthless because no one has ever stopped him. The same applies to Robinson and West — they readily remove themselves from the ethical equation and personal responsibility in the ways they pursue power. West is perhaps more self-aware and conscious of an ethical imperative than Robinson — it seems to be both his downfall and his redemption. Do you see a need for a stronger ethics of accountability? Or is it a lost cause at this point?
NM: I certainly see myself — as clichéd as this sounds — as an ethical writer. I hope that ethical accountability isn’t a lost cause, although it depends on what day of the week you ask me.
In my opinion, the problem is that the people who need to be held accountable — we know who they are, thank you — are not. And that is a habitual matter of process. So if the people we expect to be held accountable escape responsibility, then what are all of us regular folks supposed to think? And I’m not just talking about our government. I’m talking about religion. I’m talking about finance, the law, all of it. All of the purported moral barometers have proven to be naked underneath and we can’t find a reason to be good — for lack of a better word.
At the same time, we’ve never been more self-entitled, and I don’t blame us entirely. We’ve had it drummed into our heads that we have to happy. We are almost demanded to be happy unless. And we are largely not. And I think people began to assume that leading an unethical life may provide a shortcut to at least short-term fulfillment. It’s working for our leaders. It’s a vicious cycle. We’re desperate to be happy and no one is giving us any guidance as how to behave.
I agree though. Kyle, at bottom, is an ethical human being. Flawed, but trying. And I choose to think that’s how most of us really are.
TM: I thought I’d introduce something Evgeny Morozov wrote into the equation. He argues: “NSA surveillance, Big Brother, Prism: all of this is important stuff. But it’s as important to focus on the bigger picture — and in that bigger picture, what must be subjected to scrutiny is information consumerism itself — and not just the parts of the military-industrial complex responsible for surveillance. As long as we have no good explanation as to why a piece of data shouldn’t be on the market, we should forget about protecting it from the NSA, for, even with tighter regulation, intelligence agencies would simply buy—on the open market — what today they secretly get from programs like Prism.” Morozov also states that ethics is removed from the equation — that market forces have replaced morality.
What is your response to this, with regard to the previous question on ethics, and also his argument that information consumerism is a greater danger than government surveillance programs?
NM: I both agree and disagree with his diagnosis. You cannot separate the military-industrial complex, surveillance, and capitalism. Because the thing is that surveillance is the newest notch on the military-industrial complex. The twilight is coming on traditional means of warfare. Unless something utterly catastrophic happens, I just can’t see another ground invasion occurring.
The military-industrial complex is tremendously adaptable. They can see the writing on the wall and they know that Systems Intelligence is going to replace traditional Human Intelligence and also standard means of warfare. It’s why the CIA has transformed itself from an intelligence gathering organization to an international hit squad. They are getting with the program. Systems Intelligence finds the target and the CIA whacks them.
That said, I think Morozov has a point in that information overload and market forces have rendered us basically morally agnostic.
Ironically, both our culture and the NSA — and this is something I tried to get at in Weaponized — have the same problem. We have lost our moral compasses because of filtering issues. We have too much de-contextualized data and no way to process and filter it all. You end up with everything being weighted equally. Which produces an utter vacuum.
So, although I agree with Morozov philosophically, I disagree with his separation of government surveillance from capitalism. And I think he knows you can’t, which is why he decided to put the burden on us. You can argue that point; you can’t really argue the other one.
TM: Plot, pacing, and structure are central to your fiction as well as to your conception of what makes good story. You could probably teach a semester-long class on this, but let’s say you have to compress the lesson down to a two minute craft talk on how these elements function within a story and how to approach them as a writer — what would you advise?
NM: I think those factors are what make a good genre story and also what makes a certain “type” of literary fiction story work. And if you can write like Amis or Nabokov you can get away with certain things that us lesser mortals cannot. We lesser mortals need to follow some rules.
My first rule is just to outline. Outline. And then do a little more outlining. I know some people feel that it kills the spontaneity, but I find it frees me up. If I know the story when I start in, I feel like I can really concentrate on the characters, the prose, the mood. If I know where I’m going, I can really play on the margins. And I actually love the margins more than the meat sometimes.
Plot, structure, and pacing are all interrelated. Plot is clearly the first thing and needs to be differentiated from story. Story is — what is this thing about? Plot is — how am I going to get there, to tell this story in individual beats. Structure is — the most effective means to do so, to maximize drama. And pacing ties into structure. You just need to make sure there’s room to breathe between the big story beats.
Really most writing just comes down to one thing: What is the most effective way to dole out the pertinent information for this particular scene or moment?
I can be obsessive about construction, but I secretly think it’s because I don’t like to rewrite. I like to rewrite prose and dialogue — but I HATE having to rebuild a story from the ground-up once I’m halfway through the book.
I’m not offering any of those tips as a panacea. I’ve just seen too many writers get 2/3 of the way into a book and realize that the plot is not working. And you lose a year trying to save what you love, while reshaping the whole thing. That’s my nightmare. And I’ve been there.
For six days in the fall of 1996, I was an excellent tight end for the Warriors of William H. Hall High School in West Hartford, Connecticut. I ran the post route and the flag route and once in practice nearly caught a very long pass. I was only a second-stringer for the freshman team, but I had the underdog’s irrepressible optimism: here comes JV, Varsity, a scholarship to Ohio State, the NFL draft, the first celebration in the end zone at the Meadowlands while thousands upon thousands cheered.
It never quite panned out. There was an inauspicious 76 on a geometry test: I had been too busy studying quarterback signals to learn the defining characteristics of an isosceles triangle. This is a woeful mishap for the son of a mathematics teacher. The day before a game against either Windsor Locks or Enfield, I was pulled by my father from the team. Later, I participated in the far less demanding sport of volleyball, my infrequent spikes resounding in a gymnasium that had never known much glory.
That’s all just to say that I wanted very badly to fall in love with Friday Night Lights, the football drama that recently concluded a five-season run on NBC. I was primed for its cavalcade of disappointments, because I had known those disappointments myself.
In addition, both my wife and I came of age in that golden age of the artistic television drama. We are both in our thirties, and remember when TV was impossibly crude (Married…with Children), low-brow (Walker, Texas Ranger), and utterly untroubled by reality (Saved by the Bell).
With the advent of NYPD: Blue in 1993, that started to change. TV, all of a sudden, could be serious and real. You didn’t need Don Johnson anymore, and you didn’t need a laugh track. And with The Sopranos and later The Wire, even with Sex and the City and Curb Your Enthusiasm, TV could be something even greater than that. “Television had always been a pleasure, a mass entertainment…But in the aughts, the best TV-makers displayed the entitlement of the artist,” wrote Emily Nussbaum in a 2009 New York magazine article entitled “When TV Became Art.”
And we had arrived with it. Freshly minted graduates of liberal arts institutions, we were primed to treat the new TV drama like an object worthy of our Catholic, overripe intellects. We could do a Derridian reading of Breaking Bad. We could watch Mad Men with Foucault.
For many people, Friday Night Lights, which first appeared in 2006, represents the pinnacle of the new TV drama. It is less polished than Mad Men and less dour than The Wire, and somehow more relatable than both, as far as its numberless fans are concerned.
I am not one of those fans, despite having watched all five seasons. In fact, my distaste for Friday Night Lights only increased as the seasons went on, so that I was taken with launching lengthy diatribes at the television. I am fortunate to still be married.
Now, there is still plenty of bad television around, and I am content to render Dancing With the Stars unto those who want to watch it. But Friday Night Lights has somehow became a cause célèbre among the sort of crowd that would much rather spend its Sunday afternoons brunching in Brooklyn than watching a Houston Texans game. They have elevated the show to high art, with appreciations of resident hunk Tim Riggins in the same Paris Review where Norman Mailer once roamed and, on ever-so-sober NPR, “A Late-Blooming Love Letter to NBC’s ‘Friday Night Lights.‘”
“Heartbreakingly good,” says Entertainment Weekly; “an exquisite bit of anthropology,” opines the New York Times. Bullshit, I say to all of them. Friday Night Lights is bad television. And if it is art, then it is art that is purposefully misleading, which is art of the worst kind.
Forget the amateurish acting, which vacillates between maudlin enthusiasm and shrill discord. Forget, too, the recycled plotlines that always have the hometown fans of Dillon pinning their hopes on fourth and long. Something is truly rotten in the state of Texas.
It begins with the whole “clear eyes, full hearts, can’t lose” mantra, which coach Eric Taylor, the show’s protagonist, delivers with all the growling gusto of Churchill before the Battle of Britain. Now, every sports team – and every sports show – is entitled to its inspirational bromides. But on Friday Night Lights, “clear eyes, full hearts” is elevated to a central tenet to which the characters subscribe as if it were religious truth.
There’s nothing wrong with optimism, not even with optimism that crosses over into delusion – that’s the kernel of nearly every Raymond Carver story. That unmoored optimism we reference when we call something “Ahabic” or “Quixotic.” But in a Carver story, the careful use of irony allows the reader to make an independent judgment of the characters. Each one of Carver’s down-and-outers thinks his break is right around the corner, even though the narrator subtly broadcasts to us that it isn’t. This is the situational irony that Aristotle found in Oedipus – the arrogant king is looking for the transgressor who has cursed Thebes, unaware that it is himself.
Mad Men has its Oedipus in Don Draper, an outwardly successful man living a life as transparent as tissue paper. Baltimore is the Oedipus of The Wire, a sick city that nobody is capable of healing. In watching Don sink deeper into alcoholism and drift farther from his family, in witnessing the failure of every institution in “Body More” except for the drug trade, we feel pity and fear – the two emotions that, for Aristotle, give great art its pathos. Three thousand years after he wrote the Poetics, all is as should be.
But Friday Night Lights has no Oedipus of its own, no fallen king – and it has no irony, either. Nobody here is ever in danger of ever really losing. Characters do not so much overcome their troubles as they are saved from them providentially – every pass in FNL is a Hail Mary caught by a diving, flailing wide receiver for a last-second, game-winning touchdown. As such, all that overcoming is superficial and rushed.
Tyra Collette, a rebel with no interest in her studies, suddenly becomes inspired and crams for the SAT. Presto, she’s into the University of Texas’s flagship Austin campus. Matt Saracen, a middling athlete if there ever was one (and I should know), becomes a Manning brother overnight and wins the state championship. His friend Landry Clarke walks onto the Varsity squad of a championship team, though he appears to have minimal knowledge of and enthusiasm for football. More troublingly, he kills his girlfriend’s assailant, but they get over the body-dumping in the span of a couple of episodes. Because what’s the law when love is on your side?
Then there’s queen bee Lyla Garrity, who leaves paralyzed quarterback Jason Street for the aforementioned Riggins. Then she leaves Riggins for Jesus and ends up having a dalliance with a youth leader at her megachurch. Then she comes back to Riggins. Then she leaves Riggins and goes to Vanderbilt.
I don’t dislike Lyla nearly as much as I dislike what Friday Night Lights creator Peter Berg and his writers did to her – or failed to do with her, rather. Is she tortured like Anna Karenina? Is she yearning for freedom like Emma Bovary? She can’t just smile through every scene in her cheerleading outfit. It can’t always be all-good, all the time. If it could be, I would have long ago moved to East Texas.
The Season 2 case of Santiago is especially infuriating. He is a young criminal with apparently boundless athletic potential, and Buddy Garrity takes him into his own home so that he can qualify to play for the Dillon Panthers. He does, but just as he starts to excel on the field, and just as his old criminal friends start to intrude on his new life, he is gone from the show without even the most peremptory explanation. This isn’t Stalinist Russia; you don’t just disappear a character like that.
And the treatment of race is just absurd. Is this not the same Texas where James Byrd was killed in 1998 by three white men who dragged him behind their truck until his head came off? Apparently not, since every social event is a Rainbow Coalition of well-dressed, happy families. There is no color line, no class divide, only the love of football.
This robs Friday Night Lights of any pathos and makes it instead an unwitting champion of the bathetic, which Alexander Pope called a work of art’s fall “from the sublime to the ridiculous.” You can be sure that if Oedipus were on Friday Night Lights, he would soothe the pain of his sin by joining the football team. His mother Jocasta would cheer from the stands, and he would wear a patch on his jersey with his dead father’s image.
I don’t care if art is realistic, but I want it to be true. This is what Aristotle demanded in the Poetics and it is what we should demand today, whether from our novelists or our television producers.
To be realistic, art has only to have fidelity to material reality, which is easy enough and not that important anyway. Beowulf and The Odyssey are not real, but that doesn’t diminish them in the slightest. It doesn’t diminish Harry Potter, either.
Truth is much harder. What Keats said about beauty and truth hasn’t changed in the 127 years since he wrote “Ode on a Grecian Urn” – the two are still one and the same.
This is where Friday Night Lights fails – there is nothing true about it. It ignores hard battles in favor of superficial ones. I know enough about the world, and you surely do as well, to know that Vince Howard’s mother could not turn, in the span of two episodes, from a drug addict to a spry middle-aged mother. It would be pretty to think so, as Hemingway once wrote, but all experiential evidence is against it. This kind of ease with fate may be uplifting in the space of forty-five minutes, but it makes for a hollow show. It’s not that I want Matt Saracen to fail; I just want him to struggle the way real people do, the way that Oedipus struggled against his fate. That will make his victory more meaningful in the end.
There is one great scene in Friday Night Lights. Julie Taylor, the coach’s daughter, does not want to return to college in the middle of Season 5 because she has had a disastrous affair with a teaching assistant. Her father is furious and insists that she go back to school and face the consequences of her romance, but when he tries to drag her out of the house, she resists in a paroxysm of tears. The scene is unexpected but inevitable, as Aristotle said great drama should be. It is real, it is true, and you don’t know where it’s heading. The show needed more of that – much, much more.
What bothered me most, though, was Tim Riggins’s hair. It is always unfairly perfect, a surfer’s locks falling over his face. It is perfect when he is playing football, it is perfect when he is drinking beer in the afternoon, it is perfect when he drops out of college, it is perfect when he goes to jail, and it is perfect when he schemes to buy an enormous plot of land without, seemingly, enough in his bank account to pay for a round of drinks.
My wife told me to stop screaming at the television, but I couldn’t. Nobody has hair that perfect. It isn’t real, it isn’t true, and it certainly isn’t art. You don’t need Aristotle to tell you that.
“They seem to have things under control,” I said.
“Whoever’s in charge out there.”
“Who’s in charge?”
Despite having closely followed the disastrous events in the Gulf for over a month with something akin to self-flagellatory devotion, growing increasingly angry and disillusioned with each failed attempt to contain the stricken oil well, I recently booked a South Caribbean cruise for my honeymoon in January. It was only after the plans had been finalized that I realized how little the oil spill had actually affected me: I operated under the assumption that someone—the government, BP, someone—would have the “situation” resolved, cleaned up, and concluded before it could intrude on my vacation. I had blithely researched and planned the cruise, never considering that the worst manmade natural disaster in our nation’s history might have real repercussions for me. This naïve self-assurance gave me pause and, like many avid readers, I turned to what literature might teach me about such hubris.
Don DeLillo’s 1985 novel White Noise narrates the events of a manmade disaster so eerily similar to the Gulf oil spill in some of its details that it has an aura of prognostication. The novel is narrated by Jack Gladney, a professor of Hitler studies at College-on-the-Hill in Blacksmith, a quiet town somewhere in the U.S. Jack is an incredibly empathetic character. Contrary to what we might predict for the professor who founded an academic discipline devoted to studying the most heinous figure in modern history, Jack is a good husband and father, kind to his coworkers, and generally affable. Even his idiosyncrasies are endearing: he wears dark-tinted sunglasses on campus, changes his professional name to J. A. K. Gladney, and gains weight to bulk out his frame, each pose an attempt to acquire the gravitas expected of him by students and fellow professors. The careful cultivation of his public persona is matched by his need to provide answers for his family, to be a source of knowledge and assurance to his adolescent son, and to appear to have control over events outside his field of expertise.
When an accident in a nearby train yard spills 35,000 gallons of “Nyodene Derivative” (a fictional, highly toxic byproduct of commercial insecticides), creating an amorphous black cloud quickly named an “airborne toxic event,” Jack assures his family that they will be safe without fleeing home: “These things happen to poor people who live in exposed areas. Society is set up in such a way that it’s the poor and the uneducated who suffer the main impact of natural and man-made disasters. People in low-lying areas get the floods, people in shanties get the hurricanes and tornados. I’m a college professor. Did you ever see a college professor rowing a boat down his own street in one of those TV floods?” Even as the air currents threaten to send the toxic cloud toward his neighborhood, Jack insists that alarm would be out of step with his professional position, saying “I don’t see myself fleeing an airborne toxic event.”
Jack’s self assurance can be maintained only through an illusion of control. He assumes that the weather, government, and his socio-economic status will all contrive to protect him from the threatening black cloud. But this illusion is wrested from him after he learns that his two minute exposure to the toxin will likely jeopardize his health, though it will be fifteen years before the symptoms begin to manifest. “Scheduled to die,” Jack’s fear of death encroaches upon his ability to see himself among the living. Confiding to a fellow professor, he speaks of the trap he finds himself in: “It’s almost as though our fear is what brings it on. If we could learn not to be afraid, we could live forever.” Caught between the living and the dead, fear and uncertainty drive all of Jack’s actions after the exposure.
The victims of the Gulf oil spill are now trapped in the same epistemic gap in which Jack finds himself. Possibly the most confounding aspect of the disaster is that after two months there is still no certainty as to the extent of the damage. It is not merely a problem of tracking the massive, miles-long invisible plumes of oil that are suspected to be floating below the surface. A more essential problem is that the government and BP have been unable to determine how much oil is leaking from the well. There are only best and worst case scenarios separated by tens of thousands of barrels per day (as of this writing, it was estimated that between 12,600 and 40,000 barrels per day were bleeding into the Gulf before the riser was cut, and between 35,000 and 60,000 barrels per day afterwards).
Being unable to fathom such quantities, we are in a situation similar to Jack’s: things are bad, danger is lurking, but we don’t know its full extent. Like Jack’s, our exposure has been consummate, and fatal for the health and economic stability of many, but the final tally is not yet in.
Much of the novel’s pathos derives from Jack’s attempts to regain control of his life while living in the gap—living with the uncertainty of certain death. First, he alters his routine and begins to obsessively see his doctor and search for a miracle cure for his fear of death, a drug called Dylar. In the end, he violently steals the drug, consciously plotting his movements, the effort to superimpose order on his actions altering his narrative voice from the avuncular professor to the conniving criminal. The reversal of Jack’s fortunes is classically tragic, resulting from his flawed self-assurance. He both fears and longs for a conclusion to the uncertainty, desiring the resolution inevitable at the conclusion of any plot. It is as if he had read Aristotle’s Poetics and now awaits the catharsis available at the ending.
Keeping in mind E.M. Forster’s comments in Aspects of the Novel on the difference between “plot” in drama and the modern novel—the latter of which gives much greater emphasis to character development and action which derives organically from that development— Aristotle’s well-known emphasis on the unity and parts of a plot reveals what we as readers seek in narrative. Turning on either (though ideally both) a recognition on the part of a character or a reversal of his fortunes, the best plots are those which elicit sympathy and pity for the characters, resulting in catharsis for the audience. But the emotional payoff can come only at the conclusion, the result of both identifying with the characters and realizing that though you could be in their situation, you are not.
DeLillo not only masterfully plots White Noise, his characters also speaks eloquently of “plots.” Lecturing to his class, Jack opines that “All plots tend to move deathward. This is the nature of plots. Political plots, terrorist plots, lovers’ plots, narrative plots, plots that are part of children’s games. We edge nearer death every time we plot. It is like a contract that all must sign, the plotters as well as those who are the targets of the plot.” In other words, plotting is a way of reaching an end, the conclusion, and resolving whatever degree of mystery is left in a narrative or life. A plot gives structure to messy and meaningless facts by tying them together but in so doing, requires that the telling be curtailed, sometimes prematurely (for instance, the litigation and environmental cleanup from the oil spill will undoubtedly be with us for years to come, but the “narrative” of events that our culture will construct—in the media and in court—will likely provide an ending that doesn’t account for these lingering signs of the spill).
Aware that death is growing inside him, Jack has essentially short-circuited his life’s “plot.” There is no mystery left. Asked if he would like to know the exact date of his death, he says “Absolutely not. It’s bad enough to fear the unknown. Faced with the unknown, we can pretend it isn’t there. Exact dates would drive many to suicide, if only to beat the system.”
As writers and readers, we are bound to what Forster called the “tyranny of the plot.” Obligated to tie up loose ends, the writer must often sacrifice true characterization, curtailing the organic development of his characters (often with a “contrived” death or marriage, though obvious exceptions are the modernist ambiguous ending and the postmodern fragmented narrative). Forster questions the necessity false endings: “Why is there not a convention which allows a novelist to stop as soon as he feels muddled or bored? Alas, he has to round things off, and usually the characters go dead while he is at work, and our final impression of them is through deadness.” Why must all things move “plotwards”? How can the “deadness” of the characters (both creatively and in the plot) be accounted for? It is as if writers are compelled to sacrifice their characters to the reader’s need for catharsis and redemption, found in the resolution of the plot. This, I believe, is the answer given by Aristotle. We need endings to reassert our own humanity and to find life even in death.
In this way, there is something life affirming in even the greatest disasters. But only after they have ended: only after the tale of survival has been concluded and can be retold, filling in the gaps in a way that brings logic to bear on the messiness of life, creating a narrative that allows those not directly affected (the “audience” of the disaster) to live with fear by rehearsing disaster through its displacement. As stated by one of the characters in the novel, “The more we rehearse disaster, the safer we’ll be from the real thing.”
But we live in the gap, in that middle section of the novel where nothing is resolved and everything is at stake. Rereading White Noise, I recognized that plotting and planning are just ways in which I try to project order onto chaos. This is where fiction departs most drastically from life. In reading fiction, we must learn to willingly suspend disbelieve. But the beauty of living in the middle is the ability to will ourselves to believe that in these moments of suspension there is opportunity for human action.
A common stereotype about art and mental health is that writing is catharsis – a way of healing old wounds, defeating childhood demons, processing the past and putting in its place. My thesis here is the opposite – that writing is a process of degrading one’s emotional state.
I’ve been working the last two years on my first novel, which has certain elements of autobiography to it, and in that time I can’t help but notice a certain decline in the indicators of good mental health. It’s not that I’ve turned into the tortured artist type, appearing in public looking like ten miles of bad example, nor was I a paragon of mental health to begin with. But compared to the years before I started the book, I have become moodier and more withdrawn, and most of the time I’m thinking about the book instead of connecting to the people around me.
About the same time I started to wonder where this extended case of the blues was coming from, two different people – a friend and my therapist – remarked that it must be good to be “working through” my childhood issues with this writing project. “Ha!” I thought. “If I want to work something out, I’ll write it down in my journal.” The common assumption that writing is cathartic just didn’t have the ring of truth when I tried to apply it my situation.
I think I’ve figured out why. First, let’s take for granted that novelists have to get in the minds of their characters in order to make their work feel as honest and real as possible – to imagine themselves thinking and feeling as their characters would. That involves a lot of sitting around wondering what it would be like to be disappointed with humanity, unable to take a joke or otherwise afflicted by the modernist tradition.
I can usually get in that mental space, kind of like I can always afford a plane ticket to visit home – a stretch, but not impossible. When I do have trouble, one technique I use is getting up out of my writing chair to make faces in the mirror or act out a scene. One time my wife came home and found me trying to squeeze underneath our bed so I could picture what one of my characters would be seeing when he goes to ground. “Oh good, you’re here,” I said when I recognized her shoes. “Now stand there and scream at me like a speed freak.” (She’s very supportive, but she has expressed an interest in knowing when the book will be done.)
It’s all just pretend, of course. In a way I’m turning around my therapist’s advice to “fake it till you make it.” He means that it helps to pretend you feel better until you are better. I’m no expert, but as I understand it, that advice is a pretty good short-hand description of how cognitive therapy works. If you can train yourself to think differently about troubling events, then you’ll start feeling differently about them. Habitually reminding yourself that a given trauma wasn’t the end of the world eventually takes the vinegar out of it. I happen to agree and in fact have benefited from that kind of therapy. “Fake it till you make it” has been a great way to write new data on the old hard drive.
In other words, while it’s true that creating interesting characters is just pretending, the theory of cognitive therapy is that pretending really matters. But the problem with faking my characters until I’ve made them real, is that interesting characters aren’t on a path to healing. They are responding to stress in very human and understandable – and usually not very healthy – ways. Who wants to read a story about a bunch of people who have already worked through all their issues? Just perfectly self-actualized protagonists and antagonists battling to see who is stronger? That’s like reading an arm wrestling contest.
The best characters are so far from “making it” that they are their own worst enemies, carrying inside themselves the seeds of their own destruction. (c.f. Aristotle’s Poetics.) So getting in the head of a tragic hero, or any kind of believable character in a conflict, is like deliberately decompensating – etching bad data over good, imprinting the cognitive processes with all the self-destructive habits that your therapist taught you to break. It’s faking it until you make a mess big enough to warrant a novel. In that way, writers become their own anti-therapists, using their powers for evil.
Now, if my wife reads this, she’s going to worry about me more than she already is, so I hasten to offer this antithesis: Novels have resolutions, which means characters get the opportunity to “work through” their issues at the end. And it’s not going to be very dramatic resolution unless, along with the seeds of their destruction, the characters are also carrying the seeds of their salvation.
After all, the tragic hero is heroic in some way, and not every drama ends in tragedy anyway. If characters are going to have a fighting chance against their demons – internal or external – writers have to save as much good data to the character’s hard drive (and their own?) as bad. If so, perhaps the novelist can reach the same katharsis that Aristotle says the audience should enjoy in the end.
[Image credit: Tommaso Meli]