Letter from the Collapse

- | 3

“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of monsters are born.” —Antonio Gramsci, Prison Notebooks (1930)

“Twenty-thousand years of this, seven more to go… The quiet comprehending of the ending of it all.”—Bo Burnham, Inside (2021)

In our corner of Northern Virginia, we were fortunate to never see the dead birds. Yet throughout the Mid-Atlantic—a cardinal on the pebbly beaches of Delmarva or a sparrow on the Jersey Shore, a finch like an omen in front of Independence Hall or a bluebird as a threat on the steps of the Capitol—the creatures started to die by the thousands. With little sense of this new plague, experts recommended the removal of bird feeders. And so I dutifully took down the tall model where I examined mourning doves over morning coffee and listened to woodpeckers on the birches, watched the hawks who flew above, and the sleek, elegant crows speaking in their own impenetrable tongue. The Allegheny Front, an environmental show on Pittsburgh’s WYEP, posted a photograph of an afflicted robin found in Erie, Penn. Laid out in a cardboard box decorated with spruce leaves, it looked like the otherwise pristine creature was sleeping, the only sign of its illness the thick crust on her sealed eyes. An affect not unlike the wisps of cotton that escape from underneath the lids of taxidermied birds. “The phenomenon has since spread through 10 states,” writes Andy Kubis at The Allegheny Front, “including West Virginia, Ohio, Maryland and Delaware, and in 61 of 67 Pennsylvania counties.” Observers noted neurological symptoms, birds unable to fly or crashing into the ground; the dead animals, found framed by the brittle, yellow grass of sweltering June, with the characteristic discharge from eyes and beaks.

Ornithologists proffered hypotheses, noting that the avian pandemic accompanied the cicada Brood X. Those creatures we couldn’t avoid seeing, skeletal eldritch horrors bursting from the earth and their own bodies: red-eyed grotesqueries whose incessant droning permeated the humid air for weeks, who dropped from branches and through car windows like something out of a horror film. Between the dead birds and the cicadas, the summer had a Pharaonic glean, intimations of Exodus. A surreal poetry to these chthonic beings, the existential crisis of their lives spent hibernating for 16 years, only to emerge and then die. “Happy the Cicadas live,” wrote Charles Darwin in Descent of Man, and Selection in Relation to Death, though quoting Xenarchus. Our dog took to biting them in half and pushing them between the slots of our deck’s wooden planks, casting them back to hell. By the time they disappeared, without even bothering to say goodbye, I’ll confess that we missed them. But in their brittle, green bodies there was an answer to the bird pandemic, for it seemed that people had attempted to poison the cicadas, and after ingesting their pesticide-corrupted corpses the birds were killed instead. The “sense of cosmic significance is mostly unique to the human relationship with birds,” writes Boria Sax in Avian Illuminations: A Cultural History of Birds, but not apparently to those squeaked out by some bugs, the same people who undoubtedly water their lawn during a drought, or who buy the last 10 chickens during the coming food shortages.  Trillions of cicadas emerged; to avoid them was an impossibility, but you only had to bear them for a short while, and yet people unable to reason that there is no eliminating something of that magnitude and too impatient to wait decided that they knew better. Is there a more perfect encapsulation of the American mindset in these dwindling days?

I’d be amazed if you couldn’t sense it—the coming end of things. A woman sits by her grandmother in a St. Louis, Miss., ICU, the older woman about to be intubated because Covid has destroyed her lungs, but until a day before she insisted that the disease wasn’t real. In Kenosha, Wisc., a young man discovers that even after murdering two men a jury will say that homicide is justified, as long as it’s against those whose politics the judge doesn’t like. Similar young men take note. Somebody’s estranged father drives to Dallas, where he waits outside of Deeley Plaza alongside hundreds of others, expecting the emergence of JFK Jr. whom he believes is coming to crown the man who lost the last presidential election. Somewhere in a Menlo Park recording studio, a dead eyed programmer with a haircut that he thinks makes him look like Caesar Augustus stares unblinkingly into a camera and announces that his Internet services will be subsumed under one meta-platform, trying to convince an exhausted, anxious, and depressed public of the piquant joys of virtual sunshine and virtual wind. At an Atlanta supermarket, a cashier who made minimum wage, politely asks a customer to wear a mask per the store’s policy; the customer leaves and returns with a gun, shooting her. She later dies. The rural mail carrier who has driven down the winding, unnamed roads of a northwestern Oregon hamlet for over three decades notes to herself how the explosion of annoying insects on her windshield seemed entirely absent this summer. A trucker who lives in Ohio blows his airline break, and when trying to get a replacement finds that it’s on backorder indefinitely. Walking across Boston Common this October, and two men holding hands and heading toward the duck boats realize that they’re both sweating under their matching pea coats. It’s 83 degrees. On the first day of July, my family huddles in our basement; a tornado has formed in the District of Columbia, and is rapidly moving across the National Mall.     

Everyone’s favorite Slovenian Marxist Slavoj Zizek snottily gurgled it a decade ago, writing in Living in the End Times that the “global capitalist system is approaching an apocalyptic zero-point,” and identifying four horseman in the form of environmental collapse, biogenetics, systemic contradictions, and “explosive growth of social divisions and exclusions.” Not everyone claims to see the gathering storm however, especially those who are most responsible, though if they do, they’re silent about it in their New Zealand compounds. Degenerated, chipper, faux-optimism is a grift during our epoch of dusk; Jeff Bezos expecting us to clap when he shoots Captain Kirk into space; Elon Musk mouth-breathing about cryptocurrency and terraforming the rusty soil of Mars, as if we haven’t already heated one planet too much; Peter Thiel promising us that there will be a digital heaven where all of the billionaires can download their consciousness unshackled from the material world, and we can serve alongside them as Egyptian slaves entombed with their masters, clicking on PayPal,and Amazon and Facebook for a silicon eternity. Such promises are the opposite of hope, they’re only grinning assurances of dystopia instead of apocalypse. Besides, such things are chimerical; ask not for whom the Antarctic ice shelf collapses, or for whom the ocean acidifies, or for whom the temperature rises at 3 degrees Celsius, it does all these things for Bezos, Musk, and Thiel as much as you and me. Ours is the age of Covid and QAnon, supply chain breakdown and surveillance capitalism, food shortages and armed militias, climate change and bio-collapse. We’re merely in a milquetoast interregnum as we wait for monsters to be born in a year, in three. If poets and prophets have traditionally been our Cassandras, then on some level everybody knows that a rough beast is slouching towards Bethlehem right now, though despite that one sees perilously little grace, kindness, and empathy. Even the insanity of those who believe whatever conspiracy theory happens to give them scant meaning intuit that the insects are disappearing, the waters are rising, and the absence of 700,000 lives means that something is askance.

“The world sinks into ruin,” wrote St. Jerome in 413, some six decades and change before the final sack of Rome that marks the Western empire’s fall. “The renowned city, the capital of the Roman Empire, is swallowed up in one tremendous fire,” he noted of the Visigoth Alaric’s siege. Hard not to imagine that some didn’t realize that the end was coming, shortages of pungent garrum made in Mauretania, a scarcity of Cappadocian lettuce and Pontic fish. In 410, the Emperor Honorius recalled all legions from Britannia to defend the eternal city from the Visigoths who would soon traipse through its burning streets. Envision that horde, ascending the marble steps of the Senate, in furs and horned helmets, brandishing their red standard and crowding through the halls of that once august and solemn space. Can you even countenance it? The Romanized Celts requested from the emperor the return of defensive legions, and in his rescript Honorius “wrote letters to the cities in Britain urging them to be on their [own] guard.” The United States Postal Service will be late in delivering packages, because of supply chain shortages there is no chicken available at the Stop & Shop, the power grid will be down this winter in Texas. You’re on your own. As civil society crumbled, Romans turned to all variety of superstitions and occultisms, cults and conspiracies. As Edward Gibbon noted in The History of the Decline and Fall of the Roman Empire, the “zeal of fanaticism prevailed over the cold and feeble efforts of policy.” Stop the steal! Lock her up! Make America GREAT again! Living on a heating planet filled with dying animals and governed by either the inept or the insane, and it’s hard not to feel a bit strange going to work, buying groceries, saving your salary, as if everything were normal. “We live as though we are going to die tomorrow,” wrote Jerome, “yet we build as though we are going to live always,” or, as David Byrne sang, “Why stay in college? Why go to night school?… I ain’t got time for that now.”

Whenever comparisons are made between Rome and America, there’s always somebody who denounces such language as not just histrionic, but clichéd. The latter is certainly fair; ever since the founders obsessed over republican virtue we’ve imagined that the Potomac is the Tiber, and we’ve parsed (arch-royalist) Gibbon’s history for clues about our falling. Copies of Plutarch and Livy were brought to the Continental Congress, and the most popular colonial American play was a turgid script by Joseph Addison about Cato (it would be performed at Valley Forge). The young Republic declared itself to be a “Novus ordo seclorum,” a “New Order of the Ages,” in conspicuous Latin borrowed from Virgil’’s Aeneid, while the Federalist Papers were written under pen-names like Caesar, Brutus, and Publius and John Adams attributed his worldview to Cicero. Roman symbolism was replete, as in the fasces that would adorn the Senate located on Capitol Hill. When George Washington deigned not to hold a third term, he was compared to the noble dictator Cincinnatus who dropped his sword for a plow, which was enough virtue that by 1840, four decades after the first president’s death, and the sculptor Horatio Greenough rendered the general as a muscular Jupiter in a toga. By the final year of the Civil War, and the first president was depicted underneath the Capitol dome as a purple robed Roman god in “The Apotheosis of Washington”. The Lincoln Memorial, the Supreme Court, the Capitol, all of it neo-classical ridiculousness. Gore Vidal recalled in United States Essays: 1952-1992 that his grandfather, Sen. Thomas Gore of Oklahoma, remarked to Franklin Delano Roosevelt about the bloated buildings of Washington that “At least they will make wonderful ruins.”  

Vidal, that classical patrician, wrote that “Empires are dangerous possessions… Since I recall pre-imperial Washington, I am a bit of an old Republican in the Ciceronian mode, given to decrying the corruption of the simpler, saner city of my youth.” Hardly a postbellum pose, for critics have feared that the Republic would slide into an Empire before the Constitution’s ink was dry. Naturally there is also fear of collapse, and long has there has been foreboding about the decline and fall of the American Empire. On the top floor of the austere New-York Historical Society, there is a pentad of paintings by the unjustly forgotten landscape artist Thomas Cole, a series known as “The Course of Empire.” Rendered between 1833 and 1836, Cole was disturbed by both the vulgarity of Jacksonian Democracy and the brutality of Manifest Destiny. A member of the Hudson Valley School who reveled in the sheer grandiosity of the nation’s natural spaces, Cole imagines in “The Course of Empire” a fantastical country from its primitive state of nature, through an idealized agrarian state, into a decadent imperium, an apocalyptic collapse, and finally desolation. Overlooking each painting is the same mountain peak, roughly the shape of Gibraltar’s rock, the one consistency as Cole’s civilization follows the course of its evolution, a reminder that nature was here before, and despite how we may degrade it, will still be here afterwards. The penultimate landscape, entitled simply “Destruction,” presents the denouement of this fantastic city, a skyline of columned, porticoed, and domed classical buildings in flames, bellowing smoke partially obscuring that reliable mountain; vandals flooding the streets, murdering and raping the city’s citizens, pushing them into the mighty river that bisects it. A triumphant monumental statue is now decapitated. With its wide marble buildings and its memorials, Cole’s city resembles nothing so much as Washington D.C., though when he lived the capital was more provincial backwater than the neoclassical stage set it would become. Cole made a note that “the decline of nations is generally more rapid than their rise,” concluding that “Description of this picture is perhaps needless; carnage and destruction are its elements.”

Enthusiasm for such parallels, along with attendant breathless warnings (including the ones that I’m making) have hardly abated. In just the past decade, there have been articles entitled “8 striking parallels between the U.S. and the Roman Empire” by Steven Strauss in 2012 at Salon, Pascal Emmanuel-Gobry’s “America now looks like Rome before the fall of the Republic” from 2016 in The Week,  Tim Elliot’s 2020 piece at Politico entitled “America is Eerily Retracing Rome’s Steps to a Fall. Will It Turn Around Before It’s Too Late?,” Vox’s essay from that same year “What America Can Learn from the Fall of the Roman Republic” by Sean Illing, and Cullen Murphy’’s succinct “No, Really, are we Rome?” from The Atlantic of this year. Just to dissuade those who parse such things, Tom Holland wrote “America Is Not Rome. It Just Thinks It Is” for The New York Review of Books in 2019. With an article that reprints Cole’s painting underneath the headline, a pull-quote reads “There is nothing written into the DNA of a superpower that says that it must inevitably decline and fall.” Well, with all due respect, the second law of thermodynamics mandates that everything has to fall apart, but Holland’s point is taken that in a more immediate sense, comparisons of America to Rome tell us little about the latter and everything about the former. But for those who see the comparison as tortured beyond all reasonableness, the truth can be bluntly stated as follows: our current problems aren’t like the fall of Rome because they’re far, far worse. Would it only be that we faced the collapse of the U.S. government, or authoritarianism, or even civil war, because the rising average temperature per year, the PH of the oceans, and the biodome’s decreasing diversity are things unheard of on the Earth since the Permian-Triassic extinction of more than 250 million years ago, when 70 percent of life on land perished and almost 95 percent in the seas did.     

“It is worse, much worse, than you think,” writes David Wallace-Wells in The Uninhabitable Earth: Life After Warming. Wallace-Wells describes the five previous mass extinctions that shaped evolution, explaining that four of these “involved climate change produced by greenhouse gas.” Before the Permian-Triassic extinction, the land was occupied by the fin-reptile dimetrodon and the hog-shaped Lystrosaurus, the abundant atmospheric oxygen supported massive dragonflies and centipedes, and the oceans were plentiful with mollusks and trilobites. For some still unexplained reason the amount of carbon dioxide rapidly increased, which in turn triggered the release of methane, so that this feedback loop “ended with all but a sliver of life on Earth dead,” as Wallace-Wells writes. “We are currently adding carbon to the atmosphere at a considerably faster rate; by most estimates, at least ten times faster,” he explains. If we didn’t know what caused that warming 250 million years ago, we know what’s doing it now—us. Should the worst case scenario of the United Nations Intergovernmental Report on Climate Change come to pass, then in the coming century the exponential increase in warming will result in an ice-free arctic, obliteration of the coastal cities where two-thirds of humans live (no more Venice and Amsterdam, New York and Miami), the mass destruction of farm land, continual massive wildfires for which we will look back fondly on the summer of 2021, never-ending hurricanes and tropical storms, heat waves, droughts, desertification, new pandemics, and at worse the acidification of the ocean and the resultant perishing of most things that live beneath the waves. Short of a social or political revolution to reorient the world away from the cannibalistic capitalism which has brought us to this moment, we’ll read Gibbon as halcyon (assuming anyone is around to read).

This summer I threw a little digital life buoy out into the whirlpool of Twitter, another one of those horseman of dystopia, and asked others what it felt like to be living during what could be the apocalypse. Mostly I discovered that my anxiety is common, but one gentleman reminded me that there were Medieval millenarians and Great Awakening Millerites awaiting their messiahs who never came, and that they were all mistaken. That is, if you’ll forgive me, exceedingly stupid. There have been times when I was sure that I was going to die—the shaky prop plane flying low to the ground between Philly and the Lehigh Valley and the erratic driver going 20 miles over the speed limit who almost side-swiped me on a stretch of I-95 in Massachusetts—but just because I survived shouldn’t lead me to conclude that I’m immortal. Armageddon isn’t any different. My critic, though, seems to be in the minority—most people have that sense of foreboding, picking up whatever cries are coming from the Earth that the summers feel hotter, the animals scarcer, the sky sometimes glazed an ungodly glow from the redness of western fires. “The piers are pummeled by the waves;/In a lonely field the fain/Lashes an abandoned train,” wrote W.H. Auden in his 1953 poem “The Fall of Rome,” perhaps about his own justified fears regarding nuclear conflagration. I imagine the poet placing his wrinkled, droopy, hang-dog face to the ground and picking up on those frequencies that are today a cacophony, the “Private rites of magic” that now mark the fascists of one of our only two parties, how “an unimportant clerk/Writes I DO NOT LIKE MY WORK” reminding me of the striking heroes who are leaving the degrading and barely remunerated labor of late capitalism, how the “Herds of reindeer move across/Miles and miles of golden moss” in a warm arctic, and my beloved “Little birds with scarlet legs… Eye each flu-infected city.”

From the Greek, “apocalypse” means to “uncover” hidden knowledge, so for those of us anticipating what the future holds, it’s been the apocalypse for a while. What are you to do with this knowledge? Our politics operate on inertia and project onto individuals a responsibility that was always vested in the powerful themselves. Perhaps you should ditch your car, turn off your air conditioning, recycle, give up meat, and begin composting, but do that because those thing are good for your soul, not because you’re under any illusions that “Not The End of the World” is a consumer choice. Be neither a defeatist nor certainly an accelerationist, however, for avoiding the boiling of the oceans and the burning of the air must be what we put our shoulder to the door for. “To hope is to give yourself to the future,” writes Rebecca Solnit in Hope in the Dark, “and that commitment to the future is what makes the present inhabitable.” Waiting for transformation like it’s the messiah isn’t preferable to collectively willing that transformation, but I know not what that will look like because I’m not a professional revolutionary. The signs that are appearing in the windows of McDonald’s and Subway, Starbucks and Chipotle, from workers tired of being mistreated and underpaid is the largest labor rebellion in a generation, the totally organic Great Resignation spoken of everywhere and reported on nowhere—it gives me hope. It gives me hope because that dark faith, the capitalism that has spoiled the planet, isn’t inviolate; a confirmation of Ursula K. LeGuin’s promise that “We live in capitalism. Its power seems inescapable; so did the divine right of kings.” A corollary is the welcome mocking of fools like Bezos, Musk, and Thiel. Just the widespread awareness of our situation is promising, not because I valorize despair, but maybe if there are a billion little apocalypses it will somehow stave off the big Apocalypse. The whole of the law is treat others as you would wish to be treated and don’t cross a picket line, the rest is all theory. Now, go, and study.   

Finally, I’m only a writer, and the most recondite type, an essayist. Could there by any role for something so insular at the end of the world? In The Guardian, novelist Ben Okri recommends “creative existentialism,” which he claims is the “creativity at the end of time.” He argues that every line we enjamb, every phrase we turn, every narrative we further “should be directed to the immediate end of drawing attention to the dire position we are in as a species.” I understand climate change as doing something similar to what Dr. Johnson said the hangman’s noose did for focusing the mind. It’s not words that I’m worried about wasting, but experiences. What’s needed is an aesthetic imperative that we somehow live in each moment as if it’s eternal and also as if it’s our last. Our ethical imperative is similar: to do everything as if it might save the world, even if it’s unlikely that it will. Tending one’s own garden need not be selfish, though if everyone does so, well, that’s something then, right? I’m counting the liturgy of small blessings, noting the cold breeze on a December morning, the crunch of brown and red and orange leaves under foot, the sound of rain hitting my office window, the laughter of my son and the chirping of those birds at the feeder who delight him. I’ve no strategy save for love. “The world begins at a kitchen table,” writes Poet Laureate Joy Harjo, in a lyric that was introduced to me by a Nick Ripatrazone essay. “No matter what, we must eat to live.” Harjo enumerates all of the quiet domestic beauties of life, how the “gifts of earth are brought and prepared” here, and “children are given instructions on what it means to be human” while sitting at this table, where “we sing with joy, with sorrow. We pray of suffering and/remorse. We give thanks./Perhaps the world will end at the kitchen table, while we are laughing and/crying, eating of the last sweet bite.” That, finally, is the only ethic I know of as the oceans flood and the fires burn, to be aware of our existence at the kitchen table. When the cicadas come back in 17 years, I wonder what the world will be like for them? I hope that there will be bird song.    

Image Credit: Wikipedia

All I Really Need to Know I Learned from ‘A Charlie Brown Christmas’

-

If America’s most beloved blockhead taught us anything, it’s the power of perseverance. Who but Charlie Brown would try (and fail) to kick a football a few hundred times and still dust himself off to try again?

If Charlie Brown and the Peanuts gang taught fans additional lessons, many of them came by way of the iconic 1965 holiday special A Charlie Brown Christmas.  Though early viewings appeared disastrous to CBS executives and sponsors, in its own Christmas miracle, the show’s premiere pulled in 15 million viewers (second only to Bonanza that week) and secured an Emmy, a Peabody, and a place in the hearts of generations of viewers.

As gentle and big-spirited as it is, for modern audiences, it’s also a little boring.  Where are the death-defying action sequences?  The celebrity voice actors?  Or at least a cameo from Santa.  Instead, audiences must settle for the story of a sad boy who fails to direct a Christmas play; selects the lowliest, sprig-sized tree on the lot; and finds himself at odds with a culture that prefers a more commercialized take on the holiday.

Insert a couple of Vince Guaraldi songs and that’s about it.

Though the plot falls short, the philosophy doesn’t. Re-watching it today feels like a masterclass in self-help.  If Charles Schulz and Brené Brown had a love child, they’d name their sage Charlie Brown.

Read on for three Charlie Brown-inspired life lessons that extend well beyond the holiday season.

1. Good Grief, Your Feelings Are Your Own.

Moments after the opening ice-skating sequence, a visibly troubled Charlie Brown laments, “I think there must be something wrong with me, Linus.  Christmas is coming, but I’m not happy.  I don’t feel the way I’m supposed to feel.”  He tries clarifying his feelings: “I might be getting presents and sending Christmas cards and decorating trees and all that, but I’m still not happy.”  Rather than serve as a shoulder to lean on, Linus removes his thumb from his mouth just long enough to reply, “Of all the Charlie Browns in the world, you’re the Charlie Browniest.”  Linus’s emotional invalidation only deepens Charlie Brown’s troubles.  What a blow, and even more painful since it came from a supposed friend.

Linus’s dismissiveness serves as a lesson for the rest of us.  We needn’t understand, or empathize with, someone else’s emotions.  Simply acknowledging them is enough.

2. You Are Not the Disaster You Think You Are.

No sooner does Charlie Brown reveal his sprig-sized Christmas tree as the play’s centerpiece than his actors begin to revolt.  One look at the lowly tree confirms what many of the children already assumed to be true: their play’s director, a perennial loser, is incapable of finding a win.  “Boy are you stupid, Charlie Brown,” grumbles Violet.  “You’re hopeless,” Patty adds.  “You’ve been dumb before…” Lucy says, “But this time, you really did it.”  As the actors storm off the stage, Charlie Brown is left alone with Linus and his humble tree.  “Everything I do turns into a disaster,” he moans.

Not everything.  Were it not for Charlie Brown’s “Everything-I-do-turns-into-a-disaster” outburst, Linus would never have shared his minute-long recitation from the Book of Luke.  And if Linus hadn’t, then Charlie Brown would never have been inspired to bring his “tidings of great joy” in the form of his humble tree.  And if he hadn’t, then Charlie Brown and the gang would’ve never bestowed upon the tree the necessary TLC to transform it from a sprig to a full-bodied fir.

As Charlie Brown demonstrates, the road is long and winding.  And though our decisions can sometimes seem momentarily disastrous, that doesn’t mean that we are.

3. Be a Blockhead.

Following the sprig-sized Christmas tree’s extreme makeover, the Peanuts gang makes peace with Charlie Brown.  “Charlie Brown is a blockhead,” Lucy says, “but he did get a nice tree.”  Except, of course, that he didn’t.  Charlie Brown got a terrible tree, though in doing so, he set into motion the ideal conditions to reveal his true message.  “I won’t let all the commercialism ruin my Christmas,” Charlie Brown previously proclaimed to no one.  “I’ll take this little tree home and decorate it and I’ll show them it really will work in our play.”  And that’s exactly what he did.  And that’s exactly how it played out.  It was not the way he planned, of course, but the outcome remained the same.  In placing principle over popularity, he opted for the real tree rather than the artificial version—despite the haranguing he knew would soon come.

Charlie Brown, we salute you.  And we thank you for your lesson:

Sometimes, the head knows exactly what it wants; other times, we’re better off following our big, old blockheaded hearts.

Diagnosing Billy Pilgrim: On Tom Roston’s ‘The Writer’s Crusade’

- | 1


1. A couple of months ago, when I was feeling stuck in a revision of the novel I’ve been working on for too long, I decided to reread Kurt Vonnegut Jr.’s Slaughterhouse-Five. My reasons for rereading were a writer’s reasons. I was having trouble balancing speculative and realistic elements in my novel and I wanted to see how Vonnegut did it. Vonnegut was one of my favorite writers in my teen years, someone I read and re-read, but at some point in my 20s, I stopped reading him. When I picked up Slaughterhouse-Five, my memories of the book were vague. I knew Vonnegut would use the device of time travel to tell the story of his experiences in World War II; he was taken as a prisoner of war and survived the bombing of Dresden, Germany. I also knew, from Charles Shields’s biography of Vonnegut, And So It Goes, that it was a difficult novel for Vonnegut to write, one he approached from many different angles over two decades. And so, pencil in hand, I opened the book in an analytic spirit, hoping to learn a thing or two from a great writer.

If you’ve read the book recently, you may guess what happened: I pretty much dropped the pencil after a couple of pages. Everything about the book surprised me; it was almost as if I’d never encountered it before. I had completely forgotten that the first chapter is told from the point of view of Vonnegut, the writer, and reads like a memoir. It’s all about how hard it is to write an autobiographical novel, and the misgivings Vonnegut has about turning his war experiences into an entertaining narrative. He claims to have written and discarded thousands of pages, and it feels true; he comes off as genuinely anxious and tired in a way that it surprisingly raw. But the thing that really startled me was Vonnegut’s depiction of Billy Pilgrim’s time travels.

For those of you unfamiliar with the structure of Slaughterhouse-Five, I will quote from the book I’m supposed to be reviewing—and will eventually get to, I promise—Tom Roston’s The Writer’s Crusade: Kurt Vonnegut and the Many Lives of Slaughterhouse-Five:

Vonnegut writes the first chapter of Slaughterhouse-Five as if it’s nonfiction, but then the next nine chapters are about a fictional character, Billy Pilgrim, who travels in time and is abducted by aliens from the planet Tralfamadore, and whose war experiences loosely parallel Vonnegut’s, all of which makes it metafiction, meaning it upends the conventional fictional narrative by blurring the line between the author and the story being told.

Pilgrim’s time travel, combined with the metafictional aspects, are what give Slaughterhouse-Five its extraordinary power. On a storytelling level, the time travel element allows Vonnegut the writer to escape the bonds of linear narrative. I believe he needed to do that for this particular book because he could not bring himself to write a story about a massacre of human life that followed the laws of cause and effect. Instead of building momentum around the question of Pilgrim’s survival, Vonnegut shapes the novel around Pilgrim’s traumatic memory of the bombing of Dresden, inching closer and closer to it until the final chapter, when we get the full picture of what happened to Pilgrim during the war, and why he was never the same afterward.

As a teenager, I took the time travel elements in Slaughterhouse-Five literally and enjoyed them as funny sci-fi elements. Reading it as an adult, Billy’s time traveling immediately struck me as tragic, a symptom of deep trauma. I felt I was in the company of a man so haunted by terrifying memories that he was unable to settle into the present. It’s possible that I’m still taking Vonnegut too literally, reading his characterization of Billy as early reporting on what we now call PTSD. But the parallels are quite eerie.

Here’s Vonnegut, describing Billy’s state of mind, in an opening chapter:

Billy is spastic in time, has no control over where he is going next, and the trips aren’t necessarily fun. He is in a constant state of stage fright, he says, because he never knows what part of his life he is going to have to act in next.

And here’s a passage from Bessel van der Kolk’s bestselling study of trauma, The Body Keeps the Score:
Dissociation is the essence of trauma. The overwhelming experience is split off and fragmented, so that the emotions, sounds, images, thoughts, and physical sensations related to trauma take on a life of their own. The sensory fragments of memory intrude into the present, where they are literally relived.

And here’s Roston, again, describing how Vonnegut uses time travel in Slaughterhouse-Five:

By splintering reality, time, memory, and Pilgrim’s identity, Vonnegut aestheticized one of the primary effects of trauma, dissociation, in which there is a disconnection or lack of continuity between one’s thoughts.

One of the most remarkable things about Slaughterhouse-Five is its ending. The war is over, but there are anonymous, brutal deaths right up to the very end. Then, a bird tweets in Billy’s direction and the book ends. There’s no emotional catharsis for Billy, and no feeling of victory for the reader. This was as Vonnegut intended. In a preface to a special edition of Slaughterhouse-Five, (included in the Library of America’s collected Vonnegut, Novels & Stories, 1963-1973), Vonnegut rejects the idea that he gained any knowledge from his war experience. In witnessing the firebombing of Dresden he says he “learned only that people become so enraged in war that they will burn great cities to the ground and slay the inhabitants thereof.”

2.
When I finished Slaughterhouse-Five, I found myself wondering if Billy Pilgrim could be understood as having PTSD, and to what extent Vonnegut might have suffered from it. That’s how I happened upon journalist Tom Roston’s new book about Slaughterhouse-Five, one in a series of “books about books” published by Abrams Press. In The Writer’s Crusade, Roston argues that Slaughterhouse-Five was ahead of its time, and that “our views of its central themes—war, trauma, and the delicate act of telling war stories—have finally caught up with Vonnegut’s accomplishment, allowing us to see it, and the author, more clearly.” Roston structures his analysis of Vonnegut’s novel around the question of “whether or not Slaughterhouse-Five can be used as evidence of its author’s undiagnosed PTSD.” Although Roston poses the question sincerely to people who knew Vonnegut, it’s also a useful rhetorical device, and one that leads him down different research paths as he delves into Vonnegut’s notes and early drafts and talks with Vonnegut scholars, trauma experts, psychologists, and veterans who have personal experience with PTSD.

In structuring his book, which is a mixture of literary criticism, biography, and a cultural history of PTSD, Roston borrows from Slaughterhouse-Five, with an opening chapter that reflects on the process of writing and researching The Writer’s Crusade, and his ambitions for it. Roston recounts a reporting lead that he chased for some time, hoping to uncover a secret side of Vonnegut. But it’s hard to break news on a writer whose novels, particularly Slaughterhouse-Five, have been combed over by two generations of critics and hundreds of thousands of readers. Reading Slaughterhouse-Five through the lens of psychological trauma is also not a new angle. Roston notes that as early as 1974, the literary critic Arnold Edelstein describe Pilgrim’s time travel as a “neurotic fantasy” to help cope with the trauma of war.

No writer wants to be diagnosed through his work, and perhaps the best thing that Roston does in his book is to give context to the question of whether Slaughterhouse-Five is an autobiographical portrait of Vonnegut’s own war trauma. Roston writes in depth about the novel itself and how it came to be written, including the nitty-gritty of Vonnegut’s literary career before he became famous for Slaughterhouse-Five. (One of my favorite details from this section was just how lucrative the short story market used to be; Vonnegut supported his family on short stories, and even bought a house in Cape Cod.) Roston also provides a history of war trauma and how our understanding of it has evolved over the years. Although the negative psychological effects of war have been observed since ancient times, the symptoms of PTSD were not defined until the late 1970s, when it became apparent that many Vietnam veterans were having difficult adapting to civilian life. In 1980, PTSD was added to the DSM-III, and has now become so well-known that that people refer to it in casual conversation to describe any number of symptoms in the wake of traumatic events. Roston calls it as “the signature mental disorder of our age” and tries to untangle its popular definition from its clinical one. He also brings in the expertise of veteran-writers, such as Tim O’Brien, as well as veterans with an affinity for Vonnegut’s work. He wants to hear how they interpret Slaughterhouse-Five, given their experiences with war and trauma.

It’s with the help of a veteran that Roston finally attempts to diagnose Billy Pilgrim and Vonnegut, using a Veterans Affairs-issued PTSD screener. He brings many voices into the discussion, including Vonnegut’s children, literary critics, psychiatrists, and Vonnegut himself. Billy, being a fictional character, is elusive. Vonnegut, even more so. Those who knew him personally have varying opinions as to the extent of his war trauma and whether it falls under the diagnostic rubric of PTSD. Certainly, Vonnegut could be diagnosed with the loose, popular definition of the term. Speaking for himself, Vonnegut did not regard himself as someone with PTSD, and did not see Billy Pilgrim as an alter ego. In interviews later in life, Vonnegut revealed that Billy was loosely based on a private he knew in war, who died of malnutrition a few weeks before the war ended because—it seemed to Vonnegut—he had lost the will to live after witnessing so much senseless violence. Nor did Vonnegut conceive of the time travel element as a way of representing the symptoms of PTSD. Instead, he saw it as a comic device to lighten the heavy mood of the book. (So, my adolescent reading wasn’t totally stupid.) Roston doesn’t argue with Vonnegut’s analysis, but he does observe that at least some of Vonnegut’s reluctance to dwell on the past is generational. Vonnegut may have gotten in touch with war buddies and made his share of desperate late-night phone calls—as detailed in the opening chapter in of Slaughterhouse-Five—but he wasn’t visiting the VA for help. “To the best of my knowledge,” Roston writes, “Vonnegut never sat in a room with a VA-organized group of veterans to process his feelings.”

Vonnegut also avoided overly autobiographical interpretations of Slaughterhouse-Five because he didn’t want to be pigeon-holed as a writer traumatized by war, or as someone whose impulse to write was related to his war trauma—and anyone who looks at his life and work can see that this isn’t the case. But Slaughterhouse-Five is a special book. To say that it is Vonnegut’s most personal doesn’t seem quite right, in part because I don’t know Vonnegut personally. (If I had to guess, I’d pick Cat’s Cradle as the book closest to his heart.) After re-reading it, and reading Roston’s book, I think it’s actually the novel that has the least to do with Vonnegut. In the strange way of great works of art, it escapes the confines of Vonnegut’s autobiography as well as the PTSD diagnosis. Maybe it even eludes war and instead speaks to a feeling of bewildered pain that is universal to all human beings when confronted with violence. The flights to Tralfalmadore feel like a way to get some distance from the psychic mess we’re all in on this planet. Roston concludes as much in his final analysis: “As much as I’ve tried to pull out the threads on Slaughterhouse-Five to determine its relationship to war trauma, a book can never be just one thing.”

The Origins of Raoul Duke

-

Fifty years ago, Hunter S. Thompson published a two-part story in Rolling Stone that could only be categorized as Gonzo—a one-man literary genre marked by bizarre flights of fancy, hyperbole, depictions of drug abuse, and often violence. Fear and Loathing in Las Vegas introduced a generation of readers to the hilarious and shocking antics of Raoul Duke and Doctor Gonzo: larger-than-life characters based loosely upon Thompson and his friend Oscar Zeta Acosta.

The story, published in book form the following year, catapulted Thompson to fame, but before long he was frustrated by the fact that readers could hardly separate him from his creation. The device he had created to fuse fact and fiction in a wholly original form of literary journalism quickly became an encumbrance. Hordes of young fans viewed Thompson and Duke as one and the same—something that irked the writer, even if he did little to dissuade them. In his writing, he readily presented himself as a cartoonish outlaw and in public he played the role his fans wanted to see, frequently appearing on stage in costume whilst blind drunk and openly imbibing illegal substances.

Even today, the Duke/Thompson image is a popular Halloween costume and dorm-room poster, and on social media thousands of fans proudly boast about their own consumption of hallucinogenic substances whilst quoting Raoul Duke and attributing his words to Thompson. Less interested in his literary innovations than his outrageous actions, most fans seem utterly unaware that Raoul Duke was merely a figment of Thompson’s prodigious imagination. To them, the exploits of Duke and Gonzo were a relatively faithful account of Thompson and Acosta’s own Vegas adventures. But of course these were carefully invented for satiric purposes.

Indeed, it is quite possible to pick apart the real and the imagined in Thompson’s hallucinatory Gonzo prose, and many of his close friends will attest to the marked distinction between Thompson the author and Duke the character. With some effort, one can delineate the real and the imagined in most of Thompson’s work, yet it is far harder to pin down just where exactly Duke came from and why he was given his unusual name. Thompson, an incorrigible self-mythologizer, was reluctant to explain and, as he did when asked about what really happened in Las Vegas, simply complicated matters further by giving contradictory or deliberately vague answers. In an interview with the Paris Review, he was typically circumspect on the matter, pretending not to remember, but suggesting that “Raoul” came from Fidel Castro’s brother.

Thompson’s letters are often a good insight into the truth behind his various fictions, but they are also littered with attempts at obfuscating the reality of his life. Still, in discussing Raoul Duke with his publisher, he appeared honest about the purpose of his creation. These letters from the ‘60s show that he created Duke in order to do and say more outrageous things—things even the notoriously outspoken writer wouldn’t do. Duke was intended to be a comical, fictional device that would engage a reader throughout an otherwise accurate piece of journalism, allowing Thompson to explore serious and complex issues. However, as illuminating as his letters are, they do not pinpoint where and when Duke was invented or why he was given his odd name.

This confusion is further exacerbated by those who have previously attempted to uncover the truth. According to his biographer, William McKeen, Thompson invented Duke when editing the Command Courier at Eglin Air Force Base in the mid-1950s, but Duke’s name never appeared in any of Thompson’s articles from that era. His former literary executor, Douglas Brinkley, claimed in Fear and Loathing in America that Duke had been created to write about the 1968 Democratic Convention, but this is also incorrect. Whilst Thompson did use Duke in aborted efforts at writing about the tragic events in Chicago, the name appeared two years earlier in Thompson’s 1966 debut, Hell’s Angels.

Although Duke played no real part in this breakthrough book about the infamous motorcycle gang, his name was casually included in a list of outlaws near the end. Readers at the time must have been baffled, as this was the first time Thompson had ever used that name. However, it was not the first time that the words “Raoul Duke” had appeared next to each other in print. The improbable appellation had popped up in a series of articles in December 1965—a month when Thompson was furiously scouring newspapers from across North America in search of material for his book, which he considered a meditation on the media as much as the biker gang. In the midst of his press binge, he more than likely stumbled upon a series of stories about an unassuming businessman from Calgary by the name of Raoul “Duke” Duquette.

It is likely no coincidence that the name Raoul Duke first appeared in print the year before Thompson adopted it. Throughout his career, he frequently latched on to words (savage, doomed, atavistic) and names (Yail Bloor, Claude Fink, Martin Bormann) that he found amusing or otherwise significant. He was a literary magpie, borrowing the phrase “fear and loathing” and the word “gonzo,” as well as titles for several of his unpublished books, including “guts ball” (taken from Ken Kesey). Many of these became his catchphrases, parroted often enough that they became inseparable from Thompson himself. But seldom did he acknowledge their true origins.

It is not hard to imagine that the same happened with Raoul Duke. From a seemingly insignificant reference in Hell’s Angels, Duke became the device around which Gonzo journalism was conceptualized. Over the next five years, he was trialed in short stories, articles, and satirical reviews before being immortalized in the era-defining, genre-bending classic Fear and Loathing in Las Vegas. As with most of his other favorite words, phrases, and names, Thompson refused to disclose the origins of Raoul Duke, but perhaps now we can speculate with some confidence that this countercultural legend began as a humble Canadian shop manager.

Nailing the Walkaway

- | 2

Years ago, a novel-writing teacher of mine liked to ask her students, “What is the feeling you want to leave the reader with, when they finish this piece?”
The teacher was Jeanne Cavelos, the Director of the Odyssey Writing Workshop. Her question struck me as perfectly normal and legitimate, but not really central. At the time, I was writing passionately, though without a plan. Writing novels was fun, a kind of grand exploration. It was art was for art’s sake…art for my sake. I wasn’t thinking about how my work would emotionally land with readers. And shockingly, my novels—wait for it—didn’t emotionally land with readers.
After years of writing novels that readers shrugged off, I decided two things. (1.) I really did want to reach readers. (2.) I wanted reach them the way great novels had reached me: with grand, indescribable moods. I wanted to evoke the aching, forlorn beauty at the end of The Great Gatsby; the rugged, mystic hope at the close of Cormac McCarthy’s The Road; the awesome compassion for genius on the final pages of Maggie O’Farrell’s Hamnet.
As I pondered how to do this, I came to a few core realizations.
First: great stories deliver great payoffs.
Two of my favorite narrative forms are the parable and the joke. Both are ancient and durable; the basic microorganisms of narrative. They are highly economical systems of set-ups and payoffs. In the parable, the payoff is wisdom. In the joke, it’s laughter. Anyone constructing a joke or a parable arranges a tight, sturdy story to get to the payoff as quickly as possible.
A novel, it occurred to me, is fundamentally no different. Though it’s a larger and more complex organism, capable of delivering more complicated emotions, ultimately, it’s still just an elaborate system of set-ups and payoffs.
Second: the genre determines the type of payoff.
Readers of romance, mystery, thriller, action, and horror, expect certain kinds of endings. If you don’t deliver them, beware—it’s like telling joke no one laughs at.
Knowing your ending = knowing your genre.
Knowing your genre = knowing what your reader craves.
Knowing what your reader craves = the first step in giving it to them.
Third: focusing on the final payoff is helpful in crafting the story’s beginning and middle.
If you’re going to end high, you have to start low. If you’re going to end low, you have to start high. The beginning is reverse engineered from the end. Most character arcs can be boiled down to this: “It’s a story about a character who begins at X and must overcome Y to get to Z.” But in the writing process, Z comes first. Without Z, you don’t know what Y or X must be. Without Z, you don’t have a story.
I recently got back in touch with Jeanne and told her how formative her one basic question had been. She drove her points home.
“Readers pay the most attention at the beginning and end of what they read, and most especially at the end, so what you put there is very important,” she told me. The emotion at the ending “will, in many ways, define the piece and determine what kind of story it is.”
All of this begged a question: did a novel’s final, emotional payoff have a name?
I asked a few writers.
“A reverb? A resonance?” said novelist and memoirist Jenna Blum. “Like that sound the orchestra makes and that you carry with you in your heart as you walk out of the building.”
Hmmm. So maybe…le sentiment après l’orchestre?
“We do need a word for that feeling,” said author Stephen Kiernan, “When I get to the end of certain books, I feel ruined for a while. It’s that feeling of satisfaction, and nostalgia, and fullness. It’s a feeling only a novel can accomplish.”
Then I chatted with author Jonathan Evison, who did have a name for it.
“You have to hit that last right musical note that will sustain,” Evison said, describing how he works on his own novel endings. “When you hit it, you know it, and the book doesn’t ever feel done to me until I hit it. In Hollywood they call it ‘the walkaway.’”
For the past year on my author interview show, The Thoughtful Bro, which airs on A Mighty Blaze, I have closed each of my conversations the same way. I wait until the end of the discussion, when the authors are nice and loose, then I say:
“Imagine you have an ideal reader, someone who is receiving your book in exactly the way you wish it to be received. Now imagine they have just finished the final page. In a word or a phrase, what is the feeling the reader has at that moment?”
Many authors at first seemed stumped by the question. Clearly, being able to articulate the walkaway is not a requirement for producing great books. But when they did get around to their answer, the responses varied wildly. They were funny, poignant, inspiring, and above all, revealing.
“Like child’s pose at the end of yoga,” said Yaa Gyasi, about her novel Transcendent Kingdom.
“Sweaty and happy,” said Mark Leyner, about his latest, The Last Orgy of the Divine Hermit.
“It’s a cozy, warm feeling,” said Jeff VanderMeer, about his young adult romp, A Peculiar Peril. “You close the book in front of the fireplace, after having had a couple of marshmallows and a hot cup of cocoa.”
“It’s kind of a Mister Rogers feeling,” said George Saunders, about his work of non-fiction, A Swim in the Pond in the Rain. “I like you, and I hope you like me, and we just did something together that was kind of fun.”
Other authors hoped their readers felt “remorse,” “companionship,” “euphoria,” “a beautiful sadness,” or even the feeling of “a struck diving board—just humming.”
In the video below, you can see a compilation of writers trying to boil down what they hoped their readers felt when finishing the books they’d written. The question seems to cut to the core, for each author, of their book’s raison d’etre.

 
As for me, a sailor navigating the vast oceans of narrative, I now feel confident in my north star. Whether I’m introducing a protagonist on page one, keeping the arc aloft in the muddy middle, or tying up things up in the denouement, I’m constantly inching toward that final emotion, that parting shot, the reverb, the resonance, that ultimate task and unique power of the novel…
The walkaway.
Image Credit: Flickr/Robert Couse-Baker

The Kindest Cut: On Rejection

- | 7

At a party many years ago, I jokingly told another guest that I’d do my best to remember his name after I published a book to great acclaim. I meant to be funny and disarming, but I bungled the delivery. I came off like a jerk. This other guest was a writer, too, and an equally self-serious one. Predictably, he took offense. He pointed his beer bottle at me. I’ll remember you too, he said, when I win my Pulitzer.

I did not shrug off his rejoinder. How could I? He may as well have suggested pistols at dawn. I was young. I was jittery with ambition. I was terrified of failure in all forms. By the time his girlfriend—who’d introduced us—returned with more beers, we were pompously debating who had better odds for a Nobel.

Flash forward to last winter, as I walked to the building where my daughter has choir practice. I wore a thick parka, but I felt my phone vibrate to announce a new email. Once inside the vestibule, I warmed my hands and checked my email to find a message from the agent who’d asked to read my latest novel. “There is really so much to admire in this manuscript,” the email began. I stopped reading and put the phone back in my coat pocket. Then marched onward to find my daughter. Already, I knew: rejection. I knew all the words that would come. I’d heard them all before.

There’s a fairly wide gap between what I expected as a preposterous young man and the writing life as I’ve lived it. I’m old enough to see how disillusionment is the price for adulthood in every vocation, not just writing and the arts. Yet, one facet of my writing life still surprises me with its wicked gleam. Once, I believed as a writer my most important skill would be knowing how to lay words in a line that’s solid as a cut stone wall. Nope. Turns out the most important skill for me as a writer, the skill I can’t live without, and the one that took the longest to learn, is a skill for failure.

OFFSTAGE VOICE (Indignantly): Your Honor, the defendant is beating around the bush. For the record, can he state plainly that he has not, I repeat, not yet won a literary prize of any kind, or general acclaim, or even, I believe, published a real book, yes? Is that right? Can that be right, after all these years?

Yes, yes, that’s true, all of it. I’ll be specific, then: I graduated from an MFA program over 21 years ago now. I was not idle during those years. But it took 12 years for me to write and revise a story to the point that a journal was willing to publish it. The rewriting wasn’t the problem, although I wrote a lot of bad prose, too; the problem was submitting.

Once upon a time, any query that I sent out involved printing an excerpt or a story and sending the pages in the post. I had a stockpile of Uline envelopes, printer paper reams, spare toner, and stamps, always stamps. I was a regular at the drop box in the post office in Newark, N.J., the tired one near the office where I worked. Eventually, the postal tide carried rejections to me in my own self-addressed envelopes. I was prepared for cold, impersonal rejections. I hated to see them, but they were expected. However, nothing prepared me for the kind ones. For the almost-but-not-quite-there rejections, the let downs that were almost yeses. The kindest of cuts stung the most.

“I want to stress how close this came,” wrote one editor, about a story I’d submitted. The first agent I ever queried wrote: “I found myself starting and stopping, convinced by your talent without being fully absorbed.” An editor at Grove went so far as to edit 50 pages of my manuscript before deciding, no, no, not for her. I have a photocopy of the edits; a kind gesture, but also a painful one.

As time passed, more and more people I knew began to show up, one here, one there, in The New York Times Book Review. I wrote genuine congratulatory emails to friends and I tripped to the outer boroughs for readings in book shops, but let’s be real: I felt awful. It’s not that I wished these people ill. (Except that Pulitzer guy from the party long ago. He was insufferable.) I just wanted some wins, too.

One day, I opened the NYTBR to see a large photo of someone who lived three doors down in college. Her jaunty, snazzy novel was a bestseller. I’d had no idea she was a writer before that day, that moment. The shock was like learning your wife’s high school flame is moving in next door. It doesn’t really matter. Except it totally matters.

After the world mutated into its current digital state, submissions became easier, and rejections came faster. Gmail easily captured them all, and it still allows me ready access should I wish to torture myself with re-readings. “I read this piece with great interest, and I thought the characters were beautifully done,” begins a rejection from an editor at Knopf. “I wanted to let you know,” an editor writes at the end of a rejection notice from Electric Literature, “our readers commented on the story’s smart pacing and evocative details.” Another editor, another journal: “Your piece made it to the top of the general submission pile, but we didn’t take any general submissions this year.”

I have more letters like these. Dozens. Dozens of dozens. I did my best to read each and then move on. Sometimes I complained, but I learned, as the years went on, that quiet endurance was what people expect of anyone foolish enough to harbor artistic ambition. If I complained about rejection at, say, a wedding reception, a garrulous bore with a practical degree and a smug worldview was always at hand and ready to point out that this author or that cult classic got the thumbs down from publishers a few dozen, a score, hundreds, hell, why not a thousand times? This is what you asked for, yeah?

Sometimes, a useful piece of criticism would crop up in a rejection letter. “The writing is a bit flat for our list,” said the editor of a small press, after reading my novel about an interracial marriage. How lovely it felt to get clear criticism! Flat, toneless, lacking affect? Got it. This was an observation that I could address. Briefly, I had the clarity of a writing workshop. I rewrote the entire 85,000-word novel. Doubled down on submissions. But it still didn’t sell. An editor at Random House said of that book: “I read this with great interest, and thought the characters were beautifully done, and the writing was lovely, but ultimately, the stakes just didn’t seem to be high enough.”

The failure of that novel—my third since graduate school—was a turning point for me, a moment when I saw it clearly spelled out that perhaps I just plain wasn’t good at something crucial to fiction. I could write crafty sentences, even create plausible characters, but could I make a reader care?

People talk about fiction sometimes as if the elementals are so simple. Make sure your characters want something! Make sure there are stakes! Make sure the prose rings like a wine glass when you tap it! As if these attributes rose out of simple decisions made on a line-by-line basis. Yes, they do. But there’s also something more. There must be. Or else. after years of trying, wouldn’t I already have found the formula that works?

One evening after a reading that made me feel frustrated and jealous, I began writing an essay. I didn’t know that’s what it was. I was just tapping into my phone while riding on the train. But then I kept working on it after I got home. The subject: Why was I still writing? What was it that had me still trying, refusing to quit, despite failing for so long?

That meditation on failure turned into my first published essay; ironically, an essay on not succeeding was my first piece of writing to succeed in reaching an audience. I didn’t quit writing fiction per se after that. But I did write another essay. And another. I found a comfortable niche in personal essays. The canvas is smaller. The stakes are clear in the first few lines. And I fail less often when I send essays out.

Of course, I am stubborn, and I don’t always operate in a rational manner. I continue to labor at fiction. Every few years, I am swept up in the idea for a novel. I spend months and months writing doomed texts. But I don’t have the same attitude toward what will happen when I’m finished. The expectation of success is gone, so much so that I sometimes forget the point of writing a novel is to share it.

Sometimes I even see humor in rejections where no humor was intended. “There was a lot I admired about this,” said an agent who read my manuscript about the poet Sappho reappearing in the modern world. “But I couldn’t accept the idea that Sappho was somehow transplanted into the modern day.” Indeed, it would be hard to admire much about a book where you can’t accept the central premise.

Another agent once rejected a novel manuscript, but without stating why. Very well, carry on. Then, six months later, she wrote another email, apologized for the delay, and rejected me a second time. If this doesn’t make you laugh, then don’t write a book.

Not long ago, an agent rejected me and, in a few sentences, she also helped me to see how and where I fail as a fiction writer: “The gracefulness and control of your writing impressed me. [The characters] make an intriguing pair of protagonists around whom the narrative twists and unravels. As much as I admire these aspects, however, I fear the novel may ultimately be too introspective, without quite enough plot development to move the narrative along.”

Am I too introspective? Yes, I am. Do I not write fiction the way most good fiction is written? Perhaps I do not. Does this bother me? A little yes, a little no. But this is who I am. I am the Don of the Also Rans. Mr. Not Quite Good Enough. I’m a Master of Failure, and although that’s not what I set out to be, it’s something.

As a writer, I have had to learn how to fail, to fail with economy and without anger, to fail in ways that allow me to see where I am not writing well, or where I have sent my words to the wrong place. Once I learned this, everything was easier, or at least everything that had to do with rejection.

For more than a year I have been sending out into the world a new novel, Likeness; it’s starting to look like my most masterful failure yet. “There is no question that you are a very talented writer,” wrote an agent’s assistant after reading it. “This story pulled me along and kept me invested, and it’s like nothing else I’ve read in the best way.” I mean, yes, yes, right? Isn’t that what a novel should do? But, no, she went on. The book just wasn’t a fit. Her boss had no idea where to place it. It was too strange, too weird, too much its own thing.

If this is failing, then I suppose that I should have it no other way.

Image Credit: Flickr/DrewCoffman

Conquering Hell

-

“You have reason to wonder that you are not already in hell.” —Jonathan Edwards, “Sinners in the Hands of an Angry God” (1741)

“For the ones who had a notion, a notion deep inside/that it ain’t no sin to be glad you’re alive.”—Bruce Springsteen, “Badlands” (1978)

Charging only a quarter, Joseph Dorfeuille allowed the curious to view Hell itself—admission was half-price for children. Not far from the Ohio River, amongst the steep hills of the Queen City, and from 1820 to 1867, the Western Museum of Cincinnati promised in an advertisement “Come hither, come hither by night or by day, /there’s plenty to look at and little to pay.” Founded by physician Daniel Drake, known by enthusiasts as the “Ben Franklin of the west,” the institution was modeled after the Wunderkammers, “Wonder Cabinets,” of Europe, displaying shells and rocks, feathers and fossils, pottery shards and arrow heads. Even ornithologist James Audubon was on staff. Only two years after its founding, however, and the trustees forced Drake to resign. In his place Dorfeuille was hired, who rather than assemble materials zoological, archeological, and geological, understood that the public was curious about the “occasional error of nature.” In place of Drake’s edifying scientific exhibits, Dorfeuille mounted skeletons that moved by mechanical apparatus, dancing while an organ grinder played. He featured a diorama of wax figurines depicting the local murderer, Cowan, in the act, while also preserving in formaldehyde the head and heart of Mathias Hoover, a Cincinnati serial killer. And with particular popularity, the director distributed huffs of nitrous oxide after his “lectures.” But no exhibit—even the laughing gas—was quite as popular as “Dorfeuille’s Hall.”


A recreation of characters and scenes from the 14th-century Italian poet Dante Alighieri’s epic religious allegory The Divine Comedy, as well as from the 17th-century British poet John Milton’s Paradise Lost. Molded in beeswax, the hall was mounted by Hiram Powers, who’d eventually become the first celebrated American neo-classical sculptor (Elizabeth Barret Browning would pen a sonnet in honor of his work). Powers was tasked with illustrating our most grotesque visions—it would be among the most talked about exhibits before the Civil War. Powers crafted wax figures of the demon Beelzebub and of fallen arch-rebel himself, Lucifer. Adept in mechanism and sound-effects, his wax statues would shakily move in the darkness while screams emanated. “Visitors…were so intrigued by the realism of the figures that they were constantly touching them for confirmation that they were indeed wax,” writes Andrea Stulman Dennett in Weird and Wonderful: The Dime Museum in America. “To minimize the damage to his sculptures,” she explains, “Dorfeuille had to put an iron grating charged with a mild electrical current.” At the center was the “King of Terrors,” a stock Devil with red horns and pitchfork, smoke swirling about him. Originally Powers played the role, though an automata would after he quit the Western Museum—moving onto a respectable art career in Washington DC and then in Dante’s Florence—after Dorfeuille stiffed him on pay. By 1839, Dorfeuille sold off the Western Museum of Cincinnati, but he took his deteriorating waxworks to New York, where they were latter immolated in a fire, their owner following them into the underworld a few months after. King Entropy awaits us all.

A skeleton at the exit to Dorfeuille’s Hall held aloft a sign with some doggerel on it: “So far we are equal, but once left, /Our mortal weeds of vital spark bereft, /Asunder, father than the poles were driven;/Some sunk in deepest Hell, some raised to highest Heaven,” though highest Heaven was never as pruriently fascinating as deepest Hell. Powers didn’t outfit an exhibit with harp-playing, winged angels and shining halos; no wax figurines of the unassailable, the respectable, the decent. Not that it was ever much different, for people have always been more attracted—in both the neurotic’s fear and the sadist’s delight—with the fires of damnation. We recognized the 700th anniversary of Dante’s The Divine Comedy in September, and while I have no reliable numbers, my hunch is that 10-to-1 more people have read “Inferno” than the remainder about Purgatory and Paradise. Hell is more visual; if asked to envision Heaven we could offer gauzy, sepia-toned cliches about clouds and pearly gates, but if being perfectly honest nothing about it sounds appealing. But Hell. Well, Hell, we can all agree is interesting. The sulphur and shrieks, bitumen and biting, the light that burns eternally but gives off no glow. It all sounds pretty bad. While our curiosity draws us to the grotesque, we also can’t help but be haunted by Hell, traces of its ash smeared across even the most secular mind.

“Understand, I’m not speaking here only of the sincerely religious,” writes Dinty W. Moore in his irreverent To Hell With It: Of Sin and Sex, Chicken Wings, and Dante’s Entirely Ridiculous, Needlessly Guilt-Inducing Inferno, focusing on his shame-filled Catholic education. “The fable of our flawed souls, the troubling myth of original sin, the looming possibility of eternal damnation clandestinely infects even those of us who think ourselves immune: atheists, agnostics, secularists.” Supposedly only the most zealous lives without doubt, but just as such a pose evidences more anxiety than might be assumed, so too is nobody an atheist all of the time. The pious are haunted by absence and apostates by a presence, but the fear is the same. I don’t believe in a literal Hell, a place of eternal torment where our sins are punished by demons—most of the time. Hell is often on my mind, but unlike Moore I think that occasionally there is intellectual benefit in that which is sulphury (or at least the idea of it). Moore writes about the traumas of “depressive guilt brought on by religious malarkey,” and it would be a cold critic to disagree that belief has been used to oppress, persecute, and inculcate shame. Not just a cold critic, but one incapable of parsing objective reality. But even though he’s right, it’s still hard for me to throw the demon out with the hot coals.

Hell’s tragedy is that those who deserve to go there almost never think that they will, while the pious shame-filled neurotic imagines those flames with scrupulous anxiety. My soul is content if God prepared a place of eternal torment for the EXXON executive who sees climate change as an opportunity to drill for Arctic oil, the banker who gave out high-risk loans with one hand and then foreclosed on a family with the other, the CEO who makes billions getting rich off of slave labor, the racist representative who spreads hate for political gain, the NRA board member unconcerned with the slaughter of the innocent, the A.M. peddlers of hokum and bullshit, the fundamentalist preacher growing rich off his flock’s despair, and the pedophile priest abusing those whom he was entrusted to protect. Here’s an incomplete list of people who don’t belong in Hell—anyone who eats the whole sleeve of Oreos, somebody who keeps on pressing “Still Watching” on the Netflix show that they’re binging, the person who shouts “Jesus Fucking Christ” after they stub their toe, anybody who has spied on their neighbor’s home through Zillow, the Facebook humble-bragger talking about a promotion who keeps hitting “Refresh” for the dopamine rush of digital approval, the awkward subway passengers suddenly fascinated by their feet when a loud panhandler boards, and mastrubators. That so many guilty of the little sins, peccadillos, tics, frailties, and flaws that are universal and make us all gloriously imperfect humans so often feel crippling shame, guilt, and depression over these things is tragic. That my first category of sinners almost never feels those things is even more so.      

Before either Hell or Heaven there was the shadow land of indeterminate conclusions. Neither the ancient Greeks or Jews much delineated out an afterlife. Greek Hades was a grey place, a foggy place, and not particularly pleasant, even if it wasn’t exactly inferno. “Gloomy as night,” is how Alexander Pope describes Hades in his 18th-century translation of Homer’s The Odyssey, a realm populated by “Thin airy shoals of visionary ghosts.” Excepting Elysium—where excellence is more honored than virtue—and everybody has Hades to anticipate, though Homer is content to consign even the glorious to this depressing place. “I’d rather slave on earth for another man,” the hero Achilles tells Odysseus in Robert Fagles’s contemporary translation, “than rule down here over all the breathless dead.” The ancient Jewish conception of an afterlife was originally not much more hopeful, with the Hebrew Scriptures speaking of Sheol, described in Ecclesiastes as a place of “neither deed nor reckoning, neither knowledge nor wisdom.” As smudgy and unfocused as Sheol was, the Bible sometimes draws distinction (and conflation) with a horrific domain known as Gehenna.   

Drawing its name from a dusty valley where pagan child sacrifice had once been practiced, and which became a filthy heap where trash was burnt in a frenzy of ash and dirt, Gehenna is ruled over by Baal and dedicated to the punishment of the wicked. Both the Talmud and the New Testament use the word “Gehenna,” an indication of how, in the first centuries of the Common Era, a comprehensive vision of the afterlife emerged in Judaism and Christianity. Among the Sadducees, who composed the Temple elite, the afterlife was largely defined by absence, but the Pharisees (from whom rabbinic Judaism emerged) shared with early Christians an interest in charting the geography of death, and Gehenna was a highlighted location. (Worth mentioning that the historical Pharisees bear no similarity to the hatchet job performed on them in the Gospels). As it was, Jewish Gehenna would be understood slightly differently from Hell, more akin to Purgatory, a place of finite punishment cleansing the soul of its inequities. Though as the 13th-century Kabbalistic Zohar makes clear, the truly evil are “judged over in this filth… [and] never get released… the fire remains.”

Though both Judaism and Christianity developed a complex vocabulary of Heaven and Hell, it’s not unfair to attribute much of the iconography of perdition to Dante. Writing a generation after Dante’s death, and Giovanni Boccaccio predicted that as regards the Florentine poet’s name “the more it is furbished by time, the more brilliant it will ever be.” Boccaccio’s prediction has been perennially proven correct, with the Victorian critic John Ruskin describing Dante as the “central man of all the world,” and T.S. Eliot arguing that “Dante and Shakespeare divide the modern world between them; there is no third.” All of this might seem a tad Eurocentric, a tad chauvinist, and a tad too focused on Christianity—the apotheosis of Dante into a demigod. Concerning prosody—Dante’s ingenious interlocking rhyme scheme known as terza rima or his ability to describe a “place void of all light, /which bellows like the sea in tempest, /when it is combated by warring winds”—he was brilliant. But acknowledging acumen is different—even Voltaire quipped that more people valorize Dante than read him. And yet (there’s always an “And yet”…), there must be a distinction between the claim that Dante says something vital about the human condition, and the objective fact that in some ways Dante actually invented the human condition (or a version of it). John Casey argues in After Lives: A Guide to Heaven, Hell, & Purgatory that Dante’s was “without doubt the supreme imagining of the afterlife in all [Western] literature,” while Alberto Manguel in The Traveler, the Tower, and the Worm: The Reader as Metaphor claims that his epic has “acquired a permanent and tangible geography in our imagination.”

“Dante’s story, then, is both a landscape and a map” writes Manguel. None of the poem’s unfortunates get out, save for one—the author. Midway in the course of life Dante falls into despair, loneliness, alienation, and The Divine Comedy is the record of his descent and escape from those doldrums. Alice K. Turner argues in The History of Hell that Dante was the progenitor of a “durable interior metaphor.” Claiming that the harrowing and ascension that he described can be seen in everything from psychoanalysis to 12-step programs, Turner writes that “this entirely comfortable and pervasive method of modern metaphorical thinking might not exist if Dante had never written.” “Empathy” might not be the first word readers associate with a book sticky with the blood of the damned—”punishment” or even “justice” would figure higher—and yet Dante feels pain for these characters. That aspect of The Divine Comedy is why we’re still talking about it. Turner explains that “Dante was concerned with history, with Florentine politics, with the corruption of the clergy, with the moral position of his contemporaries, and most of all with the state of his own psyche,” while arguing that at a “distance of seven centuries, we can no longer easily appreciate any of these things except the last—Dante is generous with his emotions.” It’s true that for any contemporary reader, concerns with forgotten factions like the Ghibelline and Guelphs, parsing of Thomas Aquinas, or condemnations of this or that obscure pope can seem hermetic. When perusing a heavily glossed and footnoted copy of The Divine Comedy, it’s his intimate perspective that is the most human.

Eliot may have claimed that between Dante and Shakespeare there was no third, but that’s the sort of thing that a self-declared “classicist in literature, royalist in politics, and Anglo-Catholic in religion” would say, giving short shrift to that bomb-throwing author of Paradise Lost. Earth can be given over to Shakespeare, but Heaven and Hell belong to Dante and Milton. If The Divine Comedy is the consummate expression of Catholicism, then Milton’s epic is Protestantism’s fullest literary flowering, and yet neither of the two are orthodox. Milton’s depiction of damnation in media res after the rebel angels have been expelled from Heaven and the once beautiful Lucifer has been transformed into Satan, revises our understanding of Hell for the first time since Dante. Much remains recognizable in Milton, even if immaculately portrayed, this “dungeon horrible, on all sides round, /As one great furnace, flames; yet from those flames/No light, but rather darkness visible,” a fallen kingdom defined by “sights of woe, /Regions of sorrow, doleful shades, where peace/And rest can never dwell, hope never comes… but torture without end.” Paradise Lost’s beauty belies the darkness of that sunken pit, for Milton’s brilliance has always been that he acknowledges what’s evocative, what’s magnetic, what’s attractive about Hell, for Lucifer in his intransigence and his obstinacy declares that it is “Better to reign in Hell, than to serve in Heav’n.” Dante’s Satan is a monster encased in ice, weeping frozen tears as he forever masticates the bodies of Casius, Brutus, and Judas. He is bestial, animalistic, and barely sentient. Nobody would admire him; nobody would want to be Dante’s Satan. Milton’s Lucifer, on the other hand, is a revolutionary who gets all the best lines; certainly, better than God and Christ. As William Empson had it in his vaguely heretical Milton’s God, the “poem is not good in spite of but especially because of its moral confusions.”        

Milton is a “Puritan,” that wooly category of killjoy whom we associate with the Plymouth Pilgrims (though they were technically different), all belt-buckled black hats and shoes, trying to sniff out witches and other people having impure thoughts. As it goes, Milton may have politically been a Puritan, but he was also a unitarian and a materialist, and on the whole a rather cracked Protestant, not least of all because of his Devil’s thinly disguised heroism. Paradise Lost is honest because it exemplifies a principle that we all know, something expressed by Augustine in his Confessions when filching a pear from a Tunisian marketplace like he was Eve in Eden with her apple, admitting that he wasn’t even hungry but he did it because “It was foul and I loved it. I loved my own undoing.” Doing bad things is fun. Puritans ironically seem more apt to admit that clear truism than all of the Panglossian advocates for humanity’s intrinsic good nature, an obvious foolishness. Just try and negotiate a Trader Joe’s parking lot and then tell me that you’re so certain that Original Sin is old-fashioned superstition. Acknowledging sin’s innate attractiveness—our “total depravity” as John Calvin described it in his 16th-century tome The Institutes of Christian Faith—means that detailed descriptions of Hell can sound perverse. At a pulpit in Enfield, Conn., the minister Jonathan Edwards delivered an infamous sermon in 1741 in which he told the assembled that “Your wickedness makes you as it were heavy as lead, and to tend downwards with great weight and pressure towards hell… if God should let you go, you would immediately sink and swiftly descend and plunge into the bottomless gulf” for God abhors all of us, and is ” his wrath… burns like fire; he looks upon you as worthy of nothing… but to be caste into the fire.” According to Edwards, all women and men are “ten thousand times more abominable in [God’s] eyes, than the most hateful venomous serpent is in ours.” Historians note that while Edwards delivered his sermon, congregants rolled around in the church aisles and stood on the pews, moaning and screaming. All of this is a bit kinky, honestly.   

Calvinism’s God has always read more like his nemesis, omnipotent enough that He created us, but apparently not so omnipotent that He makes us worthy of salvation. More than a tyrant, He reads as a sadist, but when God is deleted from Calvinism what’s left is only the putrid, jaundiced, rotting corpse of our contemporary world, where nobody knows the value of anything, but only the price of everything. Nothing better expressed the dark American transition to post-Calvinism than our greatest of novels, Herman Melville’s Moby-Dick; or, the Whale, of which the author wrote to his friend Nathaniel Hawthorne in 1851 that “I have written a wicked book, and feel spotless as the lamb.” Immature exegetes perseverate on what the white whale means, what he is a symbol of. God? Satan? America? That’s the Holy Trinity, though only one of them has ever kept his promises (I’ll let you guess who). It’s both more and less complicated than that; the whale is the terrifying, naked abyss stripped bare of all our presuppositions. In short, he is the world as it actually is, where Hell is all around us. In Moby-Dick, Ishmael listens to the former harpooner Father Mapple’s sermon at the New Bedford, Mass., Whaling Chapel (with its cenotaphs and its massive cetacean bones), and though the sermon is ostensibly on Jonah, it describes a type of Hell. The congregants sing a strange hymn about the “ribs and terrors in the whale/Arched over… a dismal gloom, /While all God’s sun-lit waves rolled by, /And lift me deepening down to doom.” Mapple preaches that despite how sinful the reluctant prophet was, “Jonah does not weep and wail for direct deliverance. He feels that his dreadful punishment is just… And here, shipmates, is true and faithful repentance; not clamorous for pardon, but grateful for punishment.” You’ll go to Hell—or the belly of a whale—and you’ll be happy about it, too.     

Not that this rhetoric is limited to Protestantism, for fire and brimstone come just as often from homily as sermon. James Joyce knew that Catholic priests could chill the blood every bit as much as an evangelical bible thumper, with perhaps no more disturbing a vision of Hell ever offered than in The Portrait of the Artist as a Young Man. His roman a clef Stephen Dedalus attends a religious retreat, and the visiting priest provides visceral description to the adolescents—buffeted by arrogance and hormones—of what awaits them if they give into temptation, for “Hell is a straight and dark and foul-smelling prison, an abode of demons and lost souls, filled with fire and smoke” where all of the punished are “heaped together in their awful prison… so utterly bound and helpless that… they are not even able to remove from the eye a worm that gnaws it.” Drawing upon the Jansenist Catholicism that migrated from France to Ireland, and the priest’s descriptions of Hell are the equal of anything in Edwards. With barely concealed sadism he intones how amongst the damned the “blood seethes and boils in the veins.. the heart in the breast glowing and bursting, the bowels a red-hot mass of burning pulp, the tender eyes flaming like molten balls.” Sin is spoken of in sensory terms—the taste of gluttony, the touch of lust, the rest of sloth—then so too does the priest give rich description of Hell’s smell. That pit is permeated by the odor of “some foul and putrid corpse that has lain rotting and decomposing… a jelly-like mass of liquid corruption… giving off dense choking fumes of nauseous loathsome decomposition… a huge and rotting human fungus.” Heaven can remain vague, because none of us will ever agree on what it is we actually want. Pleasure is uncertain, but pain is tangibly, deeply, obviously real, and so Hell is always easier to envision.       

After Stephen is convinced to become rigidly austere following that terrifying, he finds himself once again straying. A friend asks if he plans on converting to Protestantism, and Dedalus responds that “I said I had lost the faith… not that I had lost self-respect. What kind of liberation would that be to forsake an absurdity which is logical and coherent and to embrace one which is illogical and incoherent?” Funny in the way that Joyce is, and true as well, though it might only make sense to lapsed Catholics who fumble over the newly translated words of the liturgical preface at Mass. Such Catholic atheism, or Catholic agnosticism, or Catholic what-ever-you-want-to-call-it influenced one of the great contemporary depictions of Hell in Stephen Adly Guirgis’s play The Last Days of Judas Iscariot. Guirgis engages an idiom that could be called “bureaucratizing the sacred,” transposing the rigid elements of our society in all of its labyrinthine absurdity onto the transcendent order. Whenever Hell (or Heaven) is depicted as an office, or a waiting room, or a border checkpoint, a prison, a hospital, or a school, that’s bureaucratizing the sacred. In Guirgis’s play, the action takes place in that most arcane of bureaucratic structures, the court system. “In biblical times, Hope was an Oasis in the Desert,” a character says. “In medieval days, a shack free of Plague. Today, Hope is no longer a place for contemplation—litigation being the preferred new order of the day.” The Last Days of Judas Iscariot portrays a trial of its largely silent titular character, held in a courtroom that exists beyond time and space, where expert witnesses include not just Christ and Pontius Pilate, but Sigmund Freud and Mother Teresa as well. True to the mind-bending vagaries of eternity, both God and the Devil exist in a country unimaginably far from us and yet within our very atoms, for as Christ says “Right now, I am in Fallujah. I am in Darfur. I am on Sixty-third and Park… I’m on Lafayette and Astor waiting to hit you for change so I can get high. I’m taking a walk through the Rose Garden with George Bush. I’m helping Donald Rumsfeld get a good night’s sleep… I was in that cave with Osama, and on that plane with Mohamed Atta… And what I want you to know is that your work has barely begun.” Who does the messiah love?—”every last one.” A vision expansive and universalist, and like all great portrayals—including Dante and Milton who most definitely didn’t think you could breach the walls of inferno by drilling into the earth—Hell is entirely a mental place.

“Despair,” Guirgis writes, “is the ultimate development of a pride so great and so stiff-necked that it selects the absolute misery of damnation rather than accepts happiness,” or as Milton famously put it, “The mind is its own place, and in itself/Can make a Heaven of Hell, a Hell of Heaven.”  Like all things in the sacred—that vast network of metaphors, allusions, images, allegories, poems, and dreams—the question of “is it real or not?” is entirely nonsensical. Heaven and Hell have always been located in the human mind, along with God and the Devil, but the mind is a vast country, full of strange dreams and unaccounted things. Perdition is an idea that is less than helpful for those who fear (or hope) that there is an actual Hell in the black waters of the Mariana Trench, or in scorched, sunbaked Death Valley, or near a sandy cave next to the Dead Sea. But when we realize that both Hell and Heaven exist in a space beyond up or down, left or right, any of the cardinal directions and towards a dimension both infinitely far away and nearer than our very hearts, then there just might be wisdom. Such is the astuteness of Dante, who with startling psychological realism, records the woeful tale of illicit lovers Paolo Malatesta and Francesca da Rimini, tempted into an adulterous kiss after reading the romance of Lancelot and Guinevere, now caught in a windstorm as they had once tussled in bedsheets. “There is no greater sorrow,” Francesca tells the poet, “Than to be mindful of the happy time/in misery.” Because we often think of sin as simply a matter of broken rules, psychological acuity can be obscured. Drawing from Thomas Aquinas, Dante writes that “Pride, Envy, and Avarice are/the three sparks that have set these hearts on fire,” and the interpretative brilliance of the Seven Deadly Sins is that they explain how an excess of otherwise necessary human impulses can pervert us. Reading about Paolo and Francesca, it’s understandable to doubt that they deserve such punishment—Dante does. But in the stomach-dropping, queasy, nauseous, never-ending uncertainty of their lives, the poet conveys a bit of their inner predicament.    

“There are those… who, undoubtedly, do not feel personally touched by the scourge of Dante, [or] by the ashen pall of Augustine,” writes Moore, and while I wouldn’t say that I’m so irreverent that I’m willing to lustily eat a rare hamburger on Good Friday without worrying a bit, I will admit that if I make a visit to Five Guys after forgetting that it’s Holy Week I don’t fret too much. A privilege to play with the idea of Hell without fearing it (at least too much), but the concept still has some oomph in it. Returning to an earlier observation, Hell must be the language with which we think about that which divides us, estranges us, alienates us, not for when we despair at having eaten a bit too much or sleeping in on the weekend. If anything, that Puritanical rugged individualism that so defines American culture whether we’ve opted into it or not—You need to be up by 5a.m.! You have to be a productive worker! You must avoid any real joy except for the curated experience chosen for you by algorithm!—is the true wage of sin. In opposition, I lustily and gluttonously and slothfully advocate for eating a bit too much, laughing a bit too loud, and sleeping a bit too long. Preachers once told us that to be alive was a sin, but that was never it at all. Now we have pundits and TED talk lecturers forcing us to weigh our souls instead, but there’s no shame in knowing that the road rises to meet us, in feeling the wind at our back and the warmth of the sun or the cool rain upon our faces. Shame and guilt are utilitarian, but they belong on the other side. Notable that the idea of Hell developed contemporaneously with that of Heaven, and if it wasn’t for the former, then what would we ever struggle against? “Without contraries there is no progression,” writes the poet and prophet William Blake in his 1793 The Marriage of Heaven and Hell. “Attraction and repulsion, reason and energy, love and hate are necessary to human existence.” Whether temporary or eternal, finite or infinite, a place of extreme punishment implied another for reward. There’s no Heaven unless there is also Hell, and both places are much closer than might be supposed.

Photo credit: Wikimedia Commons

Drizzly November in My Soul

-

Because Robert Burton used astrology to forecast the date of his death with exact accuracy—January 25, 1640—even some skeptics in that credulous age suspected that he may have assisted the prediction’s veracity. To accuse anyone of suicide was a slander; for Burton’s contemporaries such a death was an unpardonable offense. A half-century later, and the antiquary John Aubrey noted in his 1681 Brief Lives that ”tis whispered that… [Burton] ended his days in that chamber by hanging himself.” There are compelling reasons to think this inaccurate. Burton would not have been buried in consecrated ground had he been a suicide—though, of course, it’s possible that friends may have covered for him. Others point to the historian Anthony Wood, who described Burton as “very merry, facete, and lively,” though seemingly happy people do kill themselves. And finally, there’s the observation that within his massive, beguiling, strange, and beautiful The Anatomy of Melancholy, first printed in 1621, Burton rejected suicide—even while writing with understanding about those who are victim of it. As it actually is, the circumstances of Burton’s death remain a mystery, just as self-destruction frequently is, even as etiology has replaced astrology, as psychiatry has supplanted humoral theory.

That such a rumor spread at Christ Church, where Burton had worked for years in the library, compiling his vast study of depression, is not surprising. So identified was Burton with his subject—called “history’s greatest champion of the melancholy cause” by Andrew Solomon in The Noonday Demon: An Atlas of Depression—that his readers simply expected such a death. Within The Anatomy of Melancholy Burton gives overview of Greek and Roman biothanatos, while still condemning it. And yet Burton empathetically concludes that “In such sort doth the torture and extremity of his misery torment him, that he can take no pleasure in his life, but is in a manner enforced to offer violence unto himself, to be freed from his present insufferable pains.” Burton was also frank about his own suffering. White Kennett would write in his 1728

A Register and Chronicle Ecclesiastical and Civil that “I have heard that nothing at last could make… [Burton] laugh, but going down to the Bridge-foot in Oxford, and hearing the bargemen scold and storm and swear at one another, at which he would set his Hands to his sides and laugh most profusely.” Such a man, it was imagined, was the sort who may have dreamed of wading into that cold water in the years when the rivers of England still froze over, walking out into infinity until he felt nothing. Who is to say? We don’t have a complete record of Burton’s thoughts, especially not in his last moments (we don’t have those things for anybody), but The Anatomy of Melancholy is as comprehensive a record as possible, a palliative for author and reader, an attempt to reason through the darkness together.

“Burton’s book has attracted a dedicated rather than a widespread readership,” writes Mary Ann Lund in Aeon, “its complicated branching structure, its Latin quotations and its note-crammed margins resist easy reading.” Though clearly indebted to the humanism of Erasmus and Montaigne, The Anatomy of Melancholy is one of those books that’s almost post-modern before modernity, like the novel
Tristram Shandy (Laurence Sterne shamelessly plagiarized from Burton). The book is encyclopedic but open-ended, erudite but curious, expansive but granular, poignant but funny; never doctrinaire, never judgmental, never orthodox, but gleefully self-referential even while contemplating sadness. Burton combed history, poetry, theology, travelogue, philosophy, and medicine for case studies, across five editions during his lifetime (and a sixth based on posthumous notes) in three separate sections printed as a folio that ballooned to half-a-million words. In the first section he enumerates accounts of melancholia, in the second he offers up cures (from drinking coffee to eating spiced ram’s brain), and in the third Burton presents taxonomies of insanity, including love madness and religious mania. The contemporary edition from the New York Review of Books Classics series is 1,382 pages long. Within those digressive, branching, labyrinthine passages Burton considers King Louis XI of France’s madness whereby everything had the stink of shit about it, an Italian baker from Ferrara who believed that he’d been transformed into butter, and the therapeutic effects of music on elephants. Lund explains how subsequent editions, rather than cutting verbiage, fully indulged Burton’s favored rhetorical conceit of concierges, whereby words are piled upon words in imitation of the manic mind, a quality that has both endeared and frustrated his readers. And yet as William H. Gass observes in his introduction to the NYRBC edition, “the words themselves are magical; you cannot have too many of them; they are like spices brought back from countries so far away they’re even out of sight of seas; words that roll… words even more exotic, redolent, or chewy.”

Sales of Burton’s monumental work, which readers felt free to dip in and out of rather than reading cover-to-cover, easily outsold Shakespeare’s folio, though by the Enlightenment his acclaim had dimmed, the work interpreted as a poorly organized baroque grotesquerie based in outmoded theories. During the optimistic 18th century, The Anatomy of Melancholy had not a single printing. Despite that, it still had readers, including Benjamin Franklin and Dr. Johnson, who told Boswell that it was the “only book that ever took him out of bed two hours sooner than he wished to rise.” Romantics were naturally drawn to it; both Samuel Taylor Coleridge and John Keats had underlined copies, with the latter drawing the plot for his vampiric
Lamia from Burton. In the 20th century, the existentialists saw something modern in Burton, with Samuel Becket a lover of the book. The Canadian physician William Osler, a founder of the Johns Hopkins School of Medicine, thought it the greatest medical text by a layman, and was instrumental both in increased interest as well as the bibliographic tabulation of Burton’s personal library at Oxford. Despite the pedigree of his fans, Burton hasn’t had a wide readership for centuries, as The Anatomy of Melancholy has never been easy. An assemblage of disparate phenomena, a hodgepodge of unusual examples, a commonplace book of arcane quote and complicated exegesis, none of which is structured in a straightforward way, with Burton himself apologizing that “I had not time to lick it into form, as a bear doth her young ones,” though as it became even more formless over the next two decades that would belie his protestation.               

The Anatomy of Melancholy feels unfinished, just like life; it’s contradictory, just like a person; and it encompasses both wonder and sadness, just like a mind. On its quadricentenary it’s abundantly worthwhile to spend some time with Burton, because though he can’t speak of neurotransmitters, he does speak of the soul; though he can’t diagnose, he can understand; though he can’t prescribe, he can sympathize. Beyond just depression, Burton considers ailments like anxiety, obsessions, delusions, and compulsions, Sufferers “conceive all extremes, contrarieties and contradictions, and that in infinite varieties.” To paraphrase Tolstoy, the happy are all the same, but Burton’s depressives are gloriously different. “The Tower of Babel never yielded such confusion of tongues, as this chaos of melancholy doth variety of symptoms.” The second thing that is important to note is that Burton distinguishes between everyday emotions—the occasional blues if you will—from the more punishing. He explains that “miseries encompass our life,” that everyone suffers grief, loss, sorrow, pain, and disappointment, and that it would be “ridiculous for any mortal man to look for a perpetual tenor of happiness.” If somebody is suffering from physical pain or a loved one’s death, grief and sadness are rational; for a person facing economic ruin or an uncertain future, anxiety makes sense, but a “melancholic fears without a cause…this torment procures them and all extremity of bitterness.” For those whose humors are balanced, grief is the result of some outside torment, for the melancholic grief is itself the torment. Furthermore, as Burton makes clear, this disposition is not a moral failing but a disease, and he often makes suggestions for treatments (while just as soon allowing that he could be entirely wrong in his prescriptions). “What can’t be cured must be endured,” Burton notes. In the depressive canon of the late Renaissance, Burton would be joined by Thomas Browne with his similarly digressive, though much shorter, Religio Medici, wherein he writes, “The heart of man is the place the devil dwells in; I sometimes feel hell within myself;” John Donne’s sickbed, Devotions Upon Emergent Occasions, where he concludes that “Man, who is the noblest part of the earth, melts so away as if he were a statue, not of earth, but of snow;” and, of course, Shakespeare’s famed soliloquy from Hamlet that wonders if “by a sleep to say we end/The heart-ache, and the thousand natural shocks/That flesh is heir to.” And that’s just relatively High Church Englishmen; with a broader scope you’d include the Catholic Frenchman Blaise Pascal, whom in his Pensées defines man as that who is “equally incapable of seeing the nothingness out of which he was drawn and the infinite in which he is engulfed,” and the 15th-century Florentine Neo-Platonist Marsilio Ficino who wrote in his 1489 The Book of Life that the condition was “conducive to judgment and wisdom,” entitling one chapter “Why the Melancholic Are Intelligent.” None of them, however, is as all-encompassing as Burton, as honest about his own emotions and as sympathetic to his fellow sufferers. Within his book’s prologue, entitled “Democritus Junior to His Readers,” ironically written under a pseudonym adapted from the pre-Socratic thinker known as the “laughing philosopher,” Burton explains that “I had a heavy heart and an ugly head, a kind of imposture in my head, which I was very desirous to be unladen of,” and so his scribbling would act as therapy.

Across denominations, countries, and continents, the contemplation of a fashionable melancholia was encouraged, with even Burton confessing to sometimes enjoying such abjection as a “most delightsome humor, to be alone, dwell alone, walk alone, meditate, lie in bed whole days, dreaming awake as it were.” Noga Arikha explains in The Public Domain Review that melancholia could be seen as “good if one believed that a capacity for strong passions was the mark of a fine soul that recognized beauty and goodness… the source of sonnets, the harbinger of creativity,” while Darin M. McMahon notes in Happiness: A History that this is a “phenomenon that would have a long and robust future: the glamorization of intellectual despair.” A direct line can be drawn from the goth teen smoking clove cigarettes in a Midwestern high school parking lot through the Beats in their Manhattan lofts eating hash brownies and masturbating to William Blake through to the Left Bank existentialists parsing meaninglessness in the post-war haze and the Lost Generation writers typing on Remingtons in Parisian cafes back to the Decadents and Symbolists quaffing absinthe and the Romantics dreaming opium reveries until interrupted by the person from Porlock through to Burton, and Browne, and Donne, and Shakespeare, and Pascal and Ficino and every other partisan of depression. As Burton notes, “melancholy men of all others are most witty.”

More than a pose, however, and even if Burton agreed that melancholy could sometimes be romanticized, he never lost sight of its cost. Tabulating the price of anxious scrupulosity, Burton notes that “Our conscience, which is a great ledger book, wherein are written all our offences…grinds our souls with the remembrance of some precedent sins, makes us reflect upon, accuse and condemn ourselves.” In the millennium before Burton there were contrary perspectives concerning melancholy. It was interpreted by theologians as a sin—an indolent sloth called acedia—as opposed to the position of doctors who diagnosed it as an imbalance of elemental substances called humors. One thing that Burton is clear on was that melancholy wasn’t simply feeling down. To be melancholy isn’t to be “dull, sad, sour, lumpish, ill disposed, solitary, any way moved or displeased,” Burton writes, and that clarification is still helpful. For those blessed with a brain chemistry that doesn’t incline them towards darkness, depression might seem an issue of will power, something that can be fixed with a multivitamin and a treadmill. Reading Burton is a way to remind oneself—even as he maintained erroneous physiological explanations—that depression isn’t a personal failing. And it’s certainly not a sin. McMahon explains that by “reengaging with the classical tradition to treat excessive sadness and melancholia as an aberration or disease—not just the natural effect of original sin—Renaissance medicine opened the way toward thinking about means to cure it.”

That was a possibility more than anything, for the rudiments of mental health were still mired in superstition. Such an emotion was identified with an overabundance of black bile in the spleen, and a deficit of yellow bile, blood, and phlegm, a condition associated with a dry coldness, so that some of that metaphorical import still survives today. Arikha writes in Passions and Tempers: A History of the Humours how the “experiences of joy, pain, anguish, and fear each had their temperature, their match in some sort of stuff in the body whose motion modulated the emotion.” In a broader way, however, there is something to be said in how the humors emphasized embodiment, the way it acknowledged how much of the emotional was in the physical. We now know that melancholy isn’t caused by problems with our humors, but rather in our neurotransmitters—I am not cutely implying that this is equivalent, accurate science is the only way that pharmacologists have been able to develop the medicine that saves so many of our lives. Yet there is an enduring wisdom in knowing that this is a malady due to something coursing in your veins, whatever you call it. “We change language, habits, laws, customs, manners,” writes Burton, “but not diseases, not the symptoms of folly and madness, they are still the same.”    

Depressives have always existed because there have always been those of us who have a serotonin and dopamine deficiency, even if we’re not lacking in yellow bile, blood, and phlegm. How culture interprets mental illness is entirely another thing, though. As Burton’s popularity demonstrates, there was a surprising lack of stigma around melancholy. In an abstract way, during the 17th century this a reaction to how eternal verities no longer seemed so eternal. Gass explains that “people were lifting their heads from canonical books to look boldly around, and what they saw first were errors, plentiful as leaves. Delight and despair took turns managing their moods.” Even while The Anatomy of Melancholy used Galen’s humoral theory that dominated medicine since the second century, the English surgeon William Harvey was developing his book Anatomical Account of the Motion of the Heart and Blood, which would dispel the basis for the four bodily humors (it would take two more centuries to die). There were more practical reasons for melancholy as well. On the continent, the Thirty Years War started three years before Burton’s book was completed and would end in 1648, eight years after he died. As many as 12 million people perished, a death rate that dwarfed all previous European wars, with one out of five people on the continent dead. Writing in the early ’20s, Burton’s native England was headed towards inevitable civil war, disunion clear in the political and religious polarization. By its conclusion, 200,000 people were dead, fully 2.5 percent of the population. By comparison, that would be as if 12 million contemporary Americans were killed. Disease could be just as deadly as the New Model Army; over the course of Burton’s life the bubonic plague broke out in 1603, 1625, and 1636, with close to 1000,000 deaths. Depression can come from an imbalance within the body, but sometimes insanity is also a sane reaction to an insane world. You still have to bear it, however.     

Burton is good humored, he may even have been jovial from time to time, but he’s resolutely a partisan of the sighing angels. Not that Burton didn’t advocate for treatment, even while he emphasized his own inexpertness. Solomon explained that Burton recommends “marigold, dandelion, ash, willow, tamarisk, roses, violets, sweet apples, wine, tobacco, syrup of poppy, featherfew, Saint’-John’s-wort…and the wearing of a ring made from the right forefoot of an ass.” We are, it should be said, fortunate to have refined our prescriptions. Despite the fact that Americans hunger for painkillers both helpful and deadly, The Anatomy of Melancholy isn’t a particularly American book. If the English malady is glum dampness, then the American affliction is plucky sociopathic optimism. A can-do-attitude, pulling-ones-self-up-from-the-bootstraps, rugged individualism, grit, determination, cheeriness. We were once inundated by snake oil salesmen and medicine men, now we have self-help authors and motivational speakers. A nation where everybody can be a winner in seven easy steps and there are keys to a new car under ever guest’s seat. “Americans are a ‘positive’ people,” writes Barbara Ehrenreich in Bright-Sided: How Positive Thinking is Undermining America, this “is our reputation as well as our self-image…we are upbeat, cheerful, optimistic, and shallow.” Some crucial points: optimism is not equivalent with happiness, and if anything, it’s a mask when we lack the latter. That’s not bearing it—that’s deluding ourselves.

We weren’t always like this; we have our counter-melody to faux-positivity, from those dour Puritans to Herman Melville writing of the “damp drizzly November in my soul.” But could anyone imagine Abraham Lincoln being elected today, who as a young lawyer in 1841, would write that “I am now the most miserable man living. If what I feel were equally distributed to the whole human family, there would not be one cheerful face on earth?” Now, with the ascendancy of the all-seeing Smiley Face, we’ve categorized talk like that as weakness, even if we don’t admit what we’re doing. Our 16th president had a melancholic understanding that grappled with wisdom, what Roy Porter in Flesh in the Age of Reason: The Modern Foundations of Body and Soul phrased as “Melancholy and spleen, those stigmata of true scholarly dedication.” An ability to see the world as it is. Not just as some cankered, jaundiced, diseased thing, but how in the contrast of highs and lows there is a sense of how ecstatically beautiful this life is, even in its prosaic mundanity. Solomon writes that “I love my depression. I do not love experiencing my depression.” He explains that the “opposite of depression is not happiness but vitality,” and ignorance of this distinction bolsters the cult of positivity. Therapy is honest, unsparing, difficult, and painful. Self-help just tells you what you want to hear. Norman Vincent Peale wrote in Stay Alive All Your Life that the “dynamic and positive attitude is a strong magnetic force which, by its very nature, attracts good results.” This, quite clearly, is unmitigated bullshit. Instead of Dale Carnegie, we need Donne; rather than Eckhart Tolle we could use Browne; let’s replace Tony Robbins with Robert Burton.  

Because, dear reader, if you haven’t noticed, we’re not at a happy point in history. America’s cheery cult of optimism is finally folding under the onslaught of the pandemic, political extremism, economic collapse, and the ever-rising mercury. If you’re the sort who’d be chemically glum even in paradise, then if you’ve already been to hell, you might have a bit of extra knowledge folks could benefit from. Stanley Fish explains in Self-Consuming Artifacts: The Experience of Seventeenth Century Literature how “sober discourse itself is an impossibility given the world,” and that for Burton “nothing—no person, place, object idea—can maintain its integrity in the context of an all-embracing madness.” Gass is even more blunt on the score: “When the mind enters a madhouse…however sane it was when it went in, and however hard it struggles to remain sane while there, it can only make the ambient madness more monstrous, more absurd, more bizarrely laughable by its efforts to be rational.” Burton speaks to our epoch, for depression is both real and there are legitimate reasons to be depressed. As he writes, melancholy is an “epidemical disease,” now more than ever. Burton’s prescriptions, from tincture of marigold to stewed offal, seem suspect—save for one. With the whole world an asylum, Burton advocates for awareness. There are risks to such hair-of-the-dog though. “All poets are mad,” Burton writes, the affliction of “excellent Poets, Prophets, &c,” and I suspect, dear reader, that you too may be in that “etcetera.” Depression, along with addiction, is the writer’s disease. Sylvia Plath, James Baldwin, David Foster Wallace, Ann Sexton, Arthur Rimbaud, F. Scott Fitzgerald, Mark Twain, Emily Dickinson, and so on—all wrestled with the noon-day demon. Many of them died because of it, at least in one way or another. There is no shame here, only sadness that some couldn’t be around with us a bit longer, and the genuine, deep, and loving request that you, dear reader, stick around here. As for Burton, he was committed to the principle that “I write of melancholy, by being busy to avoid melancholy,” and it often worked. He gives the most poignant expression to that black veil that shrouds the mind, the caul of the soul that afflicts some from time to time. If writers are prone to depression, then Burton’s tome was an attempt to write himself out of it, to “satisfy and please myself, make a Utopia of mine own, a New Atlantis, a poetical commonwealth of mine own.” We’re lucky that he did, because even if it’s not the only thing—even if it’s not always the best of things—there is something sacred in that. No matter how occluded, know that somebody else understands what you’re feeling.

So, blessed is Burton and duloxetine, therapy and sertraline, writing and citalopram, empathy and fluoxetine, compassion and escitalopram. Blessed are those who ask for help, those who are unable to ask for help, those who ask if somebody else needs help. Blessed are those who struggle everyday and blessed are those who feel that they no longer can, and blessed are those who get better. Blessed are those who hold the hands of anyone suffering. Blessed is understanding—and being seen—for what Burton offers you is the observation that he sees you, the reader. “I would help others, out of a fellow-feeling,” he writes. Because Robert Burton often felt worthless; as if he was walking at the bottom of the chill Thames. Sometimes it felt like his skull was filled with water-soaked-wool and his eyes pulsated, vision clouded over with gauzy darkness; he knew of listing heaviness and the futility of opening the door, of getting dressed, of leaving the bed, of how often the window of care shrunk to a pinpoint of nothingness, so that he could feel no more than that. This strange book studded with classical allusion and scriptural quotation, historical anecdote and metaphysical speculation—who was it for? He wrote for the person who has had the tab for this article open on their computer for days, but has no energy to read; for the woman who hasn’t showered in weeks and the man who can’t bring himself to go outside. “Thou thyself art the subject of my discourse,” Burton writes. His purpose was one thing—to convey that you have never been alone. Not then, not now, not ever.    

And the Walls Came Down

- | 1

Across seven seasons and 178 episodes, Patrick Stewart performed the role of Capt. Jean-Luc Picard of the United Federation of Planets NCC-1701-D starship the USS Enterprise with professionalism, wisdom, and humanity. Displaying granitoid stoicism and reserved decorum, Picard was the thinking-person’s captain. The fifth season episode “Darmok” is an example of how Star Trek: The Next Generation was among the most metaphysically speculative shows to ever be broadcast. In the beginning, the Enterprise is hailed by the Tamarians, but the Enterprise’s computerized translator can’t comprehend their language. Against his will, Picard is transported alongside the Tamarian captain Dathon—whose head looks like a cross between a pig and a walnut—to the surface of a hostile planet. Confronted by a massive, shimmering, and horned beast, Dathon tosses Picard a dagger and yells “Darmok and Jalad at Tanagra.” After they’ve defeated the monster, though at the price of Dathon being injured, Picard realizes that the enigmatic phrase is a reference, an allusion, a myth. Thus, a clue to their language—Tamarians speak only in allegory. Dathon has invoked two mythic heroes to communicate that he and Picard must fight the beast together. Doubtful though it may be that Paul Ricoeur was a Trekkie, but Dathon embodies the French philosopher’s claim in The Rule of Metaphor: The Creation of Meaning in Language that “Only a feeling transformed into myth can open and discover the world.” Dathon, however, has sacrificed himself upon the altar of comprehensibility, for he received a fatal blow, and dies as a pieta in Picard’s arms. As he passes into that undiscovered country, Picard speaks to him in our own mythic-metaphorical tongue – “They became great friends. Gilgamesh and Enkidu at Uruk.” 
Tamarian makes explicit our own language’s reliance on the figurative—any language’s use of metaphor, for that matter. “Try as one might, it is impossible to free oneself from figural language,” writes Marjorie Garber in The Use and Abuse of Literature, as “all language is figurative.” Examine my first paragraph, for there are several mythic allusions throughout—the first clause of my fourth sentence reads “In the beginning,” a reference to the 16th-century Bible translator William Tyndale’s rendering of the Hebrew Bereshith in Genesis (later incorporated into the more famous King James Version, as well as the translation of a different koine phrase in John 1:1). My penultimate sentence has two mythopoeic references; one to Dathon having “sacrificed himself upon the altar,” and a mention of the “pieta,” a word that translates to “piety” and often refers to Christ dead in the arms of the Virgin Mary. Such mythic allusions abound in language. For example—as a writer my Achilles’ Heel is that I take on Herculean tasks like writing an overview of metaphor, requiring me to stop resting on my laurels and to open up a Pandora’s Box, so that I can leave no stone unturned; touch on wood, and don’t treat me like a Casandra, but let’s hope that I can cut the Gordian Knot here. Such Greco-Roman adornments aren’t just mythic allusions, they’re also a dead or at least dying metaphors—for who among us ever tells somebody that we’re “Between a rock and a hard place” while thinking of the Scylla and Charybdis in Homer’s The Odyssey?  That’s another way to say that they’re clichés, but mythic connections abound in less noticeable ways too, though perhaps I’d do well this Wednesday to render unto Odin what belongs to Odin and to spare a thought for the Germanic goddess Frigg when the work week draws to a close.

Star Trek: TNG screenwriters Joe Menosky and Philip LaZebnik make “metaphorical” synonymous with “mythic,” though figurative language draws from more than just classical tales of gods and heroes, but from anything that transfers meaning from one different realm into another. In my first paragraph, I described Stewart’s face as “granitoid,” though he is not coarse-grained igneous rock; later I deployed a simile (rhetoricians disagree on how different that conceit is from metaphor) where I said that Dathon looked like a cross between a pig and a walnut. I can’t claim that these are great metaphors, but they were my attempt to steer the course of the ship away from the rocky shoals of cliché. “A newly invented metaphor assists thought by evoking a visual image,” writes George Orwell in “Politics and the English Language,” his classic essay of 1946, “while on the other hand a metaphor which is technically ‘dead’… has in effect reverted to being an ordinary word and can generally be used without loss of vividness.” Between the arresting, lyrical, novel metaphor of the adept poet and the humdrum language of the everyday are the “huge dump of worn-out metaphors which have lost all evocative power and are merely used because they save people the trouble of inventing phrases for themselves.” Orwell lists around a dozen clichés, including “ride roughshod over,” “grist to the mill,” “swansong,” and “toe the line.” When encountering clichés such as these, in other peoples’ writing but especially in my own prose, they appear to me like a dog turd hidden under the coffee table; their fumes make my eyes water and my stomach churn. Cliché must be ruthlessly expunged by red pen at every opportunity. But those two other categories that Orwell lists are entirely more interesting.

Dead metaphors—not just those on life support but also those decomposing in the ground—merit us getting a shovel and some smelling salts. Tamarian was imagined as structured by metaphor, but the vast majority of words you use every day were originally metaphors. Take the word “understand;” in daily communication we rarely parse it’s implications, but the word itself is a spatial metaphor. Linguist Guy Deutscher explains in The Unfolding of Language: An Evolutionary Tour of Mankind’s Greatest Invention how “understand,” derived from the Middle English understanden, itself from the Anglo-Saxon understandan, and ultimately from the Proto-Germanic understandana. “The verb ‘understand’ itself may be a brittle old skeleton by now,” Deutscher writes, “but its origin is till obvious: under-stand originally must have meant something like ‘step under,’ perhaps rather like the image in the phrase ‘get to the bottom of.'” Such spatial language is common, with Deutscher listing “the metaphors that English speakers use today as synonyms: we talk of grasping the sense, catching the meaning, getting the point, following an explanation, cottoning on to an idea, seeing the difficulty.” The word “comprehend” itself is a metaphor, with an etymology in the Latin word prehendere, which means “to seize.” English has a propensity to those sorts of metaphors, foreign loan words from Anglo-Saxon, Frisian, and Norman; Latin, French, Italian, and Spanish; Abenaki, Wolof, and Urdu, which don’t announce themselves as metaphors precisely because of their foreignness—and yet that rich vein of the figurative runs through everything. Dan Paterson gives two examples in his voluminous study The Poem, explaining how a word as common as “tunnel” is from the “Medieval English tonel, a wide-mouthed net used to trap birds, so its first application to ‘subterranean passage’ will have been metaphorical—and would inevitably have carried the connotation ‘trap’ for a little while.” Similarly, the word “urn” comes from “the Latin urere, to burn, bake,” the word itself holding a connotative poetry of immolation. In both these examples, “the original neologists will have been aware of their elation to earlier terms,” Paterson writes, “but this conscious awareness can be lost very rapidly.” If you’re lucky enough to have a subscription to the Oxford English Dictionary, you can trace the etymology of prosaic words and find their glamorous metaphorical beginnings. Within the seemingly hollow throat of every word resides the voice of metaphor, no matter how faint it’s call to us may be.

The manner in which metaphors fossilize into everyday words is not unlike the process during the fetid days of the Carboniferous Period when giant spiders and scorpions, salamanders and sharks, scaled trees and seeded plants would sink into some shallow tropical pool and fossilize until dug out of the ground by a coal miner in Wales or West Virginia. Just as miners occasionally find deposits in the shape of a trilobite, so too do some metaphors that are dead appear as obvious clichés, but the bulk of our language is so dark and hard that it might as well be bituminous. Yet as burning coal releases heat, so too does the glow of meaning come off of these rocks that we call words, the once vital energy of metaphor still hidden inside. “However stone dead such metaphors seem,” writes I.A. Richards in The Philosophy of Rhetoric, “we can easily wake them up.”  Such evolutionary development is what linguists call “semantic change,” as tangible and concrete words are used as metaphors, transferring intent in a one-way direction towards abstraction. “The mind cannot just manufacture words for abstract concepts out of thin air—all if it can do is adapt what is already available. And what’s at hand are simple physical concepts,” writes Deutscher, so that when I say that I’m trying to establish a foundation for my argument, I draw from the concrete metaphors of construction, while it would be odd to appropriate abstraction to describe something tangible. “One thing is certain,” Deutscher says, “nothing can come from nothing.”

Every word is a metaphor; every phrase, no matter how prosaic, is a poem—even if it’s mute. Words don’t correspond to reality; they only correspond to one another. All languages are a tapestry of threads forever floating above the ground. “Metaphor is the transference of a term from one thing to another, whether from genus to species, species to genus, species to species, or by analogy,” as Aristotle defined it in his Rhetoric, the first book to taxonomize the trope. The word metaphor is itself, rather predictably, a metaphor. The Greek μεταφορά, as Aristotle’s explanation literally states, translates as “to transfer,” with all the connotations of transport, shifting, movement, and travel. As contemporary travelers to Greece still discover, delivery vans have the word metaphora painted on their side, something that was a delight to Ricœur. No language can lack metaphor for the same reason that no tongue can lack fiction; it’s not an issue of grammar in the same way that there are languages incapable of tense or person, but rather figuration is the domain of rhetoric. Wherever there are words that are used to designate one thing, then the moment somebody uses those terms to refer to something else, we are within the realm of metaphor. This, it must be said, is rather different from encouraging novel metaphors, or enshrining metaphor as a poetic device, or really even noticing that it exists.

During the Middle Ages, there was a literary fascination with metaphor’s gilded siblings—parable and allegory—but the explication of figuration’s significance didn’t move much beyond Aristotle’s original definition. By 1589, the great English rhetorician George Puttenham would still define the term in The Art of English Poesy as involving “an inversion of sense by transport,” and that belief that metaphor gets you somewhere else remains central, metaphorically at least. Contemporary study of metaphor begins with Richards’s invaluable 1937 The Philosophy of Rhetoric. Because the subject went largely unexplored over two millennia, Richards complained that the “detailed analysis of metaphors, if we attempt it with such slippery terms as these, sometimes feels like extracting cube-roots in the head.” He had a method of exactness, however, for Richards presents a novel vocabulary, a vocabulary that—also unsurprisingly—is metaphorical. According to Richards, a metaphor is composed of two parts, the “tenor” and the “vehicle.” The first is whatever it is that’s being described, and the second is that whose attributes are being carried over in the description of the former (shades of that Greek delivery van). “For the Lord God is a sun and a shield,” writes the Psalmist, providing us an opportunity to see how Richards’s reasoning works. Belying the fundamentalist fallacy that the Bible is a literal text—though nothing is—it’s clear to most that God is neither a giant ball of ionized hydrogen undergoing nuclear fusion into helium, nor is He defensive armor. What God is, in King David’s metaphor, is the tenor, because He is what is being described. The attributes that are being borrowed from “sun” and “shield” are those connotations of being life-giving, luminescent, warm, as well as being defensive and protective. “Sun” and “shield” are even gently contradictory, as blocking out the former with the later can testify towards; but it’s also a demonstration of how the non-literal can express reality in its glorious paradox. In exactly the same manner, Robert Frost’s central conceit in “The Road Not Taken” uses the vehicle of an arduous and uncertain wooded path on the implied tenor of the narrator’s life. A great richness of poetic metaphor—as separate from cliché—is that it allows for ambiguity of referent, so that meaning is a many-colored thing. What makes a poetic metaphor successful is the delicate interplay between tenor and vehicle. A poet’s task is to space that width just right, and to somehow surprise the reader while doing so, without befuddling them.

Poets are less the unacknowledged legislators of the world than they are its wizards, because the form has “the power to define reality,” as George Lakoff and Mark Johnson write in Metaphors We Live By. Or maybe it’s more accurate to say that those who nurture metaphors are as if apiarists, since the figurative is like pollen from a flower and each word a bee, meaning traded in a continual hum. Language—and thus reality—is supersaturated with meaning, everything capable of being transformed into something else. “If all language is metaphorical,” writes David Punter in his short study Metaphor, “then it could also follow that we might want to say that all language is continually involved in a series of acts of translation: translating things which are difficult to apprehend into things we can apprehend.” Just as translation is based in difference, so is all communication. All language is relational, a communion between that-which-is and that-which-isn’t. Because words are always arbitrarily connected to that which they represent, language is intrinsically metaphorical, the tethering of random shapes on a page or the vibration of air molecules to an outside sense, the compulsion of someone with a mind different from your own imagining that which you wish them to. Without metaphor there’s no fiction, and without fiction there’s no metaphor. Without either, there’s no possibility of communication. Metaphor is a bridge that doesn’t exist that you’re still able to cross, for faith in being understood is that which gets you to the other side.    

Figurative language encompasses the expansiveness of metaphor; the insularity of metonymy; the granularity of synecdoche; the straightforwardness of simile; the folksiness of parable; the transcendence of allegory. We don’t just read metaphor in literature; humans have always seen it in events, in nature, in the cosmos, in any manner of thinking that sees existence as overdetermined, as meaning permeating that which would otherwise be inert. We see metaphor in the hidden grinning skull of Hans Holbein’s painting The Ambassadors and the melting clocks in Salvador Dali’s The Persistence of Memory. It’s in F. Scott Fitzgerald’s green light and Herman Melville’s whale. It’s in the Anglo-Saxon form of the “kenning,” where the sky is a “bird-road” and Homer’s evocation of the Aegean “wine-dark sea”; in saying that space-time can “curve” and that genes are “selfish”; with Rene Descartes description of the body as a machine and William Paley’s claim that the universe is a clock, and the understanding that God can be both Mother and Father. Politics is threaded through with metaphor, narrative, and artifice—the most effective means of getting the masses to listen to you, for both good and evil. Metaphor is both what facilitated the horror of the Rwandan genocide when Hutu propagandists described the Tutsis as “cockroaches,” as well as what generates the hopefulness in the socialist call for “breads and roses.” Symbolism undergirds both the eagle in the Seal of the United States of America and that of the Third Reich. That the same animal is used for both only emphasizes how mercurial metaphor happens to be. As Punter explains, “metaphor is never static, and rarely innocent.” Figuration’s precise power and danger comes from such slipperiness, as everything is forever morphing and mutating between the metaphorical and the literal.

When Patrick Stewart first came to the United States in 1978, arriving in Los Angeles where he would make his greatest fame as a starship captain, it was as a 37-year-old member of the Royal Shakespeare Company performing the role of Duke Senior in As You Like It at the Ahmanson Theatre. Traipsing through the enchanted forest of Arden, and Stewart appeared opposite Alan Howard performing as Jacques. During the seventh scene of the second act, Howard (most celebrated for voicing Sauron in the Peter Jackson’s Lord of the Rings films) uttered the most celebrated extended metaphor in Shakespeare’s literary career of extended metaphors: “All the world’s a stage, /And all the men and women merely players;/They have their exits and their entrances;/And one man in his time plays many parts.” Conventionally understood as referring to the various roles we all perform over the course of our lives, it’s an apt encapsulation of metaphor itself. All of communication is a stage, and every word is merely a player, and one word can play many parts. When it comes to figuration, Jacques’s monologue is right up there with Darmok and Jalad at Tanagra. Because nothing in the English language is as perfect—as immaculate, as blessed, as sacred, as divine—as the word “like.” What it effects is a syntactical transubstantiation, the transformation of one thing into another. If everything—our concepts, our words, our minds—are sealed off behind a wall of unknowability, then metaphor is that which can breach those walls. Whether implied or stated, “like” is a bridge transporting sense across the chasm of difference, providing intimations of how all ideas and things are connected to each other by similarities, no matter how distant. Whether uttered or only implied, “like” is the wispy filament that nets together all things. Perhaps there is naked reality, but we’ll never be able to see it ourselves, always clothing it in finery that is continually put on and taken off. In our life, metaphor is the intimate kiss between difference.

Image Credit: Wikipedia

The Good Art Friend

- | 4

The current Internet-fueled maelstrom ignited by the article “Who Is the Bad Art Friend?”—about two writers and the putative ownership of a “kidney story:” for one writer it was a lived experience; for the other it was something to render in fiction— in all its dizzying permutations, the details of which were further recast in a court case, made me wonder if the corollary, the Good Art Friend, must then also exist.

First, I have to admit I am not sure what an “art friend” is at all. Full disclosure: I am Facebook friends with both protagonists, as well as with the writer of the original piece. I’m also a little unclear how and where the adjective good/bad attaches (to friend? to art?).

Since the definition is up for grabs, I’ve defining BAF as someone who is on the whole deleterious to your art, but probably good to their own, and may or may not be a friend—if we define friend as someone who cares for you, shows up for you, and genuinely shares in your joys. The GAF is the good friend who helps you on your journey, often in ways you don’t expect and don’t appreciate until enough time has passed for hindsight.

For Sale/Kidney Story, Never Authorized was an insightful newsletter post by my erstwhile creative writing colleague Lincoln Michel. The post made me think about how no matter how solitary we are as artists, even Emily Dickinson grabbed the details of life including those of the people closest to her for her work. For me, my oldest art friend, besides my Royal typewriter,  is my hometown best friend, Patti. She is my art friend exemplar, even though, as a VP-CFO of an insurance company, people might say, she’s not an artist, which makes me say, how do we define art and is the “artist” solely responsible for—and the benefiter of—its creation?

Patti and I met, admittedly, in the most incredibly catty way, excusable because we were only 10 or 11: piano recitals. I suffered through years of piano lessons, every minute of which I loathed— the opening bars of “Für Elise” will be forever a trigger—plus the added misery of recitals and competitions, all of which took place in the basement of our public library, where I once took a karate class in hopes it would protect me from racist bullies.  In our small town we actually had three or more piano teachers, which meant sitting through interminable rounds of little kids picking out “Chopsticks.” In our cohort, I felt Patti was the most talented, but most of the attention went to a boy pianist (whom I won’t even refer to here, for our nickname for him will make him instantly recognizable) whom we felt received unnecessary and excessive praise from our teachers solely for being the rare dude.

We actually didn’t dislike Boy Pianist on a personal level, we just truly felt the adulation he received from our teachers gave short shrift to Patti’s talents. Patti was also being raised by a single dad, a miner, after her mother was killed in a car accident, and unlike me with my Asian Tiger parents and the other kids, she continued to play of her own accord and because of her talent. This was also my first lesson in how you can bond with a fellow artist by being annoyed at a third artist.

Patti was also the person who constantly pushed me to venture into new experiences, like the time before we had driver’s licenses when we tried biking to the next town, which required a short and terrifying stint on the highway. The sense of risk and being able to sit with uncertainty is essential for any art, and I don’t know if I would have developed it on my own. I also secretly thought I was a very humorous person, but without a sparring partner, how to develop those skills? Patti was and still is one of the quickest and funniest people I know. Imagine my delight as a child finally finding someone who shared my passion for MAD magazine. Not to mention that being the only student of color in our high school made me a magnet for bullies, and often I was too tired, too scared, too full of self-loathing to defend myself, but Patti never seemed to tire of defending me.

When I wrote my first novel, a YA story set in high school, a Patti-esque character figured prominently. It was easy to develop a fully realized character basically plagiarizing my colorful friend, including her telling off racist bullies. The novel did end up with race as a prominent theme, but much of my motivation came from feeling the experiences of youth slipping away and wanting to trap them in fiction.  In various drafts, the protagonist became more and more fictional: I was an avatar of a better and braver high school self, the racial and intergenerational themes became more prominent, while the Patti character largely remained Patti, with fictional details created or rearranged for plot.
When I pull up, her house is dark; her father doesn’t come home from the mines until late in the evening, so she doesn’t leave the lights on.

I’ve never met Jessie’s mom. One Thanksgiving, long before Jessie and I became friends, an Arkin High student killed her when he came barreling down the wrong side of the street in his pickup–apparently he’d been drinking while watching the football game

I stare out at the night. I won’t drive drunk tonight–or any night. No way.

Jessie opens the door to the car. “Hi, Ellen,” she said. As she hoists herself into the Blazer, the flowery smell of Sweet Honesty fills the car, followed by the slight trace of cooking smell—fried something.
In homage, I had even left the character’s name as “Patti.” How it changed to “Jessie,” I will explain.

While I was still working on the novel, I pooled all my vacation time from my day job and went to the Bread Loaf Writers Conference, where I got to work with the late wondrous Nancy Willard. One critique she had was that the two characters, Ellen and Patti, were “too alike.” Maybe revise Ellen “up” and Patti “down,” she suggested. I still remember the hand motions she made. Up and down. So I indeed made Ellen even nerdier (and much kinder) than I was high school, and roughened the Patti character around the edges. However you want to look at it, these changes helped get the book into a publishable state.

When Houghton Mifflin bought the book, I giddily sent Patti the manuscript, excited to see what she’d think about the daredevil BFF character, modeled so closely on her that, not unlike what happened in For Sale/Kidney Story, I proudly used her real name.

I assumed she would be over the moon for me and be happy to see a fictional version of our friendship immortalized in print. And I inadvertently proved the truism one of our teachers used to use, that to assume makes an ASS out of U and ME.

She called to let me know she’d read the manuscript. Then she started yelling at me about how angry she was at what I had done, and then hung up.  I, confused and panicked, called back only to get various iterations of the loud hang-up. This was in the time of landlines and hang-ups were pretty emphatic. Finally, her husband answered the phone, and kindly said Patti didn’t want to talk to me and it would be better to just stop calling.

Of course I considered not publishing, but I comforted myself that although she had expressed her hurt over my “betrayal,” she never asked me not to publish. Honestly, I don’t know what I would have done if she did.

I did frantically call my editor to have the name changed to “Jessie.” I remember being in extremis to the point my editor said, “Wow, this makes me wonder how much else in the book is true.”

With fiction, it’s all true, and it’s all a lie. The relevant issue was whether I was being a Bad Art friend at that moment. It reminds me a little bit of Bob Dylan, who was also from our town. In his early post-Hibbing years, exploring the folk scene, many people would dig out their prized one-of-a-kind folk records only to find the next day they’d been swiped by Dylan because he so single mindedly needed them. That was an unequivocally rotten thing to do, and legally actionable, but now that Dylan is Dylan, no one called foul, everyone seemed glad for their small contribution to American arts and culture. Was that similar to what I was doing? Tearing single-mindedly into my project and hoping for forgiveness later? Would that require me becoming as famous and influential as Dylan as a justification?

I didn’t know, and maybe I still don’t know. All I knew is that I had set myself on a path that I wanted to follow, and did.

But I still missed her. I told her so, in various missives I would put in the mail every few months (I was too terrified to call). They were never reciprocated.

Until one day.

My second book, a middle grade novel set in junior high and completely Patti free, had just been published and had gotten some press in the Minneapolis newspapers, including mentions that I was in town. Patti, who had moved there shortly after our high school graduation, called me up without preamble, congratulated me on my new book, told me she had a coupon for a favorite restaurant, Ciatti’s (RIP), and would I help her fulfill the buy one, get one?

I was ready to leap into her arms when we met, but she clearly was not intending to resume where we left off. Conflict avoidant as always, I didn’t push. I ate my meal, we talked about my new book. I remember we laughed, sporadically, perhaps about how “cappuccino” at this restaurant was Sanka with whipped cream on top. The connection was still there.

We tentatively put each other on Christmas card lists. With social media, we friended and accepted the requests. We enjoyed spying on her former piano nemesis this way. Years later, she and another high school friend, Lisa, visited me in New York. Back at the apartment, she noticed my compendia of MADs and asked to borrow one. We still didn’t talk about “it.” The novel had gone out of print for a second time years ago, so it seemed we could just not talk about it forever.

Occasionally things go better than you expect—not often, even less often in publishing, but it happens. My novel had a brief second life at HarperCollins, then promptly went out of print again. But maybe 10 years later, an out-of-nowhere BuzzFeed article listed Finding My Voice as one of 15 YA Books From The ’80s And ’90s That Have Stood The Test Of Time, and set it on its third reanimation, with Soho Press.

This time, I resolved to be a better friend than artist.  During a visit to Minneapolis, I asked Patti—making sure to do it while we were driving in a car and at night so I wouldn’t have to look at her—if it was “okay” to republish.

“Oh my God Mawee,” she said, using my childhood nickname. “Of course it is.”

“But, um, you were kind of mad back then.”

“I was out of my mind then.”

She explained more about what she’d been going through at the time, and she said she felt she had acted inappropriately. I told her that the things she had said to me in anger—”You ripped out huge pieces of my life.” “Is that what you think of me?”—still stand. My bleating “But you were the hero in the book and in my life” was not a good defense on my part. I built a character on the details of her life I had gleaned as her friend, not someone doing an interview, something I now do routinely for research for my fiction.

Patti was sincere in her permission for this third go-round. Needing to reread it for republication, I was startled at how the novel now read like it had been written by someone else. Obviously, I could easily pick out where I mined the shared details of our lives, but  enough time has passed that I could see that the real/actual memories had been transformed beyond recognition–something I think Patti saw before I did. I remember writing that very first draft, being conscious that I was altering the “car accident” narrative to include alcohol, to make a character point that Ellen is aware she would not drink and drive—only to find the lived detail was Patti’s mother having a heart attack in the car, which I had somehow misremembered as a car accident. Thus, this detail in the book, which works in the text to provide characterization is still “inspired” if not “copied” from a real person’s life—and the most devastating event of that person’s life, at that. Is it okay to use it just for my “art”? I consider then the grace she extended to me despite my complete lack of consideration of her feelings when we were 28 and I was working so hard to get published. In late 2020, I casually informed my high school friends via group email about my virtual (COVID-19) launch for Finding My Voice, and I almost cried with joy to see her face in the Zoom panes.

Last, week, I did a book club visit to group of Korean American adults reading YA. All the readers, one by one, mused on how much better their high school lives would have been if they had had a Jessie  by their side. They were all amazed and somewhat envious when I explained Jessie was more or less my real BFF.  I know that I am lucky this way, now more than ever. Not just to have a viable writing career but to have a lifelong friend.

One important life lesson from these decades of career, ambition, writing, and friendship is that change is real and it’s happening all the time despite our attempts to deny it. What both Art Friend stories show is that there’s no one way to be an artist, and there’s no one way to be a friend. The who-what-why changes over time, as do the boundaries of what is moral, ethical, allowable. What is appropriation, what is theft, and the big question: Is the artist solely responsible for her art (for praise or opprobrium, including the legal kind)? I think the only remedy is to resist our very human urge to adjudicate sides: Who’s right? Who’s the bad one here? This is a toxic path that can spiral forever, with nothing resolved, feelings continually hurt, nothing generative, and only the lawyers and the third-party chronicler (in Bad Art Friend’s case) profiting. If there’s one thing being Buddhist has taught me, it’s that once you let go of attempting to impute value—win/loss, good/bad—to whatever it is unfolding in front of your face, you can actually be open to what the moment is. And that by just being actively kinder, defaulting to the kinder impulse is spiritually profitable for all. In my defense, my version of our friendship is also mine to tell, and it is my blind spot to not consider others’ feelings while I work that actually allows me to create; in fact, I consider such focus a help, not a hindrance as far as my writing is concerned. Furthermore, we see that no one actually owns memories, and even these change with time and perspective. For Patti and me, the very same event that was “bad” back then has proved to be “good” today. But the whole time, the only thing that mattered and still matters is our connection, in art and life.

Image Credit: Wikimedia Commons.