Tripping the Late Capitalist Sublime

-

“Dead shopping malls rise like mountains beyond mountains/And there’s no end in sight/I need the darkness someone please cut the lights.”—Arcade Fire, The Suburbs (2010)


When my wife and I lived just north of Boston, we’d drive past wood-paneled, yellow-painted two floor colonials and Queen Anne Victorians, pastel blue Cape Cods and rustic brown salt-box houses, until the meandering cow path of Lowell Street shunted us onto the Middlesex Turnpike toward the Burlington Mall. I never enjoyed malls when I was young; our closest was the Monroeville Mall where George Romero filmed Dawn of the Dead, and I disliked the creepy uniformity of those spaces, the steep escalators and strange indoor fountains, the shiny linoleum, piped in Top 40, and artificially lit interiors. Over time, I defeated my own snobbishness. The futuristic slickness of the Apple Store, the faux-exoticism of Anthropologie, the seediness of Spencer Gifts and Hot Topic, the schmaltziness of Yankee Candle. “Not only is the mall a place of material reward,” writes Matthew Newton in Shopping Mall, “it is also a space to meditate on your surroundings,” where wandering “feels almost like slipping off into a dream.” The few things I bought at the Burlington Mall included a pair of swim trunks at Macy’s, my glasses, and maybe bubble tea slurped through one of those unnervingly thick straws. What I did do, however, is stand in the second-floor food court overlooking the turnpike glazed in January snow with the low-winter sun of early dusk appearing as if a squib of yellow butter scrapped lightly across browned toast glowing golden. I see no shame in admitting that I love the mall.
Everyone in literary circles has met the man whose family had homes in Manhattan and upstate New York, in rural New England and in Hilton Head, and somewhere in Europe, but who hates Pottery Barn, Williams-Sonoma, even Starbucks. These types emphasize that family wealth isn’t theirs, but their parents’, and the bright orange sashimi and red tuna nigiri sitting in an open fridge at Wegman’s, the fake distressed wood of Pier 1, the cutting fragrances of Sephora were only bourgeois affectation for the rest of us. A privilege of the wealthy class tourist is the ability to whole-sale skip over life in the middle, even while that middle disappears. Despite not needing the money, these types often romanticize manual labor, seeing in summer gigs as a dishwasher something authentic, the callouses from scrubbing a rough steel-wool pad across pasta-caked plates and burns from scalding water, the rhythmic mindlessness of loading glasses and bowls into their plastic tray and then sending them on a conveyer belt through the industrial washer. Such fantasies are a rejection of the suburban, the bourgeoise, the basic. “The assumption that everyone else is like you. That you are the world,” such a man might quote from David Foster Wallace’s The Pale King, “The disease of consumer capitalism. The complacent solipsism.” ($15.49 on Amazon Prime). Despite being privileged enough to grow up upper middle-class, I’m close enough to the factory that I see something of the tourist in that aforementioned pose, and 12 years in inner-city public schools at least kept me honest. I don’t know much about class, but I know that most people who don’t have a choice in anything but the dishwashing rarely have the option to run that steel wool across the bright reds and blues of Le Creuset when they get home. Poverty is a luxury that only the rich can afford. As for me, I’ve always loved Williams-Sonoma.
During the mid-19th century, an economist enthused that capitalism has “created more massive and more colossal productive forces than have all preceding generations,” name-checking the marvels of “steam-navigation, railways, electric telegraphs… canalization of rivers,” while asking “what earlier century had even a presentment [of] such productive forces?” He was a paragon of bourgeoise tastes, an avid reader of the sentimental novels of Honoré de Balzac, a fan of maudlin Romantic music, and a perennial smoker of cheap cigars. Today he’d no doubt enjoy a Pumpkin Spice Latte at the Burlington Mall. His name was Karl Marx and the selection quoted is from The Communist Manifesto. Marx’s critique is pertinent because he acknowledges what’s seductive about capitalism. Any radical analysis that ignores what’s so great about owning stuff isn’t really a radical analysis at all; any claim that television isn’t actually amazing, or junk food never tastes good, or pop music is anemic is just bohemian posturing. Sing me a song of Chipotle’s burrito bowl, all gristly steak, synthetic cheese, and fatty guac; of the glories of an MTO hoagie ordered from a Wawa screen; of the bruising trauma of the NFL; of the spectral sublimity of Netflix. Marx’s denunciations of capitalism—written with the support of his wealthy friend Friedrich Engels—were trenchant because he didn’t confuse ethics with aesthetics. By contrast, Pete Seeger—who God bless him was right about war and labor, and produced some catchy songs as well—couldn’t shake the condescension of an upstate New York childhood being raised by two WASPy Julliard professors. “Little boxes on the hillside/Little boxes made of ticky tacky/Little boxes, little boxes/Little boxes all the same.” We’re to look down on these middle class dupes for their spiritually bereft lives, their desire to golf and drink martinis. Seeger—whose family had a rural New Jersey estate and died with $5 million to his name—saw those tracts of suburban sprawl as deadening. But you know who I bet wouldn’t mind one of those ticky tacky little boxes? Homeless people.

“Modern bourgeois society,” Marx writes “is like the sorcerer, who is no longer able to control the powers of the nether world whom he has called up by his spells.” Dispute the prescription if we must, Marx was perceptive in his diagnosis—for all of the material plenty that industry supplied to some, capitalism depends on exploitation, it is defined by inequity, it requires alienation. The problem isn’t the ticky tacky houses, the problem is that people in McMansions have convinced those in those little boxes that their enemies are people in public housing (and government assistance is nefarious socialism). Engels and Marx used an occult rhetoric of wizards, specters, and hauntings, and it’s apropos, for capitalism itself is a religion. “Under capitalism,” writes Eugene McCarraher in The Enchantments of Mammon: How Capitalism Became the Religion of Modernity, “money occupies the ontological throne from which God has been evicted.” If our religion is capitalism, then our theology is consumerism and our God is the Invisible Hand. Our prayers are “Have It Your Way,” “Think Different,” and “Just Do It;” our avatars are Ronald McDonald, Mr. Peanut, and the Kool-Aid Man; our relics are the Golden Arches, the Mercedes trinity, and the Pepsi Tao. Our liturgy, that’s advertising. It’s produced some great and beautiful art. What I would argue to you is that all of it—the television commercials and the print advertisements, the marketing campaigns and the logo designs—constitutes the United States’ artistic patrimony; that our great literature is the jingle, the copy, the billboard, the TV spot. It’s true that capitalism exploits humans—you get no disagreement on that. Furthermore, as we peer down on our remaining decades and realize it was industry itself that took us to the Anthropocene’s sweltering apocalypticism, and suddenly Marx sounds Panglossian.

Still, I can appreciate Super Bowl ads, I can enjoy TGIFridays, I can prostrate myself before capital’s liturgy even with my impious heart. You need not be Catholic to be moved by Dante, so why can’t three minutes about Budweiser and Clydesdale horses move me? “Endure, and keep yourselves for days of happiness,” wrote Virgil in The Aeneid, all in the service of Caesar Augustus, an authoritarian dictator; Donatello’s bronzed “David” is a moving evocation of the body’s perfection produced for the Florentine Medicis, and Dmitri Shostakovich’s tonalities are immaculate, albeit composed for Joseph Stalin. When the tyrants are dead, maybe it’s easier to appreciate beauty, but soon enough the ice caps will drown billions of us all, so why not enjoy our equivalent artists and their preferred medium now? James Walter Thompson who filched the Rock of Gibraltar for the Prudential Insurance Company in 1890; Doyle Dane Bernbach and their lemonish Volkswagen; Ogilvy and Mather with contracts for Schweppes, Guinness, Rolls-Royce, Sears, Dove, and so on. The little narratives constructed by these (mostly) men, tiny portraits and miniature novels, weren’t created just to sell people things, for as Jackson Lears writes in Fables of Abundance: A Cultural History of Advertising in America, “they also signify a certain vision of the good life; they validate a way of being in the world. They focus private fantasy.” Wherever people are hungry they’ll purchase food, wherever they’re thirsty they’ll buy drink, but commercials sell you an entire worldview. Every culture has myths, ordering stories of reality. In Athens, to live the good life depended on reason; in Jerusalem it was to commit yourself to faith, and on Madison Avenue it’s to live for consumption. We don’t have Hesiod’s Theogony or the Torah, our scripture is a 30-second spot. Our myth tells you that you are incomplete, disordered, and unhappy, but that the solution involves the accumulation of things, beautiful things, tasty things, sexy things, amazing things, and that through such commodities you become perfectible, as surely as an ancient Greek making offerings at Delphi ensured his favor among the Olympians, as much as a Medieval penitent paying an indulgence ensures release from Purgatory. Does any of it work? Well in the immediate sense, paying the indulgence makes you feel better too. But look, the churches are defunct and our faith is dying as our shopping malls are boarding up, our prayers as unanswered as the next shipping delay. Still, as the Sibylline Oracle at the Mall of America says,
The heartbeat of America is openhappiness, when a diamond isforever in the happiest place onearth.
Because you’re in good hands,so don’t leave home without it.
We bring good things to life, andgo the extra mile. The power ofdreams is the relentless pursuitof perfection, good to the lastdrop.
Eat fresh, expect more, and pay less—anytime, anywhere. Because you’reworth it.
Can you hear me now?

Critique my little cento, but whatever way you arrange it, some version of this lyric will be a more enduring work than anything by T.S. Eliot or Ezra Pound. If you can only feel sublime in a cathedral, I pity you, because the numinous can be smuggled into these commercial prayers, however empty their promises. Virgil, Donatello, and Shostakovich all exploited emotions, and they were servants of nefarious masters as well, and yet it would be a fool who thought that The Aeneid, “David,” and “Symphony No. 5 in D minor, Op. 47” don’t intimate the shores of eternity, the breath of transcendence. Materialism in its most raw and literal form has little to do with it. “It isn’t the whiskey they choose,” wrote David Ogilvy in Ogilvy on Advertising, “it’s the image.”

Like a wounded gladiator, Pittsburgh Steelers defensive tackle “Mean” Joe Greene limps back to the Three Rivers Stadium locker room after a bruising first two quarters. A tow-headed little boy follows the football player and offers to help his hero, but the famously gruff Greene declines. Then the child offers him his Coke, and again he’s turned down. True to the rule of three, Greene finally accepts the supplication of sugar water, and downs the Coca-Cola while the boy turns back. Before the child can return to the stands, “Mean” Joe says with a smile “Hey kid, catch!” and throws his jersey to the boy. The Hero’s Journey as envisioned by McCann Erickson in 1979. A 2020 neurological study demonstrated that 90 percent of NFL players have suffered chronic traumatic encephalopathy from injuries sustained on the field. Leo Burnett had similar masculine ambitions when tasked with reorienting Marlboro Cigarettes towards the men’s market in 1954. Across a blasted, rugged, western terrain, all otherworldly plateaus and the burnt ochre sun of dusk, rides a cowboy. The visuals are John Ford, the music is from The Magnificent Seven, the most iconic of the “Marlboro Men” was Darrell Winfield, who played the role for 20 years after working as an Oklahoma rancher. Marlboro sold a fantasy, that of the homesteader, the bootstrapper, the stern and taciturn settler kept company by his shadow. This isn’t a place—Marlboro Country is everywhere. Two years after the character’s introduction, Marlboro’s profits increased 300 percent from $5 billion to $20 billion. Five of the men who played the Marlboro Man died from lung cancer.


Calvin Klein’s in-house ad agency borrowed Western accoutrement in a 1981 television ad. Brooke Shields whistles “My Darling Clementine,” laying odalisque in jeans and cowboy boots, wearing a pewter belt buckle and a slightly open red blouse. “You know what comes between me and my Calvins?” Shields asks. “Nothing.” If the point wasn’t already clear, Tom Richert writes in The Erotic History of Advertising that it was an “unmistakable double entendre when framed with a camera shot that took thirteen seconds to slowly move along the length of her inseam before including her face.” Shields was 15. In a 2021 Vogue interview, she recalled “I was a kid, and where I was, I was naïve.” Three years later, and Steve Hayden, Lee Clow, and Brent Thomas would take advantage of the slightly warming Cold War and the ominous connotations of the year “1984” in their famed Ridley Scott directed Super Bowl spot for Apple Computers’ new Macintosh. A group of androgynous, grey-hued drones shuffles in lockstep into an industrial hanger where they watch an address by an obvious Big Brother stand-in delivered on a massive blueish telescreen. “We have created, for the first time in all history, a garden of pure ideology,” says the speaker, “where each worker may bloom, secure from the pests purveying contradictory thoughts.” But then, a solitary rebel emerges, a blonde woman in red running shorts and white tank top who seems like she has escaped from an aerobics studio, sprinting through the grimy and steamy hanger, pursued by riot police, and in the last moments of the ad she flings a sledgehammer at Big Brother’s screen, which explodes. Whether this incarnation of ’80s material excess was targeting Soviet communism or IBM is ambiguous, but a voiceover informs us that “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like 1984.” It aired only once, during Superbowl XVIII. The estate of George Orwell sued Apple Computers.

The Macintosh ad illustrates the brilliant vampiric logic of capitalism, for a totalitarian must continually dominate those whom he oppresses, but the capitalist insidiously convinces to you that he’s your friend. By outsourcing tyranny to the individual, everything is much more seamless. Capitalism privatizes totalitarianism, which on the whole is much more effective. In a review of the ad that ran in Harper’s for its 30th anniversary, Rebecca Solnit outlines how Silicon Valley has been instrumental in coarsening the discourse, increasing the gap between the wealthiest and everybody else, and ironically manufactures their products in Chinese factories that evoke the dreary setting of the commercial, before concluding that “If you think a crowd of people staring at one screen is bad, wait until you have created a world in which billions of people stare at their own screens while walking, driving, eating in the company of friends—all of them eternally elsewhere.” If resistance took only flinging a hammer at a screen (where’s the sickle?) fighting authoritarianism would be so much easier, but the genius of capitalism is that any rebellion can instantly be integrated into the status quo and used to sell jeans, computers, and beer. Like a virus, capitalism just mutates to overcome the vaccine. Thomas Frank writes in Commodify Your Dissent: Salvos from the New Gilded Age, an anthology coedited with Matt Weiland, that the counterculture’s “frenzied ecstasies have long since become an official aesthetic of consumer society, a monomyth of mass as well as adversarial culture… Corporate America is not an oppressor but a sponsor of fun, provider of lifestyle accoutrements, facilitator of carnival.”

Well, true. Still, I hope that filching the subversive to sell soap has the unintended consequence of injecting resistance into mass culture, that if we can hear the quiet chords of redemption in Virgil and Shostakovich, that we can also see rebellion in a Macintosh ad, even if the intent was duplicitous. Few ads are more cynical than McCann Erikson’s 1971 Coca-Cola Hilltop ad, in which dozens of vaguely countercultural looking young women and men sing a paeon to the glories of pop in an Italian field with glassy eyed Peoples Temple intensity. “I’d like to buy the world a home/And furnish it will love/Grow apple trees and honey bees/And snow white turtle doves,” they sing in perfect harmony. “I’d like to buy the world a Coke/And keep it company/That’s the real thing.” Obviously this millennium of fraternity and fizzy water deserves scorn, and yet dialectically it does contain a kernel of resistance against its own best interests, this evocation of a utopian moment, this depiction of a better world, even if you’ve got to have a Coke at the same time. Media theorist Marshal McLuhan claimed in The Mechanical Bride that “To get inside in order to manipulate, exploit… To keep everybody in the helpless state engendered by prolonged mental rutting is the effect of many ads,” but 25 years later in 1976 he’d admit in Advertising Age that his subject was the “greatest art form of the twentieth century.” Again, both of these things can be true. Not for nothing did Marx think that capitalism was the most revolutionary movement up until that point, and consumerism does unify people in a type of cracked democracy. Andy Warhol, our greatest theorist of commercial semiotics, wrote in The Philosophy of Andy Warhol that a “Coke is a Coke and no amount of money can get you a better Coke than the one the bum on the corner is drinking. All the Cokes are the same and all the Cokes are good.” It can both be true that capitalism is an exploitative system and that Cokes are good.
Matthew Weiner’s Mad Men—which introduced many of us to the history of advertising—features Hilltop in a crucial scene, with the implication that the show’s alcoholic, philandering anti-hero Don Draper was responsible, inspired to appropriate the hippie aesthetic after a California Esalen-retreat. Draper is a Luciferian figure, simultaneously beguiling as cankered, and despite his worst intentions sympathetic. What makes him fascinating isn’t that he’s a monster, but that he’s human. Mad Men’s best monologue, or at least its most memorable, is in the season finale of the first season when Draper gives a presentation to Kodak executives about a campaign for their new slide projector. Loading up happy pictures of his own troubled family, and Draper intones that nostalgia is a “twinge in your heart, far more powerful than memory alone. This device isn’t a space ship, it’s a time machine. It goes backwards, forwards. It takes us to a place where we ache to go again. It’s not called a wheel, it’s called a carousel. It lets us travel the way a child travels… to a place where we are loved.” Sometimes Draper is understood as a sociopath, but that’s incorrect—he has a surfeit of empathy. If he didn’t, such a presentation wouldn’t be possible. Part of what fascinates about ad men is that it’s such a succinct and obvious way in which writers could sell out, in the commodification of creativity we see both warning and pride. Draper is the suit who reads Frank O’Hara’s Meditations in an Emergency, a cerebral soul who is an embodiment of the axiom that ad men are the unacknowledged legislators of the world. Partially the reason why so many ad men wrote novels and Madison Avenue became a subject for serious post-war literature, the dejected copywriter as an existentialist hero. There’s Frederic Wakeman’s misanthropic The Hucksters and Jack Dillon’s The Advertising Man, but nothing is more associated with this sub-genre than Sloan Wilson’s The Man in the Gray Flannel Suit. Wilson’s protagonist Tom Rath is a Manhattan public relations consultant, overworked and jaded, who says “I’ll write copy telling people to eat more cornflakes and smoke more and more cigarettes and buy more refrigerators and automobiles, until they explode with happiness,” for he is “not a cheat, exactly, not really a liar, just a man who’ll say anything for pay.”  

Ad men completely reshaped the mental topography during these years. Louis Menand writes in The Free World: Art and Thought in the Cold War about how the postbellum world was dominated by “commercial and entertainment culture: movies and television, newspaper and magazine photographs, advertisements, signage, and labeling and packaging.” This was the silver age of mental coercion (ours is the golden), when Soviet writers like Mikhail Sholokhov and Nikolai Ostrovsky were used to produce official literature that extolled collectivization and the command economy, where in the latter’s novel How the Steel Was Tempered a character could shout “all my life, all my strength were given to the finest cause in all the world—the fight for the Liberation of Mankind!” Capitalist propaganda is far more subtle, rather we have “those Golden Grahams/Graham cracker tasting cereal/That taste is such a treat!” I’ve no clue who wrote that particular jingle, but Madison Avenue has always had an outsize concentration of literary ambition. Who among you knew that F. Scott Fitzgerald, Salman Rushdie, and Don DeLillo all worked as copy-writers? Rushdie may have penned The Satanic Verses, but he also wrote “Naughty! But Nice” for Fresh Cream Cakes while working at Ogilvy and Mather; Fitzgerald wrote The Great Gatsby with its description of “such beautiful shirts,” that rain of blue, and green, and yellow that Daisy sends down onto Jay, but while first living in Iowa the author’s line “We Keep You Clean in Muscatine” was emblazoned on laundry trucks throughout the city.

There’s no simple correspondence here, no one-to-one symbiosis, but the experience of DeLillo at Ogilvy and Mather must have informed his writing. In DeLillo’s White Noise, erstwhile professor of Hitler Studies and small liberal arts faculty member Jack Gladney exists, like all of us, in the medium of commercials. Thomas DiPietro records the author as saying in Conversations with DeLillo that America’s central commandment is “consume or die,” and that’s on display in the novel. Commodity fetishism is the contrition through which the capitalist soul is formed, where one “found new aspects of myself, located a person I’d forgotten existed. Brightness settled around me… I was bigger than these sums. These sums poured off my skin like so much rain. These sums in fact came back to me in the form of existential credit.” In White Noise, Jack mumbles the prayers of our faith—”Mastercard, Visa, American Express.” Good copywriter DeLillo must have been, White Noise expresses a truth of advertising—all of this purchasing isn’t about stuff, it’s about identity. Before the omnipresence of consumer culture, if you needed to plow—you bought a plow. If you needed to shovel—you bought a shovel. But as the sacrament of Jack’s purchasing demonstrates, the simulacra of reality that is late capitalism asks you to buy (and sell) your soul. White Noise is an example of the advertising turn in literature, where a character’s personality is signaled through the products that they buy. Victorian novels let you understand characters through phrenology, the slope of a brow signaling criminality or the distance between eyes demonstrating intelligence, but in post-modernism it’s the brand of ice cream somebody eats or the type of car they drive.

If you didn’t already know that Patrick Bateman was a sociopathic serial killer in Bret Easton Ellis’s American Psycho, or that he imagines himself to be, than you at least understand that he’s a conceited prick with his “six-button wool and silk suit by Ermenegildo Zegna… cotton shirt with French cuffs by Ike Behar… Ralph Lauren silk tie and leather wing tips by Fratelli Rosetti,” along with all those Eddie Money cassettes. By comparison, a very different personality is conveyed in the brands named by Jeffrey Eugenides in The Marriage Plot, where “Sometimes Madeline made him tea. Instead of going for an herbal infusion from Celestial Seasonings, with a quotation from Lao Tzu on the package, Madeline was a Fortnum & Mason’s drinker, her favorite blend Earl Grey.” The vaguely New Age-y affectations of Celestial Seasonings with its sleepy bear on the box rejected in favor of the stolid, slightly stodgy, sort-of-fussy Fortnum & Mason’s with the Royal Seal on its packaging, so that Madeline isn’t some hippie, but rather a serious person, an Anglophile even (or at least that’s what she’s trying to convey, she owns both brands clearly). Not even poetry is so otherworldly to ignore capitalism’s siren; Frederick Seidel has been writing about his luxury Italian motorcycles for decades, of Ducatis “all around, all red, all beautiful,/Ducatis as far as the eye can see,/Each small and perfect as a ladybug,” published in 2019 in The London Review of Books. Clive James provides ingenious readings of modernist poetry’s relationship to advertising in Poetry Notebook: Reflections on the Intensity of Language, noting that “Theoretically [poets] have despised the land of Just Add Hot Water and Serve, but in practice they loved the slogans. Readymade cheap poetry, the scraps of advertising copy, properly mounted.” He enumerates examples from e.e. cummings, Eliot, and Philip Larkin, though reserves attention for poet—and advertising executive—L.E. Sissman who could write of how “The maître d’/Steers for my table, bringing, in his train,/Honor in Pucci, Guccis, and Sassoon.” Bateman with his business cards and Madeline with her tea; Seidel on a motorcycle and Sissman’s song of Pucci, all of these brand names tell us something.

But of course they do in real life as well; we interpret peoples’ consumer choices in our day-to-day interactions far more than we do in fiction, and what we look for are signs of ideological affiliation. As our politics become only more tribal, what we eat, what we wear, what we drive all become signifiers, readymade symbols that advertise our identity. Imagine somebody who drives a Ford pickup, enjoys a Coors with his Chick-fil-A as compared to a woman who owns a Subaru with a radio tuned to NPR on her way to Trader Joe’s. You know exactly who these people are, or at least who they’re supposed to be. Often this has little to do with class in any traditional socio-economic sense, as “lifestyle usurped the more traditional class markers of income, and even education and occupation,” as Lizabeth Cohen explains in A Consumer’s Republic: The Politics of Mass Consumption in Postwar America. Cohen asks you to predict the different sorts of people who would buy a “Cadillac over a Chevrolet, a ranch house instead of a Cape Cod, The New Yorker over True Story magazine,” and you immediately understand her point. It speaks to something deterministic in the American psyche since the type of ice cream we buy predicts who we’ll vote for, though I offer no appraisal on this one way or the other, just the observation. And politics is only one vestige of this, obviously, consumer choices are instrumental in the formation of identity within and across races, genders, sexualities, and religions as well. We shop, therefore we are.
It becomes impossible to imagine anything different, what Mark Fisher describes in Capitalist Realism as the process by which the market “subsumes and consumes all of previous history.” Marxists use the term “late capitalism” as an optimistic shorthand, when the internal contradictions usher in the millennium of socialism. While I think that we’re definitely in capitalism’s end-stage, I’m not quite as sanguine, because I suspect that what the contradictions of the system will generate is nothing. As with anything consumed without respite, you eventually run out, and history is no exception. How will we define ourselves when the final bill comes due, when the eternal credit card is maxed out, especially since we’re incapable of imagining anything other than capitalism? In aforementioned Dawn of the Dead, all of those survivors of the zombie apocalypse hole up in the Monroeville Mall, where to get through to the other side of consumerism you must yourself become consumerism. With undead cannibals smashing their gory faces against the automatic doors and marauding through the asphalt flat lot, inside we’d raid Footlocker and Dick’s, we’d engorge ourselves at Cheesecake Factory and Red Robin, and we’d wait for the zombies to consume us all. “What haunts me is not exactly the absence of literal space so much as a deep craving for metaphorical space,” writes Naomi Klein in No Logo: Taking Aim at the Brand Bullies, “release, escape, some kind of open-ended freedom.” As Klein describes it, advertising and design mark everything in our reality, and we’re so constricted we can’t even imagine what wild open space would look like. For all that consumerism has promised us—comfort, security, identity—it was always the assurance that we could keep on purchasing our freedom that was the biggest illusion. Now the shipments are on back order and the shelves are empty, but for the time being you can still have whatever it is you want delivered right to your front door, never mind that the driver can never stop working. What happens after collapse when we can no longer define ourselves through products? No clue—the burden of defining some better world falls to those left behind after the rest of us have already left. In the meantime, have a Coke.    
Bonus Links:—A Brief Late-Stage Capitalism Reading ListWhen Capitalism and Christianity Collide in Fiction

Image Credit: Free SVG

Letter from the Collapse

- | 3

“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of monsters are born.” —Antonio Gramsci, Prison Notebooks (1930)

“Twenty-thousand years of this, seven more to go… The quiet comprehending of the ending of it all.”—Bo Burnham, Inside (2021)

In our corner of Northern Virginia, we were fortunate to never see the dead birds. Yet throughout the Mid-Atlantic—a cardinal on the pebbly beaches of Delmarva or a sparrow on the Jersey Shore, a finch like an omen in front of Independence Hall or a bluebird as a threat on the steps of the Capitol—the creatures started to die by the thousands. With little sense of this new plague, experts recommended the removal of bird feeders. And so I dutifully took down the tall model where I examined mourning doves over morning coffee and listened to woodpeckers on the birches, watched the hawks who flew above, and the sleek, elegant crows speaking in their own impenetrable tongue. The Allegheny Front, an environmental show on Pittsburgh’s WYEP, posted a photograph of an afflicted robin found in Erie, Penn. Laid out in a cardboard box decorated with spruce leaves, it looked like the otherwise pristine creature was sleeping, the only sign of its illness the thick crust on her sealed eyes. An affect not unlike the wisps of cotton that escape from underneath the lids of taxidermied birds. “The phenomenon has since spread through 10 states,” writes Andy Kubis at The Allegheny Front, “including West Virginia, Ohio, Maryland and Delaware, and in 61 of 67 Pennsylvania counties.” Observers noted neurological symptoms, birds unable to fly or crashing into the ground; the dead animals, found framed by the brittle, yellow grass of sweltering June, with the characteristic discharge from eyes and beaks.

Ornithologists proffered hypotheses, noting that the avian pandemic accompanied the cicada Brood X. Those creatures we couldn’t avoid seeing, skeletal eldritch horrors bursting from the earth and their own bodies: red-eyed grotesqueries whose incessant droning permeated the humid air for weeks, who dropped from branches and through car windows like something out of a horror film. Between the dead birds and the cicadas, the summer had a Pharaonic glean, intimations of Exodus. A surreal poetry to these chthonic beings, the existential crisis of their lives spent hibernating for 16 years, only to emerge and then die. “Happy the Cicadas live,” wrote Charles Darwin in Descent of Man, and Selection in Relation to Death, though quoting Xenarchus. Our dog took to biting them in half and pushing them between the slots of our deck’s wooden planks, casting them back to hell. By the time they disappeared, without even bothering to say goodbye, I’ll confess that we missed them. But in their brittle, green bodies there was an answer to the bird pandemic, for it seemed that people had attempted to poison the cicadas, and after ingesting their pesticide-corrupted corpses the birds were killed instead. The “sense of cosmic significance is mostly unique to the human relationship with birds,” writes Boria Sax in Avian Illuminations: A Cultural History of Birds, but not apparently to those squeaked out by some bugs, the same people who undoubtedly water their lawn during a drought, or who buy the last 10 chickens during the coming food shortages.  Trillions of cicadas emerged; to avoid them was an impossibility, but you only had to bear them for a short while, and yet people unable to reason that there is no eliminating something of that magnitude and too impatient to wait decided that they knew better. Is there a more perfect encapsulation of the American mindset in these dwindling days?

I’d be amazed if you couldn’t sense it—the coming end of things. A woman sits by her grandmother in a St. Louis, Miss., ICU, the older woman about to be intubated because Covid has destroyed her lungs, but until a day before she insisted that the disease wasn’t real. In Kenosha, Wisc., a young man discovers that even after murdering two men a jury will say that homicide is justified, as long as it’s against those whose politics the judge doesn’t like. Similar young men take note. Somebody’s estranged father drives to Dallas, where he waits outside of Deeley Plaza alongside hundreds of others, expecting the emergence of JFK Jr. whom he believes is coming to crown the man who lost the last presidential election. Somewhere in a Menlo Park recording studio, a dead eyed programmer with a haircut that he thinks makes him look like Caesar Augustus stares unblinkingly into a camera and announces that his Internet services will be subsumed under one meta-platform, trying to convince an exhausted, anxious, and depressed public of the piquant joys of virtual sunshine and virtual wind. At an Atlanta supermarket, a cashier who made minimum wage, politely asks a customer to wear a mask per the store’s policy; the customer leaves and returns with a gun, shooting her. She later dies. The rural mail carrier who has driven down the winding, unnamed roads of a northwestern Oregon hamlet for over three decades notes to herself how the explosion of annoying insects on her windshield seemed entirely absent this summer. A trucker who lives in Ohio blows his airline break, and when trying to get a replacement finds that it’s on backorder indefinitely. Walking across Boston Common this October, and two men holding hands and heading toward the duck boats realize that they’re both sweating under their matching pea coats. It’s 83 degrees. On the first day of July, my family huddles in our basement; a tornado has formed in the District of Columbia, and is rapidly moving across the National Mall.     

Everyone’s favorite Slovenian Marxist Slavoj Zizek snottily gurgled it a decade ago, writing in Living in the End Times that the “global capitalist system is approaching an apocalyptic zero-point,” and identifying four horseman in the form of environmental collapse, biogenetics, systemic contradictions, and “explosive growth of social divisions and exclusions.” Not everyone claims to see the gathering storm however, especially those who are most responsible, though if they do, they’re silent about it in their New Zealand compounds. Degenerated, chipper, faux-optimism is a grift during our epoch of dusk; Jeff Bezos expecting us to clap when he shoots Captain Kirk into space; Elon Musk mouth-breathing about cryptocurrency and terraforming the rusty soil of Mars, as if we haven’t already heated one planet too much; Peter Thiel promising us that there will be a digital heaven where all of the billionaires can download their consciousness unshackled from the material world, and we can serve alongside them as Egyptian slaves entombed with their masters, clicking on PayPal,and Amazon and Facebook for a silicon eternity. Such promises are the opposite of hope, they’re only grinning assurances of dystopia instead of apocalypse. Besides, such things are chimerical; ask not for whom the Antarctic ice shelf collapses, or for whom the ocean acidifies, or for whom the temperature rises at 3 degrees Celsius, it does all these things for Bezos, Musk, and Thiel as much as you and me. Ours is the age of Covid and QAnon, supply chain breakdown and surveillance capitalism, food shortages and armed militias, climate change and bio-collapse. We’re merely in a milquetoast interregnum as we wait for monsters to be born in a year, in three. If poets and prophets have traditionally been our Cassandras, then on some level everybody knows that a rough beast is slouching towards Bethlehem right now, though despite that one sees perilously little grace, kindness, and empathy. Even the insanity of those who believe whatever conspiracy theory happens to give them scant meaning intuit that the insects are disappearing, the waters are rising, and the absence of 700,000 lives means that something is askance.

“The world sinks into ruin,” wrote St. Jerome in 413, some six decades and change before the final sack of Rome that marks the Western empire’s fall. “The renowned city, the capital of the Roman Empire, is swallowed up in one tremendous fire,” he noted of the Visigoth Alaric’s siege. Hard not to imagine that some didn’t realize that the end was coming, shortages of pungent garrum made in Mauretania, a scarcity of Cappadocian lettuce and Pontic fish. In 410, the Emperor Honorius recalled all legions from Britannia to defend the eternal city from the Visigoths who would soon traipse through its burning streets. Envision that horde, ascending the marble steps of the Senate, in furs and horned helmets, brandishing their red standard and crowding through the halls of that once august and solemn space. Can you even countenance it? The Romanized Celts requested from the emperor the return of defensive legions, and in his rescript Honorius “wrote letters to the cities in Britain urging them to be on their [own] guard.” The United States Postal Service will be late in delivering packages, because of supply chain shortages there is no chicken available at the Stop & Shop, the power grid will be down this winter in Texas. You’re on your own. As civil society crumbled, Romans turned to all variety of superstitions and occultisms, cults and conspiracies. As Edward Gibbon noted in The History of the Decline and Fall of the Roman Empire, the “zeal of fanaticism prevailed over the cold and feeble efforts of policy.” Stop the steal! Lock her up! Make America GREAT again! Living on a heating planet filled with dying animals and governed by either the inept or the insane, and it’s hard not to feel a bit strange going to work, buying groceries, saving your salary, as if everything were normal. “We live as though we are going to die tomorrow,” wrote Jerome, “yet we build as though we are going to live always,” or, as David Byrne sang, “Why stay in college? Why go to night school?… I ain’t got time for that now.”

Whenever comparisons are made between Rome and America, there’s always somebody who denounces such language as not just histrionic, but clichéd. The latter is certainly fair; ever since the founders obsessed over republican virtue we’ve imagined that the Potomac is the Tiber, and we’ve parsed (arch-royalist) Gibbon’s history for clues about our falling. Copies of Plutarch and Livy were brought to the Continental Congress, and the most popular colonial American play was a turgid script by Joseph Addison about Cato (it would be performed at Valley Forge). The young Republic declared itself to be a “Novus ordo seclorum,” a “New Order of the Ages,” in conspicuous Latin borrowed from Virgil’’s Aeneid, while the Federalist Papers were written under pen-names like Caesar, Brutus, and Publius and John Adams attributed his worldview to Cicero. Roman symbolism was replete, as in the fasces that would adorn the Senate located on Capitol Hill. When George Washington deigned not to hold a third term, he was compared to the noble dictator Cincinnatus who dropped his sword for a plow, which was enough virtue that by 1840, four decades after the first president’s death, and the sculptor Horatio Greenough rendered the general as a muscular Jupiter in a toga. By the final year of the Civil War, and the first president was depicted underneath the Capitol dome as a purple robed Roman god in “The Apotheosis of Washington”. The Lincoln Memorial, the Supreme Court, the Capitol, all of it neo-classical ridiculousness. Gore Vidal recalled in United States Essays: 1952-1992 that his grandfather, Sen. Thomas Gore of Oklahoma, remarked to Franklin Delano Roosevelt about the bloated buildings of Washington that “At least they will make wonderful ruins.”  

Vidal, that classical patrician, wrote that “Empires are dangerous possessions… Since I recall pre-imperial Washington, I am a bit of an old Republican in the Ciceronian mode, given to decrying the corruption of the simpler, saner city of my youth.” Hardly a postbellum pose, for critics have feared that the Republic would slide into an Empire before the Constitution’s ink was dry. Naturally there is also fear of collapse, and long has there has been foreboding about the decline and fall of the American Empire. On the top floor of the austere New-York Historical Society, there is a pentad of paintings by the unjustly forgotten landscape artist Thomas Cole, a series known as “The Course of Empire.” Rendered between 1833 and 1836, Cole was disturbed by both the vulgarity of Jacksonian Democracy and the brutality of Manifest Destiny. A member of the Hudson Valley School who reveled in the sheer grandiosity of the nation’s natural spaces, Cole imagines in “The Course of Empire” a fantastical country from its primitive state of nature, through an idealized agrarian state, into a decadent imperium, an apocalyptic collapse, and finally desolation. Overlooking each painting is the same mountain peak, roughly the shape of Gibraltar’s rock, the one consistency as Cole’s civilization follows the course of its evolution, a reminder that nature was here before, and despite how we may degrade it, will still be here afterwards. The penultimate landscape, entitled simply “Destruction,” presents the denouement of this fantastic city, a skyline of columned, porticoed, and domed classical buildings in flames, bellowing smoke partially obscuring that reliable mountain; vandals flooding the streets, murdering and raping the city’s citizens, pushing them into the mighty river that bisects it. A triumphant monumental statue is now decapitated. With its wide marble buildings and its memorials, Cole’s city resembles nothing so much as Washington D.C., though when he lived the capital was more provincial backwater than the neoclassical stage set it would become. Cole made a note that “the decline of nations is generally more rapid than their rise,” concluding that “Description of this picture is perhaps needless; carnage and destruction are its elements.”

Enthusiasm for such parallels, along with attendant breathless warnings (including the ones that I’m making) have hardly abated. In just the past decade, there have been articles entitled “8 striking parallels between the U.S. and the Roman Empire” by Steven Strauss in 2012 at Salon, Pascal Emmanuel-Gobry’s “America now looks like Rome before the fall of the Republic” from 2016 in The Week,  Tim Elliot’s 2020 piece at Politico entitled “America is Eerily Retracing Rome’s Steps to a Fall. Will It Turn Around Before It’s Too Late?,” Vox’s essay from that same year “What America Can Learn from the Fall of the Roman Republic” by Sean Illing, and Cullen Murphy’’s succinct “No, Really, are we Rome?” from The Atlantic of this year. Just to dissuade those who parse such things, Tom Holland wrote “America Is Not Rome. It Just Thinks It Is” for The New York Review of Books in 2019. With an article that reprints Cole’s painting underneath the headline, a pull-quote reads “There is nothing written into the DNA of a superpower that says that it must inevitably decline and fall.” Well, with all due respect, the second law of thermodynamics mandates that everything has to fall apart, but Holland’s point is taken that in a more immediate sense, comparisons of America to Rome tell us little about the latter and everything about the former. But for those who see the comparison as tortured beyond all reasonableness, the truth can be bluntly stated as follows: our current problems aren’t like the fall of Rome because they’re far, far worse. Would it only be that we faced the collapse of the U.S. government, or authoritarianism, or even civil war, because the rising average temperature per year, the PH of the oceans, and the biodome’s decreasing diversity are things unheard of on the Earth since the Permian-Triassic extinction of more than 250 million years ago, when 70 percent of life on land perished and almost 95 percent in the seas did.     

“It is worse, much worse, than you think,” writes David Wallace-Wells in The Uninhabitable Earth: Life After Warming. Wallace-Wells describes the five previous mass extinctions that shaped evolution, explaining that four of these “involved climate change produced by greenhouse gas.” Before the Permian-Triassic extinction, the land was occupied by the fin-reptile dimetrodon and the hog-shaped Lystrosaurus, the abundant atmospheric oxygen supported massive dragonflies and centipedes, and the oceans were plentiful with mollusks and trilobites. For some still unexplained reason the amount of carbon dioxide rapidly increased, which in turn triggered the release of methane, so that this feedback loop “ended with all but a sliver of life on Earth dead,” as Wallace-Wells writes. “We are currently adding carbon to the atmosphere at a considerably faster rate; by most estimates, at least ten times faster,” he explains. If we didn’t know what caused that warming 250 million years ago, we know what’s doing it now—us. Should the worst case scenario of the United Nations Intergovernmental Report on Climate Change come to pass, then in the coming century the exponential increase in warming will result in an ice-free arctic, obliteration of the coastal cities where two-thirds of humans live (no more Venice and Amsterdam, New York and Miami), the mass destruction of farm land, continual massive wildfires for which we will look back fondly on the summer of 2021, never-ending hurricanes and tropical storms, heat waves, droughts, desertification, new pandemics, and at worse the acidification of the ocean and the resultant perishing of most things that live beneath the waves. Short of a social or political revolution to reorient the world away from the cannibalistic capitalism which has brought us to this moment, we’ll read Gibbon as halcyon (assuming anyone is around to read).

This summer I threw a little digital life buoy out into the whirlpool of Twitter, another one of those horseman of dystopia, and asked others what it felt like to be living during what could be the apocalypse. Mostly I discovered that my anxiety is common, but one gentleman reminded me that there were Medieval millenarians and Great Awakening Millerites awaiting their messiahs who never came, and that they were all mistaken. That is, if you’ll forgive me, exceedingly stupid. There have been times when I was sure that I was going to die—the shaky prop plane flying low to the ground between Philly and the Lehigh Valley and the erratic driver going 20 miles over the speed limit who almost side-swiped me on a stretch of I-95 in Massachusetts—but just because I survived shouldn’t lead me to conclude that I’m immortal. Armageddon isn’t any different. My critic, though, seems to be in the minority—most people have that sense of foreboding, picking up whatever cries are coming from the Earth that the summers feel hotter, the animals scarcer, the sky sometimes glazed an ungodly glow from the redness of western fires. “The piers are pummeled by the waves;/In a lonely field the fain/Lashes an abandoned train,” wrote W.H. Auden in his 1953 poem “The Fall of Rome,” perhaps about his own justified fears regarding nuclear conflagration. I imagine the poet placing his wrinkled, droopy, hang-dog face to the ground and picking up on those frequencies that are today a cacophony, the “Private rites of magic” that now mark the fascists of one of our only two parties, how “an unimportant clerk/Writes I DO NOT LIKE MY WORK” reminding me of the striking heroes who are leaving the degrading and barely remunerated labor of late capitalism, how the “Herds of reindeer move across/Miles and miles of golden moss” in a warm arctic, and my beloved “Little birds with scarlet legs… Eye each flu-infected city.”

From the Greek, “apocalypse” means to “uncover” hidden knowledge, so for those of us anticipating what the future holds, it’s been the apocalypse for a while. What are you to do with this knowledge? Our politics operate on inertia and project onto individuals a responsibility that was always vested in the powerful themselves. Perhaps you should ditch your car, turn off your air conditioning, recycle, give up meat, and begin composting, but do that because those thing are good for your soul, not because you’re under any illusions that “Not The End of the World” is a consumer choice. Be neither a defeatist nor certainly an accelerationist, however, for avoiding the boiling of the oceans and the burning of the air must be what we put our shoulder to the door for. “To hope is to give yourself to the future,” writes Rebecca Solnit in Hope in the Dark, “and that commitment to the future is what makes the present inhabitable.” Waiting for transformation like it’s the messiah isn’t preferable to collectively willing that transformation, but I know not what that will look like because I’m not a professional revolutionary. The signs that are appearing in the windows of McDonald’s and Subway, Starbucks and Chipotle, from workers tired of being mistreated and underpaid is the largest labor rebellion in a generation, the totally organic Great Resignation spoken of everywhere and reported on nowhere—it gives me hope. It gives me hope because that dark faith, the capitalism that has spoiled the planet, isn’t inviolate; a confirmation of Ursula K. LeGuin’s promise that “We live in capitalism. Its power seems inescapable; so did the divine right of kings.” A corollary is the welcome mocking of fools like Bezos, Musk, and Thiel. Just the widespread awareness of our situation is promising, not because I valorize despair, but maybe if there are a billion little apocalypses it will somehow stave off the big Apocalypse. The whole of the law is treat others as you would wish to be treated and don’t cross a picket line, the rest is all theory. Now, go, and study.   

Finally, I’m only a writer, and the most recondite type, an essayist. Could there by any role for something so insular at the end of the world? In The Guardian, novelist Ben Okri recommends “creative existentialism,” which he claims is the “creativity at the end of time.” He argues that every line we enjamb, every phrase we turn, every narrative we further “should be directed to the immediate end of drawing attention to the dire position we are in as a species.” I understand climate change as doing something similar to what Dr. Johnson said the hangman’s noose did for focusing the mind. It’s not words that I’m worried about wasting, but experiences. What’s needed is an aesthetic imperative that we somehow live in each moment as if it’s eternal and also as if it’s our last. Our ethical imperative is similar: to do everything as if it might save the world, even if it’s unlikely that it will. Tending one’s own garden need not be selfish, though if everyone does so, well, that’s something then, right? I’m counting the liturgy of small blessings, noting the cold breeze on a December morning, the crunch of brown and red and orange leaves under foot, the sound of rain hitting my office window, the laughter of my son and the chirping of those birds at the feeder who delight him. I’ve no strategy save for love. “The world begins at a kitchen table,” writes Poet Laureate Joy Harjo, in a lyric that was introduced to me by a Nick Ripatrazone essay. “No matter what, we must eat to live.” Harjo enumerates all of the quiet domestic beauties of life, how the “gifts of earth are brought and prepared” here, and “children are given instructions on what it means to be human” while sitting at this table, where “we sing with joy, with sorrow. We pray of suffering and/remorse. We give thanks./Perhaps the world will end at the kitchen table, while we are laughing and/crying, eating of the last sweet bite.” That, finally, is the only ethic I know of as the oceans flood and the fires burn, to be aware of our existence at the kitchen table. When the cicadas come back in 17 years, I wonder what the world will be like for them? I hope that there will be bird song.    

Image Credit: Wikipedia

A Year in Reading: Ed Simon

- | 1

Three years ago, the Bundeswehr initiated an unlikely experimental program at the University of Tübingen. “Project Cassandra,” which was exactly the codename you would want a secretive military program to be named, was led by an enigmatic professor named Jürgen Wertheimer, which is exactly what you would want his name to be. They developed a program capable of sifting through metadata and applying an algorithm to ascertain where future conflict would occur. They were unusually successful, foreseeing turmoil in Algeria, Kosovo, and Nigeria that political scientists had missed, all the more impressive because Wertheimer is a literature professor. Philip Oltermann explained in The Guardian that the Bundeswehr believes writers possess a “sensory talent” in identifying “social trends, moods and especially conflicts that politicians prefer to remain undiscussed until they break out into the open.” If writers hear subsonic vibrations just below the crust, then by reading an aggregate of them there might be a way to predict the future. “Writers represent reality in such a way that their readers can instantly visualize a world and recognize themselves inside it,” Wertheimer told Oltermann, after the former had traded in tweed for cammo.

Well, that’s one alt-ac career path. Ignoring the rumors that the CIA and the NSA have long recruited translators at those dreary annual meetings of the MLA held in frigid Boston or Chicago, there is an enigmatic, furtive allure to Project Cassandra, not to mention a practicality, because Wertheimer’s central conceit is obviously correct. George Orwell predicted telescreens in Nineteen Eighty-Four, and now we willingly give our privacy away in the blackness of our Androids. Aldous Huxley claimed in Brave New World that our future would be anesthetized bliss, and now our dopamine rushes are supplied by pawing at the screens of those same Androids. Margaret Atwood wrote The Handmaid’s Tale, and now Texas. Oltermann mentions John Brunner’s 1968 Hugo Award-winning science fiction novel, Stand on Zanzibar, which envisions the 2010 ascendancy of the Chinese economy and the United State’s response as led by “President Obomi.” I’ve long suspected that literature provides intimations of where we’re headed, and though that wasn’t my purpose when I set out digesting novels this year (my purpose was just to read) by this November I felt like I had been listening to a chorus of Sibyls.

Syllabi remain
my operative mode for comprehending reality. Making lists, dividing the year
into units, divining some overall theme to things, whether teaching or planning
my weekend, is how I exist. Just like an engineer looks at the universe and
sees a computer, I examine my own life and I see a college class. So, in
January, when setting out to decide what I’d read this year, I made a syllabus
of sorts, though I wouldn’t know the title of the class until the end of the term.
Rather than just perusing my local library stacks, the unvaccinated version of
me from last New Year used The Millions’ “Most Anticipated”
lists for 2020 and 2021 and compiled a few dozen titles that sounded
interesting. I’d inadvertently gathered what Wertheimer would consider a statistical
sample set.

Everything in this essay came from that initial list; I don’t include any books that I already reviewed for sites, nor titles I consulted in my writing, or the hundredth time I flipped through Paradise Lost. By the nature of this list, all of these books were newly published, though presumably most of them  were written before the pandemic. Much to my own embarrassment none of these titles was in translation, and the majority were by Americans with a few Brits thrown in. Because of my parochialism, and 12 months later I feel as if I’ve divined the unforged smithy of our national soul, for each of the novels provided a glimpse of living in the last days of empire, like the parable of the blind men describing an elephant, if this pachyderm was instead our rapidly fraying social contract. Our age is one of pandemic, supply chain breakdowns, economic collapse, and nascent fascism, and our writers have responded by crafting subverted Great American Novels, writing tomes of collapse, be it national, spiritual, personal. Each book taxonomizes the passing of anything that even remotely looked like it could be described naively as the “American Dream.”

The title of Ayad Akhtar’s Homeland Elegies announces itself as being concerned exactly with the themes that the traditional Great American Novel dabbles with. “Homeland” with its connotations of the vaguely-totalitarian federal agency that emerged in the wake of 9/11 and which often targeted Muslims, and “Elegies” with all of the grandiose and mournful implications of recognizing something that has passed. Narratively ambitious and sprawling, Homeland Elegies concerns a narrator named “Ayad Akhtar,” a Pakistani-American raised in Wisconsin and living in New York who bares more than a passing resemblance to the author whose name is on the cover. An acclaimed playwright before he was a novelist, Akhtar is often positioned as the Philip Roth of Islam, a fearless Muslim-American willing to portray his community in all of its complexities without desire to placate or whitewash, such as in his controversial Tony Award winning Disgraced.

Homeland Elegies follows his not-quite-identical roman a clef backwards and forwards from the present day of his professional success (around 2018) to Akhtar’s Midwestern childhood, while dropping in on events like 9/11, the 2008 economic collapse, and the election of Donald Trump as 45th president of these disunited states (indeed Trump is a character in Homeland Elegies, by connection to the author’s cardiologist father). Hyphenated Americans have historically been slurred as somehow “less than” the nationality than appears on the right side of that dash, but Akhtar is an American prophet who understands that the nation is in free-fall. Neither memoir nor autofiction, Homeland Elegies is best described by its author as a curated social media feed, a place where truth and fiction mingle in that ever-chimerical invention of the self. At the core is the complicated relationship of father and son, and the book is both about immigration and assimilation, but more than that, it’s a condemnation of American materialism, excess, and the illusory promises of the city on a hill. “America had begun as a colony and that a colony it remained,” writes Akhtar, “a place still defined by its plunder, where enrichment was paramount and civil order always an afterthought.”

Andrea Lee imagines a luscious estate among the detritus of past empires in Red Island House, her sumptuous novel published after a 15-year hiatus. Philadelphia-born Lee has spent the bulk of her adult life in Italy, and that worldly cosmopolitanism is evident in these interconnected short stories that chattily explores the family and staff who live in the titular mansion. A massive rose-hued house in Madagascar overlooking the Indian Ocean that is built by a Falstaffian Italian industrialist for his younger African American wife, Red Island House upturns expectations. In her author’s note, Lee writes that this is a “novel about foreigners in Madagascar; its viewpoints and its ‘voice’ are those of an outsider looking in,” and with shades of V.S. Naipaul and Ngũgĩ wa Thiong’o she proves as adept at describing the haunted beauty of Madagascar as she was describing the Tuscan countryside in earlier works, a master stylist reinventing the post-colonial novel.     

Shay Gilliam is an Oakland-born, upper-class Black professor of African-American literature at an Italian university who spends her off season on the windy, mango-grove shores of her husband’s pastoral idyll, a woman for whom Africa was a “near-mythical motherland” who discovers that the complexities of colonialism are often individual and that as a result our identities are always relative. Gilliam’s sometimes boorish husband, a working-class street kid made good, is “dizzied by the infinite possibilities offered by using first world money in a third world country, one of the poorest on earth.” A novel of current breezes and expats in white suits plying local girls with rum, of grilled fish on the beach and tourists on mopeds speeding past unimaginable poverty. Across 10 chapters and two decades, Red Island House shows the cankers in both paradise and marriage. Characters shift in and out, people are introduced only to disappear, and Gilliam’s perceptions always dance about true self-insight, even as it becomes clear that a similar complexion is all that unites her to this island’s inhabitants.

History similarly haunts Danielle Evans’s excellent short story collection The Office of Historical Corrections. The Office of Historical Corrections seamlessly moves from humor to poignancy. “Boys Go to Jupiter” details the social media fallout after a coed who posts a picture on Instagram of herself in a Confederate flag bikini, a story that says more about so-called “cancel culture” than 100 editorials, while “Why Won’t Women Just Say What They Want” acts as both parody of pretentious art culture and meditation on the #MeToo movement and the ways that powerful men still escape culpability.

It’s the titular novella that’s the true standout however, “The Office of Historical Corrections” following a mystery as investigated by Cassie and Genevieve, two often antagonistic childhood friends turned grad school adversaries turned agents in an invented federal agency named the Office of Public History. To call Evans’s story Kafkaesque is to ignore just how singular her style is, though she has a sense of the absurdity of bureaucracy, and is also aware of how history is defined by ghosts upon ghosts. Evans is also adept in sarcasm, and the title story with its federal agents printing out corrections to inaccurate historical markers is as strange and funny as anything written about the traumas of racism. “Besides the tablecloths, the décor is all old photographs and postcards that they scrounged up from wherever,” Cassie notes of a Midwestern hipster restaurant, “because you know how white people love their history right up until it’s true.” The Office of Historical Corrections is a parable for the era of Black Lives Matter and the rightful pulling down of Confederate statues, of Critical Race Theory hysteria and white grievance, a novella about passing and self-hatred, survival and violence, and how the American story can be funny except when it isn’t.  

Christopher Beha’s The Index of Self-Destructive Acts is a doorstopper that like many recent titles (Homeland Elegies, Dario Diofebi’s Paradise, Nevada) is fundamentally a historical novel about the very recent past, in this case the year immediately following the election of Barack Obama. And like those other novels, The Index of Self-Destructive Acts combines a dizzying era of contemporary concerns—in this case punditry, finance, the publishing industry, the collapse of journalism, predictive algorithms, the Iraq War, and baseball—crafting an allegory of our present. In this case the allegory concerns Sam Waxworth, a statistical wunderkind in the mold of Nate Silver who correctly predicts every single federal race in 2008 and Frank Doyle, a columnist for a newspaper clearly based on The New York Times who’d once been a Great Society-supporting liberal lion working in the John Lindsey administration but had since transmogrified into a reactionary ogre, scotch pickling him into a George W. Bush, Dick Cheney, and Donald Rumsfeld acolyte.

Sam has been hired to write copy for Interviewer, a publication that used to be The Atlantic but after it was purchased by a tech-bro was turned into Buzzfeed, and the young prodigy is tasked by his editor to interview Frank. The older columnist was a onetime childhood hero of Sam because of his baseball writings, but the statistician rejects Frank because of his overly romanticizing the game. Baseball is a field of battle between Sam’s sabermetrics and Frank’s poetry, as the young upstart crow from flyover country “tried to attend to the facticity of things,” while his older sparing partner understands that “polls couldn’t capture a mood. For that you needed to look around a bit.” Like all true systems novels, from Charles Dickens’s Bleak House to David Foster Wallace’s Infinite Jest, there are a panoply of characters, namely Frank’s entire immediate family (Sam starts an affair with the columnist’s daughter), and a multitude of themes are explored, none so much as what it means to choose if everything can be reduced to mathematics. When an allegory refuses didacticism for negative capability, that’s when we call it a novel, and the strength of Beha’s endeavor is that it’s not clear who exactly is sympathetic or not in the contest between Sam’s unfeeling, analytical technocracy and Frank’s painfully wrong though still fundamentally emotional perspective on life.

An English sonnet has never been as sublime as the orange sun melting into the horizon over a minor league ballpark, faint chill of desert air rustling through the stands in the seventh inning before the final beer rush, odor of sodium-nitrate saturated hot dogs and smoky peanuts hanging heavy in the air. If baseball is an undercurrent in The Index of Self-Destructive Acts, then it’s everything in Emily Nemens’s The Cactus League; interconnected short stories that are as charming as Bull Durham and as heartbreaking as Denis Johnson. A former editor for The Paris Review, Nemens follows the path of Bernard Malamud’s The Natural and Roth’s self-aware The Great American Novel, using baseball as the major metaphor of American life. Our national pastime, it has been supposed, brings the poet out in the accountant and the accountant out in the poet, but as anyone who is a fan knows, the calling of strikes and outs has nothing really to do with a game and actually everything to do with anything else.

The Cactus League gives kaleidoscopic perspective to the fictitious Los Angeles Lions’ preseason and their star outfielder John Goodyear who appears to be in the midst of a crackup, of sorts. Nemens’s novel is set in greater Scottsdale, the Arizona desert a fading pink and the entire city a massive suburb of itself, all gated communities, preposterous grass lawns, the big box sprawl of Phoenix, and above it all Frank Lloyd Wright’s ethereal Taliesin West. In nine interlocking stories (get it?) Nemens follows a host of characters from agent Herb Allison, to local baseball groupie (and architecture enthusiast) Tamara Rowland, the aging batting coach Michael Taylor, and Goodyear himself. The effect is sublimely dizzying as the narrative moves from one character to another, using a collage effect to underscore the consequence of baseball by bringing the players, the coaches, the wives, the reporters, and the fans to bear. Nemens’s chapters are stunningly rendered character portraits of figures who face increasingly dwindling days, like their sport and nation. “Here’s the thing about baseball, and all else: everything changes. Whether it’s the slow creep of glaciers dripping toward the sea, or the steady piling up of cut stones, rock upon rock until the wall reaches the chest high, nothing is still.”

The southwest is mythic in a manner that’s unlike the overdetermined east. Puritan Yankees and Cavalier Southerners are forged into something new in the unforgiving environs of the desert, and in that way, it becomes the most American of places. Paradise, Nev., is an unincorporated town whose enigmatic name aside, most people have never heard of, though it contains some of the most iconic buildings in the United States, a neighborhood better known as the Las Vegas Strip. Paradise, Nevada is the title of the Italian novelist and former professional poker player Dario Diofebi’s massive consideration of that mirage and late capitalist America. The Bellagio, Caesar’s Palace, The Venetian, the Positano—a city of excess and neon, decadence and luck (as well as its opposite). If you don’t recognize the last casino that’s because it’s the invention of Diofebi, an exact replica of the Amalfi Coast built by the reclusive billionaire Al Wiles, who constructs a kingdom of sand and water pipes AND fake Adriatic breezes and the smell of Mediterranean lemons, all to impress his wife, a Swiss model who eventually leaves him.

Diofebi uses the 600-some pages of Paradise, Nevada to portray Las Vegas in 2014 and 2015 as a microcosm of America, presenting the interlocking and eventually intersecting stories of Ray, an online poker player who absconds to Sin City to make a living, a man with too much faith in statistics and game theory; Tom, an illegal Italian immigrant who got lucky at the tables and ends up becoming embroiled with a shady vlogger and pickup artist; Mary Ann, the Mississippi raised former New York model who works as a cocktail waitress; and Lindsay, a Mormon journalist with literary ambitions confronted with whether it’s possible to serve both Mammon and Moroni. Fundamentally a novel not just about class consciousness, but more simply money—who has it and who doesn’t—Paradise, Nevada gets to the nihilistic core of American consumerism while never losing sight of the fact that all of those neon lights are gorgeous. “It’s a beautiful town to just watch,” says Wiles, “So many stories, so many myths, so many struggles. Stare at it long enough and you’ll… slowly convince yourself that all those stories amount to some kind of meaning.” Diofebi’s attempt at the great Las Vegas novel ends up being the great novel of predatory neoliberalism, though perhaps that’s the same thing.

“The Great Flu had come to America on ships along with spices and sugar,” writes Anna North in Outlawed, “then spread from husband to wife and mother to child and trader to trader by kisses and handshakes, cups of beer shared among friends and strangers, and the coughs and sneezes of men and women who didn’t know how sick.” I can guarantee that North’s Outlawed is the best alternative history feminist Western that you will read this year. A cross between Atwood and Cormac McCarthy, Outlawed imagines a turn-of-the-century Dakota several decades after a mass pandemic, and the survivors’ grandchildren live in a version of America that’s as Medieval as it is Wild West. A syncretic faith worships the baby Jesus since so much is now invested restoring the population, but women who are unable to conceive (or whose husbands are infertile) are punished as witches, the fate of Ada who is adopted by the Hole in the Wall Gang, an all-woman outlaw group whose leader is an enigmatic, androgynous and messianic figure known only as the Kid. All great science fiction should ultimately be judged by the veracity of its world building, and in this regard North’s novel is a triumph, a fully-fledged reality that’s a mirror of our own twilight civilization. As depressing as the plague ravaged misogynistic West of Outlawed may be, North’s is no dystopia, for as in the work of Ursula K. Le Guin or Octavia Butler, the novel gestures towards genuine redemptive possibility, even in the ugliness of life.

Convents are emphatically different from outlaw gangs, and yet both exist outside of normal culture. Claire Luchette’s subtle, sad, and beautiful Agatha of Little Neon follows four nuns from the Diocese of Buffalo reassigned to gritty, post-industrial Warwick, R.I., where they’re to administer a half-way house for addicts and ex-convicts. Agatha is the most intellectually independent, though her religious doubts are kept to herself, even as she develops a life independent of Little Neon (the name of the house, given because of its garish green paint job) as a math teacher at a local Catholic high school. In their role as caretakers for these women and men—Tim Gary who is missing half of his face following cancer surgery; Lawnmower Jill, a drunk and junky whose nickname is derived from her favored form of transportation—the nuns often fumble in their unworldliness. Multiple themes are explored—secularism and faith, abuse and trauma, addiction and recovery. In a nation where 100,000 people died of opioid overdose this year, Luchette’s novel sings of American brokenness. Agatha of Little Neon is not a book about affordable redemption; in the tradition of the greatest Catholic novels salvation is not guaranteed nor is it cheap. This is a story about broken lives, and is all the more arresting because of it. More meditation than story, prayer than novel, Luchette’s book is the sort that in crystalline minimalist prose with nary a comma out of order, evokes midcentury existentialist classics. “We didn’t know much about addiction, about homelessness, but we know how it could look.” Sometimes that’s enough. Sometimes it isn’t. This is the most moving book about grace and what it means to whisper a silent prayer to nobody that I read this year.   

“I felt pride, of course, but something more, something better: freedom,” says Opal Jewel in Dawnie Walton’s much lauded and thoroughly brilliant The Final Revival of Opal & Nev. The titular rock star is self-assurance incarnate, a blustery, bluesy genius who emerges ex nihilo (or at least from Detroit). The Final Revival of Opal & Nev is the great rock music novel of the year, if not the decade. Walton explores the fraught dynamics between her invented duo, a folky proto-punk outfit from the early ’70s composed of Neville Charles, a sensitive Englishmen enraptured by all things American, and Opal, a young Black singer and songwriter in possession of abundant talent and style. The Final Revival of Opal & Nev follows an upcoming reunion, decades after their falling out, their own solo careers, and Altamont-style violence that marked their earliest success. Composed of interviews between figures associated with the act as conducted by Sunny Shelton, a music journalist who is the daughter of the band’s studio musician drummer whom Opal had had an affair with, the novel inevitably drew comparisons to Taylor Jenkins Reid’s Daisy Jones & the Six. I enjoyed both books, but the strength of Walton’s novel is that Opal and Nev are so different from anything in our actual world, like an outfit composed of Nick Drake and Nina Simone, with Patti Smith on backup for good measure. Walton uses this imagined alternative musical history to explore not just the difficulties in creative partnership, but also questions of appropriation, race, and what music says that words can’t. As David Mitchell writes in his similarly brilliant rock novel Utopia Avenue, “If a song plants an idea or a feeling in the mind, it has already changed the world.”  

Rock music might be the critic’s approved version of popular culture—all of those Lester Bangs and Greil Marcus essays—but in The Gimmicks, Chris McCormick explores an influential but disdained art form in professional wrestling. To say that The Gimmicks is “about” professional wrestling, the sweaty, campy, grappling of pituitary cases wearing ethnically offensive costumes in a bit of scripted drama—the purview of the Iron Sheik and Rowdy Roddy Piper—is a misnomer. The action of The Gimmicks swirls around wrestling in the same way that The Cactus League is “about” baseball, but McCormick uses Avo Greogoryan, an immigrant from Soviet Armenia who performs under the name of the Browbeater, to explore questions about family, trauma, betrayal, diaspora, political violence, the Turkish genocide of the Armenians, and competitive chess strategy. Evocative of both Michael Chabon and Jonathan Safran Foer, McCormick’s trio of friends and family—Avo; his cousin, zealous Ruben Petrosian; and the woman they both love, bookish Mina Boghossian—are refugees from a collapsing empire. “It’s a marvel how memory works,” says Tony “Angel” Krill, the Browbeater’s pony-tailed manager, after he’s been noirishly recruited to find Avo following the wrestler’s disappearance, “how it holds its shape like smoke in the cold…most of my best forgetting is done on purpose.” Epic in range, McCormick’s novel depicts concrete Kirovakan in the U.S.S.R., the sun-bleached streets of Los Angeles’ Little Armenia, the arrondissements of Paris, the blindingly white homes of the Grecian shore, and the crowded alleyways of Istanbul, not to mention a thousand sad, sweat-filled, crowded, and hot gymnasiums in North Carolina, or Kentucky, or Nevada. Throughout McCormick asks what it means to be a genuine human being when kayfabe becomes your reality.

Physical power in its undiluted form is also a theme in Rufi Thorpe’s astounding The Knockout Queen. Set among the chlorinated paradise of suburban Orange County in the mid aughts, high school volleyball star Bunny Lampton, who is blonde, beautiful and 6’3”, forges an unlikely friendship with her next-door neighbor Michael, the narrator of the book, a closeted goth classmate living with his aunt in one of the lower middle-class homes of this neighborhood that’s seen a sprouting of McMansions. Bunny’s father is an alcoholic widower, a charming and deeply corrupt real estate developer who harbors Olympic dreams for his daughter, and is largely tolerant of her friendship with the haunted boy next door, whose mother is in jail for the attempted murder of her husband. The Knockout Queen deftly recreates adolescence during the first decade of this millennium, that era of low-rise jeans and autotune, but more than that it’s a brutal meditation on power in its rawest form, because “it’s different when it’s the woman who’s violent. It strikes people as abnormal. Like, it’s natural for a guy to just ‘lose his temper,’ but if a woman does the same thing, then it’s a sign of something deeper wrong, like psychologically or almost metaphysically.” From the turn of our century until today, Thorpe charts the diverging fortunes of the North Shore Princess and the boy from the other side of the tracks (or fence as it were), with The Knockout Queen marked by loyalty, dispossession, the ravages of time, and the often-startling brutality of what it means to be a human being with a human body.        

In her disquieting The Divines, Ellie Eaton conveys the pain that teenagers inflict on one another. Moving with perfect narrative pacing between the late ’90s United Kingdom and contemporary Chicago and Los Angeles, The Divines is narrated by Josephine, the wealthy daughter of British expats in Hong Kong who once attended the ultra-exclusive girl’s boarding school St. John the Divine in the English countryside. Students at an institution that is far more expensive than it is good, the Divines are known for their hair flip, their cruel pranks, and their abysmal town-gown relationship with the working-class denizens of this depressing hamlet. Much more than a coming of age story—Leonardo DiCaprio and Brad Pitt posters on walls, filched cigarettes, and sweaty school dances—The Divines is about class, trauma, and violence. “Divines could be cruel, conceited, arcane, but we were faithful to the end.” High school can fuck you up, and Josephine still ruminates on her relationship with popular Skipper, her illicit friendship with the townie Lauren, her traumatic infatuation with a maintenance man, and most of all the bullying of a diminutive but shrill classmate who was marked to become a world-class figure skater. Josephine is an unreliable narrator who seems estimably reasonable, a villain lacking self-awareness who befuddles the reader, with Eaton having written a galling account of how trauma mutates, until it’s not even recognizable to the past itself.

The Divines isn’t a horror novel, but it has the feel of one—the gothic campus, the insular community, the provincial townies, and the implied murder on the first page. Horror increasingly bleeds into literary fiction. Perhaps it’s this moment, simultaneously apocalyptic and boring, dulled by social media clicks and 24-hour news, the jittery anxiety of now. No contemporary writer is as adept at malignant narrators as Ottessa Moshfegh, whose characters are worthy of Poe or Dostoevsky. Moshfegh’s latest, Death in Her Hands, is a worthy addition to her oeuvre. Narrated by Vesta Gull, an elderly widow who relocates to a small town that seems like New England, discovers a note in the woods that reads “Her name was Magda. Nobody will ever know who killed her. It wasn’t me. Here is her dead body,” though sans an actual corpse. Vesta becomes obsessed, spinning intricate plots. Death in Her Hands is not quite a murder mystery and not quite gothic, but something far darker. Teddy Wayne also penned a not-quite-horror-novel in his disturbing Apartment, where rather than a cursed manse the story is placed in Columbia’s MFA program, haunted by an awkward, obsessive, slightly creepy nameless narrator who finds it natural to “alter our retrospection in subtle ways, to airbrush our unpalatable blemishes here and there.” Apartment explores the poisons of envy and resentment, class and money, and the risks of self-delusion, concluding that “Sometimes the only way to start over in life is to burn down the house.” Julie Fine offers an actual (maybe) supernatural tale in The Upstairs House, where English graduate student, ABD, and new mother Megan Weiler begins to believe that beloved children’s author Margaret Wise Brown is haunting her Chicago apartment building. “Memory… is a wild and private place to which we only return by accident, as in a dream or song,” reflects Megan in this upsetting story of postpartum depression and scholarly dissatisfaction.

No novel I read this year was quite so viscerally pertinent as Hari Kunzru’s wicked Red Pill. As he did for America’s conflicted history in White Tears, so Kunzru provides diagnosis of European sicknesses rooted deep within its poisoned blood and soil. Drawing his title from the Internet vernacular that refers to those who’ve been initiated into far-right politics, Red Pill recounts the unhinged experiences of its mild-mannered American narrator, a nameless academic who has stumbled into a year-long fellowship at a German research institute in Wannsee, the Berlin suburb where the Reich outlined the “final solution,” back when the Bundeswehr enacted a policy far more evil than just reading a lot of books. The narrator is a good liberal wearily watching the ongoing 2016 presidential election from across the Atlantic while ostensibly writing a monograph about the Romantic poet Heinrich von Kleist, though he actually spends his days walking around Wannsee’s pristine environs and becoming obsessed with an ultra-violent American copaganda show called Blue Lives, which reads like a cross between Blue Bloods and The Shield, with script rewrites from Friedrich Nietzsche. The narrator becomes convinced that dangerous alt-right talking points have been encoded into Blue Lives, as he scours message boards and charts the nihilistic references that the show’s creator Anton has written into the scripts. By fortuitous coincidence, the narrator and Anton meet. “Everything he said sounded like a dare,” the narrator writes of Anton, “an outrage that was taken back as soon as it came out of his mouth. I meant it, I didn’t mean it. Sorry, not sorry.” Understanding that trolls can end up being camp guards, that transgression can slide into genocide, the narrator tries to unmask Anton, but there’s only so much he can do in a world laughingly careening towards Armageddon.

American literature is always about America itself, just as English literature is about class or German literature is about death. Though Kunzru is British, there is something integral about our psychic life displayed in Red Pill, a novel about Europe’s past and America’s future. On the evening that I finalized my reading list—including adding Red Pill to my queue—I was largely optimistic. Six weeks before, and the presidential election had delivered a result that made me hopeful. The polls in Georgia looked surprisingly good. For a bit of time, after four years of nascent authoritarianism, alt-right provocation, and dystopian machination, there were reasons to be happy. That night, I went to sleep expecting that the moral arc of the universe does tend towards justice. We should always listen to Cassandra, though. When I turned in, it was already early morning on January 6.  

More from A Year in Reading 2021 (opens in a new tab)

Do you love Year in Reading and the amazing books and arts content that The Millions produces year round? We are asking readers for support to ensure that The Millions can stay vibrant for years to come. Please click here to learn about several simple ways you can support The Millions now.

Don’t miss: A Year in Reading 2020,  20192018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005

Conquering Hell

-

“You have reason to wonder that you are not already in hell.” —Jonathan Edwards, “Sinners in the Hands of an Angry God” (1741)

“For the ones who had a notion, a notion deep inside/that it ain’t no sin to be glad you’re alive.”—Bruce Springsteen, “Badlands” (1978)

Charging only a quarter, Joseph Dorfeuille allowed the curious to view Hell itself—admission was half-price for children. Not far from the Ohio River, amongst the steep hills of the Queen City, and from 1820 to 1867, the Western Museum of Cincinnati promised in an advertisement “Come hither, come hither by night or by day, /there’s plenty to look at and little to pay.” Founded by physician Daniel Drake, known by enthusiasts as the “Ben Franklin of the west,” the institution was modeled after the Wunderkammers, “Wonder Cabinets,” of Europe, displaying shells and rocks, feathers and fossils, pottery shards and arrow heads. Even ornithologist James Audubon was on staff. Only two years after its founding, however, and the trustees forced Drake to resign. In his place Dorfeuille was hired, who rather than assemble materials zoological, archeological, and geological, understood that the public was curious about the “occasional error of nature.” In place of Drake’s edifying scientific exhibits, Dorfeuille mounted skeletons that moved by mechanical apparatus, dancing while an organ grinder played. He featured a diorama of wax figurines depicting the local murderer, Cowan, in the act, while also preserving in formaldehyde the head and heart of Mathias Hoover, a Cincinnati serial killer. And with particular popularity, the director distributed huffs of nitrous oxide after his “lectures.” But no exhibit—even the laughing gas—was quite as popular as “Dorfeuille’s Hall.”


A recreation of characters and scenes from the 14th-century Italian poet Dante Alighieri’s epic religious allegory The Divine Comedy, as well as from the 17th-century British poet John Milton’s Paradise Lost. Molded in beeswax, the hall was mounted by Hiram Powers, who’d eventually become the first celebrated American neo-classical sculptor (Elizabeth Barret Browning would pen a sonnet in honor of his work). Powers was tasked with illustrating our most grotesque visions—it would be among the most talked about exhibits before the Civil War. Powers crafted wax figures of the demon Beelzebub and of fallen arch-rebel himself, Lucifer. Adept in mechanism and sound-effects, his wax statues would shakily move in the darkness while screams emanated. “Visitors…were so intrigued by the realism of the figures that they were constantly touching them for confirmation that they were indeed wax,” writes Andrea Stulman Dennett in Weird and Wonderful: The Dime Museum in America. “To minimize the damage to his sculptures,” she explains, “Dorfeuille had to put an iron grating charged with a mild electrical current.” At the center was the “King of Terrors,” a stock Devil with red horns and pitchfork, smoke swirling about him. Originally Powers played the role, though an automata would after he quit the Western Museum—moving onto a respectable art career in Washington DC and then in Dante’s Florence—after Dorfeuille stiffed him on pay. By 1839, Dorfeuille sold off the Western Museum of Cincinnati, but he took his deteriorating waxworks to New York, where they were latter immolated in a fire, their owner following them into the underworld a few months after. King Entropy awaits us all.

A skeleton at the exit to Dorfeuille’s Hall held aloft a sign with some doggerel on it: “So far we are equal, but once left, /Our mortal weeds of vital spark bereft, /Asunder, father than the poles were driven;/Some sunk in deepest Hell, some raised to highest Heaven,” though highest Heaven was never as pruriently fascinating as deepest Hell. Powers didn’t outfit an exhibit with harp-playing, winged angels and shining halos; no wax figurines of the unassailable, the respectable, the decent. Not that it was ever much different, for people have always been more attracted—in both the neurotic’s fear and the sadist’s delight—with the fires of damnation. We recognized the 700th anniversary of Dante’s The Divine Comedy in September, and while I have no reliable numbers, my hunch is that 10-to-1 more people have read “Inferno” than the remainder about Purgatory and Paradise. Hell is more visual; if asked to envision Heaven we could offer gauzy, sepia-toned cliches about clouds and pearly gates, but if being perfectly honest nothing about it sounds appealing. But Hell. Well, Hell, we can all agree is interesting. The sulphur and shrieks, bitumen and biting, the light that burns eternally but gives off no glow. It all sounds pretty bad. While our curiosity draws us to the grotesque, we also can’t help but be haunted by Hell, traces of its ash smeared across even the most secular mind.

“Understand, I’m not speaking here only of the sincerely religious,” writes Dinty W. Moore in his irreverent To Hell With It: Of Sin and Sex, Chicken Wings, and Dante’s Entirely Ridiculous, Needlessly Guilt-Inducing Inferno, focusing on his shame-filled Catholic education. “The fable of our flawed souls, the troubling myth of original sin, the looming possibility of eternal damnation clandestinely infects even those of us who think ourselves immune: atheists, agnostics, secularists.” Supposedly only the most zealous lives without doubt, but just as such a pose evidences more anxiety than might be assumed, so too is nobody an atheist all of the time. The pious are haunted by absence and apostates by a presence, but the fear is the same. I don’t believe in a literal Hell, a place of eternal torment where our sins are punished by demons—most of the time. Hell is often on my mind, but unlike Moore I think that occasionally there is intellectual benefit in that which is sulphury (or at least the idea of it). Moore writes about the traumas of “depressive guilt brought on by religious malarkey,” and it would be a cold critic to disagree that belief has been used to oppress, persecute, and inculcate shame. Not just a cold critic, but one incapable of parsing objective reality. But even though he’s right, it’s still hard for me to throw the demon out with the hot coals.

Hell’s tragedy is that those who deserve to go there almost never think that they will, while the pious shame-filled neurotic imagines those flames with scrupulous anxiety. My soul is content if God prepared a place of eternal torment for the EXXON executive who sees climate change as an opportunity to drill for Arctic oil, the banker who gave out high-risk loans with one hand and then foreclosed on a family with the other, the CEO who makes billions getting rich off of slave labor, the racist representative who spreads hate for political gain, the NRA board member unconcerned with the slaughter of the innocent, the A.M. peddlers of hokum and bullshit, the fundamentalist preacher growing rich off his flock’s despair, and the pedophile priest abusing those whom he was entrusted to protect. Here’s an incomplete list of people who don’t belong in Hell—anyone who eats the whole sleeve of Oreos, somebody who keeps on pressing “Still Watching” on the Netflix show that they’re binging, the person who shouts “Jesus Fucking Christ” after they stub their toe, anybody who has spied on their neighbor’s home through Zillow, the Facebook humble-bragger talking about a promotion who keeps hitting “Refresh” for the dopamine rush of digital approval, the awkward subway passengers suddenly fascinated by their feet when a loud panhandler boards, and mastrubators. That so many guilty of the little sins, peccadillos, tics, frailties, and flaws that are universal and make us all gloriously imperfect humans so often feel crippling shame, guilt, and depression over these things is tragic. That my first category of sinners almost never feels those things is even more so.      

Before either Hell or Heaven there was the shadow land of indeterminate conclusions. Neither the ancient Greeks or Jews much delineated out an afterlife. Greek Hades was a grey place, a foggy place, and not particularly pleasant, even if it wasn’t exactly inferno. “Gloomy as night,” is how Alexander Pope describes Hades in his 18th-century translation of Homer’s The Odyssey, a realm populated by “Thin airy shoals of visionary ghosts.” Excepting Elysium—where excellence is more honored than virtue—and everybody has Hades to anticipate, though Homer is content to consign even the glorious to this depressing place. “I’d rather slave on earth for another man,” the hero Achilles tells Odysseus in Robert Fagles’s contemporary translation, “than rule down here over all the breathless dead.” The ancient Jewish conception of an afterlife was originally not much more hopeful, with the Hebrew Scriptures speaking of Sheol, described in Ecclesiastes as a place of “neither deed nor reckoning, neither knowledge nor wisdom.” As smudgy and unfocused as Sheol was, the Bible sometimes draws distinction (and conflation) with a horrific domain known as Gehenna.   

Drawing its name from a dusty valley where pagan child sacrifice had once been practiced, and which became a filthy heap where trash was burnt in a frenzy of ash and dirt, Gehenna is ruled over by Baal and dedicated to the punishment of the wicked. Both the Talmud and the New Testament use the word “Gehenna,” an indication of how, in the first centuries of the Common Era, a comprehensive vision of the afterlife emerged in Judaism and Christianity. Among the Sadducees, who composed the Temple elite, the afterlife was largely defined by absence, but the Pharisees (from whom rabbinic Judaism emerged) shared with early Christians an interest in charting the geography of death, and Gehenna was a highlighted location. (Worth mentioning that the historical Pharisees bear no similarity to the hatchet job performed on them in the Gospels). As it was, Jewish Gehenna would be understood slightly differently from Hell, more akin to Purgatory, a place of finite punishment cleansing the soul of its inequities. Though as the 13th-century Kabbalistic Zohar makes clear, the truly evil are “judged over in this filth… [and] never get released… the fire remains.”

Though both Judaism and Christianity developed a complex vocabulary of Heaven and Hell, it’s not unfair to attribute much of the iconography of perdition to Dante. Writing a generation after Dante’s death, and Giovanni Boccaccio predicted that as regards the Florentine poet’s name “the more it is furbished by time, the more brilliant it will ever be.” Boccaccio’s prediction has been perennially proven correct, with the Victorian critic John Ruskin describing Dante as the “central man of all the world,” and T.S. Eliot arguing that “Dante and Shakespeare divide the modern world between them; there is no third.” All of this might seem a tad Eurocentric, a tad chauvinist, and a tad too focused on Christianity—the apotheosis of Dante into a demigod. Concerning prosody—Dante’s ingenious interlocking rhyme scheme known as terza rima or his ability to describe a “place void of all light, /which bellows like the sea in tempest, /when it is combated by warring winds”—he was brilliant. But acknowledging acumen is different—even Voltaire quipped that more people valorize Dante than read him. And yet (there’s always an “And yet”…), there must be a distinction between the claim that Dante says something vital about the human condition, and the objective fact that in some ways Dante actually invented the human condition (or a version of it). John Casey argues in After Lives: A Guide to Heaven, Hell, & Purgatory that Dante’s was “without doubt the supreme imagining of the afterlife in all [Western] literature,” while Alberto Manguel in The Traveler, the Tower, and the Worm: The Reader as Metaphor claims that his epic has “acquired a permanent and tangible geography in our imagination.”

“Dante’s story, then, is both a landscape and a map” writes Manguel. None of the poem’s unfortunates get out, save for one—the author. Midway in the course of life Dante falls into despair, loneliness, alienation, and The Divine Comedy is the record of his descent and escape from those doldrums. Alice K. Turner argues in The History of Hell that Dante was the progenitor of a “durable interior metaphor.” Claiming that the harrowing and ascension that he described can be seen in everything from psychoanalysis to 12-step programs, Turner writes that “this entirely comfortable and pervasive method of modern metaphorical thinking might not exist if Dante had never written.” “Empathy” might not be the first word readers associate with a book sticky with the blood of the damned—”punishment” or even “justice” would figure higher—and yet Dante feels pain for these characters. That aspect of The Divine Comedy is why we’re still talking about it. Turner explains that “Dante was concerned with history, with Florentine politics, with the corruption of the clergy, with the moral position of his contemporaries, and most of all with the state of his own psyche,” while arguing that at a “distance of seven centuries, we can no longer easily appreciate any of these things except the last—Dante is generous with his emotions.” It’s true that for any contemporary reader, concerns with forgotten factions like the Ghibelline and Guelphs, parsing of Thomas Aquinas, or condemnations of this or that obscure pope can seem hermetic. When perusing a heavily glossed and footnoted copy of The Divine Comedy, it’s his intimate perspective that is the most human.

Eliot may have claimed that between Dante and Shakespeare there was no third, but that’s the sort of thing that a self-declared “classicist in literature, royalist in politics, and Anglo-Catholic in religion” would say, giving short shrift to that bomb-throwing author of Paradise Lost. Earth can be given over to Shakespeare, but Heaven and Hell belong to Dante and Milton. If The Divine Comedy is the consummate expression of Catholicism, then Milton’s epic is Protestantism’s fullest literary flowering, and yet neither of the two are orthodox. Milton’s depiction of damnation in media res after the rebel angels have been expelled from Heaven and the once beautiful Lucifer has been transformed into Satan, revises our understanding of Hell for the first time since Dante. Much remains recognizable in Milton, even if immaculately portrayed, this “dungeon horrible, on all sides round, /As one great furnace, flames; yet from those flames/No light, but rather darkness visible,” a fallen kingdom defined by “sights of woe, /Regions of sorrow, doleful shades, where peace/And rest can never dwell, hope never comes… but torture without end.” Paradise Lost’s beauty belies the darkness of that sunken pit, for Milton’s brilliance has always been that he acknowledges what’s evocative, what’s magnetic, what’s attractive about Hell, for Lucifer in his intransigence and his obstinacy declares that it is “Better to reign in Hell, than to serve in Heav’n.” Dante’s Satan is a monster encased in ice, weeping frozen tears as he forever masticates the bodies of Casius, Brutus, and Judas. He is bestial, animalistic, and barely sentient. Nobody would admire him; nobody would want to be Dante’s Satan. Milton’s Lucifer, on the other hand, is a revolutionary who gets all the best lines; certainly, better than God and Christ. As William Empson had it in his vaguely heretical Milton’s God, the “poem is not good in spite of but especially because of its moral confusions.”        

Milton is a “Puritan,” that wooly category of killjoy whom we associate with the Plymouth Pilgrims (though they were technically different), all belt-buckled black hats and shoes, trying to sniff out witches and other people having impure thoughts. As it goes, Milton may have politically been a Puritan, but he was also a unitarian and a materialist, and on the whole a rather cracked Protestant, not least of all because of his Devil’s thinly disguised heroism. Paradise Lost is honest because it exemplifies a principle that we all know, something expressed by Augustine in his Confessions when filching a pear from a Tunisian marketplace like he was Eve in Eden with her apple, admitting that he wasn’t even hungry but he did it because “It was foul and I loved it. I loved my own undoing.” Doing bad things is fun. Puritans ironically seem more apt to admit that clear truism than all of the Panglossian advocates for humanity’s intrinsic good nature, an obvious foolishness. Just try and negotiate a Trader Joe’s parking lot and then tell me that you’re so certain that Original Sin is old-fashioned superstition. Acknowledging sin’s innate attractiveness—our “total depravity” as John Calvin described it in his 16th-century tome The Institutes of Christian Faith—means that detailed descriptions of Hell can sound perverse. At a pulpit in Enfield, Conn., the minister Jonathan Edwards delivered an infamous sermon in 1741 in which he told the assembled that “Your wickedness makes you as it were heavy as lead, and to tend downwards with great weight and pressure towards hell… if God should let you go, you would immediately sink and swiftly descend and plunge into the bottomless gulf” for God abhors all of us, and is ” his wrath… burns like fire; he looks upon you as worthy of nothing… but to be caste into the fire.” According to Edwards, all women and men are “ten thousand times more abominable in [God’s] eyes, than the most hateful venomous serpent is in ours.” Historians note that while Edwards delivered his sermon, congregants rolled around in the church aisles and stood on the pews, moaning and screaming. All of this is a bit kinky, honestly.   

Calvinism’s God has always read more like his nemesis, omnipotent enough that He created us, but apparently not so omnipotent that He makes us worthy of salvation. More than a tyrant, He reads as a sadist, but when God is deleted from Calvinism what’s left is only the putrid, jaundiced, rotting corpse of our contemporary world, where nobody knows the value of anything, but only the price of everything. Nothing better expressed the dark American transition to post-Calvinism than our greatest of novels, Herman Melville’s Moby-Dick; or, the Whale, of which the author wrote to his friend Nathaniel Hawthorne in 1851 that “I have written a wicked book, and feel spotless as the lamb.” Immature exegetes perseverate on what the white whale means, what he is a symbol of. God? Satan? America? That’s the Holy Trinity, though only one of them has ever kept his promises (I’ll let you guess who). It’s both more and less complicated than that; the whale is the terrifying, naked abyss stripped bare of all our presuppositions. In short, he is the world as it actually is, where Hell is all around us. In Moby-Dick, Ishmael listens to the former harpooner Father Mapple’s sermon at the New Bedford, Mass., Whaling Chapel (with its cenotaphs and its massive cetacean bones), and though the sermon is ostensibly on Jonah, it describes a type of Hell. The congregants sing a strange hymn about the “ribs and terrors in the whale/Arched over… a dismal gloom, /While all God’s sun-lit waves rolled by, /And lift me deepening down to doom.” Mapple preaches that despite how sinful the reluctant prophet was, “Jonah does not weep and wail for direct deliverance. He feels that his dreadful punishment is just… And here, shipmates, is true and faithful repentance; not clamorous for pardon, but grateful for punishment.” You’ll go to Hell—or the belly of a whale—and you’ll be happy about it, too.     

Not that this rhetoric is limited to Protestantism, for fire and brimstone come just as often from homily as sermon. James Joyce knew that Catholic priests could chill the blood every bit as much as an evangelical bible thumper, with perhaps no more disturbing a vision of Hell ever offered than in The Portrait of the Artist as a Young Man. His roman a clef Stephen Dedalus attends a religious retreat, and the visiting priest provides visceral description to the adolescents—buffeted by arrogance and hormones—of what awaits them if they give into temptation, for “Hell is a straight and dark and foul-smelling prison, an abode of demons and lost souls, filled with fire and smoke” where all of the punished are “heaped together in their awful prison… so utterly bound and helpless that… they are not even able to remove from the eye a worm that gnaws it.” Drawing upon the Jansenist Catholicism that migrated from France to Ireland, and the priest’s descriptions of Hell are the equal of anything in Edwards. With barely concealed sadism he intones how amongst the damned the “blood seethes and boils in the veins.. the heart in the breast glowing and bursting, the bowels a red-hot mass of burning pulp, the tender eyes flaming like molten balls.” Sin is spoken of in sensory terms—the taste of gluttony, the touch of lust, the rest of sloth—then so too does the priest give rich description of Hell’s smell. That pit is permeated by the odor of “some foul and putrid corpse that has lain rotting and decomposing… a jelly-like mass of liquid corruption… giving off dense choking fumes of nauseous loathsome decomposition… a huge and rotting human fungus.” Heaven can remain vague, because none of us will ever agree on what it is we actually want. Pleasure is uncertain, but pain is tangibly, deeply, obviously real, and so Hell is always easier to envision.       

After Stephen is convinced to become rigidly austere following that terrifying, he finds himself once again straying. A friend asks if he plans on converting to Protestantism, and Dedalus responds that “I said I had lost the faith… not that I had lost self-respect. What kind of liberation would that be to forsake an absurdity which is logical and coherent and to embrace one which is illogical and incoherent?” Funny in the way that Joyce is, and true as well, though it might only make sense to lapsed Catholics who fumble over the newly translated words of the liturgical preface at Mass. Such Catholic atheism, or Catholic agnosticism, or Catholic what-ever-you-want-to-call-it influenced one of the great contemporary depictions of Hell in Stephen Adly Guirgis’s play The Last Days of Judas Iscariot. Guirgis engages an idiom that could be called “bureaucratizing the sacred,” transposing the rigid elements of our society in all of its labyrinthine absurdity onto the transcendent order. Whenever Hell (or Heaven) is depicted as an office, or a waiting room, or a border checkpoint, a prison, a hospital, or a school, that’s bureaucratizing the sacred. In Guirgis’s play, the action takes place in that most arcane of bureaucratic structures, the court system. “In biblical times, Hope was an Oasis in the Desert,” a character says. “In medieval days, a shack free of Plague. Today, Hope is no longer a place for contemplation—litigation being the preferred new order of the day.” The Last Days of Judas Iscariot portrays a trial of its largely silent titular character, held in a courtroom that exists beyond time and space, where expert witnesses include not just Christ and Pontius Pilate, but Sigmund Freud and Mother Teresa as well. True to the mind-bending vagaries of eternity, both God and the Devil exist in a country unimaginably far from us and yet within our very atoms, for as Christ says “Right now, I am in Fallujah. I am in Darfur. I am on Sixty-third and Park… I’m on Lafayette and Astor waiting to hit you for change so I can get high. I’m taking a walk through the Rose Garden with George Bush. I’m helping Donald Rumsfeld get a good night’s sleep… I was in that cave with Osama, and on that plane with Mohamed Atta… And what I want you to know is that your work has barely begun.” Who does the messiah love?—”every last one.” A vision expansive and universalist, and like all great portrayals—including Dante and Milton who most definitely didn’t think you could breach the walls of inferno by drilling into the earth—Hell is entirely a mental place.

“Despair,” Guirgis writes, “is the ultimate development of a pride so great and so stiff-necked that it selects the absolute misery of damnation rather than accepts happiness,” or as Milton famously put it, “The mind is its own place, and in itself/Can make a Heaven of Hell, a Hell of Heaven.”  Like all things in the sacred—that vast network of metaphors, allusions, images, allegories, poems, and dreams—the question of “is it real or not?” is entirely nonsensical. Heaven and Hell have always been located in the human mind, along with God and the Devil, but the mind is a vast country, full of strange dreams and unaccounted things. Perdition is an idea that is less than helpful for those who fear (or hope) that there is an actual Hell in the black waters of the Mariana Trench, or in scorched, sunbaked Death Valley, or near a sandy cave next to the Dead Sea. But when we realize that both Hell and Heaven exist in a space beyond up or down, left or right, any of the cardinal directions and towards a dimension both infinitely far away and nearer than our very hearts, then there just might be wisdom. Such is the astuteness of Dante, who with startling psychological realism, records the woeful tale of illicit lovers Paolo Malatesta and Francesca da Rimini, tempted into an adulterous kiss after reading the romance of Lancelot and Guinevere, now caught in a windstorm as they had once tussled in bedsheets. “There is no greater sorrow,” Francesca tells the poet, “Than to be mindful of the happy time/in misery.” Because we often think of sin as simply a matter of broken rules, psychological acuity can be obscured. Drawing from Thomas Aquinas, Dante writes that “Pride, Envy, and Avarice are/the three sparks that have set these hearts on fire,” and the interpretative brilliance of the Seven Deadly Sins is that they explain how an excess of otherwise necessary human impulses can pervert us. Reading about Paolo and Francesca, it’s understandable to doubt that they deserve such punishment—Dante does. But in the stomach-dropping, queasy, nauseous, never-ending uncertainty of their lives, the poet conveys a bit of their inner predicament.    

“There are those… who, undoubtedly, do not feel personally touched by the scourge of Dante, [or] by the ashen pall of Augustine,” writes Moore, and while I wouldn’t say that I’m so irreverent that I’m willing to lustily eat a rare hamburger on Good Friday without worrying a bit, I will admit that if I make a visit to Five Guys after forgetting that it’s Holy Week I don’t fret too much. A privilege to play with the idea of Hell without fearing it (at least too much), but the concept still has some oomph in it. Returning to an earlier observation, Hell must be the language with which we think about that which divides us, estranges us, alienates us, not for when we despair at having eaten a bit too much or sleeping in on the weekend. If anything, that Puritanical rugged individualism that so defines American culture whether we’ve opted into it or not—You need to be up by 5a.m.! You have to be a productive worker! You must avoid any real joy except for the curated experience chosen for you by algorithm!—is the true wage of sin. In opposition, I lustily and gluttonously and slothfully advocate for eating a bit too much, laughing a bit too loud, and sleeping a bit too long. Preachers once told us that to be alive was a sin, but that was never it at all. Now we have pundits and TED talk lecturers forcing us to weigh our souls instead, but there’s no shame in knowing that the road rises to meet us, in feeling the wind at our back and the warmth of the sun or the cool rain upon our faces. Shame and guilt are utilitarian, but they belong on the other side. Notable that the idea of Hell developed contemporaneously with that of Heaven, and if it wasn’t for the former, then what would we ever struggle against? “Without contraries there is no progression,” writes the poet and prophet William Blake in his 1793 The Marriage of Heaven and Hell. “Attraction and repulsion, reason and energy, love and hate are necessary to human existence.” Whether temporary or eternal, finite or infinite, a place of extreme punishment implied another for reward. There’s no Heaven unless there is also Hell, and both places are much closer than might be supposed.

Photo credit: Wikimedia Commons

Drizzly November in My Soul

-

Because Robert Burton used astrology to forecast the date of his death with exact accuracy—January 25, 1640—even some skeptics in that credulous age suspected that he may have assisted the prediction’s veracity. To accuse anyone of suicide was a slander; for Burton’s contemporaries such a death was an unpardonable offense. A half-century later, and the antiquary John Aubrey noted in his 1681 Brief Lives that ”tis whispered that… [Burton] ended his days in that chamber by hanging himself.” There are compelling reasons to think this inaccurate. Burton would not have been buried in consecrated ground had he been a suicide—though, of course, it’s possible that friends may have covered for him. Others point to the historian Anthony Wood, who described Burton as “very merry, facete, and lively,” though seemingly happy people do kill themselves. And finally, there’s the observation that within his massive, beguiling, strange, and beautiful The Anatomy of Melancholy, first printed in 1621, Burton rejected suicide—even while writing with understanding about those who are victim of it. As it actually is, the circumstances of Burton’s death remain a mystery, just as self-destruction frequently is, even as etiology has replaced astrology, as psychiatry has supplanted humoral theory.

That such a rumor spread at Christ Church, where Burton had worked for years in the library, compiling his vast study of depression, is not surprising. So identified was Burton with his subject—called “history’s greatest champion of the melancholy cause” by Andrew Solomon in The Noonday Demon: An Atlas of Depression—that his readers simply expected such a death. Within The Anatomy of Melancholy Burton gives overview of Greek and Roman biothanatos, while still condemning it. And yet Burton empathetically concludes that “In such sort doth the torture and extremity of his misery torment him, that he can take no pleasure in his life, but is in a manner enforced to offer violence unto himself, to be freed from his present insufferable pains.” Burton was also frank about his own suffering. White Kennett would write in his 1728

A Register and Chronicle Ecclesiastical and Civil that “I have heard that nothing at last could make… [Burton] laugh, but going down to the Bridge-foot in Oxford, and hearing the bargemen scold and storm and swear at one another, at which he would set his Hands to his sides and laugh most profusely.” Such a man, it was imagined, was the sort who may have dreamed of wading into that cold water in the years when the rivers of England still froze over, walking out into infinity until he felt nothing. Who is to say? We don’t have a complete record of Burton’s thoughts, especially not in his last moments (we don’t have those things for anybody), but The Anatomy of Melancholy is as comprehensive a record as possible, a palliative for author and reader, an attempt to reason through the darkness together.

“Burton’s book has attracted a dedicated rather than a widespread readership,” writes Mary Ann Lund in Aeon, “its complicated branching structure, its Latin quotations and its note-crammed margins resist easy reading.” Though clearly indebted to the humanism of Erasmus and Montaigne, The Anatomy of Melancholy is one of those books that’s almost post-modern before modernity, like the novel
Tristram Shandy (Laurence Sterne shamelessly plagiarized from Burton). The book is encyclopedic but open-ended, erudite but curious, expansive but granular, poignant but funny; never doctrinaire, never judgmental, never orthodox, but gleefully self-referential even while contemplating sadness. Burton combed history, poetry, theology, travelogue, philosophy, and medicine for case studies, across five editions during his lifetime (and a sixth based on posthumous notes) in three separate sections printed as a folio that ballooned to half-a-million words. In the first section he enumerates accounts of melancholia, in the second he offers up cures (from drinking coffee to eating spiced ram’s brain), and in the third Burton presents taxonomies of insanity, including love madness and religious mania. The contemporary edition from the New York Review of Books Classics series is 1,382 pages long. Within those digressive, branching, labyrinthine passages Burton considers King Louis XI of France’s madness whereby everything had the stink of shit about it, an Italian baker from Ferrara who believed that he’d been transformed into butter, and the therapeutic effects of music on elephants. Lund explains how subsequent editions, rather than cutting verbiage, fully indulged Burton’s favored rhetorical conceit of concierges, whereby words are piled upon words in imitation of the manic mind, a quality that has both endeared and frustrated his readers. And yet as William H. Gass observes in his introduction to the NYRBC edition, “the words themselves are magical; you cannot have too many of them; they are like spices brought back from countries so far away they’re even out of sight of seas; words that roll… words even more exotic, redolent, or chewy.”

Sales of Burton’s monumental work, which readers felt free to dip in and out of rather than reading cover-to-cover, easily outsold Shakespeare’s folio, though by the Enlightenment his acclaim had dimmed, the work interpreted as a poorly organized baroque grotesquerie based in outmoded theories. During the optimistic 18th century, The Anatomy of Melancholy had not a single printing. Despite that, it still had readers, including Benjamin Franklin and Dr. Johnson, who told Boswell that it was the “only book that ever took him out of bed two hours sooner than he wished to rise.” Romantics were naturally drawn to it; both Samuel Taylor Coleridge and John Keats had underlined copies, with the latter drawing the plot for his vampiric
Lamia from Burton. In the 20th century, the existentialists saw something modern in Burton, with Samuel Becket a lover of the book. The Canadian physician William Osler, a founder of the Johns Hopkins School of Medicine, thought it the greatest medical text by a layman, and was instrumental both in increased interest as well as the bibliographic tabulation of Burton’s personal library at Oxford. Despite the pedigree of his fans, Burton hasn’t had a wide readership for centuries, as The Anatomy of Melancholy has never been easy. An assemblage of disparate phenomena, a hodgepodge of unusual examples, a commonplace book of arcane quote and complicated exegesis, none of which is structured in a straightforward way, with Burton himself apologizing that “I had not time to lick it into form, as a bear doth her young ones,” though as it became even more formless over the next two decades that would belie his protestation.               

The Anatomy of Melancholy feels unfinished, just like life; it’s contradictory, just like a person; and it encompasses both wonder and sadness, just like a mind. On its quadricentenary it’s abundantly worthwhile to spend some time with Burton, because though he can’t speak of neurotransmitters, he does speak of the soul; though he can’t diagnose, he can understand; though he can’t prescribe, he can sympathize. Beyond just depression, Burton considers ailments like anxiety, obsessions, delusions, and compulsions, Sufferers “conceive all extremes, contrarieties and contradictions, and that in infinite varieties.” To paraphrase Tolstoy, the happy are all the same, but Burton’s depressives are gloriously different. “The Tower of Babel never yielded such confusion of tongues, as this chaos of melancholy doth variety of symptoms.” The second thing that is important to note is that Burton distinguishes between everyday emotions—the occasional blues if you will—from the more punishing. He explains that “miseries encompass our life,” that everyone suffers grief, loss, sorrow, pain, and disappointment, and that it would be “ridiculous for any mortal man to look for a perpetual tenor of happiness.” If somebody is suffering from physical pain or a loved one’s death, grief and sadness are rational; for a person facing economic ruin or an uncertain future, anxiety makes sense, but a “melancholic fears without a cause…this torment procures them and all extremity of bitterness.” For those whose humors are balanced, grief is the result of some outside torment, for the melancholic grief is itself the torment. Furthermore, as Burton makes clear, this disposition is not a moral failing but a disease, and he often makes suggestions for treatments (while just as soon allowing that he could be entirely wrong in his prescriptions). “What can’t be cured must be endured,” Burton notes. In the depressive canon of the late Renaissance, Burton would be joined by Thomas Browne with his similarly digressive, though much shorter, Religio Medici, wherein he writes, “The heart of man is the place the devil dwells in; I sometimes feel hell within myself;” John Donne’s sickbed, Devotions Upon Emergent Occasions, where he concludes that “Man, who is the noblest part of the earth, melts so away as if he were a statue, not of earth, but of snow;” and, of course, Shakespeare’s famed soliloquy from Hamlet that wonders if “by a sleep to say we end/The heart-ache, and the thousand natural shocks/That flesh is heir to.” And that’s just relatively High Church Englishmen; with a broader scope you’d include the Catholic Frenchman Blaise Pascal, whom in his Pensées defines man as that who is “equally incapable of seeing the nothingness out of which he was drawn and the infinite in which he is engulfed,” and the 15th-century Florentine Neo-Platonist Marsilio Ficino who wrote in his 1489 The Book of Life that the condition was “conducive to judgment and wisdom,” entitling one chapter “Why the Melancholic Are Intelligent.” None of them, however, is as all-encompassing as Burton, as honest about his own emotions and as sympathetic to his fellow sufferers. Within his book’s prologue, entitled “Democritus Junior to His Readers,” ironically written under a pseudonym adapted from the pre-Socratic thinker known as the “laughing philosopher,” Burton explains that “I had a heavy heart and an ugly head, a kind of imposture in my head, which I was very desirous to be unladen of,” and so his scribbling would act as therapy.

Across denominations, countries, and continents, the contemplation of a fashionable melancholia was encouraged, with even Burton confessing to sometimes enjoying such abjection as a “most delightsome humor, to be alone, dwell alone, walk alone, meditate, lie in bed whole days, dreaming awake as it were.” Noga Arikha explains in The Public Domain Review that melancholia could be seen as “good if one believed that a capacity for strong passions was the mark of a fine soul that recognized beauty and goodness… the source of sonnets, the harbinger of creativity,” while Darin M. McMahon notes in Happiness: A History that this is a “phenomenon that would have a long and robust future: the glamorization of intellectual despair.” A direct line can be drawn from the goth teen smoking clove cigarettes in a Midwestern high school parking lot through the Beats in their Manhattan lofts eating hash brownies and masturbating to William Blake through to the Left Bank existentialists parsing meaninglessness in the post-war haze and the Lost Generation writers typing on Remingtons in Parisian cafes back to the Decadents and Symbolists quaffing absinthe and the Romantics dreaming opium reveries until interrupted by the person from Porlock through to Burton, and Browne, and Donne, and Shakespeare, and Pascal and Ficino and every other partisan of depression. As Burton notes, “melancholy men of all others are most witty.”

More than a pose, however, and even if Burton agreed that melancholy could sometimes be romanticized, he never lost sight of its cost. Tabulating the price of anxious scrupulosity, Burton notes that “Our conscience, which is a great ledger book, wherein are written all our offences…grinds our souls with the remembrance of some precedent sins, makes us reflect upon, accuse and condemn ourselves.” In the millennium before Burton there were contrary perspectives concerning melancholy. It was interpreted by theologians as a sin—an indolent sloth called acedia—as opposed to the position of doctors who diagnosed it as an imbalance of elemental substances called humors. One thing that Burton is clear on was that melancholy wasn’t simply feeling down. To be melancholy isn’t to be “dull, sad, sour, lumpish, ill disposed, solitary, any way moved or displeased,” Burton writes, and that clarification is still helpful. For those blessed with a brain chemistry that doesn’t incline them towards darkness, depression might seem an issue of will power, something that can be fixed with a multivitamin and a treadmill. Reading Burton is a way to remind oneself—even as he maintained erroneous physiological explanations—that depression isn’t a personal failing. And it’s certainly not a sin. McMahon explains that by “reengaging with the classical tradition to treat excessive sadness and melancholia as an aberration or disease—not just the natural effect of original sin—Renaissance medicine opened the way toward thinking about means to cure it.”

That was a possibility more than anything, for the rudiments of mental health were still mired in superstition. Such an emotion was identified with an overabundance of black bile in the spleen, and a deficit of yellow bile, blood, and phlegm, a condition associated with a dry coldness, so that some of that metaphorical import still survives today. Arikha writes in Passions and Tempers: A History of the Humours how the “experiences of joy, pain, anguish, and fear each had their temperature, their match in some sort of stuff in the body whose motion modulated the emotion.” In a broader way, however, there is something to be said in how the humors emphasized embodiment, the way it acknowledged how much of the emotional was in the physical. We now know that melancholy isn’t caused by problems with our humors, but rather in our neurotransmitters—I am not cutely implying that this is equivalent, accurate science is the only way that pharmacologists have been able to develop the medicine that saves so many of our lives. Yet there is an enduring wisdom in knowing that this is a malady due to something coursing in your veins, whatever you call it. “We change language, habits, laws, customs, manners,” writes Burton, “but not diseases, not the symptoms of folly and madness, they are still the same.”    

Depressives have always existed because there have always been those of us who have a serotonin and dopamine deficiency, even if we’re not lacking in yellow bile, blood, and phlegm. How culture interprets mental illness is entirely another thing, though. As Burton’s popularity demonstrates, there was a surprising lack of stigma around melancholy. In an abstract way, during the 17th century this a reaction to how eternal verities no longer seemed so eternal. Gass explains that “people were lifting their heads from canonical books to look boldly around, and what they saw first were errors, plentiful as leaves. Delight and despair took turns managing their moods.” Even while The Anatomy of Melancholy used Galen’s humoral theory that dominated medicine since the second century, the English surgeon William Harvey was developing his book Anatomical Account of the Motion of the Heart and Blood, which would dispel the basis for the four bodily humors (it would take two more centuries to die). There were more practical reasons for melancholy as well. On the continent, the Thirty Years War started three years before Burton’s book was completed and would end in 1648, eight years after he died. As many as 12 million people perished, a death rate that dwarfed all previous European wars, with one out of five people on the continent dead. Writing in the early ’20s, Burton’s native England was headed towards inevitable civil war, disunion clear in the political and religious polarization. By its conclusion, 200,000 people were dead, fully 2.5 percent of the population. By comparison, that would be as if 12 million contemporary Americans were killed. Disease could be just as deadly as the New Model Army; over the course of Burton’s life the bubonic plague broke out in 1603, 1625, and 1636, with close to 1000,000 deaths. Depression can come from an imbalance within the body, but sometimes insanity is also a sane reaction to an insane world. You still have to bear it, however.     

Burton is good humored, he may even have been jovial from time to time, but he’s resolutely a partisan of the sighing angels. Not that Burton didn’t advocate for treatment, even while he emphasized his own inexpertness. Solomon explained that Burton recommends “marigold, dandelion, ash, willow, tamarisk, roses, violets, sweet apples, wine, tobacco, syrup of poppy, featherfew, Saint’-John’s-wort…and the wearing of a ring made from the right forefoot of an ass.” We are, it should be said, fortunate to have refined our prescriptions. Despite the fact that Americans hunger for painkillers both helpful and deadly, The Anatomy of Melancholy isn’t a particularly American book. If the English malady is glum dampness, then the American affliction is plucky sociopathic optimism. A can-do-attitude, pulling-ones-self-up-from-the-bootstraps, rugged individualism, grit, determination, cheeriness. We were once inundated by snake oil salesmen and medicine men, now we have self-help authors and motivational speakers. A nation where everybody can be a winner in seven easy steps and there are keys to a new car under ever guest’s seat. “Americans are a ‘positive’ people,” writes Barbara Ehrenreich in Bright-Sided: How Positive Thinking is Undermining America, this “is our reputation as well as our self-image…we are upbeat, cheerful, optimistic, and shallow.” Some crucial points: optimism is not equivalent with happiness, and if anything, it’s a mask when we lack the latter. That’s not bearing it—that’s deluding ourselves.

We weren’t always like this; we have our counter-melody to faux-positivity, from those dour Puritans to Herman Melville writing of the “damp drizzly November in my soul.” But could anyone imagine Abraham Lincoln being elected today, who as a young lawyer in 1841, would write that “I am now the most miserable man living. If what I feel were equally distributed to the whole human family, there would not be one cheerful face on earth?” Now, with the ascendancy of the all-seeing Smiley Face, we’ve categorized talk like that as weakness, even if we don’t admit what we’re doing. Our 16th president had a melancholic understanding that grappled with wisdom, what Roy Porter in Flesh in the Age of Reason: The Modern Foundations of Body and Soul phrased as “Melancholy and spleen, those stigmata of true scholarly dedication.” An ability to see the world as it is. Not just as some cankered, jaundiced, diseased thing, but how in the contrast of highs and lows there is a sense of how ecstatically beautiful this life is, even in its prosaic mundanity. Solomon writes that “I love my depression. I do not love experiencing my depression.” He explains that the “opposite of depression is not happiness but vitality,” and ignorance of this distinction bolsters the cult of positivity. Therapy is honest, unsparing, difficult, and painful. Self-help just tells you what you want to hear. Norman Vincent Peale wrote in Stay Alive All Your Life that the “dynamic and positive attitude is a strong magnetic force which, by its very nature, attracts good results.” This, quite clearly, is unmitigated bullshit. Instead of Dale Carnegie, we need Donne; rather than Eckhart Tolle we could use Browne; let’s replace Tony Robbins with Robert Burton.  

Because, dear reader, if you haven’t noticed, we’re not at a happy point in history. America’s cheery cult of optimism is finally folding under the onslaught of the pandemic, political extremism, economic collapse, and the ever-rising mercury. If you’re the sort who’d be chemically glum even in paradise, then if you’ve already been to hell, you might have a bit of extra knowledge folks could benefit from. Stanley Fish explains in Self-Consuming Artifacts: The Experience of Seventeenth Century Literature how “sober discourse itself is an impossibility given the world,” and that for Burton “nothing—no person, place, object idea—can maintain its integrity in the context of an all-embracing madness.” Gass is even more blunt on the score: “When the mind enters a madhouse…however sane it was when it went in, and however hard it struggles to remain sane while there, it can only make the ambient madness more monstrous, more absurd, more bizarrely laughable by its efforts to be rational.” Burton speaks to our epoch, for depression is both real and there are legitimate reasons to be depressed. As he writes, melancholy is an “epidemical disease,” now more than ever. Burton’s prescriptions, from tincture of marigold to stewed offal, seem suspect—save for one. With the whole world an asylum, Burton advocates for awareness. There are risks to such hair-of-the-dog though. “All poets are mad,” Burton writes, the affliction of “excellent Poets, Prophets, &c,” and I suspect, dear reader, that you too may be in that “etcetera.” Depression, along with addiction, is the writer’s disease. Sylvia Plath, James Baldwin, David Foster Wallace, Ann Sexton, Arthur Rimbaud, F. Scott Fitzgerald, Mark Twain, Emily Dickinson, and so on—all wrestled with the noon-day demon. Many of them died because of it, at least in one way or another. There is no shame here, only sadness that some couldn’t be around with us a bit longer, and the genuine, deep, and loving request that you, dear reader, stick around here. As for Burton, he was committed to the principle that “I write of melancholy, by being busy to avoid melancholy,” and it often worked. He gives the most poignant expression to that black veil that shrouds the mind, the caul of the soul that afflicts some from time to time. If writers are prone to depression, then Burton’s tome was an attempt to write himself out of it, to “satisfy and please myself, make a Utopia of mine own, a New Atlantis, a poetical commonwealth of mine own.” We’re lucky that he did, because even if it’s not the only thing—even if it’s not always the best of things—there is something sacred in that. No matter how occluded, know that somebody else understands what you’re feeling.

So, blessed is Burton and duloxetine, therapy and sertraline, writing and citalopram, empathy and fluoxetine, compassion and escitalopram. Blessed are those who ask for help, those who are unable to ask for help, those who ask if somebody else needs help. Blessed are those who struggle everyday and blessed are those who feel that they no longer can, and blessed are those who get better. Blessed are those who hold the hands of anyone suffering. Blessed is understanding—and being seen—for what Burton offers you is the observation that he sees you, the reader. “I would help others, out of a fellow-feeling,” he writes. Because Robert Burton often felt worthless; as if he was walking at the bottom of the chill Thames. Sometimes it felt like his skull was filled with water-soaked-wool and his eyes pulsated, vision clouded over with gauzy darkness; he knew of listing heaviness and the futility of opening the door, of getting dressed, of leaving the bed, of how often the window of care shrunk to a pinpoint of nothingness, so that he could feel no more than that. This strange book studded with classical allusion and scriptural quotation, historical anecdote and metaphysical speculation—who was it for? He wrote for the person who has had the tab for this article open on their computer for days, but has no energy to read; for the woman who hasn’t showered in weeks and the man who can’t bring himself to go outside. “Thou thyself art the subject of my discourse,” Burton writes. His purpose was one thing—to convey that you have never been alone. Not then, not now, not ever.    

And the Walls Came Down

- | 1

Across seven seasons and 178 episodes, Patrick Stewart performed the role of Capt. Jean-Luc Picard of the United Federation of Planets NCC-1701-D starship the USS Enterprise with professionalism, wisdom, and humanity. Displaying granitoid stoicism and reserved decorum, Picard was the thinking-person’s captain. The fifth season episode “Darmok” is an example of how Star Trek: The Next Generation was among the most metaphysically speculative shows to ever be broadcast. In the beginning, the Enterprise is hailed by the Tamarians, but the Enterprise’s computerized translator can’t comprehend their language. Against his will, Picard is transported alongside the Tamarian captain Dathon—whose head looks like a cross between a pig and a walnut—to the surface of a hostile planet. Confronted by a massive, shimmering, and horned beast, Dathon tosses Picard a dagger and yells “Darmok and Jalad at Tanagra.” After they’ve defeated the monster, though at the price of Dathon being injured, Picard realizes that the enigmatic phrase is a reference, an allusion, a myth. Thus, a clue to their language—Tamarians speak only in allegory. Dathon has invoked two mythic heroes to communicate that he and Picard must fight the beast together. Doubtful though it may be that Paul Ricoeur was a Trekkie, but Dathon embodies the French philosopher’s claim in The Rule of Metaphor: The Creation of Meaning in Language that “Only a feeling transformed into myth can open and discover the world.” Dathon, however, has sacrificed himself upon the altar of comprehensibility, for he received a fatal blow, and dies as a pieta in Picard’s arms. As he passes into that undiscovered country, Picard speaks to him in our own mythic-metaphorical tongue – “They became great friends. Gilgamesh and Enkidu at Uruk.” 
Tamarian makes explicit our own language’s reliance on the figurative—any language’s use of metaphor, for that matter. “Try as one might, it is impossible to free oneself from figural language,” writes Marjorie Garber in The Use and Abuse of Literature, as “all language is figurative.” Examine my first paragraph, for there are several mythic allusions throughout—the first clause of my fourth sentence reads “In the beginning,” a reference to the 16th-century Bible translator William Tyndale’s rendering of the Hebrew Bereshith in Genesis (later incorporated into the more famous King James Version, as well as the translation of a different koine phrase in John 1:1). My penultimate sentence has two mythopoeic references; one to Dathon having “sacrificed himself upon the altar,” and a mention of the “pieta,” a word that translates to “piety” and often refers to Christ dead in the arms of the Virgin Mary. Such mythic allusions abound in language. For example—as a writer my Achilles’ Heel is that I take on Herculean tasks like writing an overview of metaphor, requiring me to stop resting on my laurels and to open up a Pandora’s Box, so that I can leave no stone unturned; touch on wood, and don’t treat me like a Casandra, but let’s hope that I can cut the Gordian Knot here. Such Greco-Roman adornments aren’t just mythic allusions, they’re also a dead or at least dying metaphors—for who among us ever tells somebody that we’re “Between a rock and a hard place” while thinking of the Scylla and Charybdis in Homer’s The Odyssey?  That’s another way to say that they’re clichés, but mythic connections abound in less noticeable ways too, though perhaps I’d do well this Wednesday to render unto Odin what belongs to Odin and to spare a thought for the Germanic goddess Frigg when the work week draws to a close.

Star Trek: TNG screenwriters Joe Menosky and Philip LaZebnik make “metaphorical” synonymous with “mythic,” though figurative language draws from more than just classical tales of gods and heroes, but from anything that transfers meaning from one different realm into another. In my first paragraph, I described Stewart’s face as “granitoid,” though he is not coarse-grained igneous rock; later I deployed a simile (rhetoricians disagree on how different that conceit is from metaphor) where I said that Dathon looked like a cross between a pig and a walnut. I can’t claim that these are great metaphors, but they were my attempt to steer the course of the ship away from the rocky shoals of cliché. “A newly invented metaphor assists thought by evoking a visual image,” writes George Orwell in “Politics and the English Language,” his classic essay of 1946, “while on the other hand a metaphor which is technically ‘dead’… has in effect reverted to being an ordinary word and can generally be used without loss of vividness.” Between the arresting, lyrical, novel metaphor of the adept poet and the humdrum language of the everyday are the “huge dump of worn-out metaphors which have lost all evocative power and are merely used because they save people the trouble of inventing phrases for themselves.” Orwell lists around a dozen clichés, including “ride roughshod over,” “grist to the mill,” “swansong,” and “toe the line.” When encountering clichés such as these, in other peoples’ writing but especially in my own prose, they appear to me like a dog turd hidden under the coffee table; their fumes make my eyes water and my stomach churn. Cliché must be ruthlessly expunged by red pen at every opportunity. But those two other categories that Orwell lists are entirely more interesting.

Dead metaphors—not just those on life support but also those decomposing in the ground—merit us getting a shovel and some smelling salts. Tamarian was imagined as structured by metaphor, but the vast majority of words you use every day were originally metaphors. Take the word “understand;” in daily communication we rarely parse it’s implications, but the word itself is a spatial metaphor. Linguist Guy Deutscher explains in The Unfolding of Language: An Evolutionary Tour of Mankind’s Greatest Invention how “understand,” derived from the Middle English understanden, itself from the Anglo-Saxon understandan, and ultimately from the Proto-Germanic understandana. “The verb ‘understand’ itself may be a brittle old skeleton by now,” Deutscher writes, “but its origin is till obvious: under-stand originally must have meant something like ‘step under,’ perhaps rather like the image in the phrase ‘get to the bottom of.'” Such spatial language is common, with Deutscher listing “the metaphors that English speakers use today as synonyms: we talk of grasping the sense, catching the meaning, getting the point, following an explanation, cottoning on to an idea, seeing the difficulty.” The word “comprehend” itself is a metaphor, with an etymology in the Latin word prehendere, which means “to seize.” English has a propensity to those sorts of metaphors, foreign loan words from Anglo-Saxon, Frisian, and Norman; Latin, French, Italian, and Spanish; Abenaki, Wolof, and Urdu, which don’t announce themselves as metaphors precisely because of their foreignness—and yet that rich vein of the figurative runs through everything. Dan Paterson gives two examples in his voluminous study The Poem, explaining how a word as common as “tunnel” is from the “Medieval English tonel, a wide-mouthed net used to trap birds, so its first application to ‘subterranean passage’ will have been metaphorical—and would inevitably have carried the connotation ‘trap’ for a little while.” Similarly, the word “urn” comes from “the Latin urere, to burn, bake,” the word itself holding a connotative poetry of immolation. In both these examples, “the original neologists will have been aware of their elation to earlier terms,” Paterson writes, “but this conscious awareness can be lost very rapidly.” If you’re lucky enough to have a subscription to the Oxford English Dictionary, you can trace the etymology of prosaic words and find their glamorous metaphorical beginnings. Within the seemingly hollow throat of every word resides the voice of metaphor, no matter how faint it’s call to us may be.

The manner in which metaphors fossilize into everyday words is not unlike the process during the fetid days of the Carboniferous Period when giant spiders and scorpions, salamanders and sharks, scaled trees and seeded plants would sink into some shallow tropical pool and fossilize until dug out of the ground by a coal miner in Wales or West Virginia. Just as miners occasionally find deposits in the shape of a trilobite, so too do some metaphors that are dead appear as obvious clichés, but the bulk of our language is so dark and hard that it might as well be bituminous. Yet as burning coal releases heat, so too does the glow of meaning come off of these rocks that we call words, the once vital energy of metaphor still hidden inside. “However stone dead such metaphors seem,” writes I.A. Richards in The Philosophy of Rhetoric, “we can easily wake them up.”  Such evolutionary development is what linguists call “semantic change,” as tangible and concrete words are used as metaphors, transferring intent in a one-way direction towards abstraction. “The mind cannot just manufacture words for abstract concepts out of thin air—all if it can do is adapt what is already available. And what’s at hand are simple physical concepts,” writes Deutscher, so that when I say that I’m trying to establish a foundation for my argument, I draw from the concrete metaphors of construction, while it would be odd to appropriate abstraction to describe something tangible. “One thing is certain,” Deutscher says, “nothing can come from nothing.”

Every word is a metaphor; every phrase, no matter how prosaic, is a poem—even if it’s mute. Words don’t correspond to reality; they only correspond to one another. All languages are a tapestry of threads forever floating above the ground. “Metaphor is the transference of a term from one thing to another, whether from genus to species, species to genus, species to species, or by analogy,” as Aristotle defined it in his Rhetoric, the first book to taxonomize the trope. The word metaphor is itself, rather predictably, a metaphor. The Greek μεταφορά, as Aristotle’s explanation literally states, translates as “to transfer,” with all the connotations of transport, shifting, movement, and travel. As contemporary travelers to Greece still discover, delivery vans have the word metaphora painted on their side, something that was a delight to Ricœur. No language can lack metaphor for the same reason that no tongue can lack fiction; it’s not an issue of grammar in the same way that there are languages incapable of tense or person, but rather figuration is the domain of rhetoric. Wherever there are words that are used to designate one thing, then the moment somebody uses those terms to refer to something else, we are within the realm of metaphor. This, it must be said, is rather different from encouraging novel metaphors, or enshrining metaphor as a poetic device, or really even noticing that it exists.

During the Middle Ages, there was a literary fascination with metaphor’s gilded siblings—parable and allegory—but the explication of figuration’s significance didn’t move much beyond Aristotle’s original definition. By 1589, the great English rhetorician George Puttenham would still define the term in The Art of English Poesy as involving “an inversion of sense by transport,” and that belief that metaphor gets you somewhere else remains central, metaphorically at least. Contemporary study of metaphor begins with Richards’s invaluable 1937 The Philosophy of Rhetoric. Because the subject went largely unexplored over two millennia, Richards complained that the “detailed analysis of metaphors, if we attempt it with such slippery terms as these, sometimes feels like extracting cube-roots in the head.” He had a method of exactness, however, for Richards presents a novel vocabulary, a vocabulary that—also unsurprisingly—is metaphorical. According to Richards, a metaphor is composed of two parts, the “tenor” and the “vehicle.” The first is whatever it is that’s being described, and the second is that whose attributes are being carried over in the description of the former (shades of that Greek delivery van). “For the Lord God is a sun and a shield,” writes the Psalmist, providing us an opportunity to see how Richards’s reasoning works. Belying the fundamentalist fallacy that the Bible is a literal text—though nothing is—it’s clear to most that God is neither a giant ball of ionized hydrogen undergoing nuclear fusion into helium, nor is He defensive armor. What God is, in King David’s metaphor, is the tenor, because He is what is being described. The attributes that are being borrowed from “sun” and “shield” are those connotations of being life-giving, luminescent, warm, as well as being defensive and protective. “Sun” and “shield” are even gently contradictory, as blocking out the former with the later can testify towards; but it’s also a demonstration of how the non-literal can express reality in its glorious paradox. In exactly the same manner, Robert Frost’s central conceit in “The Road Not Taken” uses the vehicle of an arduous and uncertain wooded path on the implied tenor of the narrator’s life. A great richness of poetic metaphor—as separate from cliché—is that it allows for ambiguity of referent, so that meaning is a many-colored thing. What makes a poetic metaphor successful is the delicate interplay between tenor and vehicle. A poet’s task is to space that width just right, and to somehow surprise the reader while doing so, without befuddling them.

Poets are less the unacknowledged legislators of the world than they are its wizards, because the form has “the power to define reality,” as George Lakoff and Mark Johnson write in Metaphors We Live By. Or maybe it’s more accurate to say that those who nurture metaphors are as if apiarists, since the figurative is like pollen from a flower and each word a bee, meaning traded in a continual hum. Language—and thus reality—is supersaturated with meaning, everything capable of being transformed into something else. “If all language is metaphorical,” writes David Punter in his short study Metaphor, “then it could also follow that we might want to say that all language is continually involved in a series of acts of translation: translating things which are difficult to apprehend into things we can apprehend.” Just as translation is based in difference, so is all communication. All language is relational, a communion between that-which-is and that-which-isn’t. Because words are always arbitrarily connected to that which they represent, language is intrinsically metaphorical, the tethering of random shapes on a page or the vibration of air molecules to an outside sense, the compulsion of someone with a mind different from your own imagining that which you wish them to. Without metaphor there’s no fiction, and without fiction there’s no metaphor. Without either, there’s no possibility of communication. Metaphor is a bridge that doesn’t exist that you’re still able to cross, for faith in being understood is that which gets you to the other side.    

Figurative language encompasses the expansiveness of metaphor; the insularity of metonymy; the granularity of synecdoche; the straightforwardness of simile; the folksiness of parable; the transcendence of allegory. We don’t just read metaphor in literature; humans have always seen it in events, in nature, in the cosmos, in any manner of thinking that sees existence as overdetermined, as meaning permeating that which would otherwise be inert. We see metaphor in the hidden grinning skull of Hans Holbein’s painting The Ambassadors and the melting clocks in Salvador Dali’s The Persistence of Memory. It’s in F. Scott Fitzgerald’s green light and Herman Melville’s whale. It’s in the Anglo-Saxon form of the “kenning,” where the sky is a “bird-road” and Homer’s evocation of the Aegean “wine-dark sea”; in saying that space-time can “curve” and that genes are “selfish”; with Rene Descartes description of the body as a machine and William Paley’s claim that the universe is a clock, and the understanding that God can be both Mother and Father. Politics is threaded through with metaphor, narrative, and artifice—the most effective means of getting the masses to listen to you, for both good and evil. Metaphor is both what facilitated the horror of the Rwandan genocide when Hutu propagandists described the Tutsis as “cockroaches,” as well as what generates the hopefulness in the socialist call for “breads and roses.” Symbolism undergirds both the eagle in the Seal of the United States of America and that of the Third Reich. That the same animal is used for both only emphasizes how mercurial metaphor happens to be. As Punter explains, “metaphor is never static, and rarely innocent.” Figuration’s precise power and danger comes from such slipperiness, as everything is forever morphing and mutating between the metaphorical and the literal.

When Patrick Stewart first came to the United States in 1978, arriving in Los Angeles where he would make his greatest fame as a starship captain, it was as a 37-year-old member of the Royal Shakespeare Company performing the role of Duke Senior in As You Like It at the Ahmanson Theatre. Traipsing through the enchanted forest of Arden, and Stewart appeared opposite Alan Howard performing as Jacques. During the seventh scene of the second act, Howard (most celebrated for voicing Sauron in the Peter Jackson’s Lord of the Rings films) uttered the most celebrated extended metaphor in Shakespeare’s literary career of extended metaphors: “All the world’s a stage, /And all the men and women merely players;/They have their exits and their entrances;/And one man in his time plays many parts.” Conventionally understood as referring to the various roles we all perform over the course of our lives, it’s an apt encapsulation of metaphor itself. All of communication is a stage, and every word is merely a player, and one word can play many parts. When it comes to figuration, Jacques’s monologue is right up there with Darmok and Jalad at Tanagra. Because nothing in the English language is as perfect—as immaculate, as blessed, as sacred, as divine—as the word “like.” What it effects is a syntactical transubstantiation, the transformation of one thing into another. If everything—our concepts, our words, our minds—are sealed off behind a wall of unknowability, then metaphor is that which can breach those walls. Whether implied or stated, “like” is a bridge transporting sense across the chasm of difference, providing intimations of how all ideas and things are connected to each other by similarities, no matter how distant. Whether uttered or only implied, “like” is the wispy filament that nets together all things. Perhaps there is naked reality, but we’ll never be able to see it ourselves, always clothing it in finery that is continually put on and taken off. In our life, metaphor is the intimate kiss between difference.

Image Credit: Wikipedia

Nothing Outside the Text

- | 4

Let’s pretend that you work in a windowless office on Madison, or Lex, or Park in the spring of 1954, and you’re hurrying to Grand Central to avoid the rush back to New Rochelle, or Hempstead, or Weehawken. Walking through the Main Concourse with its Beaux Arts arches and its bronzed clock atop the ticketing booth, its cosmic green ceiling fresco depicting the constellations slowly stained by the accumulated cigarette smoke, you decide to purchase an egg-salad sandwich from a vendor near the display with its ticking symphony of arrivals and departures. You glance down at a stack of slender magazines held together with thick twine. The cover illustration is of a buxom brunette wearing a yellow pa’u skirt and hauling an unconscious astronaut in a silver spacesuit through a tropical forest while fur-covered extraterrestrials look on between the palm trees. It’s entitled Universe Science Fiction. And if you were the sort of Manhattan worker who, after clocking in eight hours at the Seagram Building or the Pan Am Building, settles into the commuter train’s seat—after loosening his black-knit tie and lighting a Lucky Strike while watching Long Island go by—ready to be immersed in tales of space explorers and time travelers, then perhaps you parted with a quarter so that Universe Science Fiction’s cheap print would smudge your gray flannel suit. The sort of reading you’d want to cocoon yourself in with nothing outside the text. As advertised, you find a story of just a few hundred words entitled “The Immortal Bard,” by a writer named Isaac Asimov.

Reprinted three years later in Earth Is Room Enough, “The Immortal Bard” is set among sherry-quaffing, tweedy university faculty at their Christmas party, where a boozed-up physicist named Phineas Welch corners the English professor Scott Robertson, and explains how he’s invented a method of “temporal transference” able to propel historical figures into the present. Welch resurrects Archimedes, Galileo, and Isaac Newton, but “They couldn’t get used to our way of life… They got terribly lonely and frightened. I had to send them back.” Despite their genius, their thought wasn’t universal, and so Welch conjures William Shakespeare, believing him to be “someone who knew people well enough to be able to live with them centuries away from his own time.” Robertson humors the physicist, treating such claims with a humanist’s disdain, true to C.P. Snow’s observation in The Two Cultures and the Scientific Revolution that “at one pole we have the literary intellectuals, at the other scientists… Between the two a gulf of mutual incomprehension.” Welch explains that the Bard was surprised that he was still taught and studied, after all, he wrote Hamlet in a few weeks, just polishing up an old plot. But when introduced to literary criticism, Shakespeare can’t comprehend anything. “God ha’ mercy! What cannot be racked from words in five centuries? One could wring, methinks, a flood from a damp clout!” So, Welch enrolls him in a Shakespeare course, and suddenly Robertson begins to fear that this story isn’t just a delusion, for he remembers the bald man with a strange brogue who had been his student. “I had to send him back,” Welch declares, because our most flexible and universal of minds had been humiliated. The physicist downed his cocktail and mutters “you poor simpleton, you flunked him.”

“The Immortal Bard” doesn’t contain several of the details that I include—no sherry, no tweed (though there are cocktails). We have no sense of the characters’ appearances; Welch’s clothes are briefly described, but Robertson is a total blank. A prolific author, penning over 1,000 words a day, by Asimov’s death in 1992 he had published more than 500 books across all categories of the Dewey Decimal System (including Asimov’s Guide to Shakespeare). Skeletal parsimony was Asimov’s idiom; in his essay “On Style” from Who Done It?, coedited with Alice Laurence, he described his prose as “short words and simple sentence structure,” bemoaning that this characteristic “grates on people who like things that are poetic, weighty, complex, and, above all, obscure.” Had his magisterial Foundation been about driving an ambulance in the First World War rather than a galactic empire spanning billions of light years, it’d be the subject of hundreds of dissertations. If his short story “The Nine Billion Names of God” had been written in Spanish, then he’d be Jorge Luis Borges; had Asimov’s “The Last Question” originally been in Italian, then he’d be Italo Calvino (American critics respect fantasy in an accent, but then they call it “magical realism”). As it was, critical evaluation was more lackluster, with the editors of 1981’s Dictionary of Literary Biography claiming that since Asimov’s stories “clearly state what they mean in unambiguous language [they] are… difficult for a scholar to deal with because there is little to be interpreted.”

Asimov’s dig comes into sharper focus having admitted that “The Immortal Bard” was revenge on those professors who’d rankled him by misinterpreting stories—his and other’s. A castigation of the discipline as practiced in 1954, which meant a group that had dominated the study of literature for three decades—the New Critics. With their rather uninspired name, the New Critics—including I.A. Richards, John Crow Ransome, Cleanth Brooks, Allen K. Tate, Robert Penn Warren, William Empson, and a poet of some renown named T.S. Eliot (among others)—so thoroughly revolutionized how literature is studied that explaining why they’re important is like explaining air to a bird. From Yale, Cambridge, and Kenyon, the New Criticism would disseminate, and then it trickled down through the rest of the educational system. If an English teacher asked you to analyze metaphors in Shakespeare’s “Sonnet 12″—that was because of the New Critics. If an AP instructor asked you to examine symbolism in F. Scott Fitzgerald’s The Great Gatsby—that was because of the New Critics. If a college professor made you perform a rhetorical analysis of Virginia Woolf’s Mrs. Dalloway—that was because of the New Critics. Most of all, if anyone ever made you conduct a “close reading,” it is the New Critics who are to blame.

According to the New Critics, their job wasn’t an aesthetic one, parsing what was beautiful about literature or how it moved people (the de facto approach in Victorian classrooms, and still is in popular criticism), but an analytical one. The task of the critic was scientific—to understand how literature worked, and to be as rigorous, objective, and meticulous as possible. That meant bringing nothing to the text but the text. Neither the critic’s concerns—or the author’s—mattered more than the placement of a comma, the connotation of a particular word, the length of a sentence. True that they often unfairly denigrated worthy writing because their detailed explications only lent themselves to certain texts. Poetry was elevated above prose; the metaphysical over the Romantic; the “literary” over genre. But if the New Critics were snobbish in their preferences, they also weren’t wrong that there was utility in the text’s authority. W.K. Wimsatt and Monroe Beardsley argued in a 1946 issue of The Sewanee Review that the “design or intention of the author is neither available nor desirable as a standard for judging the success of a work of literary art.” In a more introspective interview, Asimov admitted that what inspired “The Immortal Bard” was his inadequacy at answering audience questions about his own writing. Asimov was right that he deserved more attention from academics, but wrong in assuming that he’d know more than them. The real kicker of the story might be that Shakespeare actually earned his failing grade.     

When New Critics are alluded to in popular culture, it’s as anal-retentive killjoys. Maybe the most salient (and unfair) drumming the New Critics received was in the 1989 (mis)beloved Peter Weir film Dead Poets Society, featuring Robin Williams’s excellent portrayal of John Keating, an English teacher at elite Welton Academy in 1959. Keating arrives at the rarefied school urging the boys towards carpe diem, confidently telling them that the “powerful play goes on, and you may contribute a verse.” With vitality and passion, Keating introduces the students to Tennyson, Whitman, and Thoreau, and the audience comprehends that this abundantly conservative curriculum is actually an act of daring, at least when contrasted to the New Critical orthodoxy that had previously stultified the children of millionaires. On his first day, wearing corduroy with leather arm patches, Keating asks a student to recite from their approved textbook by Professor J. Evans Pritchard. In a monotone, the student reads “If the poem’s score for perfection is plotted on the horizontal of a graph and its importance is plotted on the vertical, then calculating the total area of the poem yields the measure of its greatness.” We are to understand that after the cold scalpel of analysis cuts the warm flesh of the poem—that if it wasn’t already dead—then it certainly perished thereafter. Keating pronounces the argument to be “Excrement,” and demands his students rip the page out, so that a film that defends the humanities depicts the destruction of books. “But poetry, beauty, romance, love, these are what we stay alive for,” Keating tells his charges, and how could dusty Dr. Pritchard compete with a Cartesian coordinate plane ?

The fictional Pritchard’s essay is an allusion to Brook and Warren’s influential 1938 Understanding Poetry, where they write that “Poetry gives us knowledge… of ourselves in relation to the world of experience, and to that world considered… in terms of human purposes and values.” Not soaring in its rhetoric, but not the cartoon from Dead Poets Society either, though also notably not what Keating advocated. He finds room for poetry, beauty, romance, and love, but neglects truth. What Keating champions isn’t criticism, but appreciation, and while the former requires rigor and objectivity, the latter only needs enjoyment. Appreciation in and of itself is fine—but it doesn’t necessarily ask anything of us either. Kevin Detmar in his defenestration of the movie from The Atlantic writes that “passion alone, divorced from the thrilling intellectual work of real analysis, is empty, even dangerous.” Pontificating in front of his captive audience, Keating recites (and misinterprets) poems from Robert Frost and Percy Shelley, demanding that “When you read, don’t just consider what the author thinks, consider what you think.” Estimably sensible, for on the surface an enjoinder towards critical thought and independence seems reasonable, certainly when reading an editorial, or a column, or policy proposal—but this is poetry.

His invocation is diametrically opposed to another New Critical principle, also defined by Wimsatt and Beardsley in The Sewanee Review in 1946, and further explicated in their book The Verbal Icon: Studies in the Meaning of Poetry, published the same year as Asimov’s story. Wimsatt and Beardsley say that genuine criticism doesn’t “talk of tears; prickles or other physiological symptoms, of feeling angry, joyful, hot, cold, or intense, or of vaguer states of emotional disturbance, but of shades of distinction and relation between objects of emotion.” In other words, it’s not all about you. Being concerned with the poem’s effect tells us everything about the reader but little about the poem. Such a declaration might seem arid, sterile, and inert—especially compared to Keating’s energy—and yet there is paradoxically more life in The Verbal Icon. “Boys, you must strive to find your own voice,” Keating tells a room full of the children of wealth, power, and privilege, who will spend the whole 20th century ensuring that nobody has the option not to hear them. Rather than sounding the triumphant horn of independence, this is mere farting on the bugle of egoism. Poetry’s actual power is that it demands we shut up and listen. Wimsatt and Beardsley aren’t asking the reader not to be changed by a poem—it’s the opposite. They’re demanding that we don’t make an idol from our relative, contingent, arbitrary reactions. A poem is a profoundly alien place, a foreign place, a strange place. We do not go there to meet ourselves; we go there to meet something that doesn’t even have a face. Keating treats poems like mirrors, but they’re windows.

With austerity and sternness, New Criticism is an approach that with sola Scriptura exactitude understood nothing above, below, behind, or beyond the text. Equal parts mathematician and mystic, the New Critic deigns objectivity the preeminent goal, for in the novel, or poem, or play properly interpreted she has entered a room separate, an empire of pure alterity. An emphasis on objectivity doesn’t entail singularity of interpretation, for though the New Critics believed in right and wrong readings, good and bad interpretations, they reveled in nothing as much as ambiguity and paradox. But works aren’t infinite. If a text can be many things, it also can’t mean everything. What is broken are the tyrannies of relativism, the cult of “I feel” that defined conservative Victorian criticism and ironically some contemporary therapeutic manifestations as well. New Criticism drew from the French tradition of explication de texte, the rigorous parsing of grammar, syntax, diction, punctuation, imagery, and narrative. Not only did they supplant Victorian aesthetic criticism’s woolliness, their method was estimably democratic (despite their sometimes-conservative political inclinations, especially among the movement known as the Southern Agrarians). Democratic because none of the habituated knowledge of the upper-class—the mores of Eton or Philips Exeter, summering at Bath or Newport, proper dress from Harrod’s or Brooks Brothers—now made a difference in appreciating Tennyson or Byron. Upper class codes were replaced with the severe, rigorous, logical skill of being able to understand Tennyson or Byron, with no recourse to where you came from or who you were, but only the words themselves. Appreciation is about taste, discernment, and breeding—it’s about acculturation. Analysis? That’s about poetry.

From that flurry of early talent came Richards’s 1924 Principles of Literary Criticism and 1929 Practical Criticism, Empson’s 1930 Seven Types of Ambiguity, Brook and Warren’s 1938 Understanding Poetry, Brooks’s classic 1947 The Well Wrought Urn: Studies in the Structure of Poetry, and Wimsatt and Beardsley’s 1954 The Verbal Icon, as well as several essays of Ransome and Eliot. The New Critics did nothing less than upend literature’s study by focusing on words, words, words (as Hamlet would say). “A book is a machine to think with,” Richards wrote in Principles of Literary Criticism, and disdain that as chilly, but machines do for us that which we can’t do for ourselves. Reading yourself into a poem is as fallacious as sitting in a parked car and thinking that will get you to Stop & Shop. Lest you think that Richards is sterile, he also affirmed that “Poetry is capable of saving us,” and that’s not in spite of it being a machine, but because of it. They sanctified literature by cordoning it off and making it a universe unto itself, while understanding that its ideal rigidity is like absolute zero, an abstraction of readerly investment that by necessity always lets a little heat in. In practice, the job of the critic is profound in its prosaicness. Vivian Bearing, a fictional English professor in Margaret Edson’s harrowing and beautiful play W;t, which contains the most engaging dramatization of close reading ever committed to stage or screen, describes the purpose of criticism as not to reaffirm whatever people want to believe, but that rather by reading in an “uncompromising way, one learns something from [the] poem, wouldn’t you say?”

There have been assaults upon this bastion over the last several decades, yet even while that mélange of neo-orthodoxies that became ascendant in the ’70s and ’80s when English professor was still a paying job are sometimes interpreted as dethroning the New Critics—the structuralists and post-structuralists, the New Historicists and the Marxists, the Queer Theorists and the post-colonial theorists—their success was an ironic confirmation of the staying power of Wimsatt, Beardsley, Brooks, Warren, Richards, Empson, and so on. “Like all schools of criticism, the New Critics have been derided by their successors,” writes William Logan in his forward to Garrick Davis’s Praising It New: The Best of the New Criticism, “but they retain an extraordinary influence on the daily practice of criticism.” After all, when a post-structuralist writes about binary oppositions, there are shades of Empson’s paradox and ambiguity; when Roland Barthes wrote in 1967 that “the reader is without history, biography, psychology” there is a restatement about effect, and that he was writing in a book called The Death of the Author is the ultimate confirmation against intention. And Jacques Derrida’s deconstruction, that much maligned and misinterpreted word, bane to conservative and balm to radical? It’s nothing more than a different type of close reading—a hyper attenuated and pure form of it—where pages could be produced on a single comma in James Joyce’s Ulysses. We are still their children.

Though far from an unequivocal celebrant of the New Critics, Terry Eagleton writes in Literary Theory: An Introduction that close reading provided “a valuable antidote to aestheticist chit-chat,” explaining that the method does “more than insist on due attentiveness to the text. It inescapably suggests an attention to… the ‘words on the page.'” Close reading is sometimes slandered as brutal vivisection, but it’s really a manner of possession. In the sifting through of diction and syntax, grammar and punctuation, image and figuration, there are pearls. Take a sterling master—Helen Vendler. Here she is examining Shakespeare’s Sonnet 73, where he writes “In me thou see’st the glowing of such fire/That on the ashes of his youth doth lie.” She notes that the narrator across the lyric “Defines himself not by contrast but by continuity with his earlier state. He is the glowing—a positive word, unlike ruin or fade—of a fire.” Or Vendler on the line in Emily Dickinson’s poem 1779. The poet writes that “To make a prairie it takes a clover and a bee,” and the critic writes “Why ‘prairie’ instead of ‘garden’ or ‘meadow?’ Because only ‘prairie’ rhymes with ‘bee’ and ‘revery,’ and because ‘prairie’ is a distinctly American word.” Or, Vendler, once again, on Walt Whitman, in her book Poets Thinking: Pope, Whitman, Dickinson, Yeats. In Leaves of Grass he begins, “I celebrate myself, and sing myself,” and she observes that “The smallest parallels… come two to a line… When the parallels grow more complex, each requires a whole line, and we come near to the psalmic parallel, so often imitated by [the poet], in which the second verse adds something to the substance of the first.” And those are just examples from Helen Vendler.

When I queried literary studies Twitter about their favorite close readings—to which they responded generously and enthusiastically—I was directed towards Erich Auerbach’s Dante: The Poet of the Secular (to which I’d add Mimesis: The Representation of Reality in Western Literature); Marjorie Garber’s reading of Robert Lowell’s poem “For the Union Dead” in Field Work: Sites in Literary and Cultural Studies; the interpretation of Herman Melville’s Billy Budd in The Barbara Johnson Reader: The Surprise of Otherness; Shoshana Felman’s paper on Henry James titled “Turning the Screw of Interpretation” in Yale French Studies; Susan Howe’s My Emily Dickinson; Randall Jarry writing about Robert Frost’s “Home Burial” in The Third Book of Criticism; Olga Springer’s Ambiguity in Charlotte Bronte’s Villette; Northrop Frye’s Fearful Symmetry: A Study of William Blake; Vladimir Nabokov on Charles Dickens and Gustave Flaubert in Lectures on Literature; Ian Watt’s The Rise of the Novel: Studies in Defoe, Richardson, and Fielding. Nina Baym on Nathaniel Hawthorne in Novels, Readers, Reviewers: Responses to Fiction in Antebellum America; Minrose Gwin on The Sound and the Fury in The Feminine and Faulkner; Edward Said on Jane Austen’s Mansfield Park in Orientalism; Christopher Ricks’s Milton’s Grand Study (to which I’d add Dylan’s Vision of Sin, which refers not to Thomas but Bob), and so on, and so on, and so on. If looking to analyze my previous gargantuan sentence, just note that I organized said critics by no schema, save to observe how such a diversity includes the old and young, the dead and alive, the traditional and the radical, all speaking to the vitality of something with as stuffy a name as close reading. Risking sentimentality, I’d add another exemplary participant—all of those anonymous graduate students parsing the sentence, all of those undergraduates scanning the line, and every dedicated reader offering attentiveness to the words themselves. That’s all close reading really is.

Naïve to assume that such a practice offers a way out of our current imbroglio. However, when everyone has an opinion but nobody has done the reading, where an article title alone is enough to justify judgement, and criticism tells us more about the critic than the writing, then perhaps slow, methodical, humble close reading might provide more than just explication. Interpretations are multifaceted, but they are not relative, and for all of its otherworldliness, close reading is built on evidence. Poorly done close reading is merely fan fiction. Something profound in acknowledging that neither the reader—nor the author—are preeminent, but the text is rather the thing. It doesn’t serve to affirm what you already know, but rather to instruct you in something new. To not read yourself into a poem, or a novel, or a play, but to truly encounter another mind—not that of the author—but of literature itself, is as close to religion as the modern age countenances. Close reading is the most demonstrative way to experience that writing and reading are their own dimension.

Let’s pretend that you’re a gig worker, and while waiting to drive for Uber or Seamless, you scroll through an article entitled “Nothing Outside the Text.” It begins in the second-person, inserting the reader into the narrative. The author invents a mid-century office worker who is travelling home. Place names are used as signifiers; the author mentions “New Rochelle,” “Hempstead,” and “Weehawken,” respectively in Westchester County, Long Island, and New Jersey, gesturing towards how the city’s workers radiate outward. Sensory details are emphasized—the character buys an “egg-salad sandwich,” and we’re told that he stands near the “display with its ticking symphony,” the last two words unifying the mechanical with the artistic—with the first having an explosive connotation—yet there is an emphasis on regularity, harmony, design. We are given a description of a science fiction magazine the man buys, and are told that he settles into his train seat where the magazine’s “cheap print… smudge[s]” his “gray flannel suit,” possibly an allusion to the 1955 Sloane Wilson novel of that name. This character is outwardly conformist, but his desire to be “immersed in tales of space explorers and time travelers” signals something far richer about his inner life. Finally, if you’re this modern gig worker reading about that past office worker, you might note that the latter is engaging the “sort of reading you’d want to cocoon yourself in, a universe to yourself with nothing outside the text.” And in close reading, whether you’re you or me, the past reader or the present, the real or imagined, all that the text ever demands of us—no more and no less—is to enter into that universe on its own terms. For we have always been, if we’re anything, citizens of this text.   

Image Credit: Pixabay

This Isn’t the Essay’s Title

- | 1

“The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.” —F. Scott Fitzgerald, “The Great Crack Up” (1936)

“You’re so vain, you probably think this song is about you.” —Carly Simon (1971)

On a December morning in 1947 when three fellows at Princeton’s Institute for Advanced Study set out for the Third Circuit Court in Trenton, it was decided that the job of making sure that the brilliant but naively innocent logician Kurt Gödel didn’t say something intemperate at his citizenship hearing would fall to Albert Einstein. Economist Oscar Morgenstern would drive, Einstein rode shotgun, and a nervous Gödel sat in the back. With squibs of low winter light, both wave and particle, dappled across the rattling windows of Morgenstern’s car, Einstein turned back and asked, “Now, Gödel, are you really well prepared for this examination?” There had been no doubt that the philosopher had adequately studied, but as to whether it was proper to be fully honest was another issue. Less than two centuries before, and the signatories of the U.S. Constitution had supposedly crafted a document defined by separation of powers and coequal government, checks and balances, action and reaction. “The science of politics,” wrote Alexander Hamilton in “Federalist Paper No. 9,” “has received great improvement,” though as Gödel discovered, clearly not perfection. With a completism that only a Teutonic logician was capable of, Gödel had carefully read the foundational documents of American political theory, he’d poured over the Federalist Papers and the Constitution, and he’d made an alarming discovery.

It’s believed that while studying Article V, the portion that details the process of amendment, Gödel realized that there was no safeguard against that article itself being amended. Theoretically, a sufficiently powerful political movement with legislative and executive authority could rapidly amend the articles of amendment so that a potential demagogue would be able to rule by fiat, all while such tyranny was perfectly constitutional. A paradox at the heart of the Constitution—something that supposedly guaranteed democracy having coiled within it rank authoritarianism. All three men driving to Trenton had a keen awareness of tyranny; all were refugees from Nazi Germany; all had found safe-haven on the pristine streets of suburban Princeton. After the Anschluss, Gödel was a stateless man, and though raised Protestant he was suspect by the Nazis and forced to emigrate. Gödel, with his wife, departed Vienna by the Trans-Siberian railroad, crossed from Japan to San Francisco, and then took the remainder of his sojourn by train to Princeton. His path had been arduous and he’d earned America, so when Gödel found a paradox at the heart of the Constitution, his desire to rectify it was born from patriotic duty. At the hearing, the judge asked Gödel how it felt to become a citizen of a nation where it was impossible for the government to fall into anti-democratic tyranny. But it could, Gödel told him, and “I can prove it.” Apocryphally, Einstein kicked the logician’s chair and ended that syllogism.

Born in Austria-Hungary, citizen of Czechoslovakia, Austria, Germany, and finally the United States, Gödel’s very self-definition was mired in incompleteness, contradiction, and unknowability. From parsing logical positivism among luminaries such as Rudolph Carnap and Moritz Schlick, enjoying apfelstrudel and espresso at the Café Reichsrat on Rathausplatz while they discussed the philosophy of mathematics, Gödel now rather found himself eating apple pie and weak coffee in the Yankee Doodle Tap Room on Nassau Street—and he was grateful.  Gone were the elegant Viennese wedding-cake homes of the Ringstrasse, replaced with Jersey’s clapboard colonials; no more would Gödel debate logic among the rococo resplendence of the University of Vienna, but at Princeton he was at least across the hall from Einstein. “The Institute was to be a new kind of research center,” writes Ed Regis in Who Got Einstein’s Office?: Eccentricity and Genius at the Institute for Advanced Study. “It would have no students, no teachers, and no classes,” the only responsibility being pure thought, so that its fellows could be purely devoted to theory. Its director J. Robert Oppenheimer (of Manhattan Project fame) called it an “intellectual hotel;” physicist Richard Feynman was less charitable, referring to it as a “lovely house by the woods” for “poor bastards” no longer capable of keeping up. Regardless, it was to be Gödel’s final home, and there was something to that.

Seventeen years before his trip to Trenton, and it was at the Café Reichsrat where he presented the discovery for which he’d forever be intractably connected—Gödel’s Incompleteness Theorems. In 1930 he had irrevocably altered mathematics when Gödel demonstrated that the dream of completism that had dogged deduction since antiquity was only a mirage. “Any consistent formal system,” argues Gödel in his first theorem, “is incomplete… there are statements of the language… which can neither be proved nor disproved.” In other words, it’s an impossibility that any set of axioms can be demonstrated to be true as part of a self-contained system—the rationalist dream of a unified, self-evidently provable system is only so much fantasy. Math, it turns out, will never be depleted, since there can never be a solution to all mathematical problems. In Gödel’s formulation, a system must either sometimes produce falsehoods, or it must sometimes generate unprovable truths, but it can never consistently render only completely provable truths. As the cognitive scientist Douglas Hofstadter explained in his countercultural classic Gödel, Escher, Bach: An Eternal Golden Braid, “Relying on words to lead you to the truth is like relying on an incomplete formal system to lead you to the truth. A formal system will give you some truth, but… a formal system, no matter how powerful—cannot lead to all truths.” In retrospect, the smug certainties of American exceptionalism should have been no match for Gödel, whose scalpel-like mind had already eviscerated mathematics, philosophy, and logic, to say nothing of some dusty parchment once argued over in Philadelphia.

His theorems rest on a variation of what’s known as the “Liar’s Paradox,” which asks what the logical status of a proposition such as “This statement is false” might be. If that sentence is telling the truth, then it must be false, but if it’s false, then it must be true, ad infinitum, in an endless loop. For Gödel, that proposition is amended to “This sentence is not provable,” with his reasoning demonstrating that a sufficiently formal system of logic can’t demonstrate that proposition, regardless of its truth value, since to prove the statement is to make it unprovable, but if unprovable, then it’s proved, again ad infinitum in yet another grueling loop. As with the Constitution and its paeans to democracy, so must mathematics be rendered perennially useful while still falling short of perfection. The elusiveness of certainty bedeviled Gödel throughout his life; a famously paranoid man, the assassination of his friend Schlick by a Nazi student in 1936 pushed the logician into a scrupulous anxiety. After the death of his best friend Einstein in 1955 he became increasingly isolated. “Gödel’s sense of intellectual exile deepened,” explains Rebecca Goldstein in Incompleteness: The Proof and Paradox of Kurt Gödel. “The young man in the dapper white suit shriveled into an emaciated man, entombed in a heavy overcoat and scarf even in New Jersey’s hot humid summers, seeing plots everywhere… His profound isolation, even alienation, from his peers provided fertile soil for that rationality run amuck which is paranoia.” When his beloved wife fell ill in 1977, Gödel quit eating since she could no longer prepare his meals. The ever-logical man whose entire career had demonstrated the fallibility of rationality had concluded that only his wife could be trusted not to poison his food, and so when she was unable to cook, he properly reasoned (by the axioms that were defined) that it made more sense to simply quit eating. When he died, Gödel weighed only 50 pounds.

Gödel’s thought was enmeshed in that orphan of logic that we call paradox. As was Einstein’s, that man who converted time into space and space into time, who explained how energy and mass were the same thing so that (much to his horror) the apocalyptic false dawn of Hiroshima was the result. Physics in the 20th century had cast off the intuitive coolness of classical mechanics, discovering that contradiction studded the foundation of reality. There was Werner Heisenberg with his uncertainty over the location of individual subatomic particles, Louis de Broglie and the strange combination of wave and particle that explained the behavior of light, Niels Bohr who understood atomic nuclei as if they were smeared across space, and the collapsing wave functions of Erwin Schrödinger for whom it could be imagined that a hypothetical feline was capable of being simultaneously alive and dead. Science journalist John Gribbin explains in Schrödinger’s Kittens and the Search for Reality: Solving the Quantum Mysteries that contemporary physics is defined by “paradoxical phenomena as photons (particles of light) that can be in two places at the same time, atoms that go two ways at once… [and how] time stands still for a particle moving at light speed.” Western thought has long prized logical consistency, but physics in the 20th century abolished all of that in glorious absurdity, and from those contradictions emerged modernity—the digital revolution, semiconductors, nuclear power, all built on paradox.

The keystone of classical logic is the so-called “Law of Non-Contradiction.” Simply put, something cannot both be and not be what it happens to be simultaneously, or if symbolic logic is your jam:  ¬(p ∧ ¬p), and I promise you that’s the only formula you will see in this essay. Aristotle said that between two contradictory statements one must be correct and the other false—”it will not be possible to be and not to be the same thing” he writes in The Metaphysics—but the anarchic potential of the paradox greedily desires truth and its antecedents. And again, in the 17th century the philosopher Wilhelm Gottfried Leibnitz tried to succinctly ward off contradiction in his New Essays on Human Understanding when he declared, “Every judgement is either true or false,” and yet paradoxes fill the history of metaphysics like landmines studded across the Western Front. Paradox is the great counter-melody of logic—it is the question of whether an omnipotent God could will Himself unable to do something, and it’s the eye-straining M.C. Escher lithograph “Waterfall” with its intersecting Penrose triangles showing a stream cascading from an impossible trough. Paradox is the White Queen’s declaration in Lewis Carroll’s Through the Looking Glass that “sometimes I believed as many as six impossible things before breakfast,” and the Church Father Tertullian’s creedal statement that “I believe it because it is absurd.” The cracked shadow logic of our intellectual tradition, paradox is confident though denounced by philosophers as sham-faced; it is troublesome and not going anywhere. When a statement is made synonymous with its opposite, then traditional notions of propriety are dispelled and the fun can begin. “But one must not think ill of the paradox,” writes Søren Kierkegaard in Philosophical Fragments, “for the paradox is the passion of thought, and the thinker without the paradox is like the lover without passion: a mediocre fellow.”

As a concept, it may have found its intellectual origin on the sunbaked, dusty, scrubby, hilly countryside of Crete. The mythic homeland of the Minotaur, who is man and beast, human and bull, a walking, thinking, raging horned paradox covered in cowhide and imprisoned within the labyrinth. Epimenides, an itinerant philosopher some seven centuries before Christ, supposedly said that “All Cretans are liars” (St. Paul actually quotes this assertion in his epistle to Titus). A version of the aforementioned Liar’s Paradox thus ensues. If Epimenides is telling the truth then he is lying, and if he is lying then he is telling the truth. This class of paradoxes has multiple variations (in the Middle Ages they were known as “insolubles”—the unsolvable). For example, consider two sentences vertically arranged; the upper one is written “The statement below is true” and the lower says “The statement above is false,” and again the reader is caught in a maddening feedback loop. Martin Gardner, who for several decades penned the delightful “Mathematical Games” column in Scientific American, asks in Aha! Gotcha: Paradoxes to Puzzle and Delight, “Why does this form of the paradox, in which a sentence talks about itself, make the paradox clearer? Because it eliminates all ambiguity over whether a liar always lies and a truth-teller always tells the truth.” The paradox is a function of language, and in that way is the cousin to tautology, save for the former describing propositions that are always necessarily both true and false.

Some intrinsic meaning is elusive in all of this this, so that it would be easy to reject all of it as rank stupidity, but paradoxes provide a crucial service. In paradox, we experience the breakdown of language and of literalism. Whether or not paradoxes are glitches in how we arrange our words or due to something more intrinsic, they signify a null-space where the regular ways of thinking, of understanding, of writing, no longer hold. Few crafters of the form are as synonymous with paradox as the fifth-century BCE philosopher Zeno of Elea. Consider his famed dichotomy paradox, wherein Zeno concludes that motion itself must be impossible, since the movement from point A to point B always necessitates a halving of distance, forever (and so the destination itself can never be reached). Or his celebrated arrow paradox, wherein Aristotle explains in Physics that “If everything when it occupies an equal space is at rest at that instant of time, and if that which is in location is always occupying such a space at any moment, the flying arrow is therefore motionless at that instant of time and at the next instant of time.” And yet the arrow still moves. Roy Sorenson explains in A Brief History of the Paradox that the form “developed from the riddles of Greek folklore” (as with the Sphinx’s famous query in Sophocles’s Oedipus Rex), so that words have always mediated these conundrums, while Anthony Gottlieb writes in The Dream of Reason: A History of Philosophy from the Greeks to the Renaissance that “ingenious paradoxes… try to discredit commonsense views by demonstrating that they lead to unacceptable consequences,” in a gambit as rhetorical as it is analytical. Often connected primarily with mathematics and philosophy, paradox is fundamentally a literary genre, and one ironically (or paradoxically?) associated with the failure of language itself. All of the great authors of paradox—the pre-Socratics, Zen masters, Jesus Christ—were at their core storytellers, they were writers. Words stretched to incomprehension and narrative unspooling is their fundamental medium. Epimenides’s utterance triggers a collapse of meaning, but where the literal perishes there is room made for the figurative. Paradox is the mother of poetry.

I’d venture that the contradictions of life are the subject of all great literature, but paradoxes appear in more obvious forms, too. “There was only one catch and that was Catch-22,” writes Joseph Heller. The titular regulation of Heller’s Catch-22 concerned the mental state of American pilots fighting in the Mediterranean during the Second World War, with the policy such that if somebody requests that they don’t want to fly a mission because of mental infirmity, they’ve only demonstrated their own sanity, since anyone who would want to fly must clearly be insane, so that it’s impossible to avoid fighting. The captain was “moved very deeply by the absolute simplicity of this clause of Catch-22 and let out a respectful whistle.” Because politics is often the collective social function of reducto ad absurdum, political novels make particularly adept use of paradox. George Orwell did something similar in his celebrated (and oft-misinterpreted) novel of dystopian horror 1984, wherein the state apparatus trumpets certain commandments, such as “War is peace. /Freedom is slavery. /Ignorance is strength.” Perhaps such dialectics are the (non-Marxist) socialist Orwell’s parody of Hegelian double-speak, a mockery of that supposed engine of human progress that goes through thesis-antithesis-synthesis. Within paradox there is a certain freedom, the ability to understand that contradiction is an attribute of our complex experience, but when statements are also defined as their opposite, meaning itself can be the casualty. Paradox understood as a means to enlightenment bestows anarchic freedom; paradox understood as a means unto itself is nihilism.

Political absurdities are born out of the inanity of rhetoric and the severity of regulation, but paradox can entangle not just society, but the fabric of reality as well. Science fiction is naturally adept at examining the snarls of existential paradox, with time travel a favored theme. Paul Nahin explains in Time Machines: Time Travel in Physics, Metaphysics, and Science Fiction that temporal paradoxes are derived from the simple question of “What might happen if a time traveler changed the past?” This might seem an issue entirely of hermetic concern, save for in contemporary physics neither general relativity nor quantum mechanics preclude time travel (indeed certain interpretations of those theories downright necessitate it). So even the idea of being able to move freely through past, present, and future has implications for how reality is constituted, whether or not we happen to be the ones stepping out of the tesseract. “The classic change-the-past paradox is, of course, the so-called grandfather paradox,” writes Nahin, explaining that it “poses the question of what happens if an assassin goes back in time and murders his grandfather before his (the time-travelling murderer’s) own father is born.” The grandfather’s murder requires a murderer, but for the murderer in question to be born there is also the requirement that the grandfather not be murdered, so that the murderer is able to travel back in time and kill his ancestor, and again we’re in a strange loop.

Variations exist as far back as the golden age of the pulps, appearing in magazines like Amazing Stories as early as 1929. More recently, Ray Bradbury explored the paradox in “A Sound of Thunder,” where he is explicit about the paradoxical implications that any travel to the past will alter the future in baroque ways, with a 21st century tourist accidentally killing a butterfly in the Cretaceous, leading to the election of an openly fascistic U.S. president millions of years into the future (though the divergence of parallel universes is often proffered as a means of avoiding such implications). In Bradbury’s estimation, every single thing in history, every event, every incident, is “an exquisite thing,” so that a “small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes all down the years across Time.” This conundrum need not only be phrased in patricidal terms, for what all temporal paradoxes have at their core is an issue of causality—if we imagine that time progresses from past through future, then what happens when those terms get all mixed up? How can we possibly understand a past that’s influenced by a future that in turn has been affected by the past?

Again, no issue of scholastic quibbling, for though we experience time as moving forward like one of Zeno’s arrows, the physics itself tells us that past, present, and future are constituted in entirely stranger ways. One version of the grandfather paradox involves, rather than grisly murder, the transfer of information from the future to the past; for example, in Tim Powers’s novel The Anubis Gates, a time traveler is stranded in the early 19th century. The character realizes that “I could invent things—the light bulb, the internal combustion engine… flush toilets.” But he abandons this hubris, for “any such tampering might cancel the trip I got here by, or even the circumstances under which my mother and father met.” Many readers will perhaps be aware of temporal paradoxes from the Robert Zemeckis Back to the Future film trilogy (which for what they lack in patricide they make up for in Oedipal sentiments), notably a scene in which Marty McFly inadvertently introduces Chuck Berry to his own song “Johnny B. Goode.” Ignoring the troubling implications that a suburban white teenager had to somehow teach the Black inventor of rock ‘n’ roll his own music, Back to the Future presents a classic temporal paradox—if McFly first heard “Johnny B. Goode” from Berry records, and Berry first heard the song from McFly, then from whence was the song actually composed? (Perhaps from God).

St. Augustine asks in The City of God “What is time, then? If nobody asks me, I know; but if I were desirous to explain it to one who should ask me, I plainly do not know.” Paradox sprouts from the fertile soil of our own incomprehension, and to its benefit there is virtually nothing that humans really understand, at least not really. Time is the oddest thing of all, if we honestly confront the enormity of it. I’m continually surprised that I can’t easily walk into 1992 as if it were a room in my house. No surprise then that time and space are so often explored in the literature of paradox. Oxymoron and irony are the milquetoast cousins of paradox, but poetry at its most polished, pristine, and adamantine elevates contradiction into an almost religious principle. Among the 17th-century poets who worked in the stead of John Donne, paradox was often a central aspect of what critics have called a “metaphysical conceit.” These brilliant, crystalline, rhetorical turns are often like Zeno’s paradoxes rendered into verse, expanding and compressing time and space with a dialectical glee. An example of this from the good Dr. Donne, master of both enigma and the erotic, who in his poem “The Good-Morrow” imagined two lovers for whom they have made “one little room an everywhere.” The narrator and the beloved’s bed-chamber—perhaps there is heavy wooden paneling on the wall and a canopy bed near a fireplace burning green wood, a full moon shining through the mottled crown glass window—are as if a singularity where north, south, east and west; past, present, and future; are all collapsed into a point. Even more obvious is Donne in “The Paradox,” wherein he writes that “Once I loved and died; and am now become/Mine epitaph and tomb;/Here dead men speak their last, and so do I,” the talking corpse its own absurdity made flesh.

So taken were the 20th-century scholars known as the New Critics with the ingenuity of metaphysical conceits that Cleanth Brooks would argue in his classic The Well Wrought Urn: Studies in the Structure of Poetry that the “language of poetry is the language of paradox.” Donne and Andrew Marvell, George Herbert, and Henry Vaughan used paradox as a theme and a subject—but to write poetry itself is paradoxical. To write fiction is paradoxical. Even to write nonfiction is paradoxical. To write at all is paradoxical. A similar sentiment concerning the representational arts is conveyed in the Belgian surrealist painter Rene Magritte’s much parodied 1929 work “The Treachery of Images.” Magritte presents an almost absurdly recognizable smoking pipe, polished to a totemistic brown sheen with a shiny black mouth piece, so basically obvious that it might as well be from an advertisement, and beneath it he writes in cursive script “Ceci n’est pas une pipe”—”This is not a pipe.” A seeming blatant contradiction, for what could the words possibly relate to other than the picture directly above them? But as Magritte told an interviewer, “if I had written on my picture ‘This is a pipe,’ I’d have been lying!” For you see, Magritte’s image is not a pipe, it is an image of a pipe. Like Zeno’s paradoxes, what may first initially seem to be simple-minded contrarianism, a type of existential trolling if you will, belies a more subtle observation. The philosopher Michel Foucault writes in his slender volume This Is Not a Pipe that “Contradiction could exist only between two statements,” but that in the painting “there is clearly but one, and it cannot be contradictory because the subject of the propositions is a simple demonstrative.” According to Foucault, the picture, though self-referential, is not a paradox in the logical sense of the word. And yet there is an obvious contradiction between the viewer’s experience of the painting, and the reality that they’ve not looked upon some carefully carved and polished pipe, but rather only brown and black oil carefully applied to stretched canvas.

This, then, is the “treachery” of which Magritte speaks, the paradox that is gestated within that gulf where meaning resides, a valley strung between the-thing-in-itself and the way in which we represent the-thing-in-itself. Writing is in some ways even more treacherous than painting, for at least Magritte’s picture looks like a pipe—perhaps other than some calligraphic art, literature appears as nothing so much as abstract squiggles. Moby-Dick is not a whale and Jay Gatsby is not a man. They are less than a picture of a pipe, for we have not even images of them, only ink-stained books, and the abject abstraction of mere letters. And yet the paradox is that from that nothingness is generated the most sumptuous something; just as the illusion of painting can trick one into the experience of the concrete, so does the more bizarre phenomenon of the literary imagination make you hallucinate characters that are generated from the non-figurative alphabet. From this essay, if I’ve done even a somewhat adequate job, you’ve hopefully been able to envision Gödel and Einstein bundled into a car on the Jersey turnpike, windows frosted with nervous breath and laughter, the sun rising over the wooded Pine Barrens—or to imagine John and Anne Donne bundled together under an exquisite blanket of red and yellow and blue and green, the heavy oak door of their chamber closed tight against the English frost—but of course you’ve seen no such thing. You’ve only skimmed through your phone while sitting on the toilet, or toggled back and forth between open tabs on your laptop. Literature is paradoxical because it necessitates the invention of entire realities out of the basest nothing; the treachery of representation is that “This is not a pipe” is a principle that applies to absolutely all of the written word, and yet when we read a novel or a poem we can smell the burning tobacco.

All of literature is a great enigma, a riddle, a paradox. What the Zen masters of Japanese Buddhism call a koan. Religion is too often maligned for being haunted by the hobgoblin straw-man of consistency, and yet the only real faith is one mired in contradiction, and few practices embrace paradox quite like Zen. Central to Zen is the breaking down of the dualities that separate all of us from absolute being, the distinction between the I and the not-I. As a means to do this, Zen masters deploy the enigmatic stories, puzzles, sayings, and paradoxes of koan, with the goal of forcing the initiate toward the para-logical, a catalyst for the instantaneous enlightenment known as satori. Sometimes reduced to the “What is the sound of one-hand clapping?” variety of puzzle (though that is indeed a venerable koan), the monk and master D.T. Suzuki explains in An Introduction to Zen Buddhism that these apparently “paradoxical statements are not artificialities contrived to hide themselves behind a screen of obscurity; but simply because the human tongue is not an adequate organ for expressing the deepest truth of Zen, the latter cannot be made the subject of logical exposition; they are to be experienced in the inmost soul when they become for the first time intelligible.” A classic koan, attributed to the ninth-century Chinese monk Linji Yixuan, famously says “If you meet the Buddha, kill him.” Linji’s point is similar to Magritte’s—”This is not the Buddha.” It’s a warning about falling into the trap of representation, of refusing to resist the treachery of images, and yet the paradox is that the only way we have of communicating is through the fallible, inexact, medium of words. Zen is the only religion whose purpose is to overcome religion, and everything else for that matter. It asks us to use its paradoxes as a ladder to which we can climb toward ultimate being—and then we’re to kick that ladder over. In its own strange way, literature is the ultimate koan, all of these novels and plays, poems and essays, all words, words, words meaning nothing and signifying everything, gesturing towards a Truth beyond truth, and yet nothing but artfully arranged lies (and even less than that, simply arrayed squiggles on a screen). To read is to court its own type of enlightenment, of transcendence, and not just because of the questions literature raises, but because of literature’s very existence in the first place.

Humans are themselves the greatest of paradoxes: someone who is kind can harbor flashes of rage, the cruelest of people are capable of genuine empathy, our greatest pains often lead to salvation and we’re sometimes condemned by that which we love. In a famous 1817 letter to his brothers, the English Romantic poet John Keats extolled the most sublime of literature’s abilities that was to dwell in “uncertainties, mysteries, doubts, without any irritable reaching after fact and reason,” a quality that he called “negative capability.” An irony in our present’s abandonment of nuance, for ours is a paradoxical epoch through and through—an era of unparalleled technological superiority and appalling barbarity, of instantaneous knowledge and virtually no wisdom. A Manichean age as well—which valorizes consistency above all other virtues, though it is that most suburban of values—yet Keats understood that if we’re to give any credit to literature, and for that matter any credit to people, we must be comfortable with complexity and contradiction. Negative capability is what separates the moral from the merely didactic. In all of our baroque complexity, paradox is the operative mode of literature, the only rhetorical gambit commensurate with displaying the full spectrum of what it means to be a human. We are all such glorious enigmas—creatures of finite dimension and infinite worth. None of us deserve grace, and yet all of us are worthy of it, a moral paradox that makes us beautiful not in spite of its cankered reality, but because of it. The greatest of paradoxes is that within that contradictory form, there is the possibility of genuine freedom—of liberation.

Image Credit: Wikipedia

Circles of the Damned

-

Maybe during this broiling summer you’ve seen the footage—in one striking video, women and men stand dazed on a boat sailing away from the Greek island of Evia, watching as ochre flames consume their homes in the otherwise dark night. Similar hellish scenes are unfolding in Algeria, Tunisia, and Libya, as well as in Turkey and Spain. Currently Siberia is experiencing the largest wildfire in recorded history, an unlikely place for such a conflagration, joined by large portions of Canada. As California burns, the global nature of our immolation is underscored by horrific news around the world, a demonstration of the United Nation’s Intergovernmental Report on Climate Change’s conclusion that such disasters are “unequivocally” anthropogenic, with the authors signaling a “code red” for the continuation of civilization. On Facebook, the mayor of a town in Calabria mourned that “We are losing our history, our identity is turning to ashes, our soul is burning,” and though he was writing specifically about the fires raging in southern Italy, it’s a diagnosis for a dying world as well.

Seven centuries ago, another Italian wrote in The Divine Comedy, “Abandon all hope ye who enter here,” which seems just as applicable in 2021 as it did in 1221. That exiled Florentine had similar visions of conflagration, describing how “Above that plain of sand, distended flakes/of fire showered down; their fall was slow –/as snow descends on alps when no wind blows… when fires fell, /intact and to the ground.” This September sees the 700th anniversary of both the completion of The Divine Comedy and the death of its author Dante Alighieri. But despite the chasms of history that separate us, his writing about how “the never-ending heat descended” holds a striking resonance. During our supposedly secular age, the singe of the inferno feels hotter when we’ve pushed our planet to the verge of apocalyptic collapse. Dante, you must understand, is ever applicable in our years of plague and despair, tyranny and treachery.

People are familiar with The Divine Comedy’s tropes even if they’re unfamiliar with Dante. Because all of it—the flames and sulphur, the mutilations and the shrieks, the circles of the damned and the punishments fitted to the sin, the descent into subterranean perdition and the demonic cacophony—find their origin with him. Neither Reformation nor revolution has dispelled the noxious fumes from inferno. There must be a distinction between the triumphalist claim that Dante says something vital about the human condition, and the objective fact that in some ways Dante actually invented the human condition (or a version of it).  When watching the videos of people escaping from Evia, it took me several minutes to understand what it was that I was looking at, and yet those nightmares have long existed in our culture, as Dante gave us a potent vocabulary to describe Hell centuries ago. For even to deny Hell is to deny something first completely imagined by a Medieval Florentine.

Son of a prominent family, enmeshed in conflicts between the papacy and the Holy Roman Empire, a respected though ultimately exiled citizen, Dante resided far from the shores of modernity, though the broad contours of his poem, with its visceral conjurations of the afterlife, are worth repeating. “Midway upon the journey of our life/I found myself within a forest dark,” Dante famously begins. At the age of 35, Dante descends into the underworld, guided by the ancient Roman poet Virgil and sustained by thoughts of his platonic love for the lady Beatrice. That Inferno constitutes only the first third of The Divine Comedy—subsequent sections consider Purgatory and Heaven—yet that it is the most read, speaks to something cursedly intrinsic in us. The poet descends like Orpheus, Ulysses, and Christ before him deep into the underworld, journeying through nine concentric circles, each more brutal than the previous. Perdition is a space of “sighs and lamentations and loud cries” filled with “Strange utterances, horrible pronouncements, /accents of anger, words of suffering, /and voice shrill and faint and beating hands” who are buffeted “forever through that turbid, timeless air, /like sand that eddies when a whirlwind swirls.” Cosmology is indistinguishable from ethics, so that each circle was dedicated to particular sins: the second circle is reserved for crimes of lust, the third to those of gluttony, the fourth to greed, the wrathful reside in the fifth circle, the sixth is domain of the heretics, the seventh is for the violent, all those guilty of fraud live in the eighth, and at the very bottom that first rebel Satan is eternally punished alongside all traitors.

Though The Divine Comedy couldn’t help but reflect the concerns of Dante’s century, he still formulated a poetics of damnation so tangible and disturbing that it’s still the measure of hellishness, wherein he “saw one/Rent from the chin to where one breaks wind. /Between his legs were hanging down his entrails;/His heart was visible, and the dismal sack/That makes excrement of what is eaten.” Lest it be assumed that this is simply sadism, Dante is cognizant of how gluttony, envy, lust, wrath, sloth, covetousness, and pride could just as easily reserve him a space. Which is part of his genius; Dante doesn’t just describe Hell, which in its intensity provides an unparalleled expression of pain, but he also manifests a poetry of justice, where he’s willing to implicate himself (even while placing several of his own enemies within the circles of the damned). 

No doubt the tortures meted out—being boiled alive for all eternity, forever swept up in a whirlwind, or masticated within the freezing mouth of Satan—are monstrous. The poet doesn’t disagree—often he expresses empathy for the condemned. But the disquiet that we and our fellow moderns might feel is in part born out of a broad theological movement that occurred over the centuries in how people thought about sin. During the Reformation, both Catholics and Protestants began to shift the model of what sin is away from the Seven Deadly Sins, and towards the more straightforward Ten Commandments. For sure there was nothing new about the Decalogue, and the Seven Deadly Sins haven’t exactly gone anywhere, but what took hold—even subconsciously—was a sense that sins could be reduced to a list of literal injunctions. Don’t commit adultery, don’t steal, don’t murder. Because we often think of sin as simply a matter of broken rules, the psychological acuity of Dante can be obscured. But the Seven Deadly Sins are rather more complicated—we all have to eat, but when does it become gluttony? We all have to rest, but when is that sloth?

An interpretative brilliance of the Seven Deadly Sins is that they explain how an excess of otherwise necessary human impulses can pervert us. Every human must eat; most desire physical love; we all need the regeneration of rest—but when we slide into gluttony, lust, sloth, and so on, it can feel as if we’re sliding into the slime that Dante describes. More than a crime, sin is a mental state which causes pain—both within the person who is guilty and to those who suffer because of those actions. In Dante’s portrayal of the stomach-dropping, queasy, nauseous, never-ending uncertainty of the damned’s lives, the poet conveys a bit of their inner predicament. The Divine Comedy isn’t some punitive manual, a puritan’s little book of punishments. Rather than a catalogue of what tortures match which crimes, Dante’s book expresses what sin feels like. Historian Jeffrey Burton Russell writes in Lucifer: The Devil in the Middle Ages how in hell “we are weighted down by sin and stupidity… we sink downward and inward… narrowly confined and stuffy, our eyes gummed shut and our vision turned within ourselves, drawn down, heavy, closed off from reality, bound by ourselves to ourselves, shut in and shut off… angry, hating, and isolated.”

If such pain were only experienced by the guilty, that would be one thing, but sin effects all within the human community who suffer as a result of pride, greed, wrath, and so on. There is a reason why the Seven Deadly Sins are what they are. In a world of finite resources, to valorize the self above all others is to take food from the mouths of the hungry, to hoard wealth that could be distributed to the needy, to claim vengeance as one’s own when it is properly the purview of society, to enjoy recreation upon the fruits of somebody else’s labor, to reduce another human being to a mere body, to covet more than you need, or to see yourself as primary and all others as expendable. Metaphor is the poem’s currency, and what’s more real than the intricacies of how organs are pulled from orifices is how sin—that disconnect between the divine and humanity—is experienced. You don’t need to believe in a literal hell—I don’t—to see what’s radical in Dante’s vision. What Inferno offers isn’t just grotesque descriptions, increasingly familiar though they may be on our warming planet, but also a model of thinking about responsibility to each other in a connected world.

Such is the feeling of the anonymous authors of the 2018 anarchist manifesto The Invisible Committee—a surprise hit when published in France—who opined that “No bonds are innocent” in our capitalist era, for “We are already situated within the collapse of a civilization,” structuring their tract around the circles of Dante’s inferno. Since its composition, The Divine Comedy has run like a molten vein through culture both rarefied and popular; from being considered by T.S. Eliot, Samuel Becket, Primo Levi, and Dereck Walcott, to being referenced in horror films, comic books, and rock music. Allusion is one thing, but what we can see with our eyes is another—as novelist Francine Prose writes in The Guardian, those images of people fleeing from Greek wildfires are “as if Dante filmed the Inferno on his iPhone.” For centuries artists have mined Inferno for raw materials, but now in the sweltering days of the Anthropocene we are enacting it. To note that our present appears as a place where Hell has been pulled up from the bowels of the earth is a superficial observation, for though Dante presciently gives us a sense of what perdition feels like, he crucially also provided a means to identify the wicked.

Denizens of the nine circles were condemned because they worshiped the self over everybody else; now the rugged individualism that is the heretical ethos of our age has made man-made apocalypse probable. ExxonMobil drills for petroleum in the heating Arctic and the apocalyptic QAnon cult proliferates across the empty chambers of Facebook and Twitter; civil wars are fought in the Congo over the tin, tungsten, and gold in the circuit boards of the Android you’re reading this essay with, children in Vietnam and Malaysia sew the clothing we buy at The Gap and Old Navy, and the simplest request for people to wear masks so as to protect the vulnerable goes unheeded in the name of “freedom” as our American Midas Jeff Bezos barely flies to outer space while workers in Amazon warehouses are forced to piss in bottles rather than be granted breaks. Responsibility for our predicament is unequally distributed, those in the lowest circle are the ones who belched out carbon dioxide for profit knowing full well the effects, those who promoted a culture of expendable consumerism and valorized the rich at the expense of the poor. Late capitalism’s operative commandment is to pretend that all seven of the deadly sins are virtues. Literary scholar R.W.B. Lewis describes the “Dantean principle that individuals cannot lead a truly good life unless they belong to a good society,” which means that all of us are in a lot of trouble. Right now, the future looks a lot less like paradise and more like inferno. Dante writes that “He listens well who takes notes.” Time to pay attention.

Bonus Link:—Is There a Poet Laureate of the Anthropocene?

Image Credit: Wikipedia

Who’s There?: Every Story Is a Ghost Story

-

“One need not be a chamber to be haunted.” —Emily Dickinson

Drown Memorial Hall was only a decade old when it was converted into a field hospital for students stricken with the flu in the autumn of 1918. A stolid, grey building of three stories and a basement, Drown Hall sits half-way up South Mountain where it looks over the Lehigh Valley to the federal portico of the white-washed Moravian Central Church across the river, and the hulking, rusting ruins of Bethlehem Steel a few blocks away. Composed of stone the choppy texture of the north Atlantic in the hour before a squall, with yellow windows buffeted by mountain hawks and grey Pennsylvania skies. Built in honor of Lehigh University’s fourth president, a mustachioed Victorian chemistry professor, Drown was intended as a facility for leisure, exercise, and socialization, housing (among other luxuries) bowling alleys and chess rooms. Catherine Drinker Bowen enthused in her 1924 History of Lehigh University that Drown exuded “dignity and, at the same time, a certain at-home-ness to every function held there,” that the building “carries with it a flavor and spice which makes the hotel or country club hospitality seem thin, flat and unprofitable.” If Drown was a monument to youthful exuberance, innocent pluck, and boyish charm, then by the height of the pandemic it had become a cenotaph to cytokine storms. Only a few months after basketballs and Chuck Taylors would have skidded across its gymnasium floor, and those same men would lay on cots hoping not to succumb to the illness. Twelve men would die of the influenza in Drown.

After its stint as a hospital, Drown would return to being a student center, then the Business Department, and by the turn of our century the English Department. It was in that final purpose that I got to know Drown a decade ago, when I was working on my PhD. Toward the end of my first year, I had to go to my office in the dusk after-hour, when lurid orange light breaks through the cragged and twisted branches of still leafless trees in the cold spring, looking nothing so much like jaundiced fingers twisting the black bars of a broken cage, or like the spindly embers of a church’s burnt roof, fires still cackling through the collapsing wood.  I had to print a seminar paper for a class on 19th-century literature, and to then quickly adjourn to my preferred bar. When I keyed into the locked building, it was empty, silent save for the eerie neon hum of the never-used vending machines and the unnatural pooling luminescence of perennially flickering fluorescent lights in the stairwells at either end of the central hall. While in a basement computer lab, I suddenly heard a burst of noise upstairs come from one end of the hall rapidly progress towards the other—the unmistakable sound of young men dribbling a basketball. Telling myself that it must be the young children of one of the department’s professors, I shakily ascended. As soon as I got to the top the noise ceased. The lights were out. The building was still empty. Never has an obese man rolled down that hill quite as quickly as I did in the spring of 2011.

There are several rational explanations—students studying in one of the classrooms even after security would have otherwise locked up. Or perhaps the sound did come from some faculty kids (though to my knowledge nobody was raising adolescents at that time). Maybe there was something settling strangely, concrete shifting oddly or water rushing quickly through a pipe (as if I didn’t know the difference between a basketball game and a toilet flushing). When depleted of all explanations, I know what I heard and what it sounded like, and I still have no idea what it was. Nor is this the only ghost story that I could recount—there was the autumn of 2003 when walking back at 2 a.m. after the close of the library at Washington and Jefferson College, feet unsteady on slicked brown leaves blanketing the frosted sidewalk, that I noted an unnatural purple light emanating from a half-basement window of Macmillan Hall, built in 1793 (having been the encampment of Alexander Hamilton during the Whisky Rebellion) and the oldest university building west of the Alleghenies. A few seconds after observing the shining, I heard a high-pitched, unnatural, inhuman banshee scream—some kind of poltergeist cry—and being substantially thinner in that year I was able to book it quickly back to my dorm. Or in 2007 while I was living in Scotland, when I toured the cavernous subterranean vaults underneath the South Bridge between the Old and New towns of Edinburgh, and I saw a young chav, who decided to make obscene hand gestures within a cairn that the tour guide assured us had “evil trapped within it,” later break down as if he was being assaulted by some unseen specter. Then there was the antebellum farm house in the Shenandoah Valley that an ex-girlfriend lived in, one room being so perennially cold and eerie that nobody who visited ever wanted to spend more than a few minutes in it. A haunted space in a haunted land where something more elemental than intellect screams at you that something cruel happened there.

Paranormal tales are popular, even among those who’d never deign to believe in something like a poltergeist, because they speak to the ineffable that we feel in those arm-hair-raised, scalp-shrinking, goose-bumped-moments where we can’t quite fully explain what we felt, or heard, or saw. I might not actually believe in ghosts, but when I hear the dribbling of a basketball down an empty and dark hall, I’m not going to stick around to find out what it is. No solitary person is ever fully a skeptic when they’re alone in a haunted house. Count me on the side of science journalist Mary Roach, who in Spook: Science Tackles the Afterlife writes that “I guess I believe that not everything we humans encounter in our lives can be neatly and convincingly tucked away inside the orderly cabinetry of science. Certainly, most things can… but not all. I believe in the possibility of something more.” Haunting is by definition ambiguous—if with any certainty we could say that the supernatural was real it would, I suppose, simply be the natural.

Franco-Bulgarian philosopher Tzvetan Todorov formulated a critical model of the supernatural in his study The Fantastic: A Structural Approach to a Literary Genre, in which he argued that stories about unseen realms could be divided between the “uncanny” and the “marvelous.” The former are narrative elements that can ultimately be explained rationally, i.e., supernatural plot points that prove to be dreams, hallucinations, drug trips, hoaxes, illusions, or anything unmasked by the Scooby-Doo gang. The latter are things that are actually occult, supernatural, divine. When it’s unclear as to whether or not a given incident in a story is uncanny or marvelous, then it’s in that in-between space of the fantastic, which is the same place any ghostly experience has had to be honestly categorized. “The fantastic is that hesitation experienced by a person who knows only the laws of nature, confronting an apparently supernatural event,” writes Todorov, and that is a succinct description of my Drown anomaly. Were it to be simply uncanny, then I suppose my spectral fears would have been assuaged if upon my ascent I found a group of living young men playing impromptu pick-up basketball. For that experience to be marvelous, I’d have to know beyond any doubt that what I heard were actual spirits. As it is, it’s the uncertainty of the whole event—the strange, spooky, surreal ambiguity—that makes the incident fantastic. “What I’m after is proof,” writes Roach. “Or evidence, anyway —evidence that some form of disembodied consciousness persists when the body closes up shop. Or doesn’t persist.” I’ve got no proof or disproof either, only the distant memory of sweaty palms and a racing heart.

Ghosts may haunt chambers, but they also haunt books; they might float through the halls of Drown, but they even more fully possess the books that line that building’s halls. Traditional ghosts animate literature, from the canon to the penny dreadful, including what the Victorian critic Matthew Arnold grandiosely termed the “best which has been thought and said” as well as lurid paperbacks with their garish covers. We’re so obsessed with something seen just beyond the field of vision that vibrates at a frequency that human ears can’t quite detect—from Medieval Danish courts to the Overlook Hotel, Hill House to the bedroom of Ebenezer Scrooge—that we’re perhaps liable to wonder if there is something to ghostly existence. After all, places are haunted, lives are haunted, stories are haunted. Such is the nature of ghosts; we may overlook their presence, their flitting and meandering through the pages of our canonical literature, but they’re there all the same (for a place can be haunted whether you notice that it is or not).

How often do you forget that the work that is the greatest in the language is basically a ghost story? William Shakespeare’s Hamlet is fundamentally a good old-fashioned yarn about a haunted house (in addition to being a revenge tragedy and pirate tale). The famed soliloquy of the Danish prince dominates our cultural imagination, but the most cutting bit of poetry is the eerie line that begins the play: “Who’s there?” Like any good supernatural tale, Hamlet begins in confusion and disorientation, as the sentries Marcellus and Bernardo patrolling Elsinore’s ramparts first espy the silent ghost of the murdered king, with the latter uttering the shaky two-word interrogative. Can you imagine being in the audience, sometime around 1600 when it was a widespread belief that there are more things in heaven and earth than can be dreamt of in our philosophies, and hearing the quivering question asked in the darkness, faces illuminated by tallow candle, the sense that there is something just beyond our experience that has come from beyond? The status of Hamlet’s ghost is ambiguous; some critics have interpreted the specter as a product of the prince’s madness, others claim that the spirit is clearly real. Such uncertainty speaks to what’s fantastic about the ghost, as ambiguity haunts the play. Notice that Bernardo doesn’t ask “What’s there?” His question is phrased towards a personality with agency, even as the immaterial spirit of Hamlet’s dead father exists in some shadow-realm between life and death.

A ghost’s status was no trifling issue—it got to the core of salvation and damnation. Protestants wouldn’t believe that souls could wander the earth; they would either be rewarded in heaven or punished in hell, while ghosts must necessarily be demons. Yet Shakespeare’s play seems to make clear that Hamlet’s father has indeed returned, perhaps as an inhabitant of that way station known as purgatory, that antechamber to eternity whereby the ghost can ascend from the Bardo to skulk around Elsinore for the space of a prologue. Of course, when Shakespeare wrote Hamlet, ostensibly good Protestant that he was, he should have held no faith in purgatory, that abode of ghosts being in large part that which caused Luther to nail his theses to the Wittenberg Cathedral door. When the final Thirty-Nine Articles of the Church of England were ratified in 1571 (three decades before the play’s premier), it was Article 22 that declared belief in purgatory was a ” thing vainly invented, and grounded upon no warranty of scripture; but rather repugnant to the word of God.” According to Stephen Greenblatt’s argument from Hamlet in Purgatory, the ghost isn’t merely Hamlet’s father, but also a haunting from the not-so-distant Catholic past, which the official settlement had supposedly stripped away with rood screens and bejeweled altars. Elsinore’s haunting is not just that of King Hamlet’s ghost, but also of those past remnants that the reformers were unable to completely bury. Greenblatt writes that for Shakespeare purgatory “was a piece of poetry” drawn from a “cultural artery” whereby the author had released a “startling rush of vital energy.” There are a different set of ambiguities at play in Hamlet, not least of which is how this “spirit of health or goblin damned” is to be situated between orthodoxy and heresy. In asking “who” the ghost is, Bernardo is simultaneously asking what it is, where it comes from, and how such a thing can exist. So simple, so understated, so arresting is the first line of Hamlet that I’m apt to say that Bernardo’s question is the great concern of all supernatural literature, if not all literature. Within Hamlet there is a tension between the idea of survival and extinction, for though the prince calls death the “undiscovered country from whose bourn no traveler returns,” he himself must know that’s not quite right. After all, his own father came back from the dead (a role that Shakespeare played himself).

Shakespeare’s ghoul is less ambiguous than those of Charles Dickens, for the ghost of Jacob Marley who visits Ebenezer Scrooge in A Christmas Carol is accused of simply being an “undigested bit of beef, a blot of mustard, a crumb of cheese, a fragment of underdone potato. There’s more of gravy than of grave about you, whatever you are!” Note that the querying nature of the final clause ends in an exclamation rather than a question mark, and there’s no asking who somebody is, now only what they are. Because A Christmas Carol has been filtered through George C. Scott, Patrick Stewart, Bill Murray, and the Muppets, there is a tendency to forget just how terrifying Dickens’s novel actually is. The ultimately repentant Scrooge and his visitations from a trinity of moralistic specters offer up visions of justice that are gothic in their capacity to unsettle. The neuter sprite that is the Ghost of Christmas Past with holly and their summer flowers; hail-fellow-well-met Bacchus that is the Ghost of Christmas Present; and the grim memento mori visage of the reaper who is the Ghost of Christmas Yet to Come (not to mention Marley padlocked and in fetters). Dickens in the mold of Dante, for whom haunting is its own form of retribution, a means of purging us of our inequities and allowing for redemption. Andrew Smith writes in The Ghost Story 1840-1920: A Cultural History that “Dickens’s major contribution to the development of the ghost story lies in how he employs allegory in order to encode wider issues relating to history, money, and identity.” Morality itself—the awesome responsibility impinging on us every second of existence—can be a haunting. The literal haunting of Scrooge is more uncertain, for perhaps they’re gestated from madness, hallucination, nightmare, or as he initially said, indigestion. Dickens’s ghosts are ambiguous—as they always must be—but the didactic sentiment of A Christmas Carol can’t be.

Nothing is more haunting than history, especially a wicked one, and few tales are as cruel as that of the United States. Gild the national narrative all that we want, American triumphalism is a psychological coping mechanism. This country, born out of colonialism, genocide, slavery, is a massive haunted burial ground, and we all know that grave yards are where ghosts dwell. As Leslie Fiedler explained in Love and Death in the American Novel, the nation itself is a “gothic fiction, nonrealistic and negative, sadist and melodramatic—a literature of darkness and the grotesque.” America is haunted by the weight of its injustice; on this continent are the traces of the Pequod and Abenaki, Mohawk and Mohegan, Apache and Navajo whom the settlers murdered; in this country are the filaments of women and men held in bondage for three centuries, and from every tree hangs those murdered by our American monstrosity. That so many Americans willfully turn away from this—the Faustian bargain that demands acquiescence—speaks not to the absence of haunting; to the contrary, it speaks of how we live among the possessed still, a nation of demoniacs. William Faulkner’s observation in Requiem for a Nun that “The past is never dead. It’s not even past” isn’t any less accurate for being so omnipresent. No author conveyed the sheer depth and extent of American haunting quit like Toni Morrison, who for all that she accomplished must also be categorized among the greatest authors of ghost stories. To ascribe such a genre as that to a novel like Morrison’s Beloved is to acknowledge that the most accurate depictions of our national trauma have to be horror stories if they’re to tell the truth. “Anything dead coming back to life hurts”—there’s both wisdom and warning in Morrison’s adage.

Beloved’s plot is as chilling as an autumnal wind off of the Ohio River, the story of the former enslaved woman Sethe whose Cincinnati home is haunted by the ghost of her murdered child, sacrificed in the years before the Civil War to prevent her being returned to bondage in Kentucky. Canonical literature and mythology have explored the cruel incomprehension of infanticide—think of Euripides’s Medea—but Sethe’s not irrational desire to send her “babies where they would be safe” is why Beloved’s tragedy is so difficult to contemplate. When a mysterious young woman named Beloved arrives, Sethe becomes convinced that the girl is the spirit of her murdered child. Ann Hostetler, in her essay from the collection Toni Morrison: Memory and Meaning, writes that Beloved’s ghost “was disconcerting to many readers who expected some form of social or historical realism as they encountered the book for the first time.” She argues, however, that the “representation of history as the return of the repressed…is also a modernist strategy,” whereby “loss, betrayal, and trauma is something that must be exorcized from the psyche before healing can take place.” Just because we might believe ourselves to be done with ghosts doesn’t mean that ghosts are done with us. Phantoms so often function as the allegorical because whether or not specters are real, haunting very much is. We’re haunted by the past, we’re haunted by trauma, we’re haunted by history. “This is not a story to pass on,” Morrison writes, but she understands better than anyone that we can’t help but pass it on.

Historical trauma is more occluded in Stephen King’s The Shining, though that hasn’t stopped exegetes from interpreting the novel about homicide in an off-season Colorado resort as being concerned with the dispossession of Native American, particularly in the version of the story as rendered by director Stanley Kubrick. The Overlook Hotel is built upon a Native American burial ground, Navajo and Apache wall hangings are scattered throughout the resort. Such conjectures about The Shining are explored with delighted aplomb in director Rodney Ascher’s documentary Room 237 (named after the most haunted place in the Overlook), but as a literary critical question, there’s much that’s problematic in asking what any given novel, or poem, or movie is actually about; an analysis is on much firmer ground when we’re concerned with how a text works (a notably different issue).  So, without discounting the hypothesis that The Shining is concerned with the genocide of indigenous Americans, the narrative itself tells a more straightforward story of haunting, as a bevy of spirits drive blocked, recovering alcoholic writer and aspiring family destroyer Jack Torrance insane. Kubrick’s adaptation is iconic—Jack Nicholson as Torrance (with whom he shares a first name) breaking through a door with an axe; his son, young Danny Torrance, escaping through a nightmarish, frozen hedge-maze of topiary animals; the crone in room 237 coming out of the bathtub; the blood pouring from the elevators; the ghostly roaring ’20s speakeasy with its chilling bartender, and whatever the man in the boar costume was. Also, the twins.

Still, it could be observed that the only substantiated supernatural phenomenon is the titular “Shining” that afflicts both Danny and the Overlook’s gentle caretaker, Dick Halloran. “A lot of folks, they got a little bit of shine to them,” Dick explains. “They don’t even know it. But they always seem to show up with flowers when their wives are feelin blue with the monthlies, they do good on school tests they don’t even study for, they got a good idea how people are feelin as soon as they walk into a room.” As hyper-empathy, the shining makes it possible that Danny is merely privy to a variety of psychotic breaks his father is having rather than those visions being actually real. While Jack descends further and further into madness, the status of the spectral beings’ existence is ambiguous (a point not everyone agrees on, however). It’s been noted that in the film, the appearance of a ghost is always accompanied by that of a mirror, so that The Shining’s hauntings are really manifestations of Jack’s fractured psyche. Narrowly violating my own warning concerning the question of “about,” I’ll note how much of The Shining is concerned with Jack’s alcoholism, the ghostly bartender a psychic avatar of all that the writer has refused to face. Not just one of the greatest ghost stories of the 20th century, The Shining is also one of the great novels of addiction, an exploration of how we can be possessed by own deficiencies. Mirrors can be just as haunted as houses. Notably, when King wrote The Shining, he was in the midst of his own full-blown alcoholism, so strung out he barely remembers writing doorstopper novels (he’s now been sober for more than 30 years). As he notes in The Shining, “We sometimes need to create unreal monsters and bogies to stand in for all the things we fear in our real lives.”   

If Hamlet, A Christmas Carol, Beloved, and The Shining report on apparitions, then there are novels, plays, and poems that are imagined by their creators as themselves being chambers haunted by something in between life and death. Such a conceit offers an even more clear-eyed assessment of what’s so unsettling about literature—this medium, this force, this power—capable of affecting what we think, and see, and say as if we ourselves were possessed. As a trope, haunted books literalize a profound truth about the written word, and uneasily push us towards acknowledging the innate spookiness of language, where the simplest of declarations is synonymous with incantation. Richard Chambers’s collection of short stories The King in Yellow conjures one of the most terrifying examples of a haunted text, wherein an imaginary play that shares the title of the book is capable of driving its audience to pure madness. “Strange is the night where the black stars rise, /And strange moons circle through the skies,” reads verse from the play; innocuous, if eerie, though it’s in the subtlety that the demons get you. Chambers would go on to influence H.P. Lovecraft, who conceived of his own haunted book in the form of the celebrated grimoire The Necronomicon, which he explains was “Composed by Abdul Alhazred, a mad poet of Sanaa in Yemen, who was said to have flourished during the period of the Ommiade caliphs, circa 700 A.D.,” and who rendered into his secret book the dark knowledge of the elder gods who were responsible for his being “seized by an invisible monster in broad daylight and devoured horribly before a large number of fright-frozen witnesses.” Despite the Necronomicon’s fictionality, there are a multitude of occultists who’ve claimed over the years that Lovecraft’s haunted volume is based in reality (you can buy said books online).

Then there are the works themselves which are haunted. Shakespeare’s Macbeth is the most notorious example, its themes of witchcraft long lending it an infernal air, with superstitious directors and actors calling it the “Scottish play” in lieu of its actual title, lest some of the spells within bring ruin to a production. Similar rumors dog Shakespeare’s contemporary Christopher Marlowe and his play Doctor Faustus, with a tradition holding that the incantations offered upon the stage summoned actual demons. With less notoriety, the tradition of “book curses” was a full-proof way to guard against the theft of the written word—a practice that dates back as far as the Babylonians, but that reached its apogee during the Middle Ages, when scribes would affix to manuscript colophons warnings about what should befall an unscrupulous thief. “Whoever steals this Book of Prayer/May he be ripped apart by swine, /His heart splintered, this I swear, /And his body dragged along the Rhine,” writes Simon Vostre in his 1502 Book of Hours. To curse a book is perhaps different than a stereotypical haunting, yet both of these phenomenon, not-of-this-world as they are, assume disembodied consciousness as manifest among otherwise inert matter; the curse is a way of imbuing yourself and your influence beyond your demise. It’s to make yourself a ghost, and it worked in leaving those books complete. “If you ripped out a page, you were going to die in agony. You didn’t want to take the chance,” Marc Drogin dryly notes in Anathema! Medieval Scribes and the History of Book Curses.

Not all writing is cursed, but surely all of it is haunted. Literature is a catacomb of past readers, past writers, past books. Traces of those who are responsible for creation linger among the words on a page; Shakespeare can’t hear us, but we can still hear him (and don’t ghosts wander through those estate houses upon the moors unaware that they’ve died?). Disenchantment has supposedly been our lot since Luther, or Newton, or Darwin chased the ghosts away, leaving behind this perfect mechanistic clockwork universe with no need for superfluous hauntings. Though like Hamlet’s father returned from a purgatory that we’re not supposed to believe in, we’re unwilling to acknowledge the specters right in front of us. Of all of the forms of expression that humanity has worked with—painting, music, sculpture—literature is the eeriest. Poetry and fiction are both incantation and conjuration, the spinning of specters and the invoking of ghosts; it is very literally listening to somebody who isn’t there, and might not have been for a long while. All writing is occult, because it’s the creation of something from ether, and magic is simply a way of acknowledging that—a linguistic practice, an attitude, a critical method more than a body of spells. We should be disquieted by literature; we should be unnerved. Most of all, we should be moved by the sheer incandescent amazement that such a thing as fiction, and poetry, and performance are real. Every single volume upon the shelves of Drown, every book in an office, every thumbed and underlined play sitting on a desk, is more haunted than that building. Reader, if you seek enchantments, turn to any printed page. If you look for a ghost, hear my voice in your own head.

Bonus Link:—Binding the Ghost: On the Physicality of Literature

Image Credit: Flickr/Kevin Dooley