On Obscenity and Literature

“But implicit in the history of the First Amendment is the rejection of obscenity as utterly without redeeming social importance.” —Associate Justice William J. Brennan Jr., Roth v. United States (1957)

Interviewer: Speaking of blue, you’ve been accused of vulgarity. Mel Brooks: Bullshit! —Playboy (February, 1975)

On a spring evening in 1964 at the Café Au Go Go in Greenwich Village, several undercover officers from the NYPD’s vice squad arrested Lenny Bruce for public obscenity. Both Bruce and the club’s owner Howard Solomon were shouldered out through the crowded club to waiting squad cars, their red and blue lights reflected off of the dirty puddles pooled on the pavement of Bleecker Street. For six months the two men would stand trial, with Bruce’s defense attorney calling on luminaries from James Baldwin to Allen Ginsberg, Norman Mailer to Bob Dylan, to attest to the stand-up’s right to say whatever he wanted in front of a paying audience. “He was a man with an unsettling sense of humor,” write Ronald K.L. Collins and David M. Skover in The Trials of Lenny Bruce: The Fall and Rise of an American Icon. “Uncompromising, uncanny, unforgettable, and unapologetic…His words crossed the law and those in it is. He became intolerable to people too powerful to ignore. When it was over, not even the First Amendment saved him.” The three-judge tribunal sentenced Bruce to four months punishment in a workhouse. Released on bail, he never served a day of his conviction, overdosing on morphine in his Hollywood Hills bungalow two years later. He wouldn’t receive a posthumous pardon until 2003.

“Perhaps at this point I ought to say a little something about my vocabulary,” Bruce wrote in his (still very funny) How to Talk Dirty and Influence People: An Autobiography. “My conversation, spoken and written, is usually flavored with the jargon of the hipster, the argot of the underworld, and Yiddish.” Alongside jazz, Jewish-American comedy is one of the few uniquely American contributions to world culture, and if that comparison can be drawn further, then Bruce was the equivalent of Dizzy Gillespie or Miles Davis—he was the one who broke it wide open. Moving comedy away from the realm of the Borscht Belt one-liner, Bruce exemplified the emerging paradigm of stand-up as spoken word riff of personal reflections and social commentary, while often being incredibly obscene. The Catskills comedian Henry Youngman may have been inescapably Jewish, but Bruce was unabashedly so. And, as he makes clear, his diction proudly drew from the margins, hearing more truth in the dialect of the ethnic Other than in mainstream politeness, more honesty in the junky’s language than in the platitudes of the square, more righteous confrontation in the bohemian’s obscenity than in the pieties of the status quo. Among the comics of that golden age of stand-up, only Richard Pryor was his equal in bravery and genius, and despite the fact that some of his humor is dated today, books like How to Talk Dirty and Influence People still radiate an excitement that a mere burlesque performer could challenge the hypocrisy and puritanism of a state that would just as soon see James Joyce’s Ulysses and D.H. Lawrence’s Lady Chatterley’s Lover banned and their publisher’s hauled to jail as they would actually confront any of the social ills that infected the body politic.

What separates Bruce from any number of subsequent comics is that within his performances there was a fully articulated theory of language. “Take away the right to say the word ‘fuck’ and you take away the right say ‘fuck the government,’” he is reported to have said, and this is clearly and crucially true. That’s one model of obscenity’s utility: its power to lower the high and to raise the low, with vulgarity afforded an almost apocalyptic power of resistance. There is a naivety, however, that runs through the comedian’s work, and that’s that Bruce sometimes doesn’t afford language enough power. In one incendiary performance from the early ’60s, Bruce went through a litany of ethnic slurs for Black people, Jews, Italians, Hispanics, Poles, and the Irish, finally arguing that “it’s the suppression of the word that gives it the power, the violence, the viciousness.” He imagines a scenario whereby the president would introduce members of his cabinet by using those particular words, and concludes that following such a moment those slurs wouldn’t “mean anything anymore, then you could never make some six-year-old black kid cry because somebody called him” that word at school. Bruce’s idealism is almost touching—let it not be doubted that he genuinely believed language could work in this way—but it’s also empirically false. Dying a half-century ago he can’t be faulted for his ignorance on this score, but when we now have a president who basically does what Bruce imagined his hypothetical Commander-in-Chief doing, and I think we can emphatically state that the repetition of such ugliness does nothing to dispel its power.       

Discussions about obscenity often devolve into this bad-faith dichotomy—the prudish schoolmarms with their red-pens painting over anything blue and the brave defenders of free speech pushing the boundaries of acceptable discourse. The former hold that there is a certain power to words that must be tamed, while the later champion the individual right to say what they want to say. When the issue is phrased in such a stark manner, it occludes a more discomforting reality—maybe words are never simply utterances, maybe words can be dangerous, maybe words can enact evil things, and maybe every person has an ultimate freedom to use those words as they see fit (notably a different claim than people should be able to use them without repercussion). Bruce’s theory of language is respectably semiotic, a contention about the arbitrary relationship between signifier and signified, whereby that chain of connection can be severed by simple repetition, as when sense flees from a word said over and over again, whether it’s “potato” or “xylophone.” But he was ultimately wrong (as is all of structural and post-structural linguistics)—language is never exactly arbitrary, it’s not really semiotic. We need theurgy to explain how words work, because in an ineffable and numinous way, words are magic. When it comes to obscenity in particular, whether the sexual or the scatological, the racial or the blasphemous, we’re considering a very specific form of that magic, and while Bruce is correct that a prohibition on slurs would render resistance to oppression all the more difficult, he’s disingenuous in not also admitting that it can provide a means of cruelty in its own right. If you couldn’t say obscenities then a certain prominent tweeter of almost inconceivable power and authority couldn’t deploy them almost hourly against whatever target he sees fit. This is not an argument for censorship, mind you, but it is a plea to be honest in our accounting.

Obscenity as social resistance doesn’t have the same cache it once did, nor is it always interpreted as unassailably progressive (as it was for Bruce and his supporters). In our current season of a supposed Jacobin “cancel culture,” words have been ironically re-enchanted with the spark of danger that was once associated with them. Whether or not those who claim that there is some sort of left McCarthyism policing language are correct, it’s relatively anodyne to acknowledge that right now words are endowed with a significance not seen since Bruce appeared in a Manhattan courtroom. Whatever your own stance on the role that offensiveness plays in civilized society, obscenity can only be theorized through multiple perspectives. Four-letter words inhabit a nexus of society, culture, faith, linguistics, and morality (and the law). A “fuck” is never just a “fuck,” and a shit by any other name wouldn’t smell as pungent. Grammatically, obscenities are often classified as “intensifiers,” that is placeholders that exist to emphasize the emotionality of a given declaration—think of them as oral exclamation marks. Writing in Holy Sh*t: A Brief History of Swearing, Melissa Mohr explains vulgarity is frequently “important for the connotation it carries and not for its literal meaning.” Such a distinction came into play in 2003 after the Irish singer Bono of U2 was cited by the Federal Communications Commission when upon winning a Golden Globe he exclaimed “fucking brilliant.” The Enforcement Commission of the bureau initially decided that Bono’s f-bomb wasn’t indecent since its use clearly wasn’t in keeping with the sexual definition of the word, a verdict that was later rescinded higher up within the FCC.

“Historically,” Mohr writes, “swearwords have been thought to possess a deeper, more intimate connection to the things they represent than do other words,” and in that regard the pencil-necked nerds at the FCC ironically showed more respect to the dangerous power of fucking then did Bono. If vigor of emotion is all one was looking for in language, any number of milquetoast words would work as well as a vulgarity, and yet obscenity (even if uttered due to a stubbed toe) is clearly doing something a bit more transcendent than more PG terms—for both good and bad. Swearing can’t help but have an incantatory aspect to it; we swear oaths, and we’re specifically forbidden by the Decalogue from taking the Lord’s name in vain. Magnus Ljung includes religious themes in his typology of profanity, offered in Swearing: A Cross-Cultural Linguistic Study, as one of “five major themes that recur in the swearing of the majority of the languages discussed and which are in all likelihood also used in most other languages featuring swearing.” Alongside religious profanity, Ljung recognizes themes according to scatology, sex organs, sexual activities, and family insults. To this, inevitably, must also be offered ethnic slurs. Profanity is by definition profane, dealing with the bloody, pussy, jizzy reality of what it means to be alive (and thus the lowering of the sacred into that oozy realm is part of what blasphemously shocks). Obscenity has a quality of the theological about it, even while religious profanities have declined in their ability to shock an increasingly secular society.

Today a word like “bloody” sounds archaic or Anglophilic, and almost wholly inoffensive, even while it’s (now forgotten) reference to Christ’s wounds would have been scandalous to an audience reared on the King James Bible. This was the problem that confronted television director David Milch, who created the classic HBO western Deadwood. The resultant drama (with dialogue largely composed in iambic pentameter) was noted as having the most per capita profanity of any show to ever air, but in 1870s Dakota most of those swears would have been religious in nature. Since having Al Swearengen (a perfect name if ever there was one) sound like Yosemite Sam would have dulled the shock of his speech, Milch elected to transform his characters’ language into scatological and ethnic slurs, the latter of which still has the ability to upset an audience in a way that “by Christ’s wounds!” simply doesn’t. When Swearengen offers up his own theory of language to A.W. Merrick, who edits Deadwood’s newspaper, he argues that “Just as you owning a print press proves only an interest in the truth, meaning up to a fucking point, slightly more than us others maybe, but short of a fucking anointing or the shouldering of a sacred burden—unless of course the print press was gift of an angel,” he provides a nice synthesis of the blasphemous and the sexual. The majority of copious swears in Deadwood are of the scatological, sexual, or racial sort, and they hit the ear-drum with far more force than denying the divinity of Christ does. When Milch updated the profanity of the 19th century, he knew what would disturb contemporary audiences, and it wasn’t tin-pot sacrilege.  

All of which is to say that while obscenity has a social context, with what’s offensive being beholden to the mores of a particular century, the form itself universally involves the transgression of propriety, with the details merely altered to the conventions of a time and place. As an example, watch the 2005 documentary The Aristocrats directed by magician Penn Jillette, which features dozens of tellings of the almost unspeakably taboo joke of the same name. Long an after-hours joke told by comedians who would try to one-up each other in the degree of profanity offered, Jillette’s film presents several iconic performers giving variations on the sketch. When I saw the film after it came out, the audience was largely primed for the oftentimes extreme sexual and scatological permutations of the joke, but it was the tellings that involved racial slurs and ethnic stereotypes that stunned the other theater goers. It’s the pushing of boundaries in and of itself, rather than the subject in question, that designates something as an obscenity. According to Sigmund Freud in his (weirdly funny) The Joke and Its Relation to the Unconscious, vulgar humor serves a potent psychological purpose, allowing people “to enjoy undisguised obscenity” that is normally repressed so as to keep “whole complexes of impulses, together with their derivatives, away from consciousness.” Obscenity thus acts as a civilizational pressure valve for humanity’s chthonic impulses.

That
words which are considered obscene are often found in the vocabulary of the
marginalized isn’t incidental, and it recommends spicey language as a site of
resistance. English swearing draws directly from one such point of contact
between our “higher” and our “lower” language. The majority of English swears
have a Germanic origin, as opposed to a more genteel Romance origin (whether
from French or Latin). In keeping with their Teutonic genesis, they tend to
have an abrasive, guttural, jagged quality to their sounds, the better to
convey an onomatopoeic quality. Take a look at the list which comprises comedian
George Carlin’s 1972 bit “Seven Words You Can Never Say on Television.” Four of
them definitely have an Old English etymology, traceable back to the West
Germanic dialect of the Angles, Saxons, Frisians, and Jutes who occupied
Britain in the later centuries of the first millennium. Three of them – the one
that rudely refers to female genitalia, the one that tells you to rudely do
something sexual, and the one that tells you to do that thing to your mother –
may have Latin or Norman origins, though linguists think they’re just as likely
to come from what Medievalists used to call “Anglo-Saxon.” Most of these words
had no obscene connotations in their original context; in Old English the word
for urine is simply “piss,” and the word for feces is “shit.” Nothing dirty
about either word until the eleventh-century Norman invasion of Britain privileged
the French over the English. That stratification, however, gives a certain
gutter enchantment to those old prosaic terms, endowing them with the force of
a swear. Geoffrey Hughes writes in Swearing: A Social History of Foul
Language, Oaths, and Profanities in English that the “Anglo-Saxon element… provides
much more emotional force than does the Normal French of the Latin. Copulating
pandemonium! conveys none of the emotion charge of the native equivalent fucking
hell!”  Invasion, oppression, and
brutality mark those words which we consider to be profane, but they also give
them their filthy enchantments.

What’s clear is that the class connotations of what Bruce called an “argot” can’t be ignored. Swearing is the purview of criminals and travelers, pirates and rebels, highwaymen and drunks. For those lexicographers who assembled lists of English words in the early modern era, swearing, or “canting,” provided an invaluable window into the counter-cultural consciousness. The Irish playwright Richard Head compiled The Canting Academy, or Devil’s Cabinet Opened in 1673, arguably the first full-length English “dictionary,” complied decades before Dr. Johnson’s staider 1755 A Dictionary of the English Language. Decades before Head’s book, and short pamphlets by respectable playwrights from Thomas Dekker to Thomas Middleton similarly illuminated people on the criminal element’s language—other examples that were included as appendices within books, such as Thomas Harman’s A Caveat or Warning for Common Cursitors, go back well into the 16th century. Such “canting guides,” exploring the seamy underbelly of the cockney capital, were prurient pamphlets that illustrated the salty diction of thieves and rogues for the entertainment of the respectable classes. One of the most popular examples was the anonymously edited A New Dictionary of the Terms Ancient and Modern of the Canting Crew, first printed in 1698.  Within, readers could learn the definitions of insults from “blobber-lipped” to “jobber-not.” Such dictionaries (that included words like “swindler” and “phony,” which still survive today) drew from the English underclass, with a motley vocabulary made up of words from rough-hewn English, Romani, and ultimately Yiddish, among other origins.

A direct line runs between the vibrant, colorful, and earthy diction of canting to cockney rhyming slang, or the endangered dialect of Polari used for decades by gay men in Great Britain, who lived under the constant threat of state punishment. All of these tongues are “obscene,” but that’s a function of their oppositional status to received language. Nothing is “dirty” about them; they are, rather, rebellions against “proper” speech, “dignified” language, “correct” talking, and they challenge that codified violence implied by the mere existence of the King’s Speech. Their differing purposes, and respective class connotations and authenticity, are illustrated by a joke wherein a hobo asks a nattily dressed businessman for some change. “’Neither a borrower nor a lender be’—that’s William Shakespeare,” says the businessman. “’Fuck you’—that’s David Mamet,” responds the panhandler. A bit of a disservice to the Bard, however, who along with Dekker and Middleton could cant with the best of them. For example, within the folio one will find “bawling, blasphemous, incharitible dog,” “paper fac’d villain,” and “embossed carbuncle,” among such other similarly colorful examples.

An entire history could be written about early instances of noted slurs, which of course necessitates trawling the Oxford English Dictionary for examples of dirty words that appear particularly early. For “shit,” there is a 1585 instance of the word in a Scottish “flyting,” an extemporaneous poetic rhyme-battle held in Middle Scotts, which took place between Patrick Hume and Alexander Montgomerie. The greatest example of the form is the 15th-century Flyting of Dunbar and Kennedy, containing the first printed instance of the word “fuck.” In the OED, our good friend the dirty lexicographer Richard Head has the earliest example given in the entry for the word “fuck,” the profanity appearing as a noun in his play Hic et Ubique: or, The Humors of Dublin ,wherein a character says “I did creep in…and there I did see [him] putting the great fuck upon my wife.” And the dictionary reflects the etymological ambiguity concerning the faux-francophone/faux-Virgilian word “dildo,” giving earliest attribution to the playwright Robert Greene in 1590 who in his comedy Never Too Late wrote “Dildido dildido, Oh love, oh love, I feel thy rage rumble below and above.” Swearing might be a radical alternative to received language, but it pulses through literature like a counter-history, a shadow realm of the English tongue’s full capabilities. It is a secret language, the twinned-double of more respectable letters, and it’s unthinkable to understand Geoffrey Chaucer without his scatological jokes or Shakespeare minus his bawdy insults. After all, literature is just as much Charles Bukowski as T.S. Eliot; it’s William S. Burroughs and not just Ezra Pound.

Sometimes those dichotomies about what language is capable of are reconciled within the greatest of literature. A syllabus of the immaculate obscene would include the Marquis de Sade’s 120 Days of Sodom, Charles Baudelaire’s The Flowers of Evil, Gustave Flaubert’s Madame Bovary, Joyce’s Ulysses, Lawrence’s Lady Chatterley’s Lover, Vladimir Nabokov’s Lolita, Henry Miller’s Tropic of Cancer (smuggled out of a Barnes & Noble by yours truly when I was 16), and Irving Welsh’s Trainspotting. Along with his fellow Scotsman James Kelman, Welsh shows the full potential of obscenity to present an assault on the pieties of the bourgeois, mocking Madison Avenue sophistry when he famously implores the reader to “Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.” Within the English language, looming above all as the primogeniture of literary smut, is the great British author John Cleland, who in 1748 published our first pornographic novel in Fanny Hill: Or, Memoirs of a Woman of Pleasure, wherein he promised “Truth! stark naked truth, is the word, and I will not so much as take the pains to bestow the strip of a gauze-wrapper on it.” Cleland purposefully wrote Fanny Hill entirely in euphemisms and double entendres, but the lack of dirty words couldn’t conceal the fact that the orgiastic bildungsroman about a middle-age nymphomaniac was seen as unspeakably filthy. The novel has the distinction of being the longest banned work in U.S. history, first prohibited by the Massachusetts Supreme Court in 1821, only to be sold legally after the U.S. Supreme Court ruled that its censorship was unconstitutional in 1966. The same year that Bruce was found face-down, naked and dead, in his California bathroom.     

A goddamn unequivocal fucking triumph of the human spirt that any fucking wanker can march up into a public library and check out a copy of Fanny Hill. That liberty is one that was hard fought for, and we should look askance on anyone who’d throw it away too cavalierly. But there is also something disingenuous as dismissing all those who suppressed works like Fanny Hill or Ulysses or Lady Chatterley’s Lover as mere prigs and prudes. A work is never censored because it isn’t powerful; it’s attacked precisely because of that coiled, latent energy that exists within words, none the more so than those that we’ve labeled as forbidden. If the debate over free speech and censorship is drenched in a sticky oil of bad faith, then that slick spills over into all corners. My fellow liberals will mock the conservative perspective that says film or comic books or video games or novels are capable of altering someone into action, sometimes very ugly action—but of course literature is capable of doing this. Why would we read literature otherwise? Why would we create it otherwise? The censor with his black marker in some ways does due service to literature, acknowledging its significance and its uncanny effect. To claim that literature shouldn’t be censored because all literature is safe is not just fallacious, it’s disrespectful. The far more difficult principle is that literature shouldn’t be censored despite the fact that it’s so often dangerous.

Like any grimoire or incantation, obscenity can be used to liberate and to oppress, to free and to enslave, to bring down those in power but also to froth a crowd into the most hideous paroxysms of fascistic violence. So often the moralistic convention holds that “punching down” is never funny, but the dark truth is that it often is. What we do with that reality is the measure of us as people, because obscenity is neither good nor bad, but all power resides within the mouth of who wields it. What we think of as profanity is a rupture within language, a dialectic undermining conventional speech, what the Greeks called an aporia that constitutes the moment that rhetoric breaks down. Obscenity is when language declares war on itself, often with good cause. Writing in Rabelais and His World, the great Russian critic Mikhail Bakhtin defined what he called the “carnivalesque,” that is the principle that structured much medieval and Renaissance performance and literature, whereby the “principle of laughter and the carnival spirit on which the grotesque is based destroys…seriousness and all pretense.” Examining the Shrovetide carnivals that inaugurated pre-Reformation Lent, Bakhtin optimistically saw something liberatory in the ribald display of upended hierarchies, where the farting, shitting, pissing, vomiting hilarity of the display rendered authority foolish. “It frees human consciousness,” Bakhtin wrote, “and imagination for new potentialities.”

An uneasy and ambivalent undercurrent threads through Bakhtin’s argument, though. If the carnival allowed for a taste of emancipation, there was also always the possibility that it was just more bread and circus, a way to safely “rebel” without actually challenging the status quo. How much of our fucks and shits are just that, simply the smearing of feces on our playpen walls? Even worse, what happens when the carnival isn’t organized by plucky peasants to mock the bishops and princes, but when the church and state organize those mocking pageants themselves? Bakhtin didn’t quite anticipate the troll, nor did Bruce for that matter. Gershon Legman writes in the standard text Rationale of the Dirty Joke: An Analysis of Sexual Humor that “Under the mask of humor, our society allows infinite aggressions, by everyone and against everyone. In the culminating laugh of the listener or observer…the teller of the joke betrays his hidden hostility.” Can’t you take a joke? Because I was just joking. Legman’s reading of obscenity is crucial—it’s never just innocent, it’s never just nothing, it’s never just words. And it depends on who is saying them, and to whom they’re being said. Because swearing is so intimately tied to the theological, the use of profanity literally takes on the aura of damnation. It’s not that words aren’t dangerous—they are. But that doesn’t mean we must suture our mouths, even as honesty compels us to admit that danger. What we do with this understanding is the process that we call civilization. Because if Lenny Bruce had one unassailable and self-evident observation, it was that “Life is a four-letter word.” How could it be fucking otherwise?

Bonus Link:—Pussy Riot: One Woman’s Vagina Takes on Japan’s Obscenity Laws

Image Credit: Flickr/Jeniffer Moo

Ten Ways to Save the World

1.              In a purple-walled gallery of the Smithsonian Museum of American Art, you can visit the shrine constructed by Air Force veteran and janitor James Hampton for Jesus Christ’s return. Entitled “Throne of the Third Heaven of the Nation’s Millennium General Assembly,” the altar and its paraphernalia were constructed to serve as temple objects for the messiah, who according to Hampton, based on visions he had of Moses in 1931, the Virgin Mary in 1946, and Adam in 1949, shall arrive in Washington D.C. His father had been a part-time gospel singer and Baptist preacher, but Hampton drew not just from Christianity, but brought Afrocentric folk traditions of his native South Carolina to bear in his composition. Decorated with passages from Daniel and Revelation, Hampton’s thinking (done in secret over 14 years in his Northwest Washington garage) is explicated in his 100-page manifesto St. James: The Book of the 7 Dispensations (dozens of pages are still in an uncracked code). Claiming that he had received a revised version of the Decalogue, Hampton’s notebook declared himself to be “Director, Special Projects for the State of Eternity.” His work is a fugue of word salad, a concerto of pressured speech. A combined staging ground for the incipient millennium—Hampton’s shrine is a triumph.

As if the bejeweled shield of the Urim and the Thummim were constructed not by Levites in ancient Jerusalem, but by a janitor in Mt. Vernon. Exodus and Leviticus give specifications for those liturgical objects of the Jewish Temple—the other-worldly cherubim gilded, their wings touching the hem of infinity huddled over the Ark of the Covenant; the woven brocade curtain with its many-eyed Seraphim rendered in fabric of red and gold; the massive candelabra of the ritual menorah. The materials with which the Jews built their Temple were cedar and sand stone, gold and precious jewels. When God commanded Hampton to build his new shrine, the materials were light-bulbs and aluminum foil, door frames and chair legs, pop cans and cardboard boxes, all held together with glue and tape. The overall effect is, if lacking in gold and cedar, transcendent nonetheless. Hampton’s construction looks almost Mesoamerican, aluminum foil delicately hammered onto carefully measured cardboard altars, names of prophets and patriarchs from Ezekiel to Abraham rendered.

Everyday Hampton would return from his job at the General Services Administration, where he would mop floors and disinfect counters, and for untold hours he’d assiduously sketch out designs based on his dreams, carefully applying foil to wood and cardboard, constructing crowns from trash he’d collected on U Street. What faith would compel this, what belief to see it finished? Nobody knew he was doing it. Hampton would die of stomach cancer in 1964, never married, and with few friends or family. The shrine would be discovered by a landlord angry about late rent. Soon it would come to the attention of reporters, and then the art mavens who thrilled to the discovery of “outsider” art—that is work accomplished by the uneducated, the mentally disturbed, the impoverished, the religiously zealous. “Throne of the Third Heaven of the Nation’s Millennium General Assembly” would be purchased and donated to the Smithsonian (in part through the intercession of artist Robert Rauschenberg) where it would be canonized as the Pieta of American visionary art, outsider art’s Victory of Samothrace. 

Hampton wasn’t an artist though—he was a prophet. He was Elijah and Elisha awaiting Christ in the desert. Daniel Wojcik writes in Outsider Art: Visionary Worlds and Trauma that “apocalyptic visions often have been expressions of popular religiosity, as a form of vernacular religion, existing at a grassroots level apart from the sanction of religious authority.” In that regard Hampton was like so many prophets before him, just working in toilet paper and beer can rather than papyrus—he was Mt. Vernon’s Patmos. Asking if Hampton was mentally ill is the wrong question; it’s irrelevant if he was schizophrenic, bipolar. Etiology only goes so far in deciphering the divine language, and who are we so sure of ourselves to say that the voice in a janitor’s head wasn’t that of the Lord? Greg Bottoms writes in Spiritual American Trash: Portraits from the Margins of Art and Faith that Hampton “knew he was chosen, knew he was a saint, knew he had been granted life, this terrible, beautiful life, to serve God.” Who among us can say that he was wrong? In his workshop, Hampton wrote on a piece of paper “Where there is no vision, the people perish.” There are beautiful and terrifying things hidden in garages all across America; there are messiahs innumerable. Hampton’s shrine is strange, but it is oh so resplendent.

2.              By the time Brother John Nayler genuflected before George Fox, the founder of the Quaker Society of Friends, his tongue had already been bored with a hot iron poker and the letter “B” (for “Blasphemer”) had been branded onto his forehead by civil authorities. The two had not gotten along in the past, arguing over the theological direction of the Quakers, but by 1659 Nayler was significantly broken by their mutual enemies that he was forced to drag himself to Fox’s parlor and to beg forgiveness. Three years had changed the preacher’s circumstances, for it was in imitation of the original Palm Sunday that in 1656 Nayler had triumphantly entered into the sleepy sea-side town of Bristol upon the back of a donkey, the religious significance of the performance inescapable to anyone. A supporter noted in a diary that Nayler’s “name is no more to be called James but Jesus,” while in private writings Fox noted that “James ran out into imaginations… and they raised up a great darkness in the nation.”

At the start of 1656, Nayler was imprisoned, and when Fox visited him in his cell, the latter demanded that the former kiss his foot (belying the Quaker reputation for modesty). “It is my foot,” Fox declared, but Nayler refused. The confidence of a man who reenacted Christ’s entrance into Jerusalem. Guided by the Inner Light that Quakers saw as supplanting even the gospels, Nayler thought of his mission in messianic terms, and organized his vehicle to reflect that. Among the “Valiant Sixty,” itinerant preachers who were too radical even for the Quakers, Nayler was the most revolutionary, condemning slavery, enclosure, and private property. The tragedy of Nayler is that he happened to not actually be the messiah. Before his death, following an assault by a highwayman in 1660, Nayler would write that his “hope is to outlive all wrath and contention, and to wear out all exaltation and cruelty, or whatever is of a nature contrary to itself.” He was 42, beating Christ by almost a decade.

“Why was so much fuss made?” asks Christopher Hill in his classic The World Turned Upside Down: Radical Ideas During the English Revolution. “There had been earlier Messiahs—William Franklin, Arise Evans who told the Deputy Recorder of London that he was the Lord his God…Gadbury was the Spouse of Christ, Joan Robins and Mary Adams believed they were about to give birth to Jesus Christ.” Hill’s answer to the question of Nayler’s singularity is charitable, writing that none of the others actually seemed dangerous, since they were merely “holy imbeciles.” The 17th-century, especially around the time of the English civil wars, was an age of blessed insanity, messiahs proliferating like dandelions after a spring shower. There was John Reeve, Laurence Clarkson, and Lodowicke Muggleton, who took turns arguing over which were the two witnesses mentioned in Revelation, and that God had absconded from heaven and the job was now open. Abiezer Cope, prophet of a denomination known as the Ranters, demonstrated that designation in his preaching and writing. One prophet, Thoreau John Tany (who designated himself “King of the Jews”), simply declared “What I have written, I have written,” including the radical message that hell was liberated and damnation abolished. Regarding the here and now, Tany had some similar radical prescriptions, including to “feed the hungry, clothe the naked, oppress none, set free them bounden.”

There have been messianic claimants from first century Judea to contemporary Utah. When St. Peter was still alive there was the Samaritan magician Simon Magus, who used Christianity as magic and could fly, only to be knocked from the sky during a prayer-battle with the apostle. In the third century the Persian prophet Mani founded a religion that fused Christ with the Buddha, that had adherents from Gibraltar to Jinjiang, and a reign that lasted more than a millennium (with its teachings smuggled into Christianity by former adherent St. Augustine). A little before Mani, and a Phrygian prophet named Montanus declared himself an incarnation of the Holy Spirit, along with his consorts Priscilla and Maximillia. Prone to fits of convulsing revelation, Montanus declared “Lo, the man is as a lyre, and I fly over him as a pick.” Most Church Fathers denounced Montanism as rank heresy, but not Tertullian, who despite being the primogeniture of Latin theology, was never renamed “St. Tertullian” because of those enthusiasms. During the Middle Ages, at a time when stereotype might have it that orthodoxy reigned triumphant, and mendicants and messiahs, some whose names aren’t preserved to history and some who amassed thousands of followers, proliferated across Europe. Norman Cohn remarks in The Pursuit of the Millennium that for one eighth-century Gaulish messiah named Aldebert, followers “were convinced that he knew all their sins…and they treasured as miracle-working talismans the nail pairings and hair clippings he distributed among them.” Pretty impressive, but none of us get off work for Aldebert’s birthday.  

More recently, other messiahs include the 18th-century prophetess and mother of the Shakers Anne Lee, the 20th-century founder of the Korean Unification Church Sun Myung Moon (known for his elaborate mass weddings and owning the conservative Washington Times), and the French test car driver Claude Vorilhon who renamed himself Raël and announced that he was the son of an extraterrestrial named Yahweh (more David Bowe’s The Rise and Fall of Ziggy Stardust and the Spiders from Mars then Paul’s epistles). There are as many messiahs as there are people; there are malicious messiahs and benevolent ones, deluded head-cases and tricky confidence men, visionaries of transcendent bliss and sputtering weirdos. What unites all of them is an observation made by Reeve that God speaks to them as “to the hearing of the ear as a man speaks to a friend.” 

3.Hard to identify Elvis Presley’s apotheosis. Could have been the ’68 Comeback Special, decked in black leather warbling “Suspicious Minds” in that snarl-mumble. Elvis’s years in the wilderness precipitated by his manager Col. Tom Parker’s disastrous gambit to have the musician join the army, only to see all of the industry move on from his rockabilly style, now resurrected on a Burbank sound stage. Or maybe it was earlier, on The Milton Berle Show in 1956, performing Big Mama Thorton’s hit “Hound Dog” to a droopy dog, while gyrating on the Hollywood stage, leading a critic for the New York Daily News to opine that Elvis “gave an exhibition that was suggestive and vulgar, tinged with the kind of animalism that should be confined to dives and bordellos.” A fair candidate for that moment of earthly transcendence could be traced back to 1953, when in Sun Records’ dusty Memphis studio Elvis would cover Junior Parker’s “Mystery Train,” crooning out in a voice both shaky and confident over his guitar’s nervous warble “Train I ride, sixteen coaches long/Train I ride, sixteen coaches long/Well that long black train, got my baby and gone.” But in my estimation, and at the risk of sacrilege, Elvis’s ascension happened on Aug. 16, 1977 when he died on the toilet in the private bathroom of his tacky and opulent Graceland estate.

The story of Elvis’s death has the feeling of both apocrypha and accuracy, and like any narrative that comes from that borderland country of the mythic, it contains more truth than the simple facts can impart. His expiration is a uniquely American death, but not an American tragedy, for Elvis was able to get just as much out of this country as the country ever got out of him, and that’s ultimately our true national dream. He grabbed the nation by its throat and its crotch, and with pure libidinal fury was able to incarnate himself as the country. All of the accoutrement—the rhinestone jumpsuits, the karate and the Hawaiian schtick, the deep-fried-peanut-butter-banana-and-bacon-sandwiches, the sheer pill-addicted corpulence—are what make him our messiah. Even the knowing, obvious, and totally mundane observation that he didn’t write his own music misses the point. He wasn’t a creator—he was a conduit. Greil Marcus writes in Mystery Train: Images of America in Rock ‘n’ Roll of the “borders of Elvis Presley’s delight, of his fine young hold on freedom…[in his] touch of fear, of that old weirdness.” That’s the Elvis that saves, the Elvis of “That’s All Right (Mama)” and “Shake, Rattle, and Roll,” those strange hillbilly tracks, that weird chimerical sound—complete theophany then and now.

There’s a punchline quality to that contention about white-trash worshipers at the Church of Elvis, all of those sightings in the Weekly World News, the Las Vegas impersonators of various degrees of girth, the appearance of the singer in the burnt pattern of a tortilla. This is supposedly a faith that takes its pilgrimage to Graceland as if it were zebra-print Golgotha, that visits Tupelo as if it were Nazareth. John Strausbaugh takes an ethnographer’s calipers to Elvism, arguing in E: Reflections on the Birth of the Elvis Faith that Presley has left in his wake a bona fide religion, with its own liturgy, rituals, sacraments, and scripture. He writes that it is a “The fact that outsiders can’t take it seriously may turn out to be its strength and its shield. Maybe by the time Elvism is taken seriously it will have quietly grown too large and well established to be crushed.” There are things less worthy of your worship than Elvis Presley. If we were to think of an incarnation of the United States, of a uniquely American messiah, few candidates would be more all consumingly like the collective nation. In his appetites, his neediness, his yearning, his arrogance, his woundedness, his innocence, his simplicity, his cunning, his coldness, and his warmth, he was the first among Americans. Elvis is somehow both rural and urban, northern and southern, country and rock, male and female, white and black. Our contradictions are reconciled in him. “Elvis lives in us,” Strausbaugh writes, “There is only one King and we know who he is.” We are Elvis and He was us.   

4.              A hideous slaughter followed those settlers as they drove deep into the continent. On that western desert, where the lurid sun’s bloodletting upon the burnt horizon signaled the end of each scalding day, a medicine man and prophet of the Paiute people had a vision. In a trance, Wodziwob received an oracular missive, that “within a few moons there was to be a great upheaval or earthquake… he whites would be swallowed up, while the Indians would be saved.” Wodziwob would be the John the Baptist to a new movement, for though he would die in 1872, the ritual practice that he taught—the Ghost Dance—would become a rebellion against the genocidal policy of the U.S. Government. For Wodziwob, the Ghost Dance was an affirmation, but it has also been remembered as a doomed moment. “To invoke the Ghost Dance has been to call up an image of indigenous spirituality by turns militant, desperate, and futile,” writes Louis S. Warren in God’s Red Son: The Ghost Dance Religion and the Making of Modern America, “a beautiful dream that died.” But a dream that was enduring. 

While in a coma precipitated by scarlet fever, during a solar eclipse, on New Year’s Day of 1889, a Northern Paiute Native American who worked on a Carson City, Nevada, ranch and was known as Jack Wilson by his coworkers and as Wovoka to his own people, fell into a mystical vision not unlike Wodziwob’s. Wovoka met many of his dead family members, he saw the prairie that exists beyond that which we can see, and he held counsel with Jesus Christ. Wovoka was taught the Ghost Dance, and learned what Sioux Chief Lame Deer would preach, that “the people…could dance a new world into being.” When Wovoka returned he could control the weather, he was able to compel hail from the sky, he could form ice with his hands on the most sweltering of days. The Ghost Dance would spread throughout the western United States, embraced by the Paiute, the Dakota, and the Lakota. Lame Deer said that the Ghost Dance would roll up the earth “like a carpet with all the white man’s ugly things—the stinking new animals, sheep and pigs, the fences, the telegraph poles, the mines and factories. Underneath would be the wonderful old-new world.”

A messiah is simultaneously the most conservative and most radical, preaching a return to a perfected world that never existed but also the overturning of everything of this world, of the jaundiced status quo. The Ghost Dance married the innate strangeness of Christianity to the familiarity of native religion, and like both it provided a blueprint for how to overthrow the fallen things. Like all true religion, the Ghost Dance was incredibly dangerous. That was certainly the view of the U.S. Army and the Bureau of Indian Affairs, which saw an apocalyptic faith as a danger to white settler-colonials and their indomitable, zombie-like push to the Pacific. Manifest Destiny couldn’t abide by the hopefulness of a revival as simultaneously joyful and terrifying as the Ghost Dance, and so an inevitable confrontation awaited.   

The Miniconjou Lakota people, forced into the Pine Ridge Reservation by 1890, were seen as particularly rebellious, in part because their leader Spotted Elk was an adherent. As a pretext concerning Lakota resistance to disarmament, the army opened fire on gathered Miniconjou, and more than 150 people (mostly women and children) would be slaughtered during the Wounded Knee Massacre. As surely as the Romans threw Christians to the lions and Cossacks rampaged through the Jewish shtetls of eastern Europe, so too were the initiates of the Ghost Dance persecuted, murdered, and martyred by the U.S. Government. Warren writes that the massacre “has come to stand in for the entire history of the religion, as if the hopes of all of its devoted followers began and ended in that fatal ravine.” Wounded Knee was the Calvary of the Ghost Dance faith, but if Calvary has any meaning it’s that crucified messiahs have a tendency not to remain dead. In 1973 a contingent of Oglala Lakota and members of the American Indian Movement occupied Wounded Knee, and the activist Mary Brave Beard defiantly performed the Ghost Dance, again.   

5.             Rabbi Menachem Mendel Schneerson arrived in the United States via Paris, via Berlin, and ultimately via Kiev. He immigrated to New York on the eve of America’s entry into the Second World War, and in that interim six million Jews were immolated in Hitler’s ovens—well over half of all the Jews in the world. Becoming Lubavitch Chief Rebbe in 1950, Schneerson was a refugee from a broken Europe that had devoured itself. Schneerson’s denomination of Hasidism had emerged after the vicious Cossack-led pogroms that punctuated life in 17th-century eastern Europe, when many Jews turned towards the sect’s founder, the Baal Shem Tov. His proper name was Rabbi Israel ben Eliezer, and his title (often shortened to “Besht”) meant “Master of the Good Name,” for the Baal Shem Tov incorporated Kabbalah into a pietistic movement that enshrined emotion over reason, feeling over logic, experience over philosophy. David Bial writes in Hasidism: A New History that the Besht espoused “a new method of ecstatic joy and a new social structure,” a fervency that lit a candle against persecution’s darkness.

When Schneerson convened a gathering of Lubavitchers in a Brooklyn synagogue for Purim in 1953,  a black cloud enveloped the Soviet Union. Joseph Stalin was beginning to target Jews whom he implicated in “The Doctor’s Plot,” an invented accusation that Jewish physicians were poisoning Soviet leadership. The state propaganda organ Pravda denounced these supposed members of a “Jewish bourgeois-nationalist organization… The filthy face of this Zionist spy organization, covering up their vicious actions under the mask of charity.” Four gulags were constructed in Siberia, with the understanding that Russian Jews would be deported and perhaps exterminated. Less than a decade after Hitler’s suicide, and Schneerson would look out into the congregation of swaying black-hatted Lubavitchers, and would see a people labeled for extinction.

And so on that evening, Schneerson explicated on the finer points of Talmudic exegesis, on questions of why evil happens in the world, and what man and G-d’s role is in containing that wickedness. Witnesses said that the very countenance of the rabbi was transformed, as he declared that he would speak the words of the living G-d. Enraptured in contemplation, Schneerson connected the Persian courtier Haman’s war against the Jews and Stalin’s upcoming campaign, he invoked G-d’s justice and mercy, and implored the divine to intervene and prevent the Soviet dictator from completing that which Hitler had begun. The Rebbe denounced Stalin as the “evil one,” and as he shouted it was said that his face transformed into a “holy fire.”

Two days later Moscow State Radio announced that Stalin had fallen ill and died. The exact moment of his expiration was when a group of Lubavitch Jews had prayed that G-d would still the hand of the tyrant and punish his iniquities. Several weeks later, and Soviet leadership would admit that the Doctor’s Plot was a government rouse invented by Stalin, and they exonerated all of those who’d been punished as a result of baseless accusations. By the waning days of the Soviet Union, the Lubavitcher Rebbe would address crowds gathered in Red Square by telescreen while the Red Army Band performed Hasidic songs. “Was this not the victory of the Messiah over the dark forces of the evil empire, believers asked?” writes Samuel Heilman and Menachem Friedman in The Rebbe: The Life and Afterlife of Menachem Mendel Schneerson.

The Mashiach (“anointed one” in Hebrew) is neither the Son of G-d nor the incarnate G-d, and his goal is arguably more that of liberation than salvation (whatever either term means). Just as Christianity has had many pseudo-messiahs, so is Jewish history littered with figures whom some believers saw as the anointed one (Christianity is merely the most successful of these offshoots). During the Second Jewish-Roman War of the second century, the military commander Simon bar Kokhba was lauded as the messiah, even as his defeat led to Jewish exile from the Holy Land. During the 17th century, the Ottoman Jew Sabbatai Zevi amassed a huge following of devotees who believed him the messiah come to subvert and overthrow the strictures of religious law itself. Zevi was defeated by the Ottomans not through crucifixion, but through conversion (which is much more dispiriting). A century later, and the Polish libertine Jacob Frank would declare that the French Revolution was the apocalypse, that Christianity and Judaism must be synthesized, and that he was the messiah. Compared to them, the Rebbe was positively orthodox (in all senses of the word). He also never claimed to be the messiah.

What all share is the sense that to exist is to be in exile. That is the fundamental lesson and gift of Judaism, born from the particularities of Jewish suffering. Diaspora is not just a political condition, or a social one; diaspora is an existential state. We are all marooned from our proper divinity, shattered off from G-d’s being—alone, disparate, isolated, alienated, atomized, solipsistic. If there is to be any redemption it’s in suturing up those shards, collecting those bits of light cleaved off from the body of G-d when He dwelled in resplendent fullness before the tragedy of creation. Such is the story of going home but never reaching that destination, yet continuing nevertheless. What gives this suffering such beauty, what redeems the brokenness of G-d, is the sense that it’s that very shattering that imbues all of us with holiness. What the German-Jewish philosopher Walter Benjamin explains in On the Concept of History as the sacred reality that holds that “every second was the narrow gate, through which the Messiah could enter.”

6.When the Living God landed at Palisadoes Airport in Kingston, Jamaica, on April 21, 1966, he couldn’t immediately disembark from his Ethiopian Airlines flight from Addis Ababa. More than 100,000 people had gathered at the airport, the air thick with the sticky, sweet smell of ganja, the airstrip so overwhelmed with worshipers come to greet the Conquering Lion of the Tribe of Judah, His Imperial Majesty Haile Selassie I, King of Kings, Lord of Lords, Elect of God, Power of the Trinity, of the House of Solomon, Amhara Branch, noble Ras Tafari Makonnen, that there was a fear the plane itself might tip over. The Ethiopian Emperor, incarnation of Jah and the second coming of Christ, remained in the plane for a few minutes until a local religious leader, the drummer Ras Mortimer Planno, was allowed to organize the emperor’s descent.

Finally, after several tense minutes, the crowd pulled back long enough for the emperor to disembark onto the tarmac, the first time that Selassie would set foot on the fertile soil of Jamaica, a land distant from the Ethiopia that he’d ruled over for 36 years (excluding from 1936 to 1941 when his home country was occupied by the Italian fascists). Jamaica was where he’d first been acknowledged as the messianic promise of the African diaspora. A year after his visit, while being interviewed by the Canadian Broadcasting Corporation, Selassie was asked what he made of the claims of his status. “I told them clearly that I am a man,” he said, “that I am mortal…and that they should never make a mistake in assuming or pretending that a human being is emanated from a deity.” The thing with being a messiah, though, is that it does not depend on the consent of the worshiped whether or not they’re to be adored.

Syncretic and born from the Caribbean experience, and practiced from Kingstown, Jamaica, to Brixton, London, Rastafarianism is a mélange of Christian, Jewish, and uniquely African symbols and beliefs, with its own novel rhetoric concerning oppression and liberation. Popularized throughout the West because of the indelible catchiness of reggae, with its distinctive muted third beat, and the charisma of the musician Bob Marley who was the faith’s most famous ambassador, Rastafarianism is sometime offensively reduced in peoples’ minds to dreadlocks and spliff smoke. Ennis Barrington Edmonds places the faith’s true influence in its proper context, writing in Rastafari: From Outcasts to Culture Bearers that “the movement has spread around the world, especially among oppressed people of African origins… [among those] suffering some form of oppression and marginalization.”

Central to the narrative of Rastafarianism is the reluctant messiah Selassie, a life-long member of the Ethiopian Orthodox Tewahedo Church. Selassie’s reign had been prophesized by the Jamaican Protestant evangelist Leonard Howell’s claim that the crowning of an independent Black king in an Africa dominated by European colonialism would mark the dawn of a messianic dispensation. A disciple of Black nationalist Marcus Garvey, whom Howell met when both lived in Harlem, the minister read Psalm 68:31’s injunction that “Ethiopia shall soon stretch out her hands unto God” as being fulfilled in Selassie’s coronation. Some sense of this reverence is imparted by a Rastafarian named Reuben in Emily Robeteau’s Searching for Zion: The Quest for Home in the African Diaspora who explained that “Ethiopia was never conquered by outside forces. Ethiopia was the only independent country on the continent of Africa…a holy place.” Sacred Ethiopia, the land where the Ark of the Covenant was preserved. 

That the actual Selassie neither embraced Rastafarianism, nor was particularly benevolent in his own rule, and was indeed deposed by a revolutionary Marxist junta, is of no accounting.  Rather what threads through Rastafarianism is what Robeateu describes as a “defiant, anticolonialist mind-set, a spirit of protest… and a notion that Africa is the spiritual home to which they are destined to return.” Selassie’s biography bore no similarity to residents in the Trenchtown slum of Kingstown where the veneration of a distant African king began, but his name served as rebellion against all agents of Babylon in the hopes of a new Zion. Rastafarianism found in Selassie the messiah who was needed, and in their faith there is a proud way of spiritually repudiating the horrors of the trans-Atlantic slave trade. What their example reminds us of is that a powerful people have no need of a messiah, that he rather always dwells amongst the dispossessed, regardless of what his name is.

7.Among the Persian Sufis there is no blasphemy in staining your prayer rug red with shiraz. Popular throughout Iran and into central Asia, where the faith of Zoroaster and Mani had both once been dominant, Sufism drew upon those earlier mystical and poetic traditions and incorporated them into Islam. The faith of the dervishes, the piety of the wali, the Sufi tradition is that of Persian miniatures painted in stunning, colorful detail, of the poetry of Rumi and Hafez. Often shrouded in rough woolen coats and felt caps, the Sufi practice a mystical version of faith that’s not dissimilar to Jewish kabbalah or Christian hermeticism, an inner path that the 20th-century writer Aldous Huxley called the “perennial philosophy.” As with other antinomian faiths, the Sufis often skirt the line of what’s acceptable and what’s forbidden, seeing in heresy intimations of a deep respect for the divine.

A central poetic topoi of Sufi practice is what’s called shath, that is deliberately shocking utterances that exist to shake believers out of pious complacency, to awaken within them that which is subversive about God. One master of the form was Mansour al-Hallaj, born to Persian speaking parents (with a Zoroastrian grandfather) in the ninth century during the Abbasid Caliphate. Where most Sufi masters were content to keep their secrets preserved for initiates, al-Hallaj crafted a movement democratized for the mass of Muslims, while also generating a specialized language for speaking of esoteric truths, expressed in “antithesis, breaking down language into prepositional units, and paradox,” as Husayn ibn Mansur writes in Hallaj: Poems of a Sufi Martyr. Al-Hallaj’s knowledge and piety were deep—he had memorized the Koran by the age of 12 and he prostrated himself before a replica of Mecca’s Kaaba in his Baghdad garden—but so was his commitment to the radicalism of shath. When asked where Allah was, he once replied that the Lord was within his turban; on another occasion he answered that question by saying that God was under his cloak. Finally in 922, borrowing one of the 99 names of God, he declared “I am the Truth.”

The generally tolerant Abbasids decided that something should be done about al-Hallaj, and so he was tied to a post along the Tigris River, repeatedly punched in the face, lashed several times, decapitated, and finally his headless body was hung over the water. His last words were “Akbar al-Hallaj” —“Al-Hallaj is great.” An honorific normally offered for God, but for this self-declared heretical messiah his name and that of the Lord were synonyms. What’s sacrilegious about this might seem clear, save for that al-Hallaj’s Islamic piety was such that he interpreted such a claim as the natural culmination of Tawhid, the strictness of Islamic monotheism pushed to its logical conclusion—there is but one God, and everything is God, and we are all in God. Idries Shah explains the tragic failure of interpretation among religious authorities in The Sufis, writing that the “attempt to express a certain relationship in language not prepared for it causes the expression to be misunderstood.”  The court, as is obvious, did not agree.

If imagining yourself as the messiah could get you decapitated by the 10th century Abbasids, then in 20th-century Michigan it only got you institutionalized. Al-Hallaj implied that he was the messiah, but for the psychiatric patients in Milton Rokeach’s 1964 study The Three Christs of Ypsilanti each thought that they were the authentic messiah (with much displeasure ensuing when they meet). Based on three years of observation at the Ypsilanti State Hospital starting in 1959, Rokeach treated this trinity of paranoid schizophrenics. Initially Rokeach thought that the meeting of the Christs would disavow them all of their delusions, that the law of logical non-contradiction might mean anything to a psychotic. But the messiahs were steadfast in their faith—each was singular and the others were imposters. Then Rokeach and his graduate students introduced fake messages from other divine beings, in a gambit that the psychiatrist apologized for two decades later and would most definitely land him before an ethics board today. Finally Rokeach grants them the right to their insanities, each of the Christs of Ypsilanti continuing in their merry madness. “It’s only when a man doesn’t feel that he’s a man,” Rokeach concludes, “that he has to be a god.”

Maybe. Or maybe the true madness of the Michigan messiahs was that each thought themselves the singular God. They weren’t in error that each of them were the messiah, they were in error by denying that truth in their fellow patients. Al-Hallaj would have understood, declaring before his executioner that “all that matters for the ecstatic is that the Unique shall reduce him to Unity.” The Christs may have benefited more by Sufi treatment than psychotherapy. Clyde Benson, Joseph Cassel, and Leon Gabor all thought themselves to be God, but al-Hallaj knew that he was (and that You reading this are as well). Those three men may have been crazy, but al-Hallaj was a master of what the Buddhist teacher Wes Nisker calls “crazy wisdom.” In his guidebook The Essential Crazy Wisdom, he celebrates the sacraments of “clowns, jesters, tricksters, and holy fools,” who understand that “we live in a world of many illusions, that the emperor has no clothes, and that much of human belief and behavior is ritualized nonsense.” By contrast, the initiate in crazy wisdom, whether gnostic saint of Kabbalist rabbi, Sufi master or Zen monk, prods at false piety to reveal deeper truths underneath. “I saw my Lord with the eye of the heart,” al-Hallaj wrote in one poem, “I asked, ‘Who are You?’/He replied, ‘You.’”

8. “Bob” is the least-likely looking messiah. With his generic handsomeness, his executive hair-cut dyed black and tightly parted on the left, the avuncular pipe that jauntily sticks out of his tight smile, “Bob” looks like a stock image of a 1950’s pater familias (his name is always spelled with quotation marks). Like a clip-art version of Mad Men’s Don Draper, or Utah Sen. Mitt Romney. “Bob” is also not real (which may or may not distinguish him from other messiahs), but rather the central figure in the parody Church of the Sub-Genius. Supposedly a traveling salesman, J.R. “Bob” Dobbs had a vision of JHVH-1 (the central God in the church) in a homemade television set, and he then went on the road to evangelize. Founded by countercultural slacker heroes Ivan Stang and Philo Drummond in 1979 (though each claims that “Bob” was the actual primogeniture), the Church of the SubGenius is a veritable font of crazy wisdom, promoting the anarchist practice of “culture jamming” and parody in the promulgation of a faith where it’s not exactly clear what’s serious and what isn’t.

“Bob” preaches a doctrine of resistance against JHVH-1 (or Jehovah 1), the demiurge who seems as if a cross between Yahweh and a Lovecraftian elder god. JHVH-1 intended for “Bob” to encourage a pragmatic, utilitarian message about the benefits of a work ethic, but contra his square appearance, the messiah preferred to advocate that his followers pursue a quality known as slack. Never clearly defined (though its connotations are obvious), slack is to the Church of the SubGenius what the Tao is to Taoism or the Word is to Christianity, both the font of all reality and that which gives life meaning. Converts to the faith include luminaries like the underground cartoonist Robert Crumb, Pee-wee Herman Show creator Paul Reubens, founder of the Talking Heads David Byrne, and of course Devo’s Mark Mothersbaugh. If there is any commandment that most resonates with the emotional tenor of the church, it’s in “Bob’s” holy commandment that says “Fuck ‘em if they can’t take a joke.”

The Church of the SubGenius is oftentimes compared to another parody religion that finds its origins from an identical hippie milieu, though first appearing almost two decades before in 1963, known by the ominous name of Discordianism. Drawing from Hellenic paganism, Discordianism holds as one of its central axioms in the Principia Discordia (written by founders Greg Hill and Kerry Wendell Thornley under the pseudonyms of Malaclypse the Younger and Omar Khayyam Ravenhurst) that the “Aneristic Principle is that of apparent order; the Eristic Principle is that of apparent disorder. Both order and disorder are man made concepts and are artificial divisions of pure chaos, which is a level deeper than is the level of distinction making.” Where other mythological systems see chaos as being tamed and subdued during ages primeval, the pious Discordian understands that disorder and disharmony remain the motivating structure of reality. To that end, the satirical elements—its faux scripture, its faux mythology, and its faux hierarchy—are paradoxically faithful enactments of its central metaphysics.

For those of a conspiratorial bent, Thornley first conceived of the movement after leaving the Marine Corps, and he was an associate of Lee Harvey Oswald. It sounds a little like one of the baroque plots in Robert Anton Wilson and Robert Shea’s The Illuminatus! Trilogy. A compendium of occult and conspiratorial lore whose narrative complexity recalls James Joyce or Thomas Pynchon, The Illuminatus! Trilogy was an attempt to produce for Discordianism what Dante crafted for Catholicism or John Milton for Protestantism: a work of literature commensurate with theology. Stang and Drummond, it should be said, were avid readers of The Illuminatus! Trilogy. “There are periods of history when the visions of madmen and dope fiends are a better guide to reality than the common-sense interpretation of data available to the so-called normal mind,” writes Wilson. “This is one such period, if you haven’t noticed already.” And how.

Inventing religions and messiahs wasn’t merely an activity for 20th-century pot smokers. Fearing the uncovering of the Christ who isn’t actually there can be seen as early as the 10th century, when the Iranian war-lord Abu Tahir al-Jannabi wrote about a supposed tract that referenced the “three imposters,” an atheistic denunciation of Moses, Jesus, and Muhammad. This equal opportunity apostasy, attacking all three children of Abraham, haunted monotheism over the subsequent millennium, as the infernal manuscript was attributed to several different figures. In the 13th century, Pope Gregory IX said that the Holy Roman Emperor Frederick II had authored such a work (the latter denied it). Within Giovanni Boccaccio’s 14th-century The Decameron there is reference to the “three imposters,” and in the 17th century, Sir Thomas Browne attributed the Italian Protestant refugee Bernardino Orchino with having composed a manifesto against the major monotheistic faiths.

What’s telling is that everyone feared the specter of atheism, but no actual text existed. They were scared of a possibility without an actuality, terrified of a dead God who was still alive. It wouldn’t be until the 18th century that writing would actually be supplied in the form of the French authored anonymous pamphlet of 1719 Treatise of the Three Imposters. That work, going through several different editions over the next century, drew on the naturalistic philosophy of Benedict Spinoza and Thomas Hobbes to argue against the supernatural status of religion. Its author, possibly the bibliographer Prosper Marchand, argued that the “attributes of the Deity are so far beyond the grasp of limited reason, that man must become a God himself before he can comprehend them.” One imagines that the prophets of the Church of the SubGenius and Discordianism, inventors of gods and messiahs aplenty, would concur. “Just because some jackass is an atheist doesn’t mean that his prophets and gods are any less false,” preaches “Bob” in The Book of the SubGenius.

9.The glistening promise of white-flecked SPAM coated in greasy aspic as it slips out from it’s corrugated blue can, plopping onto a metal plate with a satisfying thud. A pack of Lucky Strikes, with its red circle in a field of white, crinkle of foil framing a raggedly opened end, sprinkle of loose tobacco at the bottom as a last cigarette is fingered out. Heinz Baked Beans, sweet in their tomato gravy, the yellow label with its picture of a keystone slick with the juice from within. Coca-Cola—of course Coca-Cola—its ornate calligraphy on a cherry red can, the saccharine nose pinch within. Products of American capitalism, the greatest and most all-encompassing faith of the modern world, left behind on the Vanuatuan island of Tanna by American servicemen during the Second World War.

More than 1,000 miles northwest from Australia, Tanna was home to airstrips and naval bases, and for the local Melanesians, the 300,000 GIs housed on their island indelibly marked their lives. Certainly the first time most had seen the descent of airplanes from the blue of the sky, the first time most had seen jangling Jeeps careening over the paths of Tanna’s rainforests, the first time most had seen Naval destroyers on the pristine Pacific horizon. Building upon a previous cult that had caused trouble for the jointly administered colonial British-French Condominium, the Melanesians claimed that the precious cargo of the Americans could be accessed through the intercession of a messiah known as John Frum—believed to have possibly been a serviceman who’d introduced himself as “John from Georgia.”

Today the John Frum cult still exists in Tanna. A typical service can include the rising of the flags of the United States, the Marine Corps, and the state of Georgia, while shirtless youths with “USA” painted on their chests march with faux-riffles made out of sticks (other “cargo cults” are more Anglophilic, with one worshiping Prince Philip). Anthropologists first noted the emergence of the John Frum religion in the immediate departure of the Americans, with Melanesians apparently constructing landing strips and air traffic control towers from bamboo, speaking into left-over tin cans as if they were radio controls, all to attract back the quasi-divine Americans and their precious “cargo.” Anthropologist Holger Jebens in After the Cult: Perceptions of Other and Self in West New Britain (Papua New Guinea) describes “cargo cults” as having as their central goal the acquisition of “industrially manufactured Western goods brought by ship or aeroplane, which, from the Melanesian point of view, are likely to have represented a materialization of the superior and initially secret power of the whites.” In this interpretation, the recreated ritual objects molded from bamboo and leaves are offerings to John Frum so that he will return from the heavenly realm of America bearing precious cargo.

If all this sounds sort of dodgy, than you’ve reason to feel uncomfortable. More recently, some anthropologists have questioned the utility of the phrase “cargo cult,” and the interpretation of the function of those practices. Much of the previous model, mired in the discipline’s own racist origins, posits the Melanesians and their beliefs as “primitive,” with all of the attendant connotations to that word. In the introduction to Beyond Primitivism: Indigenous Religious Traditions and Modernity, Jacob K. Olupona writes that some scholars have defined the relationship between ourselves and the Vanuatuans by “challenging the notion that there is a fundamental difference between modernity and indigenous beliefs.” Easy for an anthropologist espying the bamboo landing field and assuming that what was being enacted was a type of magical conspicuous consumption, a yearning on the part of the Melanesians to take part in our own self-evidently superior culture. The only thing that’s actually self-evident in such a view, however, is a parochial and supremacist positioning that gives little credit to unique religious practices. Better to borrow the idea of allegory in interpreting the “cargo cults,” both the role which that way of thinking may impact the symbolism of their rituals, but also what those practices could reflect about our own culture.

Peter Worsley writes in The Trumpet Shall Sound: A Study of “Cargo” Cults in Melanesia that “a very large part of world history could be subsumed under the rubric of religious heresies, enthusiastic creeds and utopias,” and this seems accurate. So much of prurient focus is preoccupied with the material faith. Consequently there is a judgment of the John Frum religion as being superficial. Easy for those of us who’ve never been hungry to look down on praying for food, easy for those with access to medicine to pretend that materialism is a vice. Mock praying for SPAM at your own peril; compared to salvation, I at least know what the former is. Christ’s first miracle in the Gospel of John, after all, was the production of loaves and fishes, a prime instance of generating cargo. John Frum, whether he is real or not, is a radical figure, for while a cursory glance at his cult might seem that what he promises is capitalism, it’s actually the exact opposite. For those who pray to John Frum are not asking for work, or labor, or a Protestant work ethic, but rather delivery from bondage; they are asking to be shepherded into a post-scarcity world. John Frum is not intended to deliver us to capitalism, but rather to deliver us from it. John Frum is not an American, but he is from that more perfect America that exists only in the Melanesian spirit.   

10.On Easter of 1300, within the red Romanesque walls of the Cistercian monastery of Chiaravelle, a group of Umiliati Sisters were convened by Maifreda da Pirovano at the grave of the Milanese noblewoman Guglielma. Having passed two decades before, the mysterious Guglielma was possibly the daughter of King Premysl Otakar I of Bohemia, having come to Lombardy with her son following her husband’s death. Wandering the Italian countryside as a beguine, Guglielma preached an idiosyncratic gospel, claiming that she was an incarnation of the Holy Spirit, and that her passing would incur the third historical dispensation, destroying the patriarchal Roman Catholic Church in favor of a final covenant to be administered through women. If Christ was the new Adam come to overturn the Fall, then his bride was Guglielma, the new Eve, who rectified the inequities of the old order and whose saving grace came for humanity, but particularly for women.

Now a gathering of nuns convened at her inauspicious shrine, in robes of ashen gray and scapulars of white. There Maifreda would perform a Mass, transubstantiating the wafer and wine into the body and blood of Christ. The Guglielmites would elect Maifreda the first Pope of their Church. Five hundred years after the supposed election of the apocryphal Pope Joan, and this obscure order of women praying to a Milanese aristocrat would confirm Maifreda as the Holy Mother of their faith. That same year the Inquisition would execute 30 Guglielmites—including la Papessa. “The Spirit blows where it will and you hear the sound of it,” preached Maifreda, “but you know now whence it comes or whither it goes.”

Maifreda was to Guglielma as Paul was to Christ: apostle, theologian, defender, founder. She was the great explicator of the “true God and true human in the female sex…Our Lady is the Holy Spirit.” Strongly influenced by a heretical strain of Franciscans influenced by the Sicilian mystic Joachim of Fiore, Maifreda held that covenantal history could be divided tripartite, with the first era of Law and God the Father, the second of Grace and Christ the Son, and the third and coming age of Love and the Daughter known as the Holy Spirit. Barbara Newman writes in From Virile Woman to WomanChrist: Studies in Medieval Religion and Literature that after Guglielma’s “ascension the Holy Spirit would found a new Church, superseding the corrupt institution in Rome.” For Guglielma’s followers, drawn initially from the aristocrats of Milan but increasingly popular among more lowly women, these doctrines allowed for self-definition and resistance against both Church and society. Guglielma was a messiah and she arrived for women, and her prophet was Maifreda.

Writing of Guglielma, Newman says that “According to one witness… she had come in the form of a woman because if she had been male, she would have been killed like Christ, and the whole world would have perished.” In a manner she was killed, some 20 years after her death when her bones were disinterred from Chiaravelle, and they were placed on the pyre where Maifreda would be burnt alongside two of her followers. Pope Boniface VIII would not abide another claimant to the papal throne—especially from a woman. But even while Maifreda would be immolated, the woman to whom she gave the full measure of sacred devotion would endure, albeit at the margins. Within a century Guglielma would be repurposed into St. Guglielma, a pious woman whom suffered under the false accusation of heresy, and who was noted as particularly helpful in interceding against migraines. But her subversive import wasn’t entirely dampened over the generations. When the Renaissance Florentine painter Bonifacio Bembo was commissioned to paint an altarpiece around 1445 in honor of the Council of Florence (an unsuccessful attempt at rapprochement between the Catholic and Orthodox Churches) he depicted God crowning Christ and the Holy Spirit. Christ appears as can be expected, but the Holy Spirit has Guglielma’s face.

Maifreda survived in her own hidden way as well, and also through the helpful intercession of Bembo. The altar that he crafted had been commissioned by members of the powerful Visconti family, leaders in Milan’s anti-papal Ghibelline party, and they also requested the artist to produce 15 decks of Tarot cards. The so-called Visconti-Sforza deck, the oldest surviving example of the form, doesn’t exist in any complete set, having been broken up and distributed to various museums. Several of these cards would enter the collection of the American banker J.P. Morgan, where they’d be stored in his Italianate lower Manhattan mansion. A visitor can see cards from the Visconti-Sforza deck that include the fool in his jester’s cap and mottled pants, the skeletal visage of death, and most mysterious of all, il Papesa—the female Pope. Bembo depicts a regal woman, in ash-gray robes and white scapular, the papal tiara upon her head. In the 1960s, the scholar Gertrude Moakley observed that the female pope’s distinctive dress indicates her order: she was an Umilati. Maifreda herself was first cousins with a Visconti, the family preserving the memory of the female pope in Tarot. On Madison Avenue you can see a messiah who for centuries was shuffled between any number of other figures, never sure of when she might be dealt again. Messiahs are like that, often hidden—and frequently resurrected.  

Bonus Links:—Ten Ways to Live ForeverTen Ways to Change Your GodTen Ways to Look at the Color Black

Image Credit: Wikipedia.

A Fraternity of Dreamers

“There is no syllable one can speak that is not filled with tenderness and terror, that is not, in one of those languages, the mighty name of a god.” —Jorge Luis Borges, “The Library of Babel” (1941)

“Witness Mr. Henry Bemis, a charter member in the fraternity of dreamers. A bookish little man whose passion is the printed page…He’ll have a world all to himself…without anyone.” —Rod Serling, “Time Enough at Last,” The Twilight Zone (1959)

 When entering a huge library—whether its rows of books are organized under a triumphant dome, or they’re encased within some sort of vaguely Scandinavian structure that’s all glass and light, or they simply line dusty back corridors—I must confess that I’m often overwhelmed with a massive surge of anxiety. One must be clear about the nature of this fear—it’s not from some innate dislike of libraries, the opposite actually. The nature of my trepidation is very exact, though as far as I know there’s no English word for it (it seems like some sort of sentiment that the Germans might have an untranslatable phrase for). This fear concerns the manner in which the enormity of a library’s collection forces me to confront the sheer magnitude of all that I don’t know, all that I will never know, all that I can never know. When walking into the red-brick modernist hanger of the British Library, which houses all of those brittle books within a futuristic glass cube that looks like a robot’s heart, or the neo-classical Library of Congress with its green patina roof, or Pittsburgh’s large granite Carnegie Library main branch smoked dark with decades of mill exhaust and kept guard by a bronze statue of William Shakespeare, my existential angst is the same. If I start to roughly estimate the number of books per row, the number of rows per room, the number of rooms per floor, my readerly existential angst can become severe. This symptom can even be present in smaller libraries; I felt it alike in the small-town library of Washington, Penn., on Lincoln Avenue and in the single room of the Southeast Library of Washington D.C. on Pennsylvania Avenue. Intrinsic to my fear are those intimations of mortality whereby even a comparatively small collection must make me confront the fact that in a limited and hopefully not-too-short life I will never be able to read even a substantial fraction of that which has been written. All those novels, poems, and plays; all those sentiments, thoughts, emotions, dreams, wishes, aspirations, desires, and connections—completely inaccessible because of the sheer fact of finitude.

Another clarification should be in order—my fear isn’t the same as worrying that I’ll be found out for having never read any number of classical or canonical books (or those of the pop, paper-back variety either). There’s a scene in David Lodge’s classic and delicious campus satire Changing Places: A Tale of Two Campuses in which a group of academics play a particularly cruel game, as academics are apt to do, that asks participants to name a venerable book they’re expected to have read but have never opened. Higher point-values are awarded the more canonical a text is; what the neophytes don’t understand is that the trick is to mention something standard enough that they still can get the points for having not read it (like Laurence Sterne’s Tristram Shandy) but not so standard that they’ll look like an idiot for having never read it. One character—a recently hired English professor—is foolish enough to admit that he skipped Hamlet in high school. The other academics are stunned into silence. His character is later denied tenure. So, at the risk of making the same error, I’ll lay it out and admit to any number of books that the rest of you have probably read, but that I only have a glancing Wikipedia familiarity with: Marcel Proust’s Remembrance of Things Past, James Joyce’s Finnegan’s Wake, Don DeLillo’s White Noise, David Foster Wallace’s Infinite Jest. I’ve never read Harper Lee’s To Kill a Mockingbird, which is ridiculous and embarrassing, and I feel bad about it. I’ve also never read Jonathan Franzen’s The Corrections, though I don’t feel bad about that (however I’m wary that I’ve not read the vast bulk of J.K. Rowling’s Harry Potter books). Some of those previously mentioned books I want to read, others I don’t; concerning the later category, some of those titles make me feel bad about my resistance to them, others I haven’t thought twice about (I’ll let you guess individual titles’ statuses).

I offer this contrition only as a means of demonstrating that my aforementioned fear goes beyond simply imposter syndrome. There are any number of reasons why we wish we’d read certain things, and that we feel attendant moroseness for not having done so—the social stigma of admitting such things, a feeling of not being educated enough or worldly enough, the simple fact that there might be stuff that we’d like to read, but inclination, will power, or simply time has gotten in the way. The anxiety that libraries can sometimes give me is of a wholly more cosmic nature, for something ineffable affects my sense of self when I realize that the majority of human interaction, expression, and creativity shall forever be unavailable to me. Not only is it impossible for me to read the entirety of literature, it’s impossible to approach even a fraction of it—a fraction of a fraction of it. Some several blocks from where I now write is the Library of Congress, the largest collection in the world, which according to its website contains 38 million books (that’s excluding other printed material from posters to pamphlets). If somebody read a book a day, which of course depends on the length of the book, it would take somebody about 104,109 years and change to read everything within that venerable institution (ignoring the fact that about half-a-million to a million new titles are published every year in English alone, and that I was also too unconcerned to factor in leap years).

If you’re budgeting your time, may I suggest the British Library, which though it has a much larger collection of other textual ephemera, has a more manageable 13,950,000 books, which would take you a breezy 38,291 years to get through. If you’re of a totalizing personality, according to Google engineers from a 2010 study estimating the number of books ever written, you’ll have to wade through 129 million volumes of varying quality. That would take you 353,425 years to read. Of course this ignores all of that which has been written but not bound within a book—all of the jottings, the graffiti, the listings, the diaries, the text messages, the letters, and the aborted novels for which the authors have wisely or unwisely hit “Delete.” Were some hearty and vociferous reader to consume one percent of all that’s ever been written—one percent of that one percent—and then one percent of that one percent—they’d be the single most well-read individual to ever live. When we reach the sheer scale of how much human beings have expressed, have written, we enter the realm of metaphors that call for comparisons to grains of sand on the beach or stars in our galaxy. We depart the realm of literary criticism and enter that of cosmology. No wonder we require curated reading lists.

For myself, there’s an unhealthy compulsion towards completism in the attendant tsuris over all that I’ll never be able to read. Perhaps there is something stereotypically masculine in the desire to conquer all of those unread worlds, something toxic in that need. After all, in those moment’s of readerly ennui there’s little desire for the experience, little need for quality, only that desire to cross titles off of some imagined list. Assume it were even possible to read all that has been thought and said, whether sweetness and light or bile and heft, and consider what purpose that accomplishment would even have. Vaguely nihilistic the endeavor would be, reminding me of that old apocryphal story about the conqueror, recounted by everyone from the 16th-century theologian John Calvin to Hans Gruber in Die Hard, that “Alexander the Great…wept, as well indeed he might, because there were no more world’s to conquer,” as the version of that anecdote is written in Washington Irving’s 1835 collection Salmagundi: Or, The Whim-whams and Opinions of Launcelot Langstaff, Esq. and Others. Poor Alexander of Macedon, son of Philip, educated in the Athenian Lyceum by Aristotle, and witness to bejeweled Indian war-elephants bathing themselves on the banks of the Indus and the lapis lazuli encrusted Hanging Gardens of Babylon, the gleaming, white pyramids at Cheops and the massive gates of Persepolis. Alexander’s map of the world was dyed red as his complete possession—he’d conquered everything that there was to be conquered. And so, following the poisoning of his lover Hephaestion, he holed up in Nebuchadnezzar’s Babylonian palace, and he binged for days. Then he died. An irony though, for Alexander hadn’t conquered, or even been to all the corners of the world. He’d never sat on black sand beaches in Hokkaido with the Ainu, he’d never drank ox-blood with the Masai or hunted the giant Moa with the Maori, nor had he been on a walkabout in the Dreamtime with the Anangu Pitjantjatjara or stood atop Ohio’s Great Serpent Mound or seen the grimacing stone-heads of the Olmec. What myopia, what arrogance, what hubris—not to conquer the world, but to think that you had. Humility is warranted whether you’re before the World or the Library.

Alexander’s name is forever associated not just with martial ambitions, but with voluminous reading lists and never-ending syllabi as well, due to the library in the Egyptian city to which he gave his name, what historian Roy MacLeod describes in The Library of Alexandria: Centre of Learning in the Ancient World as “unprecedented in kingly purpose, certainly unique in scope and scale…destined to be far more ambitious [an] undertaking than a mere repository of scrolls.” The celebrated Library of Alexandria, the contents of which are famously lost to history, supposedly confiscated every book from each ship that came past the lighthouse of the city, had its scribes make a copy of the original, and then returned the counterfeit to the owners. This bit of bibliophilic chicanery was instrumental to the mission of the institution—the Library of Alexandria wasn’t just a repository of legal and religious documents, nor even a collection of foundational national literary works, but supposedly an assembly that in its totality would match all of the knowledge in the world, whether from Greece and Egypt, Persia and India. Matthew Battles writes in Library: An Unquiet History that Alexandria was “the first library with universal aspirations; with its community of scholars, it became a prototype of the university of the modern era.” Alexander’s library yearned for completism as much as its namesake had yearned to control all parts of the world; the academy signified a new, quixotic emotion—the desire to read, know, and understand everything. By virtue of it being such a smaller world at the time (at least as far as any of the librarians working there knew) such an aspiration was even theoretically possible.

“The library of Alexandria was comprehensive, embracing books of all sort from everywhere, and it was public, open to anyone with fitting scholarly or literary qualifications,” writes Lionel Casson in Libraries in the Ancient World. The structure overseen by the Ptolemaic Dynasty, who were Alexander’s progeny, was much more of a wonder of the ancient world than the Lighthouse in the city’s harbor. Within its walls, whose appearance is unclear to us, Aristophanes of Byzantium was the first critic to divide poetry into lines, 70 Jews convened by Ptolemy II translated the Hebrew Torah into the Greek Septuagint, and the geographer Eratosthenes correctly calculated the circumference of the Earth. Part of the allure of Alexandria, especially to any bibliophile in this fraternity of dreamers, is the fact that the vast bulk of what was kept there is entirely lost to history. Her card catalogue may have included lost classical works like Aristotle’s second book of Poetics on comedy (a plot point in Umberto Eco’s medieval noir The Name of the Rose), Protagoras’s essay “On the Gods,” the prophetic books of the Sibyllines, Cato the Elder’s seven-book history of Rome, the tragedies of the rhetorician Cicero, and even the comic mock-epic Magrites supposedly written by Homer.

More than the specter of all that has been lost, Alexandria has become synonymous with the folly of anti-intellectualism, as its destruction (variously, and often erroneously, attributed to Romans, Christians, and Muslims) is a handy and dramatic narrative to illustrate the eclipse of antiquity. Let’s keep some perspective though—let’s crunch some numbers again. According to Robin Lane Fox in The Classical World: An Epic History from Homer to Hadrian the “biggest library…was said to have grown to nearly 500,000 volumes.” Certainly not a collection to scoff at, but Alexander’s library, which drew from the furthest occident to the farthest orient, has only a sixth of all the books in the Library of Congress’s Asian Collection; Harvard University’s Widener Library has 15 times as many books (and that’s not including the entire system); the National Library of Iran, housed not far from where Alexander himself died, has 30 times the volumes than did that ancient collection. The number of books held by the Library of Alexandria would have been perfectly respectable in the collection of a small midwestern liberal arts college. By contrast, according to physicist Barak Shoshany on a Quora question, if the 5 zettabytes of the Internet were to be printed, then the resultant stack of books would have to fit on a shelf “4×10114×1011 km or about 0.04 light years thick,” the last volume floating somewhere near the Oort Cloud. Substantially larger shelves would be needed, it goes without saying, than whatever was kept in the storerooms of Alexandria with that cool Mediterranean breeze curling the edges of those papyri.

To read all of those scrolls, codices, and papyri at Alexandria would take our intrepid ideal reader a measly 1,370 years to get through. More conservative historians estimate that the Library of Alexandria may have housed only 40,000 books—if that is the case, then it would take you a little more than a century to read (if you’re still breezing through a book a day). That’s theoretically the lifetime of someone gifted with just a bit of longevity. All of this numeric stuff misses the point, though. It’s all just baseball card collecting, because what the Library of Alexandria represented—accurately or not—was the dream that it might actually be possible to know everything worth knowing. But since the emergence of modernity some half-millennia ago, and the subsequent fracturing of disciplines into ever more finely tuned fields of study, it’s been demonstrated just how much of a fantasy that goal is. There’s a certain disposition that’s the intellectual equivalent of Alexander, and our culture has long celebrated that personality type—the Renaissance Man (and it always seems gendered thus). Just as there was always another land to be conquered over the next mountain range, pushing through the Kush and the Himalayan foothills, so too does the Renaissance Man have some new type of knowledge to master, say geophysics or the conjugation of Akkadian verbs. Nobody except for Internet cranks or precocious and delusional autodidacts actually believes in complete mastery of all fields of knowledge anymore; by contrast, for all that’s negative about graduate education, one clear and unironic benefit is that it taught me the immensity and totality of all of the things that I don’t know.

Alexandria’s destruction speaks to an unmistakable romance about that which we’ll never be able to read, but it also practically says something about a subset of universal completism—our ability to read everything that has survived from a given historical period. By definition it’s impossible to actually read all of classical literature, since the bulk of it is no longer available, but to read all of Greek and Roman writing which survives—that is doable. It’s been estimated that less than one percent of classical literature has survived to the modern day, with Western cultural history sometimes reading as a story of both luck and monks equally preserving that inheritance. It would certainly be possible for any literate individual to read all of Aristophanes’s plays, all of Plato’s dialogues, all of Juvenal’s epigrams.  Harvard University Press’s venerable Loeb Classical Library, preserving Greek and Latin literature in their distinctive minimalist green and red covered volumes, currently has 530 titles available for purchase. Though it doesn’t encompass all that survives from the classical period, it comes close. An intrepid and dogged reader would be able to get through them, realistically, in a few years (comprehension is another matter).

If you need to budget your time, all of Anglo-Saxon writing that survives, that which didn’t get sewn into the back-binding of some inherited English psalm book or found itself as kindling in the 16th century when Henry VIII dissolved the monasteries, is contained across some four major poetic manuscripts, though 100 more general manuscripts endure. The time period that I’m a specialist in, which now goes by the inelegant name of the “early modern period” but which everybody else calls the “Renaissance,” is arguably marked as the first time period for which no scholar would be capable of reading every primary source that endures. Beneficiary of relative proximity to our own time, and a preponderance of titles gestated through the printing press, it would be impossible for anyone to read everything produced in those centuries. For every William Shakespeare play, there are hundreds of yellowing political pamphlets about groups with names like “Muggletonians;” for every John Milton poem, a multitude of religious sermons on subjects like double predestination. You have to be judicious in what you choose to read, since one day you’ll be dead. This reality should be instrumental in any culture wars détente—canons exist as a function of pragmatism.   

The canon thus functions as a kind of short-cut to completism (if you want to read through all of the Penguin Classics editions with their iconic black covers and their little avian symbol on the cover, that’s a meager 1,800 titles to get through). Alexandria’s delusion about gathering all of that which has been written, and perhaps educating oneself from that corpus, drips down through the history of Western civilization. We’ve had no shortage of Renaissance Men who, even if they hadn’t read every book ever written, perhaps at least roughly know where all of them could be found in the card catalogue. Aristotle was a polymath who not only knew the location of those works, but credibly wrote many of them (albeit all that remains are student lecture notes), arguably the founder of fields as diverse as literary criticism to dentistry. In the Renaissance, whereby it could be assumed that the attendant Renaissance Man would be most celebrated, there was a preponderance of those for whom it was claimed that they had mastered all disciplines that could be mastered (and were familiar with the attendant literature review). Leonardo da Vinci, Blaise Pascal, Athanasius Kircher, Isaac Newton, and Gottfried Wilhelm Leibnitz have all been configured as Renaissance Men, their writings respectively encompassing not just art, mathematics, theology, physics, and philosophy, but also aeronautics, gambling, sinology, occultism, and diplomacy as well.

Stateside both Benjamin Franklin and Thomas Jefferson (a printer and a book collector) are classified as such, and more recently figures as varied as Nikola Tesla and Noam Chomsky are sometimes understood as transdisciplinary polymaths (albeit ones for whom it would be impossible to have read all that can be read, even if it appears as such). Hard to disentangle the canonization of such figures from the social impulse to be “well read,” but in the more intangible and metaphysical sense, beyond wanting to seem smart because you want to seem smart, the icon of the Renaissance Man can’t help but appeal to that completism, that desire for immortality that is prodded by the anxiety that libraries inculcate in me. My patron saint of polymaths is the 17th-century German Jesuit Kircher, beloved by fabulists from Eco to Jorge Louis Borges, for his writings that encompassed everything from mathematics to hieroglyphic translation. Paula Findlen writes in the introduction to her anthology Athanasius Kircher: The Last Man Who Knew Everything that he was the “greatest polymath of an encyclopedic age,” yet when his rival Renaissance Man Leibnitz first dipped into the voluminous mass of Kirchermania he remarked that the priest “understands nothing.”

Really, though, that’s true of all people. I’ve got a PhD and I can’t tell you how lightbulbs work (unless they fit into Puritan theology somehow). Kircher’s translations of Egyptian were almost completely and utterly incorrect. As was much of what else he wrote on, from minerology to Chinese history. He may have had an all-encompassing understanding of all human knowledge during that time period, but Kircher wasn’t right very often. That’s alright, the same criticism could be leveled at his interlocutor Leibnitz. Same as it ever was, and applicable to all of us. We’re not so innocent anymore, the death of the Renaissance Man is like the death of God (or of a god). The sheer amount of that which is written, the sheer number of disciplines that exist to explain every facet of existence, should disavow us of the idea that there’s any way to be well-educated beyond the most perfunctory meaning of that phrase. In that gulf between our desire to know and the poverty of our actual understanding are any number of mythic figures who somehow close that gap; troubled figures from Icarus to Dr. Faustus who demonstrate the hubris of wishing to read every book, to understand every idea. A term should exist for the anxiety that those examples embody, the quacking fear before the enormity of all that we don’t know. Perhaps the readerly dilemma, or even textual anxiety.

A full accounting of the nature of this emotion compels me to admit that it goes beyond simply fearing that I’ll never be able to read all of the books in existence, or in some ideal library, or if I’m being honest even in my own library. The will towards completism alone is not the only attribute of textual anxiety, for a not dissimilar queasiness can accompany related (though less grandiose) activities than the desire to read all books that were ever written. To whit—sometimes I’ll look at the ever-expanding pile of books that I’m to read, including volumes that I must read (for reviews or articles) and those that I want to read, and I’m paralyzed by that ever-growing paper cairn. Such debilitation isn’t helpful; the procrastinator’s curse is that it’s a personality defect that’s the equivalent of emotional quicksand. To this foolish inclination towards completism—desiring everything and thus acquiring nothing—I sometimes use Francis Bacon’s claim from his essays of 1625 that “Some books should be tasted, some devoured, but only a few should be chewed and digested thoroughly,” as a type of mantra against textual anxiety, and it mostly works. Perhaps I should learn to cross-stitch it as a prayer to display amongst my books, which even if I haven’t read all of them, they’ve at least been opened (mostly).

But textual anxiety manifests itself in a far weirder way, one that I think gets to the core of what makes the emotion so disquieting. When I’m reading some book that I happen to be enjoying, some random novel picked up from the library or purchased at an airport to pass the time, but not the works that I perennially turn toward—Walt Whitman and John Milton, John Donne and Emily Dickinson—I’m sometimes struck with a profound melancholy born from the fact that I shall never read these sentences again. Like meeting a friendly stranger who somehow indelibly marks your life by the tangible reality of their being, but whom will return to anonymity. Then it occurs to me that even those things I do read again and again, Leaves of Grass and Paradise Lost, I will one day also read for the last time. What such textual anxiety trades in, like all things of humanity, is my fear of finality, of extinction, of death. That’s at the center of this, isn’t it? The simultaneous fear of there being no more worlds to conquer and the fear that the world never can be conquered. Such consummation, the obsession with having it all, evidences a rather immature countenance.

It’s that Alexandrian imperative, but if there is somebody wiser and better to emulate it’s the old cynic Diogenes of Sinope, the philosophical vagabond who spent his days living in an Athenian pot. Laertius reports in Lives of the Eminent Philosophers that “When [Diogenes] was sunning himself…Alexander came and stood over him and said: ‘Ask me for anything you want.’ To which he replied, ‘Stand out of my light.’” And so, the man with everything was devoid of things to give to the man with nothing. Something indicative of that when it comes to this fear that there are things you’ll never be able to read, things you’ll never be able to know. The point, Diogenes seems to be saying, is to enjoy the goddamn light. Everything in that. I recall once admitting my fear about all that I don’t know, all of those books that I’ll never open, to a far wiser former professor of mine. This was long before I understood that an education is knowing what you don’t know, understanding that there are things you will never know, and worshiping at the altar of your own sublime ignorance. When I explained this anxiety of all of these rows and rows of books to never be opened, she was confused. She said “Don’t you see? That just means that you’ll never run out of things to read.” Real joy, it would seem, comes in the agency to choose, for if you were able to somehow give your attention equally to everything than you’d suffer the omniscient imprisonment that only God is cursed with. The rest of us are blessed with the endless, regenerative, confusing, glorious, hidden multiplicity of experience in discrete portions. Laertius writes “Alexander is reported to have said ‘Had I not been Alexander, I should have liked to be Diogenes.”

Image Credit: Wikipedia.

Stories in Formaldehyde: The Strange Pleasures of Taxonomizing Plot

Somewhere within the storerooms of London’s staid, gray-faced Tate Gallery (for it’s currently no longer on exhibit) is an 1834 painting by J.M.W. Turner entitled “The Golden Bough.” Rendered in that painter’s characteristic sfumato of smeared light and smoky color, Turner’s composition depicts a scene from Virgil’s epic Aeneid wherein the hero is commanded by that seventh-century-old prophetic crone, the Sibyl of Cumae, to make an offering of a golden bough from a sacred tree growing upon the shores of crystalline blue Lake Avernus to the goddess Prosperina, if he wishes to descend to Hades and see the shadow of his departed father. “Obscure they went through dreary shades, that led/Along the waste dominions of the dead,” translated John Dryden in 1697, using his favored totemistic Augustinian rhyming couplets, as Aeneas descends further into the Underworld, its entrance a few miles west of Naples. As imagined by Turner, the area around the volcanic lake is pleasant, if sinister; bucolic, if eerie; pastoral, if unsettling. A dapple of light marks the portal whereby pilgrims journey into perdition; in the distance tall, slender trees topped with a cap of branches jut up throughout the landscape. A columned temple is nestled within the scrubby hills overlooking the field. The Sibyl stands with a scythe so that the vegetable sacrifice can be harvested, postlapsarian snakes slither throughout, and the Fates revel in mummery near hell’s doorway. Rather than severe tones of blood red and sulfurous black, earthy red and cadaverous green: Turner opted to depict Avernus in soft blues and greys, and the result is all the more disquieting. Here, the viewer might think, is what the passage between life and death must look like—muted, temperate, serene, barely even noticeable from one transition to the next.

As with the best of Turner’s paintings, with his eye to color the visual equivalent of perfect pitch, it is the texture of hues that renders, if not some didactic message about his subject, a general emotional sense, a sentiment hard to describe and registering at a pitch that can be barely heard and yet alters one’s feelings in the moment. Such was the sense conveyed by the Scottish folklorist James George Frazer who borrowed the artist’s title for his landmark 1890 study The Golden Bough: A Study in Comparative Religion, describing on his first page how the painting is “suffused with the golden glow of imagination in which the divine mind of Turner steeped and transfigured even the fairest natural landscape.” This scene, Frazer enthuses, “is a dream-like vision of the little woodland…[where] Dian herself might still linger by this lonely shore, still haunt these woodlands wild.” An influential remnant of a supremely Victorian enthusiasm for providing quasi-scientific gloss to the categorization of mythology, Frazer’s study provided taxonomy of classical myth so as to find certain similarities, the better to provide a grand, unified theory of ancient religion (or what Edward Casaubon in George Elliot’s Middlemarch, written two decades before, might call The Key to All Mythologies). First viewing Turner’s canvas, and the rationalist Frazer was moved by the painting’s mysteriousness, the way in which the pool blue sky and the shining hellmouth trade in nothing as literal as mere symbolism, but wherein the textured physicality—the roughness of the hill and the ominous haze of the clouds, dusk’s implied screaming cicadas and the cool of the evening—conveys an ineffable feeling. Despite pretensions to an analysis more logical, Frazer intimates the numinous (for, how couldn’t he?). “Who does not know Turner’s picture of the Golden Bough?” he writes.

His argument in The Golden Bough was that religions originated as primitive fertility cults, dedicated to the idea of sacrifice and resurrection, and that from this fundamentally magical worldview would evolve more sophisticated religions, to finally be supplanted by secular science. The other argument from The Golden Baugh is implicit in the book’s very existence—that structure can be ascertained within the messy morass of disparate myths. To make this argument he drew from sources as diverse as Virgil to the Nootka people of British Columbia, classifying, categorizing, and organizing data as surely as a biologist preserving specimens in a jar of formaldehyde. And like Charles Darwin measuring finch beaks, or Thomas Huxley pinning butterflies to wood blocks, Frazer believed that diversity was a mask for similarity.

As reductionist as his arguments are, and as disputed as his conclusions may be, Frazer’s influence was outsize among anthropologists, folklorists, writers, and especially literary critics, who thrilled to the idea that some sort of unity could be found in the chaotic variety of narratives that constitute world mythology. “I am a plain practical man,” Frazer writes, “not one of your theorists and splitters of hairs and choppers of logic,” and while it’s true that The Golden Bough evidences a more imaginative disposition, it still takes part in that old quixotic desire to find some Grand Unified Theory of Narrative. While Frazer’s beat was myth, he was still a reporter in stories, and percolating like a counter-rhythm within discussion of narrative is that old desire, the yearning to find the exact number of plots that it is possible to tell. Frazer, for all that was innovative about his thought, was neither the first nor the last to treat stories like animals in a genus, narratives as if creatures in a phylum.

That grand tradition claims there are only 36 stories that can be told, or seven, or four. Maybe there is really only one tale, the story of wanting something and not getting it, which is after all the contour of this story itself—the strange endurance of the sentiment that all narrative can be easily classifiable into a circumscribed, finite, and relatively small number of possibilities. While I’ve got my skepticism about such an endeavor—seeing those suggested systems as erasing the particularity of stories, of occluding what makes them unique in favor of mutilating them into some Procrustean Bed—I’d be remiss not to confess that I also find these theories immensely pleasing. There is something to be said about the cool rectilinear logic that claims any story, from Middlemarch to Fifty Shades of Grey, Citizen Kane to Gremlins 2, can be stripped down to its raw schematics and analyzed as fundamental, universal, eternal plots that have existed before Gilgamesh’s cuneiform was wedged into wet clay.

Christopher Booker claims in The Seven Basic Plots: Why We Tell Stories that “wherever men and women have told stories, all over the world, the stories emerging to their imaginations have tended to take shape in remarkably similar ways,” differences in culture, language, or faith be damned. With some shading, Booker uses the archetypal psychoanalysis of Carl Jung to claim that every single narrative, whether in epic or novel, film or comic, can be slotted into 1) overcoming the monster (Beowulf, George Lucas’s Star Wars), 2) rags to riches (Charlotte Bronte’s Jane Eyre, Horatio Alger stories), 3) the quest (Homer’s The Odyssey, Steven Spielberg’s Raiders of the Lost Ark), 4) voyage and return (The Ramayana, J.R.R. Tolkien’s The Hobbit), 5) comedy (William Shakespeare’s Twelfth Night, the Coen Brothers’ The Big Lebowski), 6) tragedy (Leo Tolstoy’s Anna Karenina, Arthur Penn’s Bonnie and Clyde) or 7) rebirth (Charles Dickens’s A Christmas Carol, Harold Ramis’s Groundhog Day).

That all of these parenthetically referenced works are, of course, astoundingly different from each other in character, setting, and most of all language, is irrelevant to Booker’s theory. While allowing for more subtlety than my potted overview would allow, Booker still concludes that “there are indeed a small number of plots which are so fundamental to the way we tell stories that it is virtually impossible for any storyteller ever entirely to break away from them.” Such a claim is necessary to Booker’s contention that these narratives are deeply nestled in our collective unconscious, a repository of themes, symbols, and archetypes that are “our basic genetic inheritance,” which he then proffers as an explanation for why humans tell stories at all.

The Seven Basic Plots, published in 2004 after 34 years of labor, is the sort of critical work that doesn’t appear much anymore. Audacious to the point of impudence, ambitious to the level of crack-pottery, Booker’s theory seems more at home in a seminar held by Frazer than in contemporary English departments more apt to discuss gender, race, and class in Jane Austen’s Pride and Prejudice than they are the Orphic themes of rebirth as manifested in that same novel. Being the sort of writer who both denied anthropogenic climate change and defended asbestos (for real), Booker had the conservative’s permanent sense of paranoid aggrievement concerning the treatment of his perspectives. So, let me be clear—contra Booker’s own sentiments, I don’t think that the theories in The Seven Basic Plots are ignored by literary critics because of some sort of politically correct conspiracy of silence; I think that they’re ignored because they’re not actually terribly correct or useful. When figuring out the genealogical lineage of several different species of Galapagos Island finches, similarity becomes a coherent arbiter; however, difference is more important when thinking through what makes exemplary literature exemplary. Genre, and by proxy plot, is frequently more an issue of marketing than anything. That’s not to say that questions of genre have no place in literary criticism, but they are normally the least interesting (“What makes this gothic novel gothic?”). No stranger to such thinking himself, author Kurt Vonnegut may have solved the enigma with the most basic of monomyths elucidated—“man falls into hole, man gets out of hole.”

Booker isn’t after marketing, however, he’s after the key to all mythologies. Like Frazer before him, he won’t be the last critic enraptured by the idea of a Periodic Table of Plots, capable of explaining both Fyodor Dostoevsky’s Crime and Punishment as well as Weekend at Bernie’s, and he won’t be the last. If you wish to blame somebody for this line of thinking, as with most disciplines of human endeavor from ethics to dentistry, look to Aristotle as the culprit. The philosopher’s “four conflicts,” man against himself, man against man, man against nature, and man against the gods, have long been a convenient means of categorizing plots. The allure of there being a limited number of plots is that it makes both reading and writing theoretically easier. The denizens of high culture literary criticism have embraced the concept periodically, as surely as those producing paperbacks promising that a hit book can be easily plotted out from a limited tool kit. Georges Polti, of Providence Rhode Island and later Paris France, wrote The Thirty-Six Dramatic Situations in 1895, claiming that all stories could be categorized in that number of scenarios, including plots of “Crime pursued by vengeance” and “Murderous adultery.” “Thirty-six situations only!” Polti enthuses. “There is to me, something tantalizing about the assertion.” Polti’s book has long been popular as a sort of lo-fi randomizer for generating stories, and its legacy lives on in works like Ronald B. Tobias’s 20 Master Plots and How to Build Them and Victoria Lynn Schmidt’s A Writer’s Guide to Characterization: Archetypes, Heroic Journeys, and Other Elements of Dynamic Character Development.

There is also a less pulpy, tonier history surrounding the thinking that everything can be brewed down to a handful of elemental plots. My attitude concerning such thinking was a bit glib earlier, as there is something to be said about the utility in this thinking, and indeed entire academic disciplines have grown from that assumption. Folklorists use a classification system called the “Aarne-Thompson-Uther Index,” where a multitude of plot-types are given numbers (“Cinderella” is 510A, for example), which can be useful to trace the ways in which stories have evolved and altered over both distance and time. Unlike Polti’s 36 plots, Tobias’s 20, or Booker’s seven, Stith Thompson’s Motif-Index of Folk-Literature goes to six volumes of folk tales, fairy tales, legends, and myths, but the basic idea is the same: plots exist in a finite number (including “Transformation: man to animal” and “Magic strength resides in hair”). As with the system of classification invented by Francis James Child in The English and Scottish Popular Ballads, or the Roud Folk Song Index, the Aarne-Thompson Uther Index is more than just a bit of shell collecting, but rather a system of categorization that helps folklorists make sense of the diversity of oral literature, with scholar Alan Dundes enthusing that the system was among the “most valuable tools in the professional folklorist’s arsenal of aids for analysis.” Morphological approaches define the discipline known as “narrative theory,” which draws from a similar theoretical inclination as that of the ATU Index. All of these methodologies share a commitment to understanding literature less through issues of grammar, syntax, and diction, and more in terms of plot and story. For those who read with an eye towards narrative, there is frequently an inclination, sentiment, or hunch that all stories and novels, films and television shows, epics and lyrics, comics and plays, can have their fat, gristle, and tallow boiled away to leave just the broth and a plot that’s as clean as a bone.

A faith that was popular among the Russian Formalists, sometimes incongruously known as the Prague School (after where many of them, as Soviet exiles, happened to settle), including Roman Jacobson, Viktor Shklovsky, and Vladimir Propp, the last of whom wrote Morphology of the Folktale, reducing those stories to a narrative abstraction that literally looks like mathematics. A similar movement was that of French structuralism, as exemplified by its founder the linguist Ferdinand de Saussure, and as later practiced by the anthropologist Claude Levi-Strauss and the literary critic Roland Barth. In the Anglophone world, with the exception of some departments that are enraptured to narratology, literary criticism has often focused on the evisceration of a text with the scalpel of close reading rather than the measurement of plot with the calipers of taxonomy. Arguably that’s led to the American critical predilection towards “literary” fiction over genre fiction, the rejection of science fiction, fantasy, horror, and romance as being unserious in favor of all of those beautifully crafted stories in The New Yorker where the climax is the main character looking out the window, sighing, and taking a sip of coffee, while realizing that she was never happy, not really.

There are exceptions to the critical valorization in language over plot, however, none more so than in the once mighty but now passé writings of Canadian theorist Northrop Frye. Few scholars in the English-speaking world were more responsible for that once enthusiastic embrace of taxonomic criticism than this United Church of Christ minister and professor at Toronto’s Victoria College. Frye was enraptured to the psychoanalyst Carl Jung’s theories of how fundamental archetypes structure our collective unconscious, and he believed that a similar approach could be applied to narrative, that a limited number of plots structured our way of thinking and approaching stories. In works like Fearful Symmetry on William Blake, and his all-encompassing Anatomy of Criticism, Frye elucidated a complex, baroque, and elegant system of categorizing stories, the better to interpret them properly. “What if criticism is a science as well as an art?” Frye asked, wishing to approach literature like a taxonomist, as if novels were a multitude of plants and animals just awaiting Linnaean classification. For those who read individual poems or novels as exemplary texts, explaining what makes them work, Frye would say that they’re missing the totality of what literature is. “Criticism seems to be badly in need of a coordinating principle,” he writes, “a central hypothesis which, like the theory of evolution in biology, will see the phenomena it deals with as part of a whole.”

Frye argued that this was to be accomplished by identifying that which was universal in narrative, where works could be rendered of their unique flesh down into their skeletons, which we would then find to be myths and archetypes. From this anodyne observation, Frye spun out a complex classification system for all Western literature, one where he identifies the exact archetypes that define poetry and prose, where he flings about terms like “centripidal” and “centrifugal” to interpret individual texts, and where phrases like the “kerygmatic mode” are casually used.  Anatomy of Criticism is true to its title; Frye carves up the cadaver of literature and arrives at an admittedly intoxicating theory of everything. “Physics is an organized body of knowledge about nature, and a student of it says that he is learning physics, not nature,” Frye writes. “Art, like nature, has to be distinguished from the systematic study of it, which is criticism.” In Frye’s physics, there are five “modes” of literature, including the mythic, romantic, high mimetic, low mimetic, and ironic; these are then cross listed with tragic, comic, and thematic forms; what are then derived are genres with names like the dionysian, the elegiac, the aristophanic, and so on. Later in the book he supplies a complex theory of symbolism, a methodology concerning imagery based on the Platonic Great Chain of Being, and a thorough taxonomy of genre. In what’s always struck me as one of the odder (if ingenious) parts of Anatomy of Criticism, Frye ties genres specifically to certain seasons, so that comedy is a spring form, romance belongs to the summer, autumn is a time of tragedy, and winter births irony. How one reads books from those tropical places where seasons neatly divide between rainy or not speaks to a particular chauvinism on the Canadian’s part.

For most viewers of public television, however, their introduction to the “There-are-only-so-many-stories” conceit wasn’t Frye, but rather a Sarah Lawrence College professor who was the titular subject of journalist Bill Moyers’s 1988 PBS documentary Joseph Campbell and the Power of Myth. Drawing largely from his 1949 study The Hero with a Thousand Faces, Campbell became the unlikely star of the series that promulgated his theory of the “monomyth,” the idea that a single-story threads through world mythology and is often focused on what he termed “the hero’s journey.” Viewers were drawn to Campbell’s airy insights about the relationship between Akkadian mythology and Star Wars (a film which George Lucas admitted was heavily influenced by the folklorist’s ideas), and his vaguely countercultural pronouncement that one should “Follow your bliss!,” despite his own right-wing politics (which according to some critics could run the gamut between polite Reaganism to fascist sympathizing). Both Frye and Campbell exhibited a wide learning, but arguably only the former’s was particularly deep. With an aura of crunchy tweediness, Campbell seemed like the sort of professor who would talk to students about the Rubaiyat of Omar Khayyam in an office which smells of patchouli, a threadbare oriental rug on the dusty floor, knick-knacks assembled while studying in India and Japan, and a collapsing bookshelf jammed with underlined paperback copies of Friedrich Nietzsche and Arthur Schopenhauer above his desk. Campbell, in short, looked like what we expect a liberal arts teacher to look like, and for some of his critics (like Dundes who called him an “non-expert” and an “amateur”) that gave him an unearned authority.

But what an authority he constructed, the hero with only one theory to explain everything! Drawing from Jung, Frazer, and all the rest of the usual suspects, Campbell argued in his most famous book that broad archetypes structure all narrative, wherein a “hero ventures forth from the world of common day into a region of supernatural wonder: fabulous forces are there encountered and a decisive victory is won: the hero comes back from this mysterious adventure with the power to bestow boons on his fellow man.” Whether Luke Skywalker venturing out from Tatooine or Gilgamesh leaving Ur, the song remains the same, Campbell says. Gathering material from the ancient near east and Bronze Age Ireland, the India of the Mahabharata and Hollywood screen plays, Campbell claimed that his monomyth was the skeleton key to all narrative, a story whose parsing could furthermore lead to understanding, wisdom, and self-fulfillment among those who are hip to its intricacies. The Hero with a Thousand Faces naturally flattered the pretensions of some artists and writers, what with its implications that they were conduits connected directly to the collective unconsciousness. Much as with Freud and the legions of literary critics who applied his theories to novels and film, if Campbell works well in interpreting lots of movies, it’s because those directors (from Lucas to Stanley Kubrick) happened to be reading him. The monomyth can begin to feel like the critical equivalent of the intelligent design advocate who knows God exists, because why else would we have been given noses on which to so conveniently hold our glasses?

Campbell’s politics, and indeed that of his theory, are ambivalent. His comparative approach superficially seems like the pluralistic, multicultural, ecumenical perspective of the Sarah Lawrence professor that he was, but at the same time the flattening of all stories into this one monomyth does profound violence to the particularity of myths innumerable. There is a direct line between Campbell and the mythos-laden mantras of poet Robert Bly and his Iron John: A Book About Men, the tome that launched a thousand drum circles of suburban dads trying to engage their naturalistic masculinity in vaguely homoerotic forest rituals, or of Canadian psychotherapist/alt-right apologist Jordan Peterson who functions as basically a Dollar Store version of the earlier folklorist. Because myths are so seemingly elemental, mysterious telegrams from the ancient past, whose logic seems imprinted into our unconscious, it’s hard not to see the attraction of a Campbell. And yet whenever someone starts talking about “mythos” it inevitably can start to feel like you’re potentially in the presence of a weirdo who practices “rune magik,” unironically wonders if they’re an ubermensch, and has an uncomfortably racist Google search history. We think of the myth as the purview of the hippie, but it’s just as often the provenance of the jackbooted authoritarian, for Campbell’s writings fit comfortably with a particularly reactionary view of life, which should fit uncomfortably with the rest of us. “Marx teaches us to blame society for our frailties, Freud teaches us to blame our parents,” Campbell wrote in the posthumously published Pathways to Bliss, but the “only place to look for blame is within: you didn’t have the guts to bring up your full moon and live the life that was your potential.” Yeah, that’s exactly it. People can’t afford healthcare or get a job because they didn’t bring up their full moon…

The problem is that if you take Campbell too seriously then everything begins to look like it was written by Campbell. To wit, the monomyth is supposed to go through successive stages, from the hero’s origin in an ordinary world where he receives a “call to adventure,” to being assisted by a mentor who leads him through a “guarded threshold” where he is tested on a “road of trials,” to finally facing his ultimate ordeal. After achieving success, the hero returns to the ordinary world wiser and better, improving the lives of others through the rewards that have been bestowed upon him. The itinerary is more complex than this in The Hero with a Thousand Faces, but this should be enough to convey that Campbell’s schema is general enough that it can be applied to anything, but particular enough that it gives the illusion of rigor. Think of Jesus Christ, called to be the messiah and assisted by John the Baptist, tempted by Satan in the desert, and after coming into Jerusalem facing torture at the hands of the Romans, before his crucifixion and harrowing of hell, only to be resurrected with the promise of universal human salvation. Now, think of Jeff Lebowski, called to be the Dude and assisted by Walter Sobchak, tempted by Jackie Treehorn, battling the nihilists, only to return in time for the bowling finals. Other than speaking deep into the souls of millions of people, it should be uncontroversial to say that the gospels and the Coen Brothers’ The Big Lebowski are only the same story in the most glaringly of superficial ways, and yet the quasi-conspiratorial theory of the monomyth promises secret knowledge that says that they are.    

But here’s the thing—stories aren’t hydrogen, plots aren’t oxygen, narratives aren’t carbon. You can’t reduce the infinity of human experience into a Periodic Table, except in the most perfunctory of ways. To pretend that the tools of classification are the same as the insights of interpretation is to grind the Himalayas into Iowa, it’s to cut so much from the bone that the only meal you’re left with is that of a skeleton. When all things are reduced to monomyth, the enthusiast can’t recognize the exemplary, the unique, the individual, the subjective, the idiosyncratic, because some individual plot doesn’t have a magical wizard shepherding the hero to the underworld, or whatever. It’s to deny the possibility of some new story, of some innovation in narrative, its to spurn the Holy Grail of uniqueness. Still, some sympathy must be offered as to why these models appeal to us, of how archetypal literary criticism appeals to our inner stamp collectors. With apologies to Voltaire, if narrative didn’t exist it would be necessary to invent it—and everything else too. The reasons why archetypal criticism is so appealing are legion—they impose a unity on chaos, provides a useful measure of how narratives work, and give the initiate the sense that they have knowledge that is applicable to everything from The Odyssey to Transformers.

But a type of critical madness lay in the idolatry of confusing methodological models for the particularity of actual stories. Booker writes of stories that are “Rags to Riches,” but that reductionism is an anemic replacement for inhabiting Pip’s mind when he pines for Estella in Charles Dickens’s Great Expectations; he classifies Bram Stoker’s Dracula as being about “Overcoming the Monster,” but that simplification is at the expense of that purple masterpiece’s paranoia, its horror, its hunger, its sexiness. There are no stories except for in the details. To forget that narratives are infinite is a slur against them; it’s the blasphemy of pretending that every person is the same as every other. For in a warped way, there is but one monomyth, but it’s not what the stamp collectors say it is. In all of their variety, diversity, and multiplicity, every tale is a creation myth because every tale is created. From the raw material of life is generated something new, and in that regard we’re not all living variations of the same story, we’re all living within the same story.  

Bonus Links:—The Purpose of Plot: An Argument with MyselfThe Million Basic PlotsOn Not Going Out of the House: Thoughts About Plotlessness

Image Credit: Wikimedia Commons.

Letter from Wartime

“Questo è il fiore del partigiano,/o bella ciao, bella ciao, bella ciao ciao ciao,/questo è il fiore del partigiano/morto per la libertà.”                                         —Italian Partisan Song, “Bella Ciao”
“Heard about Houston? Hear about Detroit?/Heard about Pittsburgh, PA?/You oughta know not to stand by the window/Somebody see you up there.”                                         —Talking Heads, “Life During Wartime”
In the hours before Hurricane Sandy slammed into the northeastern United States, my apartment in Bethlehem (Pennsylvania), which was 100 miles and a few hours from the Atlantic, was permeated by the unmistakable smell of the shore. Stolid son of the Alleghenies that I am, I’d never experienced the full onslaught of a hurricane before. This almost miasmic odor I associated with vacation—a fragrance inextricably connected to the Jersey boardwalk and Massachusetts beaches, of salt-water taffy and lobster rolls—suddenly permeating my living room, whose window looked out on a hulking, rusting former steel mill, felt borderline apocalyptic. As is the nature in things apocalyptic, it’s the incongruity that is alarming. As it was for some frightened 17th-century peasant reading a pamphlet foretelling doom because of the appearance of a mysterious comet in the heavens or the birth of a two-headed calf. The unexpected, the unusual, the unforeseen act as harbinger.
A landlocked home smelling like the beach is perhaps not as dramatic as those former examples, of course, and yet as with a sun-shower or the appearance of frost in May, there is a certain surrealism in things being turned upside down. That disruption in the nature of things makes it feel like worse disorder is coming. As it did, certainly, those hours before climate-change-conjured Sandy knocked out transponders, their explosions lighting up the horizon an oozing green all through the night, the winds howling past my building on its hill overlooking the river, where ultimately the power was out for more than a week, and roads made unpassable by the felled centuries-old oaks and maples which dotted the Lehigh Valley. It’s the eerie stillness in the air before the storm came that impressed itself upon me (so much so that this isn’t the first time I’ve written about it), those last few moments of normalcy before the world ended, but when you could tell it was coming, and there was nothing to do but charge your phone and reinforce your windows to withstand the impact from all of the debris soon to be buffeted about. Can you smell the roiling, stormy, boiling sea in the air right now?
“If destruction be our lot,” state representative Abraham Lincoln told a crowd gathered at the Young Men’s Lyceum of Springfield, Ill., in the winter of 1838, “we must ourselves be its author and finisher. As a nation of freemen, we must live through all time, or die by suicide.” Historical parallels outlive their critical utility; some of us have made a cottage industry out of comparing whatever in our newsfeeds to the Peasants’ Rebellion or the English civil wars. In the realm of emotion however, in psychological reality, is the autumn of 2020 what it felt like to learn that Polish defenses had been overrun by the Nazi blitzkrieg? To apprehend the dull shake of those guns of August a generation before? To read news that Ft. Sumter had fallen? As Franco’s war in Spain was to the world war, as Bleeding Kansas was to the civil, are we merely in the antechamber to a room that contains far worse horrors? Ultimately no year is but like itself, so that we’re already cursed enough to live during these months of pandemic and militia, of incipient authoritarianism contrasted with the uncertain hope for renewal. On the ground it can’t help but feel like one of those earlier moments, so that we’re forced to fiddle about with the inexact tool of historical comparison, of metaphor and analogy. Something of what Lincoln said, more than something, seems applicable now. “Suicide” might not be the right word though, unless we think of the national body politic as a single organism in and of itself. Certainly there are connotations of self-betrayal, but it’s more accurate to see this season of national immolation as what it is—a third of the country targeting another third while the remaining third remains non-committal on what stand they’ll take when everything starts to finally fall apart.
We shouldn’t misread Lincoln’s choice of word as indicating an equivalence of sides; in this split in the national psyche there is the malignant and the non-malignant, and it’s a moral cowardice to conflate those two. On one side we have a groundswell movement on behalf of civil and human rights, a progressive populism that compels the nation to stand up for its always unrealized and endlessly deferred ideals; on the other we have the specter of authoritarianism, of totalitarianism, of fascism. This is not an issue of suicide, it’s one of an ongoing attempted homicide, and if you’re to ever not shrink away from mirrors for the rest of your life—even if the bad guys should win (as they might)—then choose your side accordingly. And figure out that you don’t even have to like your allies, much less love them, to know that they’re better than the worst people in the room. If you bemoan “cancel culture” and “social justice warriors” but not extrajudicial kidnapping of activists by paramilitaries, then you are at best a hypocrite and a fool, and at worst a bad-faith actor justifying the worst of the U.S. government. If your concern is with the rhetorical excesses of a few college kids on Twitter, but you’re silent about the growing fascist cult currently in control of the federal executive, the federal judiciary, half of the federal legislature, and a majority of state governments (not to speak of the awesome power of the military), then you’ve already voted with your words. If you’re disturbed by property destruction, but not the vigilante murder of protestors, then you’ve since made your decision. We all have to imagine that speaking out might still mean something; we have to pretend like voting might make a difference; we all have to live with ourselves as citizens and humans beings. What I’m writing about is something different, however. What I’m writing about is what it feels like to be living through the blood-red dusk of a nation.
When the Romans left Britain, it was so sudden and surprising that we still have record of the shock amongst the locals over the retraction of the empire from their frosted shores. The Medieval English monk Gildas the Wise, as well as his student the Venerable Bede, record that in the immediate years following this abandonment, an appeal was sent to the capital for assistance. “The barbarians drive us to the sea,” wrote the Britons’ leaders, “the sea drives us to the barbarians; between these two means of death, we are either killed or drowned.” Under the protection of the imperial hegemon, the British Celts built an advanced civilization. Aqueducts brought water into the towns and cities, concrete roads lined paths through the countryside. One imagines that the mail arrived on time. In a shockingly brief period, however, and all that was abandoned; the empire having retracted back into itself and left those for whom it was responsible at the mercy of those who wished to pick apart its bones. Three centuries later, and the inhabitants of England no longer even remembered Rome; an anonymous Anglo-Saxon poet writes of a ruined settlement, that “This masonry is wondrous; fates broke it/courtyard pavements were smashed… Roofs are fallen, ruinous towers,/the frosty gate with frost on cement is ravaged,/chipped roofs are torn, fallen,/undermined by old age.” Have you seen American infrastructure lately? By the eighth century and that silent scop singling his song of misinterpreted past glories can’t even imagine by what technology a city like Londinium was made possible. He writes that “the work of giants is decaying,” because surely men couldn’t have moved stones that large into place.
Because historical parallel is such a fickle science, an individual of very different political inclinations than myself might be apt to misunderstand my purposes. They may see some sort of nativist warning in my allegory about Picts and Scots pushing beyond Hadrian’s great, big beautiful wall. Such a reading is woefully incorrect, for the barbarians that I identify are not some mythic subaltern beyond the frontier, but rather the conspiratorial minded fanatics now amassing at the polls, the decadent parsers of tweets who believe in satanic cabals, and the personality cultists who’ve all but abandoned a belief in democracy. As the Greek poet Constantine Cavafy wrote, “Why isn’t anything going on in the senate?/Why are the senators sitting there without legislating?/Because the barbarians are coming today.” We’re beyond the point of disagreeing without being disagreeable, the era of going high when they go low is as chimerical as it ever was.  There is something different in the United States today, and I know that you feel it; something noxious, toxic, sick, diseased, and most of all decadent. The wealthiest nation on Earth with such iniquity, where pandemic burnt—still burns—through the population while the gameshow host emperor froths his supporters into bouts of political necromancy. There is no legislation today because it increasingly feels like this is not a nation of laws, but something lower and uglier.
When I say that there is a decadence, I mean it in the fullest sense of that word. Not in the way that some reactionaries mean, always with their bad faith interpretations; nor exactly in the manner that my fellow leftists often mean, enraptured as they are to that ghost called “materialism.” Rather I mean a fallenness of spirit, a casual cruelty that if I were a praying man I’d identify as being almost devilish. Perhaps there are satanic cabals after all, just not where the letter-people think (I suspect the call is actually coming from within the White House). Since the republic was founded, we’ve fancied ourselves Rome, always fearing the Caesar who never seems to finally cross the Potomac. That’s the thing with self-fulfilling prophecies. Now the denizens of the fading order of Pax Americana seem every bit as incredulous at collapse as those poor Britons a millennium-and-a-half ago. Writing in The Irish Times, the great critic Fintan O’Toole notes that “Over more than two centuries, the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” I can genuinely say that I appreciate his sentiment.
When I lived in Europe, I couldn’t help but feel that there was ironically something younger about my friends—I imagine it would seem compounded today. The irony comes from the traditional stereotype of “The American,” this rustic well-meaning hayseed, this big, bountiful, beautiful soul traipsing on his errand into the wilderness. If America was a land without history, then the Old World was supposedly death haunted, all those Roman ruins testament to the brutality that marked that continent, not least of all in the last century. Such was the public relations that marked this hemisphere from its supposed discovery onward—but how easily we forget the blood that purchased this place, a land which was never virginal, but that was raped from the beginning. I envy Europeans. I envy their social democracy and their welfare states, their economic safety nets and their sense of communal goodwill (no matter how frayed or occasionally hypocritical). Every European I met, the English and Scots, the French and Italians, seemed more carefree, seemed more youthful. They seemed to have the optimism that Americans are rumored to have but of which there is no remaining evidence of as the third decade of this millennium begins. During the early days of the pestilence the Italians were locked inside all of those beautiful old stone buildings of theirs. Now they’re sitting outside in cafes and trattorias, going to movies and concerts. We’re of course doing those things too, but the difference is that we have more than 200,000 dead and counting, and from the top on down it seems like few care. A French friend of mine once asked how Americans are able to go to the grocery store, the theater, the public park, without fear of getting shot? In the end, America will get you, whether by bullet or microbe. As a nation of freemen, we’re a traumatized people…

One of the few outsiders to really get our number was D.H. Lawrence, who in his Studies in Classic American Literature noted that “The essential American soul is hard, isolate, stoic, and a killer. It has never yet melted.” How could it be otherwise, in a nation built on stolen land by stolen people? America’s story is a gothic tale, a house built on a Native American burial ground. The legacies of bloodshed, of assault, of exploitation, of oppression that mark this forge of modernity ensure that it’s hard to be otherwise, even if we’re not allowed to ever admit such unpatriotic things. In that sense I don’t wonder if it wasn’t inevitable that we’d eventually be led—against the wishes of the majority—by this fool who promises to steal an election while accusing his adversary of the same, who will no doubt refuse to concede even when it becomes clear that he’s lost. We’re continually told by nice, liberal, and morally correct commentators that this is not who we are, but the American president is a philandering, sociopathic carnival barker who sells bullshit to people who can’t be so brain dead as to not know that it’s bullshit, all because they hate people who look different from them more than they love their own children. He’s Elmer Gantry, Harold Hill, “Buzz” Windrip.  He’s the unholy union of P.T. Barnum and Andrew Jackson. What could be more American?
Of course our saving grace has always been that we’re a covenantal nation, defined by supposed adherence to an abstract set of universal values. No land for anything as mundane as blood and soil (even though those ghouls at Charlottesville spread their terror for exactly that reason). There was something scriptural in the idealism that John Winthrop maintained in 1630, whereby national sustenance was in “our community as members of the same body,” or Lincoln in 1864 providing encomium for “government of the people, by the people, for the people,” and Barack Obama in 2004 declaring the American mantra to be one of “Hope in the face of difficulty, hope in the face of uncertainty, the audacity of hope.” That old saw about life, liberty, and the pursuit of happiness. No nation since that of the ancient Hebrews was so fully founded upon an idea—this idea that is by definition so utopian and so completely unattainable that to be a satisfied American is to make your peace with heartbreak, or else to see yourself become either delusional or cold and cruel.
There is an idea of America and the reality of the United States, and all of our greatest literature, rhetoric, and philosophy lives in that infinite gap between, our letters always being an appraisal of the extent of our disappointment. “The promises made in the Declaration of Independence and the Constitution,” writes critic Greil Marcus in The Shape of Things to Come: Prophecy and the American Voice, “were so great that their betrayal was part of the promise.” Thus the greatest of American political modes from the Puritans to Obama would be the jeremiad. Thus our most native of literature, be it Mark Twain’s The Adventures of Huckleberry Finn or Ralph Ellison’s Invisible Man, chart the exigencies of a dream deferred. All of American literature is a tragedy. What we’re living through now isn’t a tragedy, however—it’s a horror novel. Only the most naïve of fools wouldn’t be aware that that strain of malignancy runs through our country’s narrative—all of the hypocrisies, half-truths, and horrors—that define us from the moment when the word “America” was first printed on Martin Waldseemüller and Mathias Ringmann’s map of the world in 1507. In Stephen Vincent Benet’s classic short story “The Devil and Daniel Webster,” Old Scratch himself says that “When the first wrong was done to the first Indian, I was there. When the first slaver put out for the Congo. I stood on her deck…I am merely an honest American like yourself—and of the best descent.” What would Eden be, after all, without the serpent? A thing with devils is that they imply there must be angels; if you can find proof of hell, that indicates that there might be a heaven, somewhere. That’s the corollary to the failed covenant, that even with all of the hypocrisy, half-truth, and horror, there is that creed—unfulfilled, but still stated. Freedom of expression. Equal opportunity. The commonwealth of all people. Do I write jeremiads myself? Very well then.


I only do so to remind us that the confidence man huckster (who as I write this is only a few miles down Pennsylvania Avenue undoubtedly conspiring on what nightmares he’ll unleash upon his fellow citizens when he doesn’t get his way) is an American, if a cankered one. Take solace, though, because America isn’t just Stephen Miller, but Harriet Tubman and John Brown also; it’s not only Steve Bannon, but Frederick Douglass and Elizabeth Cady Stanton; more than Donald Trump, it’s also Eugene Debs and Dorothy Day, James Baldwin, and Emma Goldman, Harvey Milk and Ruth Bader Ginsburg. Such a litany of secular saints is of course inconsistent, contradictory, and I’ll unabashedly confess a bit maudlin. But that’s okay—we need not all agree, we need not all be saints, to still be on the side of those beings in any such Manichean struggle. More than just angels can fight demons; the only thing required is the ability to properly name the latter. Because if American history is anything, if the American idea is anything, it’s a contradictory story, that dialectical struggle that goes back through the mystic chains of memory, a phrase which I once read somewhere. The contradictions of American culture once again threaten to split the whole thing apart. Make your plans accordingly, because the battle always continues.
For such is the great moral struggle of this century. It is against neofascism and its handmaiden of a cultish twisted civil religion. It requires the breaking of this fractured American fever dream, where a vaccine is far from assured. Right now it seems like our choices are authoritarianism or apocalypse, though perhaps there are always reasons to hope for more. What’s coming, I can’t be sure of, but that lyric of the great prophet Leonard Cohen “I’ve seen the future, brother/It is murder” echoes in my numbed brain. Whether or not we can stand athwart history and yell “Stop!” or not, whether or not there is the possibility to affect genuine change, whether or not it’s we can still salvage a country of decency, justice, and freedom—I’m unsure. What I do know is that whether or not any of those things can happen, we must live our political lives with a categorical imperative that acts as if they can. Least of all so that we’re able to live with ourselves alone in the rooms of our minds. Live with at least some convictions, live spiritually like the men remembered in poet Genevieve Taggard’s lyric in honor of those veterans of the Abraham Lincoln Brigade. Americans (mostly socialists, communists, and anarchists) who went to Spain to fight the fascists in the years before the Second World War. “They were human. Say it all; it is true. Now say/When the eminent, the great, the easy, the old,/And the men on the make/Were busy bickering and selling,/Betraying, conniving, transacting, splitting hairs,/Writing bad articles, signing bad papers,/Passing bad bill,/Bribing, blackmailing,/Whimpering, meaching, garroting, – they/Knew and acted.”
Bonus Links:—Letter from the Other ShoreLetter from the PestilenceSteal This Meme: Beyond Truth and LiesOn Pandemic and Literature
Image Credit: SnappyGoat.

Humble Words Organized Beautifully: Ward Farnsworth on Style

Something disappointing in the fact that despite his immaculate New Yorker essays and his perfect children’s books (the latter of which is nothing to scoff at), E.B. White will most likely be primarily remembered for his over-valorized and pedestrian composition guide based on punched-up lecture notes from his old Cornell professor William Strunk. I’ll confess that White’s Stuart Little moves me more than Michael Cunningham’s The Hours, even if the former is about an anthropomorphic mouse who falls in love with a bird and the latter is about Virginia Woolf killing herself, but when it comes to The Elements of Style, I’m left completely cold. Linguist Geoffrey Pullum gleefully demolished the shrine to Strunk and White in a Chronicle of Higher Education piece where he condemned the “book’s toxic mix of purism, atavism, and personal eccentricity,” which is not “underpinned by a proper grounding in English grammar.” Still you’ll find precocious English majors and pretentious English professors who cling to White and Strunk’s guide as if holy writ, repeating their dogma of the best (or only) writing as “being specific, definite and concrete” or that “Vigorous writing is concise,” as if those were postulates of physics and not aesthetic suggestions mediated through a particular time and place (with the attendant masculine obsessions of that time and place).

The Elements of Style endures, however, like some antique bacterium in Siberian permafrost released by climate change and threatening us all; something better left as a relic of the time that produced it rather than as a universal guide to good composition. They arrogantly pose laws as if they were the Author of the Decalogue, and their stylistic affectations are configured as inviolate rules of grammar. Strunk and White are mummies of the Lost Generation, bound in typewriter ribbon and pickled with scotch, and their adjective-slaying, adverb-slaughtering, violent Fitzgeraldian demands to kill your darlings reflect a type of writing that’s only one example in the many-mansioned house that is literature. It’s not that the advice they give leads to bad writing, and if concision is your aim than by all means dog-ear those pages of their book. It’s rather that Strunk and White exclude anything with a glint of the maximalist, a hint of the baroque, a whiff of the rococo, a sniff of the Byzantine, or—egad!—even a touch of the purple. They make totems of simplicity, fetishes of concision, idols of conventionality. I can’t in good conscience tell students that they should “Prefer the standard to the offbeat” or that that they should “not affect a breezy manner.” Literary style, as with clothing, is an issue not of dressing like somebody else, but of being the most fully you that you can be (as Queer Eye’s Tan France would no doubt confirm). If Brooks Brothers is your thing, then by all means let Strunk and White be your guide, but never forget that the wardrobe goes back a way. “It’s sad,” Pullum writes of the regard in which the book is still held a century after it was written. For 13 years I’ve taught college composition, and for all 13 of them I’ve refused to teach The Elements of Style.

Luckily there are no shortage of composition guides of varying qualities and uses, and I’ve taught a cornucopia of them (veggies both rotten and succulent). The nearest to Strunk and White is George Orwell’s “Politics and the English Language,” which for what it shares with The Elements of Style—in demands for the elimination of excess words, the denigration of the noble passive voice, and the provincialism that piously intones that we should “Never use a foreign phrase, a scientific word, or a jargon word”—still has crusted about it some interesting philosophical observations about the relationship of language to thought. Many of my colleagues are partial to Gerald Graff’s They Say/I Say, which is less a style guide than it is a general first-year composition guide, focusing on the way in which arguments can be posed—a useful volume, even if its Stanford doctorate holding author’s affectation of being a reformed ‘50s greaser who learned the importance of higher education leaves Generation Z befuddled. Several years ago, I put in a book order for Stanley Fish’s How to Write a Sentence: And How to Read One, enchanted by his mea culpa that “Some people are bird watchers…I belong to the tribe of sentence watchers.” Yet like many Fish books it seemed like a good idea at the time. For similar reasons, I never even thought to crack the spine of Steven Pinker’s The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century, with its promise to bring cognitive science to bear on the humble scribbler’s trade.

Style guides that deserve less snark include Karen Elizabeth Gordon’s delightfully gothic and grammatical The Transitive Vampire: The Ultimate Handbook of Grammar for the Innocent, the Eager, and the Doomed, Sam Leith’s excellent oratory and composition guide Words Like Loaded Pistols: Rhetoric from Aristotle to Obama (as terrible a title as that may be), and the wonderfully practical and pop culture infused primer by Arthur Plotnik, Spunk & Bite: A Writers’ Guide to Bold, Contemporary Style (as great a title as that is). I’d be remiss not to mention Random House’s chief copy editor Benjamin Dreyer’s engaging Dreyer’s English: An Utterly Correct Guide to Clarity and Style, which is a chatty, if thorough, encomium for the lost art of line editing, and which I’ve been pleased to read but have never taught. I’ve neither taught nor read Francis-Noël Thomas’s Clear and Simple as the Truth: Writing Classic Prose, with its promise of continental elegance and the Attic style, with its sophisticated sense that “learning to write cannot be reduced to acquiring writing skills,” but it’s to my loss.

Now our bookshelves can include a new title by Ward Farnsworth, dean of the University of Texas Law School, entitled Farnsworth’s Classical English Style. This title concludes a trilogy of Farnsworth’s, joining the unlikely cult hits of Farnsworth’s Classical English Rhetoric and Farnsworth’s Classical English Metaphor, bringing to a close his series of vaguely Victorian, vaguely tweedy, and vaguely Anglophilic guides to style and writing (joining The Practicing Stoic, which I reviewed for The Millions). In his preface, Farnsworth avails himself well in the style guide turf war between linguistic prescriptivists and descriptivists, noting with admirable writerly latitudinarianism that “Most modern books offer advice: write this way, not that way. This book does not offer advice of that kind.” Belying the slightly fussy affectation that the book presents, from its title evocative of the 19th century, to the bulk of Farnsworth’s examples coming from writers like Dickens, Churchill, Lincoln, and Dr. Johnson, his philosophy of composition is wonderfully anarchic when compared to the partisans of prescription who dominate the writing classroom and the style-guide racket. The packaging says “conservative,” but the spirit says “rip the pages out of your book.” Farnsworth’s Classical English Style is a Molotov cocktail wrapped in paisley; a hand-grenade cushioned in madras. “Books on style usually state precepts that have merit but that talented writers violate often. Much of this book is about the violations and reasons to commit them,” Farnsworth writes, but “Our topic, in part, is when to make exceptions.” A manifesto for the preppy revolutionary of the writing seminar. Because every single dictum of the Strunk-White Industrial Complex is complicated by Farnsworth; within the pages of his book you will encounter a defense of passive voice, a glorying in the labyrinthine curve of sentences that pack clause upon clause, and a celebration of the Latinate (though not at the expense of the Saxon).

Orwell offered six rules of writing; Strunk and White had their 11 principles (as if writing were a management course), but Farnsworth correctly notes that the “only rule really worth worrying about is simple: have a reason for whatever you do in your writing.” The whole of the law is thoughtfulness, otherwise do what thou wilt. Consequently, Farnsworth’s grills the sacred cows of muscular modernism’s style guides into bloody hamburger. If a revolution refers to a turning back (a “revolving” back) than Farnsworth doesn’t reject the icons of efficient writing so much—Papa Hemingway and Cormac McCarthy with all their swagger and exterminated adjectives—so much as he turns back to the rich classical rhetoric of the era that modernism supplanted, valorizing Thomas Paine and Charles Dickens, Mary Wollstonecraft and Frederick Douglas—with all of their biblical phrasing, their classical allusion, their gargantuan sentences and erudite diction. None of these writers would have passed the Strunk and White smell test; they could be long-winded, they could be multisyllabic, they could be meandering, allusive, and illusive. Eighteenth- and 19th-century prose calls to mind the Victorian critic Thomas Macaulay’s ambivalent assessment of Dr. Johnson, that “All his books are written in a learned language, in a language which nobody hears from his mother or his nurse, in a language in which nobody ever quarrels, or drives bargains, or makes love, in a language in which nobody ever thinks.” Which naturally makes one ask if some things might not precisely call for an unnatural language; which makes one apprehend that there might be more things in heaven and earth than are dreamt of in The Elements of Style.

Farnworth notes that “different styles are right for different occasions,” and while he doffs his cap to the style guides that reign triumphant, he also makes clear that “This book treats efficiency as important but as not enough.” The clipped sentences were advocated for and the endangered punctuation, adverbs, and adjectives threatened by the modernists and their intellectual descendants, who have long dominated the teaching of correct style and derived their conclusions in part because they were constrained by material conditions. A journalist like White was naturally limited by the tyranny of the margin, and much of style guide orthodoxy comes from print reporting where one really did have to be judicious with verbiage. Little wonder that efficiency and concision, in all of their capitalist and Protestant dreariness, became the shibboleths of proper style. Farnsworth’s point is that things were not always as such, nor that they always have to be so. If advocates for simplicity believe that the “purpose of a piece of writing is to transmit meaning to the reader; so, the writer’s job is to make the meaning easier to understand,” they’ll find little to agree with in Farnsworth’s Classical English Style. He summarizes the White-Strunk Consensus as being that good writing uses “simple words… short sentences… [and is] direct.” From that ideal there must be an exterminator’s crusade against “Needless words, needless length, or needless anything,” since they are rank inefficiencies. Farnsworth challenges every single one of those assertions.

But he’s not a complete libertine, and if anything, Farnsworth’s Classical English Style provides some deeper and more useful axioms of writing. Farnsworth offers a novel encapsulation of what makes good writing good, whether in the King James Bible or the Gettysburg Address. He hypothesizes that “Rhetorical power doesn’t come from just being clear or just being concise or by pushing in any other one direction. It’s usually created by some sort of push and pull, or in a word by contrasts.” Where a traditional style guide would emphasize shortness and simplicity as a means to writerly elegance, Farnsworth says its in the display of difference that writing becomes most fully engaging, with the union of oppositions, “usually one of friction or contrast, between two things.” Illustration of this is provided when he analyzes the average sentence structure and length in the writing of Supreme Court Justice Oliver Wendell Holmes, arguably the greatest American legal writer (and thus close to Farnsworth’s judicial attorney’s heart). While taking care to emphasize that sentence length should be varied, most contemporary style guides hew towards the mantra that shorter is better. Farnsworth compares two paragraphs of legal writing, one from Holmes and the other from the entirely more pedestrian writer (and man) Chief Justice William Rehnquist. What Farnsworth discovers isn’t that Holmes’s sentences are either shorter or longer on average than those of Rehnquist, rather Holmes’s shorter sentences are much shorter and his longer sentences are much longer. The consensus perspective might be that briefer sentences are preferable, and the oppositional pugilist might challenge that, but the reality is more nuanced—it was in the juxtaposition of such extremity, Farnsworth argues, that Holmes’s talents lay (It would be fascinating to see an analysis of Antonin Scalia, for what that justice lacked in scruples or ideas he made up for in style).

The rhythm of like and unlike is what becomes Farnsworth’s stylistic equivalent of the unified field theory of physics; it is (not unconvincingly) his argument for what separates the sublime from the passable. A shift in difference can take many forms; he writes that the “two things might be plain and fancy words, long and short sentences, hard and soft syllables, high or rich substance and low or simple style…the concrete and the abstract, the passive and the active, the dignified and the coarse, detachment from the audience and engagement with it.” The point, he seems to be saying, is that strict prescriptions for what words to pick, how long to make your sentences, what tone to affect, miss the point—the whole game is in setting up some pair of dueling Manichean principles and letting that tension be the energy that propels the prose. The whole thing puts me in mind of an old comic strip that one sees posted upon many a faculty office door. Underground cartoonist Matt Groening, now very wealthy from being creator of The Simpsons, had an entry in his strip Life in Hell entitled “The 9 Types of College Professor,” which included the “Single Theory-to-Explain-Everything Maniac.” His warning about this genus of academic was that their “Theory may be correct.”

That’s a bit what Farnsworth’s contention feels like to me—I’m skeptical, but his rhetorical analysis goes a long way to reverse engineering what makes great prose effective. The strength of his theory is that it’s general enough, for disjuncts in prose can be manifested in a variety of ways. He shows the strength of his observation when turning towards the Gettysburg Address, often configured as an exemplar of the plain-spoken Attic style. “The beauty and power of Lincoln’s wording,” however, “lies not in a relentless use of Saxon words but in the movement between earthy language and airier words and phrases that elevate.” Contrary to the Orwell prescription that native English etymology must always be preferred, Farnsworth says that Lincoln’s genius was in knowing how to weave the various strains of English together, so that “The Saxon words create feeling and convey simplicity and sincerity. They hit home. The Latinate words evoke thought and connect the images to concepts and ideals. The sound and tone of each balances the sound and tone of the other.”

If there is a drawback to Farnsworth’s Classical English Style, it’s that he leans so heavily on excerpts from “Shakespeare and the King James Bible, from Lincoln and Churchill, from Charles Dickens and Mark Twain.” His contention that “writers of lasting stature still make the best teachers” is well taken; I always tell my students that there is no shortcut to good writing, it can only come about by reading all of the time. But the problem with the canon is that it’s a surprisingly anemic syllabus. Farnsworth says that the “premise of the book” is that these canonical authors provide “a set of lessons on style drawn from writers whose words have stood the test of time.” To which it could be retorted that that’s true for some of them; Lincoln and Churchill are unambiguously great, G.K. Chesterton is a bit too sherry-pickled for my taste, and the passages from the Irish parliamentarian Henry Gratan were certainly good, but I don’t know if he’s withstood the test of time in the sense that he doesn’t appear on a Barnes & Noble tote bag. At a certain point the litany of selections from Cardinal Newman and Patrick Henry, Daniel Webster and Edmund Burke gets tiring. Farnsworth doth protests too much about there not being enough contemporary examples exhibiting the exact qualities he celebrates, pernicious modernist minimalism or not. Farnsworth’s Classical English Style would have benefited by some Joan Didion or James Baldwin, if we’re hunting for great sentences.

A small quibble, that. Because while Farnsworth’s tastes might be conservative, his appreciation is radical. Farnsworth’s Classical English Style has a great benefit in exploding all of the pious certitude of every grammarian who has yelled at somebody for ending a sentence in a preposition, every knife-wielding composition teacher who with red pen excises succulent meat from the bare-bone of the sentence, every sectarian of shortness declaiming that their way of writing is the only way of writing. Questions of composition pedagogy are often configured as a pitched battle between conservative prescriptivists and liberal descriptivists, the former drafting laws that must be followed and the later simply describing language as it’s actually used. Any number of conservative commentators who decry the so-called degradation of contemporary language, blaming texting or pop music, are within the prescriptivist camp even while most teachers of writing are firmly descriptivist. It would be easy to see the author names that Farnsworth uses to illustrate his points and to assume he’s in the conservative faction, and it’s true that he opposes a certain literary entropy, but he’s not mounting the same argument that people who decry text-speak are making. “Those who wring their hands about the decline of the language sometimes worry too much about the wrong things,” Farnsworth writes, “They observe with horror that people confuse uninterested with disinterested, or don’t know when to say fewer and when to say less, or fumble in their use of the apostrophe or other punctuation marks.” But grammar is incidental, in many ways; the true decline has been one of style, in part ironically pushed by the very people who claimed to be defending the honor of English. He argues that “the more meaningful decline of the language doesn’t involve the presence of mistakes. It involves absences that are easier to overlook: the abandonment of half the orchestra, the erosion of rhetorical ability, the dwindling of attention spans, the scarcity of speech that inspires and rouses and strikes deep.”

Farnsworth’s Classical English Style is a worthy rejoinder to The Elements of Style. If Strunk and White sent half the orchestra home, furloughing all of the grab-bag of rhetorical tropes that make language musical from anaphora to zeugma, then Farnsworth is passing around a collection plate to make sure that we can still hear their music. He’s correct that the style promoted by the composition guides that dominate our definitions of good writing have worthy observations in them—there is no shame in that mode, as long as we acknowledge it as one among many. But the full variety of ways of writing are reduced, minimized, obscured. Farnsworth is right to mount a defense of the beleaguered Byzantine. It puts me in mind of a volley in the Pedant’s War that I once got into with a colleague who objected to my favored use of the rhetorical trope of asyndeton, the practice of deleting conjunctions so as to effect a kind of breathless rhythm when listing ideas at the end of a sentence. For him this was unconventional and distracting; for me, it gave the structure of the sentence exactly what I wanted—a sense of things as being rushed, energetic, incomplete. Count me in Farnsworth’s camp. What he offers is a beautiful stylistic disestablishmentarianism. A sentiment that gives student and writer alike the permission to be breezy—the permission to prefer the offbeat to the standard. Follow the call: enjoy the unconventional word, the meandering sentence, the affected rhetorical trope. Extremism in defense of the baroque is no vice; moderation in pursuit of minimalism is no virtue.

Bonus Links:—A Year in Reading: Ward FarnsworthThinking Makes It So: Ward Farnsworth Reframes the Stoics with Wit and InsightWard Farnsworth Doesn’t Mess Around: On ‘Classical English Metaphor’A Review! A Review! Farnsworth’s Classical English RhetoricScenes From Our Unproduced Screenplay: ‘Strunk & White: Grammar Police’Prescriptivists vs. Descriptivists: The Fifth Edition of The American Heritage DictionaryThe Impediments of Style: Advice from Steven Pinker and the CIA

On Isolation and Literature

“The mind is its own place, and in itself/Can make a Heav’n of Hell, a Hell of Heav’n.” —John Milton, Paradise Lost (1667)

“All of humanity’s problems stem from man’s inability to sit quietly in a room alone.” —Blaise Pascal, Pensées (1670)

In a field between Sharpsburg, Md., and Antietam Creek in the fall of 1862, more than 21,000 men would die in a single day. In a photograph taken by Matthew Brady of the battle’s aftermath, which in the south is named after Sharpsburg and in the north is referred to by Antietam, there is a strewing of bodies in front of the Dunker Church maintained by a sect of Pennsylvania Dutch Baptists. In their repose, the men no longer have any concerns; in the photograph it’s difficult to tell who wears blue and who wears grey, for death has never been prejudiced. Americans had never experienced such destruction before, such death, such a rupture from what they had defined previously as normal. If Americans had been cursed with their own erroneous sense of exceptionality in the decades before the Civil War, believing that suffering was something that only foreigners were susceptible to, then that carnage temporally availed them of their self-superiority. Drew Gilpin Faust writes in This Republic of Death: Suffering and the American Civil War that the “impact and meaning of the war’s death toll went beyond the sheer numbers. Death’s significance for the Civil War generation arose as well from its violation of prevailing assumptions about life’s proper end—about who should die, when and where, and under what circumstances.”

The United States was unprepared for the extremity of this thing—22,717 young men dead in a day—with almost a million perishing by its end. Faust writes that “Americans of the immediate prewar era continued to be more closely acquainted with death than are their twenty-first-century counterparts,” though if the state of exception demonstrated by the war proves anything, it’s that nobody should be so sanguine concerning his privileges. One survivor of Antietam, a member of the Massachusetts 15th named Roland Bowen, castigated a friend who wanted ghoulish details of the battle. He writes in a letter that such images “will do you no good and that you will be more mortified after the facts are told than you are now.” Such suffering couldn’t be circumscribed by something as insignificant as mere words, nor was it the task of Bowen to supply such texture in fulfilling his friend’s prurient fascination. The task of putting words to this horror belonged to somebody with no allegiance to anything as crass as the literal, and paradoxically it wouldn’t come from somebody who was actually witness to the horrors. A year before Antietam’s blood-letting, and a 31-year-old woman sequestered in a 970-square-foot room in a yellow wooden house in Amherst, Mass., would presciently write on the back of an envelope that “I felt a Funeral, in my Brain,/And mourners to and fro/Kept treading – treading – till it seemed/That Sense was breaking through.”

Emily Dickinson is American literature’s most significant recluse. She is our hermit, our anchorite, our holy isolate. Despite Dickinson’s self-imposed solitude, first limiting herself to Amherst, then her family’s house, and finally, finally living only in her own bedroom where she would speak to visitors from the half-opened door, her poetry is the greatest literary engagement with the trauma of the war. She was a spiritual seismograph, transcribing and interpreting the vibrations that she detected through the land itself, and though she never saw the battlefields at Antietam or Gettysburg, never even leaving Massachusetts, her 1,789 short lyrics are the fullest encapsulation of that event, even while it’s never specifically mentioned—though lines like “My Life had stood – a Loaded Gun” evidence her mood.

Only a handful of her poems were published in Dickinson’s own lifetime, normally anonymously, with a notable example being a few lyrics included in the 1864 anthology Drum Beats whose proceeds went to Union veterans. The war’s apparent absence from her poetry is incongruously proof of its presence, for as Susan Howe writes in My Emily Dickinson, the “Civil War broke something loose in her own divided nature.” Other figures like Walt Whitman and Herman Melville produced brilliant poetry about the war as well, but the absence of explicit language about battle-field deaths in Dickinson’s verse is a demonstration of Bowen’s warning that mere reportage “will do you no good.” She isolates not only herself, but the meaning of her poems, from the brutal reality of that American apocalypse—such isolation mimics the brutality of the event all the more completely. “I have an appetite for silence,” she wrote, for “silence is infinity.”

Within the cocoon of that silence, Dickinson made herself a conduit for the blood-sacrifice then taking place; despite being in solitude, she was not solitary; despite being isolated, she was not an isolate. There are two ways of producing literature; from her multitudinous contemporary Whitman there was the gospel of extroversion, the smithy of the crowd whereby the throngs are the source of his energy and the “sidewalks are littered with postcards from God.” His engagement with the war was visceral, forged in Washington D.C.’s hospitals where he tended to the injured. Dickinson’s poetry of seclusion was more abstract, but no less pertinent, and from her introversion a different variety of poetry could be produced. Dickinson is too often reduced to mere recluse, she is transformed into a crank sequestered in an attic, but she was actually a brilliant performance artist for whom the process was as integral as the product. Buddhist scholar Stephen Batchelor writes in The Art of Solitude that there is more to that state “than just being alone. True solitude is a way of being that needs to be cultivated. You cannot switch it on or off at will. Solitude is an art…When you practice solitude, you dedicate yourself to the care of the soul.” Dickinson’s isolation wasn’t just how she crafted her verse, in some sense it was her verse.

“Interiority” is one of those literary critical jargon terms that is overly maligned, for it expresses something useful about this quality of consciousness we share, a term for the many mansions in our head. There is a breadth and width to the human experience—and the experience of the human experience (if I’m to be meta)—that only “interiority” can really convey. Douglas Hofstadter writes in I Am a Strange Loop that “what we call ‘consciousness’ was a kind of mirage…a very peculiar kind of mirage…Since it was a mirage that perceived itself…It was almost as if this slippery phenomenon called ‘consciousness’ lifted itself up by its own bootstraps, almost as if it made itself out of nothing.” Such an ex nihilo self-creation can only take place alone, of course. And in her solitude, Dickinson, like all hermits, made the very substance of her thoughts a living work.

Some people, perhaps most people, live their life on the outside, all thoughts conveyed in a running monologue to the world. But the isolation of crafting literature, even if done in a crowded room, is such that any writer (and reader) must be by definition solitary, even while entire swaths of existence are contained inside one human skull. Such is the idealism of Dickinson when she claims that “To make a prairie…The revery alone will do.” Isolation is the hard kernel of literature. Beyond the relatively prosaic fact that there have been reclusive writers and secluded characters, isolation is also the fundamental medium of both reading and writing, traceable back to our inherited numinous sense and the thread of expression that intimates works hidden, all that we shall never read but that nonetheless radiate outward into the world with beauty.

A history of isolation is a history of literature, albeit a secret one. Historically we’ve valorized men of action, but it’s people of seclusion who just as easily move the Earth. Think of Christ’s Lenten vigil in the Negev, the way in which Satan tempted him and the Son of God so easily resisted, the Lord kept company only by silence and perdition. Isolation is a counternarrative of human existence; for every vainglorious general like Alexander the Great conquering until the ends of the Earth, there is the philosopher Diogenes living naked in an Athenian pot and imploring the former to get out of his sunlight. Emperor Qin Shi Huang can be answered by the sage Lao-Tzu riding off by himself unto the west, and every bishop and pope can be matched by the Desert Fathers and anchorites. For every Andrew Carnegie, there should be a Henry David Thoreau in his Walden cabin. Isolation is one of the fundamental themes of literature, the kiln of experience whereby a human is able to discover certain aspects of character, personality, and existence through journeying to the center of their being (though results are certainly varied).

In fiction, there are the recluses damaged by their toxic loneliness: think of Miss Havisham in Charles Dickens’s Great Expectations in her filthy, tattered wedding dress sadistically toying with Pip and Estella, or of the psychically diseased anonymous narrator in Fyodor Dostoevsky’s Notes from the Underground who can intone that “To be acutely conscious is a disease, a real, honest-to-goodness disease.” For all of the sociopathic recluses, biding their time under floorboards and going mad in attics, there are also positive depictions of isolation. Daniel Defoe’s titular character in Robinson Crusoe, ship-wrecked upon a desert isle and thus forced to recreate civilization anew, has often been understood as a representative citizen of hard-working, sober, industrious modernity, whereby “We never see the true state of our condition till it is illustrated to us by its contraries, nor know how to value what we enjoy, but by the want of it.” Something similar in Jack London’s The Call of the Wild, when writing of the isolation of the Alaskan wild he could declare that there is an “ecstasy that marks the summit of life, and beyond which life cannot rise. And such a paradox of living, this ecstasy comes when one is most alive, and it comes as a complete forgetfulness that one is alive.” Isolation, however, is far more than a subject authors describe now and then.

Few religious traditions are lacking in the hermitage or the monastery, at least in some form. Among the ecstatic Hasidism there are stories of rabbis like the Baal Shem Tov who lived at least part of his life as a hermit, and according to tradition, the second-century founder of Kabbalah Rabbi Shimon bar Yochai resided in a cave for 12 years as the Romans destroyed Jerusalem, exploring mystical secrets. When he could finally leave, bar Yochai’s focus was so intense that he was able to smite people with his eyes. Christianity potentially adopted monasticism from the esoteric first-century Jewish group the Essenes, and from the Desert Fathers onward, men who lived their lives sitting atop tall columns or who would meditate among the sands of Sinai, would develop a full-fledged system of religious seclusion. The wages of silence have defined the life of figures as varied as the nun and sacred composer Hildegard of Bingen in the 12th century to the Trappist monk and activist Thomas Merton in the 20th For Merton, to be silent was to be radical; in Choosing to Love the World: On Contemplation, he explains that his vows are “saying No to all the concentration camps, the aerial bombardments, the staged political trials, the judicial murders, the racial injustices, the economic tyrannies, and the whole socio-economic apparatus.” Hermits are replete in Hinduism, Buddhism, Jainism, as much as in Judaism and Christianity. To be a hermit, to make sister or brother with your own isolation, is to commit a profound act of courage. To have no company other than yourself can be dangerous. As the 12th-century Sufi Muslim hermit Al-Ghazali said, “I have poked into every dark recess, I have made an assault on every problem, I have plunged every abyss.” What’s brought back from the emptiness at the beating heart of every ego is something ineffable, and only privy to those willing to look for it.

BBC reporter Peter France asks in Hermits: The Insights of Solitude if it is “possible that solitude confers insights not available to society? Could it be that the human condition, even the ways we’re related to each other, is better understood by those who have opted out of relationships?” Certainly there has long been a religious tradition of answering that question in the affirmative; a literary one as well if we think of authors like Dickinson and Thoreau as psychological astronauts who returned from inner space with observations not accessible to those of us enmeshed in the cacophonous din of everyday social interaction. In a more modern sense, imagine the singular focus, the elemental personality, the bare simplicity of Christopher Thomas Knight’s life. From 1986 until 2013, the North Pond Hermit of Maine’s Belgrade Lake’s pushed the stolid and taciturn New England personality to extremes, living alone on campgrounds and surviving from pilfered supplies and burglarized cabins, while only once speaking the single word “Hi” to a hiker sometime in the ‘90s.

When finally apprehended by local police, the harmless eccentric supposedly told the officers that he thought Thoreau had been a “dilletante.” A voracious reader, Knight had consumed Walden and thousands of other books, often on the subject of solitude, and if he found Thoreau lacking, he saw great value in other works. Lao-Tzu’s Tao Te Ching, the collected essays of Ralph Waldo Emerson, and of course the collected poetry of Dickinson were all held in high esteem by the hermit. Journalist Michael Finkel recounts his conversations with Knight in The Stranger in the Woods: The Extraordinary Story of the Last True Hermit, whereby the dedicated Mainer claims that “solitude bestows an increase in something valuable…my perception. But…when I applied my increased perception to myself, I lost my identity. There was no audience, no one to perform for.” Stripped of all of the socially constructed, arbitrary, imposed, and chosen definitions of the self, Knight had become a singular consciousness, an aggregate of experience divorced from the humdrum job of applying meaning, significance, or gloss to the fact that things happen one after the other. “To put it romantically,” he said, “I was completely free.” Knight had become, in a manner, God.

Nomads as these are cracked, their extremity such that they dance on the precipice of either saintliness or madness. When journeying to the center of one’s own mind, care must be taken not to lose it. Knight was able to return relatively unscathed from both the chill New England forests and from the solitary experience of only having himself for company for almost three decades, but he is in some sense lucky as regards extreme hermits. For example, there’s Christopher McCandless, the subject of Jon Krakauer’s bestseller Into the Wild, who followed Jack London dreams into the Alaskan brush, where the aspiring naturalist’s decomposing corpse would be discovered next to his extensive diaries in a broken-down bus that he’d been living in. Krakauer writes about how for McCandless, to hike alone was to “constantly feel the abyss at your back,” where the “siren song of the void puts you on edge,” yet as with Knight, a type of elemental solitariness emerges as well.

For McCandless, existence could become “a clear-eyed dream,” wherein a “trancelike state settles over your efforts.” A danger with dreams however, for it’s never clear to the dreamer himself just how clear-eyed they actually are, so that the zealotry of a McCandless that confuses the echoes in his own mind for other voices can easily transmogrify into a different type of hermit. Witness the former mathematics-professor-turned-recluse-turned-domestic-terrorist Theodor Kaczynski. In Walden, Thoreau claimed that “the mass of men lead lives of quiet desperation.” That’s clearly not identical to Kaczynski’s claim in his anarcho-ecological manifesto Industrial Society and its Future that “almost everyone hates somebody at some time or other, whether he admits it to himself or not,” but there are some resonances. A special type of wag would claim that the Unabomber was simply an angrier Thoreau with a chemistry set, but they both arguably are examples of not unrelated strains of American antisocial individualism, albeit with incredibly different outcomes. “Whosoever is delighted in solitude,” wrote Francis Bacon in the 17th century, “is either a wild beast of a god.”

Thoreau would be instrumental to any cultural accounting of isolation; from a literary perspective he’s the most obvious of hermits in our national tradition. His biographers Robert D. Richardson and Barry Moser note in Henry Thoreau: A Life of the Mind that the hermit “argued with himself in his journal…about his need for solitude versus the merits of society,” an incredibly American argument, and as the authors note, his conclusion was also particularly American: “He came down repeatedly for solitude.” Thoreau’s understanding of solitude derived from that sense of the frontier, that desire for unlimited elbow room that distinguishes this country from the Old World. When my wife and I used to live in Massachusetts, it took less than half an hour to drive to the recreation of Thoreau’s cabin on the shores of that glacial pond, and it’s understandable why he chose to spend a few years there pretending to live off the land. It’s an exceedingly pleasant space, and his example spoke to some sort of romancing of solitude that exists deep within my Pennsylvania soul.

I’m a city boy who grew up in a house that was so close to our neighbor that we could hear when that gentleman sneezed, and I’ve lived in apartments since I was 18, yet I harbor some delusion that if given half a chance I’d live in the middle of nowhere surrounded by nobody. A foolish dream, but my rightful inheritance as an American. The author of Walden is the primogeniture of that particular counter-cultural vision—it’s not wrong to see him as a kind of proto-hippie, living off the land and spouting mistranslated versions of the Upanishads, the grandfather of both John Muir and Edward Abbey. But the boot-strapping, the rugged individualism, the obsession with industriousness—Thoreau is a proto-survivalist as well. When he writes in an 1847 diary that “Disobedience is the true foundation of liberty,” it sounds more a creed for somebody with a “Don’t Tread on Me” flag decal on their truck as much as somebody with a “Coexist” bumper sticker on their Volvo. As our favorite hermit, Thoreau is in some manner the patriarch of us all, both left and right, liberal and conservative, anarchist and libertarian. He proves that Americans are nothing so much like each other, especially when we’re alone.

Thoreau isn’t our only reclusive writer. Gravity’s Rainbow author Thomas Pynchon eschews almost all media spectacle and refuses to update his author bio from a senior yearbook picture (though he did do a voice on The Simpsons); Harper Lee retired after To Kill a Mockingbird from Manhattan to her hometown of Monroeville, Ala., where she supported the local high school drama club; and Cormac McCarthy living rugged and without punctuation in a New Mexico trailer only broke his silence to appear on The Oprah Winfrey Show, as one does, when the book club read The Road (he has since been more chatty). For most supposedly hermit authors, the reality might be more prosaic; as an irate Pynchon told CNN, “My belief is that ‘recluse’ is a code word generated by journalists… meaning ‘doesn’t like to talk to reporters.’”

No accounting of literary isolation can credibly ignore J.D. Salinger whose The Catcher in the Rye, despite read less frequently by adolescents today than it once was, remains the Great American Teenage Novel. Within The Catcher in the Rye there are intimations of that profound desire to be left alone, with its main character Holden Caulfield ruminating that “I’m sort of glad that they’ve got the atomic bomb invented. If there’s ever another war, I’m going to sit right the hell on top of it. I’ll volunteer for it, I swear to God I will.” Salinger’s biography has an unmistakable romance, the scion of the Upper East Side who was feted by The New Yorker and The Paris Review, a brilliant enfant terrible who produced perfect short stories and the immaculate novel with which he’s most associated, only to retire to rural New Hampshire, reject all media and appearances, and yet continue to prodigiously write until his death in 2010.

His last story appeared in The New Yorker in 1966, yet according to journalist and novelist Joyce Maynard in At Home in the World, her memoir about her affair with Salinger in 1972 when he was 53 and she was only 18, he “works on his fiction daily,” claiming that since he’d last been published he’d written two more novels. By the time of his death, Salinger had written 13 more. His dedication to the craft itself was pure—in a rare interview with The New York Times in 1974 Salinger said that “There is a marvelous peace in not publishing…I like to write. I live to write. But I write just for myself and my own pleasure.” The purest form of composition. Upon his death, journalists discovered that the residents of Salinger’s New Hampshire hamlet fully knew who he was, but helped to mislead gawkers from tracking down the author. Katie Zezima writes in The New York Times that “Here Mr. Salinger was just Jerry, a quiet man who arrived early to church suppers, [and] nodded hello while buying a newspaper at the general store.” Keeping with a venerable New England sentiment that perhaps both Knight and Thoreau would recognize, a woman quoted in Zezima’s article says that his neighbors did “respect him. He was an individual who just wanted to live his life.” It’s unknown if any of them read those 15 novels.

Solitude and quiet are often figured, albeit in perhaps fewer extreme examples than Salinger’s method, as integral to the process of composition. Susan Sontag opined that “One can never be alone enough to write;” Ernest Hemingway said that “Writing, at its best, is a lonely life;” and the Romantic poet John Keats enthused that “my Solitude is sublime.” Peruse the rightly celebrated author interviews in The Paris Review, and you will discover that a space carved out for the self’s sovereignty is one of the few things that unite writers with their varying schedules, methods of research, and editorial eccentricities. The private John Updike, fully inhabiting the 9-5 ethos of his suburban Pennsylvania middle-class youth, wrote his novels in a rented office above a restaurant in Ipswich, Mass.; Wallace Stevens composed his poetry in the quiet of his own head, pounding out the scansions in his steps as he walked to his job in a Hartford, Conn., insurance office. Isolation can take many forms, not all of them literal, but in the pragmatic necessity of a writer needing a “room of one’s own,” as Virginia Woolf famously described it, there need be, well, a room of one’s own.

Woolf’s formulation was of course gendered; her point throughout A Room of One’s Own was the way in which the details of domestic responsibility (among other factors) contributed to the silencing of women. In short, if you don’t have the dedicated space and the leisure time in which to write in peace, you’re not going to be writing in peace. Philosopher Gaston Bachelard says as much in The Poetics of Space, noting that the house is “psychologically complex,” whereby “the nooks and corners of solitude are the bedroom.” Living collectively dampens the interior a bit, which is why the privacy of an individual space becomes instrumental in producing individual works. “The house, the bedroom, the garret in which we were alone,” Bachelard writes, “furnished the framework for an interminable dream, one that poetry alone, through the creation of a poetic work, could success in achieving completely.”

A direct causal relationship could be drawn between the architectural development of independent bedrooms in the early modern period and the evolution of the novel, the literary genre that most aggressively displays subjectivity. Like all things we take for granted, the bedroom, or the office, or the study, and all places of individual and solitary repose have their own history. Bill Bryson explains in At Home: A Short History of Private Life that the “inhabitants of the medieval hall had no bedrooms in which to retire,” and that “Sleeping arrangements appear to have remained relaxed for a long time.” The word “bedroom” itself didn’t exist until well later; Bryson writes that as “a word to describe a dedicated sleeping chamber… [it] didn’t become common until” the 17th century, the exact period in which the novel began to emerge.

People wrote long narratives before the novel and the bedroom, and for that matter there have always been venerable forms of collaborative writing as well. Yet the possibility of privacy and solitude—not just for a St. Jerome sequestered in his study or a Trappist whose taken his vow of silence—arguably contributed to certain literary forms that not only require isolation in their production, but in some sense mimic a type of isolation as well. To argue that writing requires quiet is in some ways too prosaic an observation. Writing is silence—writing is isolation. By shielding themselves in the cocoon of composition, writers are in some sense able to create rooms of their own, wherever they happen to be writing. Writing can function as its own type of sensory deprivation, an activity that can erase the outside world in the construction of a new internal one. Think of Stevens lost in his rhythmic reverie pounding out poems on his way to the office, or of James Baldwin writing Go Tell It on the Mountain in a busy Parisian café. Being ensconced within the process of writing, letting yourself become a conduit for words (and wherever they come) is a type of armor against the outside world; it’s a form of isolation that can be brought with you wherever you go.

Which is also true of reading, the only cultural medium that is purely mental and can be done in any situation, circumstance, or setting, and that if we’re considering its non-digital forms can be rapaciously consumed free of any outside interference, a universe cordoned off in a book. Reading a book on the bus or a subway, in a Starbucks or on a park bench, is a manner of building your own room within the public. It’s the profoundest type of privacy there can be as you generate an entire new reality alongside the author of the words you’re reading. When language was primarily an oral form, it was delivered collectively, and there is great power in that. But the proliferation of wide-spread literacy several centuries ago, the promulgation of affordable print, and the development of book forms like the 15th-century Venetian printer Aldus Manutius’s innovation of the cheap and portable octavo form made it possible for people to dream with their books not just while sequestered in a monastery, but anywhere that they pleased, from the encampments by the side of Renaissance Europe’s roads to a New York City taxicab.

Alberto Manguel describes the innovation of silent reading (which becomes common in late antiquity, in the fourth century around the time of St. Augustine), whereby readers could “exist in interior space…[where] the text itself, protected from outsiders by its covers, became the reader’s own possession, the reader’s intimate knowledge, whether in the busy scriptorium, the market-place or the home.” When I argue that the history of literature is the history of isolation, I mean something more than writers often require solitude, or that the hermit is a popular figure to be explored in fiction. Rather, the deep vein that runs through the experience and definition of both reading and writing is precisely the sort of solitude of that Manguel describes. Isolation is not a medium for literature, nor is it a method of creating literature; it is the very substance of literature itself.

If there is something special about seclusion, about quiet, about aloneness that defines our literature, if something about isolation defined the shift from oral cultures to written ones, then perhaps it’s in the imitation of that original Author who was the first to compose in quarantine. Judaism’s God was distinguished from His pagan colleagues by being a singular creator-being, from forming His world not out of raw materials in collaboration with a pantheon, but of His own singular volition from nothing at all. Nobody could have been more isolated than God in the beginning, and no literary work emerging from that aloneness more powerful than all of existence. Such a model, of creation ex nihilo, is the operational essence of literary composition, a medium that requires no performers and no audience and exists only in the transit from one mine to another (and not necessarily even that).

As Dickinson approached her death, she asked her sister Lavinia to immolate her correspondence, but disregarded instructions concerning the poetry, assuming it didn’t even merit mention. It appears that Dickinson never intended her work for publication, that the author and audience were one, the purest form of poetry conceivable. Lavinia recognized their brilliance, which is the only reason that we’re able to read them today. What’s remarkable isn’t that they were almost destroyed, but that Dickinson’s poems survived at all. How many comparable works were penned by names unknown, by women and men innumerable? How much is written and read in glorious isolation never to extend to an audience beyond its creator? What shall be written in the coming days and weeks and months of isolation, existing only for the delight and glory of its creators, and none the less for its impermanence? In the end all works are immolated. Even that of the Creator’s shall be deleted. That Book’s glory is no less because of it.

Image Credit: Wallpaperflare.

Letter from the Other Shore

“Beyond right and wrong there is a field. I’ll meet you there.”
—Jalal ad-Din Muhammad Rumi

They’ve constructed tent hospitals in Central Park across Fifth Avenue from Mt. Sinai Hospital and the foreboding is so palpable to me, the sense that what’s coming can’t be prepared for so visceral, that I can barely stand to consider it. New York used to be home, at least for the better part of most weeks when I’d commute in from small-town northeastern Pennsylvania to stay with my now wife while she was completing a residency in the city.

Every decent person loves New York, and some indecent too, but that it stands as the greatest of American cities is so axiomatic that I care not to even make an argument on behalf of it. Central Park is the great lung of Manhattan; when my wife was at work I’d wander the paths, the ramble, the Great Meadow were now medics work. There are few places—for many of us—as evocative of what a better world could look like. Think of it, unlike all of those royal pleasure palaces in the world of old, Olmsted’s lush urban garden is free and open to all. And now the dying too. All I will say is that I’ve heard from those who still live in the city (for anyone in publishing knows a lot of New Yorkers) that right now the sirens are deafening, that there are refrigerated trucks parked outside the hospitals because of the morgue overflow, and that EMS is working longer and harder hours than they did during 9/11. Speaking of that seminal event that inaugurated adulthood for those of my generation—for that was a disquieting year to be an 18-year-old man—sometime this week our nation will begin to suffer deaths equivalent to the World Trade Center attack every single day until this burning stops.

According to the almost certainly sugarcoated predictions of the man with the unenviable task of being the chief epidemiologist for our current, cankered administration, this pestilence could see 200,000 Americans die in the next few months—more than four times as many men who died in Vietnam. If one consults the terrifying Imperial College of London report, the reality—if nothing was done and social distancing was ignored—would be closer to 2.2 million women and men. That’s more than twice as many Americans who died in the four years of the Civil War. When the rebels fired on Ft. Sumter and Washington D.C.’s precarious position too many miles south of the Mason-Dixon Line made it an obvious target for Confederate invasion, President Abraham Lincoln ordered the capital to be heavily fortified. And in a few months, it became the most solidly protected city on Earth. Lincoln was not necessarily an optimist, but he was a hopeful man, and that is a difference. One thing that he wasn’t was a denialist; when he refused to abandon Washington, he knew what the score was, capable of seeing from the balcony of the White House a massive Confederate flag flying from an Alexandria hotel across the Potomac, the pestilence already infecting the body politic. Regardless of the city’s fortifications, there were still incursions into the District of Columbia. The Battle of Fort Stevens, late in the war during 1864, occurred when Confederate Lt. Gen. Jubal Early invaded just over the northern border of the city from Maryland. When remembered at all, it’s sometimes configured as just an unsuccessful scouting mission. But almost 1,000 men died. The same number as the average losses we’re about to suffer every day.

Washington D.C. is my home now; spring really is prettier here than it is further north, albeit perhaps less earned after the warm winters. The cherry blossoms bloomed early this year; I gather they’ve been doing that more frequently of late. I haven’t been to the National Mall for a few weeks, even though it’s less than a mile away. We’re new to the city, so I still don’t totally intuit that this is where I live. When boredom compels me to go for a brief drive, the neighborhood looking nothing so much like the bougie Mid-Atlantic neighborhood of my Pittsburgh upbringing, I’ll occasionally turn one of those narrow, brick-lined rectilinear alphabet streets and suddenly see the Capitol dome. The experience always strikes me as strange and dreamlike, since I’d forgotten where I was for the past few weeks. All that dysfunction, all that callousness, all that refusal to see what we face while giving people the bandage of a one-time $1,200 check, mere blocks from where I’m in quarantine.

Not far from the site of Early’s rebellious perfidy, and there’s the National Arboretum maintained by the Department of Agriculture. Though a poor substitute for Central Park, the space is not without its charms, not least of which is the surreal spectacle of the National Capital Columns, an arrangement of 22 of the original Corinthian support columns from the United States Capital, looking nothing so much like some abandoned temple in a field. They’re uncanny, eerie, unsettling—like seeing the debris of a lost civilization that happens to be your own. A few weeks ago, before social isolation became de facto policy, my wife and her brother drove with me throughout the arboretum to see if we could see any of the cherry blooms from the car windows. The bonsai museum and the visitor’s center were closed, but the paths were packed with people meandering in groups, as if nothing was different here, as if there was no need for fortifications at all, as if they couldn’t hear Jubal Early moving in from the north.

The sirens are not yet deafening here, though I hear them more frequently. More medivacs flying low over Capitol Hill, too. Whatever is coming is coming. It no longer feels like we’re on the Potomac, but waiting to cross the River Styx. I figure it might behoove me to gather some of my thoughts in an epistle here from the opposite bank of that river. Because I fear that none of us are prepared for what’s coming; none of us can truly comprehend the enormity of the changes that will take place, even if some of us had our ears to the ground and could hear those hoofbeats coming months ago. Anyone who isn’t an abject denialist, somebody enraptured to false paeons of positivity or an adherent of the death cult that currently masquerades as this nation’s governing party, can intuit the heat in the atmosphere, all that horror and sadness that’s already happened, that’s waiting to come. Those people dead in New York, around America, around the world. All those stories, all the narratives brought to an end. If you’ve got your empathetic radio tuned into the frequencies that are coming out of every corner of the land, then the songs you’re hearing are in a minor key. Obituaries are starting to fill up with mentions of the virus, and those strange icons of celebrity have it: Prince Charles, Tom Hanks—and most heartbreaking to me, the death of brilliant folkie John Prine. There’s an unreality to the whole thing; as those seemingly unassailable of the rich and wealthy succumb to the pestilence. I wonder if it will soon seem more real to those blocking up the road in the arboretum?

Never forget that less than a few weeks and several members of the chattering class of columnists who bolster the delusions, lies, and taunts of the junta were “simply considering” the possibility that it might be worth it to have a few million Americans die—the elderly, those with preexisting conditions, and a bunch of the unlucky of the rest of us—to jumpstart the economy. As if an economy that demanded a blood sacrifice of citizens was an economy worth having. If we remember our villains after some of us have survived, then the pharaohs of the supply-side cult governing from the White House and the Senate should forever be emblazoned as a travesty, whose intentions were a cruel pantomime of their self-described “pro-life” positions. Some commentators described them as offering the populace up as if infants to the Canaanite deity of bull-headed Moloch who immolates the innocent in the fiery cauldron of his bronze stomach. It’d be an overwrought metaphor if it wasn’t precisely what they were doing.  Anger is my most reliable emotion; I can convert sadness, depression, anxiety into its familiar and comfortable contours, so for at least a few hours of the day I let myself feel that hatred towards the ghouls a few blocks down Pennsylvania Avenue. Otherwise I’m like the rest of you, little idea what soundtrack to put on as you rocket towards the singularity. I’ve no clue how one prepares for something like this, what one expects April, or May, or June to look like when you’re facing that abyss that feels like the end of the world every day. Right now, I’m adhering to that old Program mantra of “One Day at a Time” and that seems to work while I’m white-knuckling it through the apocalypse. That means steadfastly following social distancing and getting proficient with disinfection. What one should also do, of course, is believe the science, believe in medicine, listen to the doctors and the epidemiologists who know what they’re talking about (no matter how disturbing) and ignore the pundits, politicians, and talking heads who trade in masturbatory, sociopathic tweets while people die.

I’m under no illusions that what I’m doing right now makes a contribution, for the best thing that all of us can do is to exile ourselves from this world. The woman I love more than my own heartbeat goes off to deal with this on the frontlines every day, so I know that anything I offer is paltry. Good Romantic that I am, I of course adhere to the power of words, the transcendence of poetry, the power to reach out and connect to others that are suffering. That’s not just lip service, I do believe that, even while I think that washing your hands can be as immaculate as a poem, staying inside as triumphant as a novel. So, what I want to make clear is that right now I’m writing for myself, and should any of that be useful to some of you than I am grateful. But I’m fundamentally offering a non-essential service, and it does no harm to my ego to admit that. What’s difficult is to know what to turn to when facing something this unexpected, this enormous. Peruse Facebook and Twitter right now, there’s a way of talking that’s expected of late-stage capitalism, or post-modernity, or whatever the fuck we’re supposed to call it. Snarky, outraged, absurd at times, perennially aggrieved, concerned with piffling bullshit. I suspect that by summer many of us won’t be talking that way anymore. I think, if we can, we should try and turn to something a bit more permanent, a bit more real, to help us hold our heads above water for a few more minutes even while the water is burning our lungs.

In the coming weeks, the coming months, this whole damned year, there will be death. This will be a season of death. All of us will lose people we know, lose people that we love. The famous will die, and the unknown will too. Both the poor and the rich; the powerful and the powerless. Unless you were witness to atrocities in Syria and Iraq, unless you are a refugee from El Salvador or Honduras, or a survivor of when this government let young men die by the thousands simply because of who they loved, then little will prepare us for such staggering loss, I think. This devouring reminds me of a poem of crystalline beauty by the underread Irish poet Eavan Boland from 2008’s New Collected Poems. In Boland’s appropriately named “Quarantine” she writes, “the worst hour of the worst season/of the worst year of a whole people” during the Great Hunger in 1847, when the potato blight and its attendant famine decimated Ireland. A million women and men dead, a million more forced into exile across the ocean. Victims of potato mold, yes; but more approximately humans killed by negligent or actively murderous government policy from the colonial rulers. Into that abyss, that cacophony of numbers and statistics, she reminds us that all of those millions were human beings, that each death was the conclusion of a unique story, placed into a mass grave and dusted over with soil.

Boland writes of “a man set out from the workhouse with his wife. /He was walking—they were both walking—north.” Across this broken world, this scarred earth, Boland describes that the wife “was sick with famine fever and could not keep up. /He lifted her and put her on his back. /He walked like that west and west and north. /Until at nightfall under freezing stars they arrived.” As a poet, Boland is no fabulist, she is no nostalgist, or sentimentalist. She does not give into the charming narcotic of optimism, and abides not by keeping spirits up. Boland is, however, resplendent with grace—in the full religious implications of that word. She writes “Let no love poem ever came to this threshold. /There is no place here for the inexact/praise of the easy graces and sensuality of the body.” Romanticism is a luxury that Boland can’t countenance for her characters, not in Ireland, not in the black year of ’47. This is a poem about “what they suffered. How they lived/And what there is between a man and a woman. /And in which darkness it can best be proved.” She writes not of happy endings, but of the possibility, the reality of love. In her third stanza, the middle one, Boland writes:
In the morning they were both found dead.
Of cold. Of hunger. Of the toxins of a whole history.
But her feet were held against his breastbone.
The last heat of his flesh was his last gift to her.
What I’m saying to you is that I know not who among us shall live or die, but Christ I pray that we all have the ability to be the breastbone.

I’ve decided to write an obituary for our dying world while I’m still well, while most of you are still well. The world is convulsing. I’ve no idea what it will ultimately look like, nor does anyone else for that matter. Jeff Sharlet and Peter Manseau wrote about the aftermath of 9/11 in Killing the Buddha: A Heretic’s Bible, asking “How many times can the world end? How many times can it begin again? As often as you survive. As often as you tell the story. The apocalypse is always now, but so is the creation.” This seems right to me—the world is ending now. But something else is coming out of it. Possibly it could be a far worse world, the authoritarians and aspiring dictators using pandemic as an excuse to further tighten the noose, the obscenely wealthy retiring to their palaces as inequity grows even starker and the people who bag our groceries are forced into a virtual death sentence as disease runs rampant. Or, perhaps our current moment of unlikely solidarity, our new consciousness on what work is, what work requires, will continue unabated; maybe there will be a new demanding of justice, new victories for equity, for fairness, for fundamental human dignity. In our current touchless epoch it’s impossible to know. All that can be offered is the breastbone, the reminder that you must give to those you love, even as the world ends.    

I can list what I do know will be on that other side, what will be there after the world stops ending. Whenever we emerge, whenever we’ve buried our dead, whenever we’ve mourned the losses and tabulated the incalculable grief that we can barely comprehend in this darkest of Lents, I say to you that the following things shall be waiting: A plate of half sours at the Second Avenue Deli. The way Manhattan looks at sunset when first espied from a bus as it turns around the cliffs of Weehawken toward the tunnel. The perfumed scent of a magnolia tree at dawn. Primanti Brothers sandwiches. Calloused hands of strangers grasped together in a church basement as they utter the Serenity Prayer. Roadside rib festivals where flimsy napkins do literally nothing to sop the mess up as you eat. Corny and wonderful beachside art festivals where everything is pastel and painted on drift wood. Baseball (but the Pirates will sadly still suck). Dog parks where the concentrated joy is almost unimaginable. Refreshing summer breeze and spray rolling off the forks of the Ohio River. Scorching hot sand at the Singing Sands Beach in Manchester-by-the-Sea, and the attendant mystery-meat hot dog purchased from a bored teenager. Ridiculous small-town music festivals where you can pay bottom dollar to hear classic rock warriors on their epic road downward, and yet they still absolutely shred it. Pittsburgh’s skyline when you first emerge from that tunnel. Ice cream trucks. Cheesesteaks made with the worst meat but with the best of intentions. Cannoli. The Metropolitan Museum of Arts alabaster gleaming Roman room. The old men playing chess in Washington Square Park. Holding hands. Falling in love. The cherry blossoms. Central Park. The world on the other side of what’s coming will not look exactly like this one. But there will be a world. I hope that most of us can meet there.

Letter from the Pestilence

One particularly brutal winter, more than half a decade ago now, I used to find myself fantasizing about stripping down to my underwear and t-shirt and calmly walking out into the massive field of snow that blanketed the flat lot across from my apartment complex so that I could quietly freeze to death. These day dreams came on casually, and it was only after a few times of realizing that I had been online looking up “What does it feel like to die from the cold?” that I might be in trouble. For several weeks following an arctic blast, my small Pennsylvania town was covered in snow and ice, which the tax averse city fathers did little to clear. Though I obviously wasn’t ensconced inside my apartment for the entirety of that time period, able to mostly slide down the hill to my job, and more frequently to the bar where I could get black-out drunk and somehow amazingly get back home, the isolation somehow felt both metaphorical and literal. During that time I mostly kept company with box wine, liquor, beer, and a Netflix subscription, and despite my Google searches I thankfully never saw fit to try my experiment during those blackouts. Weather wasn’t the cause of my depression obviously; my father was dying of a terminal illness at the time, I had yet to figure out that I should stop drinking, and there is some betrayal in my brain chemistry. But the chill permanence of the starkly beautiful and isolated landscape was certainly an affirmation of the pathetic fallacy, every bit as trite as if I’d made it up for a book. I eventually came out of the depression—as one does.

If you’ve ever been depressed, then you know that sometimes it feels like you’ve been wearing dark sunglasses on a bright day; the strange film that seems to cover everything and muck up the synapses in your brain. There might be drama to some people’s depression, and while there was certainly anxiety and the dull hiss of fear punctuated by moments of panic in mine, for much of it there was a surprisingly low volume. It occupied you all the time, but there was something almost relaxed about it, like the way the moment before you freeze to death is supposed to feel like a gentle letting go of one sense after the other. One of the signs of depression is that you lose interest in things that you love. In a clinical sense, that was true for me; I abandoned a lot of the intellectualization of literature that was my passion (and my paying job as a graduate student), but in a far deeper way it wasn’t accurate at all. Maybe I didn’t want to write interpretations of poetry, teach the novels that I kept on teaching, or talk about drama in graduate seminars, but words were stripped to their most elemental and jagged for me, boiled down and rendered into a broth that I kept on drinking. This isn’t going to be where I set up a false dichotomy between thinking and feeling, between interpretation and experience, nor is it a rejection of the critical discussion of literature. I truly derive pleasure from those things, and it was a blessing when my desire to engage them returned.But when everything was stripped away from my desire, when I could scarcely feel love, least of all for myself, the words were still there. At the core of the humanities, it turns out, remains the human. Sometimes reading felt like running in place in a swimming pool; sometimes I was so distracted by my malady that the connection between sense and syntax was all but severed. I was lucky enough that I could still do it though—and with a dim awareness that it was because I had no choice if I was going to come out on the other side. During this period, either ironically or appropriately, I read Andrew Solomon’s stunning personal etiology The Noonday Demon: An Atlas of Depression. Much like Leslie Jamison in The Recovering: Intoxication and Its Aftermath, Solomon gives an account of our shared affliction in terms of history and medicine, and offers his own dark nights of the soul. Solomon writes “To give up the essential conflict between what we feel like doing and what we do, to end the dark moods that reflect that conflict and its difficulties—this is to give up what it is to be human, of what is good in human beings.” The Noonday Demon’s great power is that it doesn’t reduce depression to character building, nor does it simply explain it away, but it does give some scaffolding of meaning to the experience of meaninglessness. Solomon’s prose is exemplary, his empathy is complete, and though I don’t personally know him, reading The Noonday Demon was just enough a connection—as weak as my transponders were—that a bit of static electricity was able to power me through when I got better.I only bring this up because currently we’re all in the pest house. What a strange thing, this social isolation, the self-quarantining? Suddenly societal survival depends on all of us anointing ourselves as depressives, staying sequestered in our homes as whatever hell burns through the immune systems of our fellow humans. When I was depressed, I drew hope from the fact that other people weren’t depressed; it felt like if my world was unravelling there was at least a world. The surrealism of our current moment is that none of us have that same luxury anymore. Ironically there’s something democratic in our common situation, the way in which we’re all feeling the same fear, the same uncertainty, the same panic, worry, anger, and anxiety. A solidarity, finally. If there’s anything different between our current situation and personal depression—the sense of doom, the preoccupation with a malignant force, and the inability to fully immerse yourself in that which gives joy make this feel like a type of cultural depression—it’s that there’s also a weird joviality out there. The often funny social media gallows humor (I’m partial to a picture of the advertising mascot Mr. Clean with the caption “He left us when we needed him most.”) and the odd confessions with strangers, like the Trader Joe’s checkout guy who told me he’d miss karaoke most of all.

The depressed, ironically, might have an immunological consciousness more prepared for the quarantine that is now necessary. No longer kept in my apartment by diminished serotonin levels and several feet of snow, now it is the coronavirus that keeps me home. “Depression at its worst is the most horrifying loneliness,” Solomon writes, “and from it I learned the value of intimacy.” Our school has been those feelings of nothing that have trained us in the art of that most human of things, our need for connection, precisely at the moment when its necessary to sever those ties. “You cannot draw a depressed person out of their misery,” Solomon correctly notes, but “You can, sometimes, manage to join someone in the place where he resides.” We live in an ugly era—mean, intemperate, cruel, cynical, narcissistic. Everyone says that of their age, but doesn’t it feel a bit more true of our own? Now, as if the Earth has a breaking fever, it seems as if the very planet itself is shaking us off. We’re all in that dark place now; some of us will get sick, as well. Many of us will. We will all require a kindness in that.

Like Solomon was once something I was able to hold onto—however so slightly—that returned me to life, there must be an engagement with each other, with that which we’ve created, with that which exists to make connection, with that which joins us in the places where we reside. Creation can’t be a luxury, nor is it just entertainment, or a way of passing time. Recently, I saw a video of an empty street in Florence, where women and men are quarantined where Boccaccio was once sequestered from the plague, where Petrarch’s beloved Laura de Noves succumbed to it. From an open window, a strong baritone voice from an unseen man starts singing in an Italian I can’t understand, then a woman joins in somewhere down the alley, then another man, then another. Even the feral dogs in the street are barking joyfully by the end. All of them were isolated, but none were alone. Creation must be a kindness.

Image Credit: Needpix.com.

On Pandemic and Literature

Less than a century after the Black Death descended into Europe and killed 75 million people—as much as 60 percent of the population (90% in some places) dead in the five years after 1347—an anonymous Alsatian engraver with the fantastic appellation of “Master of the Playing Cards” saw fit to depict St. Sebastian: the patron saint of plague victims. Making his name, literally, from the series of playing cards he produced at the moment when the pastime first became popular in Germany, the engraver decorated his suits with bears and wolves, lions and birds, flowers and woodwoses. The Master of Playing Cards’s largest engraving, however, was the aforementioned depiction of the unfortunate third-century martyr who suffered by order of the Emperor Diocletian. A violent image, but even several generations after the worst of the Black Death, and Sebastian still resonated with the populace, who remembered that “To many Europeans, the pestilence seemed to be the punishment of a wrathful Creator,” as John Kelly notes in The Great Mortality: An Intimate History of the Black Death, the Most Devastating Plague of all Time.

The cult of Sebastian had grown in the years between the Black Death and the engraving, and during that interim the ancient martyr had become associated with plague victims. His suffering reminded people of their own lot—the sense that more hardship was inevitable, that the appearance of purpled buboes looked like arrows pulled from Sebastian’s eviscerated flesh after his attempted execution, and most of all the indiscrimination of which portion of bruised skin would be arrow-pierced seeming as random as who should die from plague. Produced roughly around 1440, when any direct memory of the greatest bubonic plague had long-since passed (even while smaller reoccurrences occurred for centuries), the Master of the Playing Cards presents a serene Sebastian, tied to a short tree while four archers pummel him with said arrows. Unlike more popular depictions of the saint, such as Andrea Mantegna’s painting made only four decades later, or El Greco and Peter Paul Reubens’s explicitly lithe and beautiful Sebastians made in respectively the 16th and 17th centuries, the engraver gives us a calm, almost bemused, martyr. He has an accepting smile on his face. Two arrows protrude from his puckered flesh. More are clearly coming. Sebastian didn’t just become associated with the plague as a means of saintly intercession, but also because in his narrative there was the possibility of metaphor to make sense of the senseless. Medical historian Roy Porter writes in Flesh in the Age of Reason: The Modern Foundations of Body and Soul that the “Black Death of the mid-fourteenth century and subsequent outbreaks…had, of course, cast a long, dark shadow, and their aftermath was the culture of the Dance of Death, the worm-corrupted cadaver, the skull and crossbones and the charnel house.” All of said accoutrement, which endures even today from the cackling skulls of Halloween to the pirates’ flag, serve to if not make pandemic comprehensible, then to at least tame it a bit. Faced with calamity, this is what the stories told and the images made were intended to do. Religion supplied the largest storehouse of ready-made narrative with which to tell stories, even while the death toll increasingly made traditional belief untenable. John Hatcher writes in The Black Death: A Personal History that many lost “faith in their religion and…[abandoned] themselves to fate,” where fatality is as unpredictable as where an arrow will land.

A different narrative, though not unrelated, was depicted 40 years later. Made by the Swedish painter Albertus Pictor, and applied to the white walls of the rustic Täby Church north of Stockholm, the mural presents what appears to be a wealthy merchant playing a (losing) game of chess against Death. Skeletal and grinning, Death appears with the same boney twisted smile that is underneath the mask of every human face, the embodiment and reminder of everyone’s ultimate destination. Famously the inspiration for director Ingmar Bergman’s 1957 film The Seventh Seal, Pictor’s picture is a haunting memento mori, a very human evocation of the desperate flailing against the inevitable. Both pictures tell stories about the plague, about the lengths we’ll go to survive. They convey how in pandemic predictability disappears; they are narratives about the failure of narratives themselves. What both of them court are Brother Fate and his twin Sister Despair. The wages of fortune are the subject of which cards you’re dealt and the tension of strategy and luck when you avoid having your bishop or rook taken. Life may be a game, but none of us are master players and sometimes we’re dealt a very bad hand.

There has always been literature of pandemic because there have always been pandemics. What marks the literature of plague, pestilence, and pandemic is a commitment to try and forge if not some sense of explanation, than at least a sense of meaning out of the raw experience of panic, horror, and despair. Narrative is an attempt to stave off meaninglessness, and in the void of the pandemic, literature serves the purpose of trying, however desperately, to stop the bleeding. It makes sense that the most famous literary work to come out of the plague is Giovani Boccaccio’s 1353 The Decameron, with its frame conceit of 100 bawdy, hilarious, and erotic stories told by seven women and three men over 10 days while they’re quarantined in a Tuscan villa outside Florence. As pandemic rages through northern Italy, Boccaccio’s characters distract themselves with funny, dirty stories, but the anxious intent from those young women and men self-exiled within cloistered walls is that “Every person born into this world has a natural right to sustain, preserve and defend” their own life, so that storytelling becomes its own palliative to drown out the howling of those dying on the other side of the ivy-covered stone walls.

Pandemic literature exists not just to analyze the reasons for the pestilence—that may not even be its primary purpose. Rather the telling of stories is a reminder that sense still exists somewhere, that if there is not meaning outside of the quarantine zone there’s at least meaning within our invented stories. Literature is a reclamation against that which illness represents—that the world is not our own. As the narrator of Albert Camus’s The Plague says as disease ravages the town of Oran in French Algeria, there is an “element of abstraction and unreality in misfortune. But when an abstraction starts to kill you, you have to get to work on it.” When confronted with the erraticism of etiology, the arbitrariness of infection, the randomness of illness, we must contend with the reality that we are not masters of this world. We have seemingly become such lords of nature that we’ve altered the very climate and geologists have named our epoch after humanity itself, and yet a cold virus can have more power than an army. Disease is not metaphor, symbol, or allegory, it is simply something that kills you without consideration. Story is a way of trying to impart a bit of that consideration that nature ignores.

The necessity of literature in the aftermath of pandemic is movingly illustrated in Emily St. John Mandel’s novel Station Eleven. Mostly taking place several years after the “Georgian Flu” has killed the vast majority of humans on the planet and civilization has collapsed, Mandel’s novel follows a troupe of Shakespearean actors as they travel by caravan across a scarred Great Lakes region on either side of the U.S.-Canadian border. “We bemoaned the impersonality of the modern world,” Mandel writes, “but that was a lie.” Station Eleven is, in some sense, a love letter to a lost world, which is to say the world (currently) of the reader. Our existence “had never been impersonal at all,” she writes, and the novel gives moving litanies of all that was lost in the narrative’s apocalypse, from chlorinated swimming pools to the mindlessness of the Internet. There is a tender love of every aspect of our stupid world, so that how the crisis happened can only be explained because of the fact that we were so interconnected: “There had always been a massive delicate infrastructure of people, all of them working unnoticed around us, and when people stop going to work, the entire operation grinds to a halt.” As survivors struggle to rebuild, it’s the job of narrative to supply meaning to that which disease has taken away, or as the motto painted on the wagon of the traveling caravan has it: “Survival is insufficient.” The need to tell stories, to use narrative to prove some continuity with a past obliterated by pandemic, is the motivating impulse of English professor James Smith, the main character in Jack London’s largely forgotten 1912 post-apocalyptic novel, The Scarlet Plague. With shades of Edgar Allan Poe, London imagines a 2013 outbreak of hemorrhagic fever called the “Red Death.” Infectious, fast-moving, and fatal, the plague wipes out the vast majority of the world’s population, so that some six decades after the pestilence first appears, Smith can scarcely believe that his memories of a once sophisticated civilization aren’t illusions. Still, the former teacher is compelled to tell his grandchildren about the world before the Red Death, even if he sometimes imagines that they are lies. “The fleeting systems lapse like foam,” writes London, “That’s it—foam, and fleeting. All man’s toil upon the planet was just so much foam.”

The Scarlet Plague ends in a distant 2073, the same year that Mary Shelley’s 1826 forerunner of the pandemic novel The Last Man was set. Far less famous than Shelley’s Frankenstein, her largely forgotten novel is arguably just as groundbreaking. As with Station Eleven, narrative and textuality are the central concerns of the novel; when the last man himself notes that “I have selected a few books; the principal are Homer and Shakespeare—But the libraries of the world are thrown open to me,” there is the sense that even in the finality of his position there is a way in which words can still define our reality, anemic though it may now be. Displaying the trademark uneasiness about the idea of fictionality that often marked 19th-century novels, Shelley’s conceit is that what you’re reading are transcriptions of parchment containing ancient oracular predictions that the author herself discovered while exploring caves outside of Naples that had once housed the temple of the Cumae Sibylline.

Her main character is a masculinized roman a clef for Shelley herself, an aristocrat named Lionel Verney who lives through the emergence of global pandemic in 2073 up through the beginning of the 22nd century when he earns the titular status of The Last Man. All of Shelley’s characters are stand-ins for her friends, the luminaries of the rapidly waning Romantic age, from Lord Byron who is transformed into Lord Randolph, a passionate if incompetent leader of England who bungles that nation’s response to the pandemic, to her own husband, Percy, who becomes Adrian, the son of the previous king who has chosen rather to embrace republicanism. By the time Verney begins his solitary pilgrimage across a desolated world, with only the ghosts of Homer and Shakespeare, and an Alpine sheepdog whom he adopts, he still speaks in a first person addressed to an audience of nobody. “Thus around the shores of deserted earth, while the sun is high, and the moon waxes or wanes, angels, the spirts of the dead, and the ever-open eye of the Supreme, will behold…the LAST MAN.” Thus, in a world devoid of people, Verney becomes the book and the inert world becomes the reader.

The Last Man’s first-person narration, ostensibly directed to a world absent of people who could actually read it, belies a deeper reason for the existence of language than mere communication—to construct a world upon the ruins, to bear a type of witness, even if it’s solitary. Language need not be for others; that it’s for ourselves is often good enough. Literature thus becomes affirmation; more than that it becomes rebellion, a means of saying within pandemic that we once existed, and that microbe and spirochete can’t abolish our voices, even if bodies should wither. That’s one of the most important formulations of Tony Kushner’s magisterial play Angels in America: A Gay Fantasia on National Themes. Arguably the most canonical text to emerge from the horror of the AIDS crisis, Kushner’s three-hour play appears in two parts, “Millennium Approaches” and “Perestroika,” and it weaves two narrative threads, the story of wealthy WASP scion Prior Walter’s HIV diagnosis and his subsequent abandonment by his scared lover, Louis Ironson, and the arrival to New York City of the closeted Mormon Republican Joe Pitt, who works as a law clerk and kindles an affair with Louis.

Angels in America combines subjects as varied as Jewish immigration in the early 20th century, Kabbalistic and Mormon cosmology (along with a baroque system of invented angels), the reprehensible record of the closeted red-baiting attorney and Joseph McCarthy-acolyte Roy Cohn, and the endurance of the gay community struggling against the AIDS epidemic and their activism opposing the quasi-genocidal non-policy of conservative politicians like Ronald Reagan. If all that sounds heady, Kushner’s play came from the estimably pragmatic issue of how a community survives a plague. Born from the pathbreaking work of activist groups like ACT UP, Angels in America has, because of its mythological concerns, an understanding that pandemics and politics are inextricably connected. In answering who deserves treatment and how such treatment will be allocated we’ve already departed from the realm of disinterested nature. “There are no gods here, no ghosts and spirits in America, no spiritual past,” says Louis, “there’s only the political, and the decoys and the ploys to maneuver around the inescapable battle of politics.” Throughout Angels in America there is an expression of the human tragedy of pandemic, the way that beautiful young people in the prime of life can be murdered by their own bodies. Even Cohn, that despicable quasi-fascist, who evidences so little of the human himself, is entitled to some tenderness when upon his death kaddish is recited for him—by the spirit of Ethel Rosenberg, the supposed Soviet spy whom the lawyer was instrumental in the execution of.At the end of the play, Prior stands at Bethesda Fountain in Central Park, with all the attendant religious implications of that place’s name, and intones that “This disease will be the end of many of us, but not nearly all, and the dead will be commemorated and will struggle on with the living, and we are not going away. We won’t die secret deaths anymore… We will be citizens. The time has come.” In telling stories, there is not just a means of constructing meaning, or even endurance, but indeed of survival.  Fiction is not the only means of expressing this, of course, or even necessarily the most appropriate. Journalist Randy Shilts accomplished something similar to Kushner in his classic account And the Band Played On: Politics, People, and the AIDS Epidemic, which soberly, clinically, and objectively chronicled the initial outbreaks of the disease among the San Francisco gay community.In a manner not dissimilar to Daniel Defoe in his classic A Journal of the Plague Year (even while that book is fictionalized), Shilts gives an epidemiological account of the numbers, letting the horror speak through science more effectively than had it been rendered in poetry. Such staidness is its own requirement and can speak powerfully to the reality of the event, whereby “the unalterable tragedy at the heart of the AIDS epidemic…[was that] By the time America paid attention to the disease, it was too late to do anything about it,” the shame of a nation whereby Reagan’s press secretary Larry Speakes would actually publicly laugh at the idea of a “gay plague.” Shilts waited until he finished And the Band Played On to be tested for HIV himself, worried that a positive diagnosis would alter his journalistic objectivity. He would die of AIDS related complications in 1994, having borne witness to the initial years of the epidemic, abjuring the cruel inaction of government policy with the disinfectant of pure facts.

Most people who read about pandemics, however, turn to pulpier books: paperback airport novels like Michael Crichton’s clinical fictionalized report about an interstellar virus The Andromeda Strain, Robin Cook’s nightmare fuel about a California Ebola pandemic in Outbreak, and Stephen King’s magisterial post-apocalyptic epic The Stand, which I read in the summer of 1994 and remains the longest sustained narrative I think that I’ve ever engaged with. Because these books are printed on cheap paper and have the sorts of garish covers intended more for mass consumption than prestige, they’re dismissed as prurient or exploitative. Ever the boring distinctions between genre and literary fiction, for though the pace of suspense may distinguish entertainment as integral as aesthetics, they too have just as much to say about the fear and experience of illness as do any number of explicitly more “serious” works.

The Stand is an exemplary example of just what genre fiction is capable of, especially when it comes to elemental fears surrounding plague that seem to have been somehow encoded within our cultural DNA for more than seven centuries. Written as an American corollary to J.R.R. Tolkien’s Lord of the Rings trilogy, The Stand depicts a United States completely unraveled one summer after the containment loss of a government “Super-Flu” bioweapon nicknamed “Captain Trips.” In that aftermath, King presents a genuinely apocalyptic struggle between good and evil that’s worthy of Revelation, but intrinsic to this tale of pestilence is the initial worry that accompanies a scratchy throat, watery eyes, a sniffling nose, and a cough that seemingly won’t go away. If anything, King’s vision is resolutely in keeping with the medieval tradition of fortuna so expertly represented by the Master of the Playing Cards or Pictor, a wisdom that when it comes to disease “Life was such a wheel that no man could stand upon it for long. And it always, at the end, came round to the same place again,” as King writes.

Far from being exploitative, of only offering readers the exquisite pleasure of vicariously imagining all of society going to complete shit, there is a radical empathy at the core of much genre fiction. Readers of Robert Kirkman and Tony Moore’s graphic novels The Walking Dead (or the attendant television series) or viewers of George Romero’s brilliant zombie classics may assume that they’ll always be the ones to survive Armageddon, but those works can force us into a consideration of the profound contingency of our own lives. Cynics might say that the enjoyment derived from zombie narratives is that they provide a means of imagining that most potent of American fantasies—the ability to shoot your neighbor with no repercussions. More than that, however, and I think that they state a bit of the feebleness of our civilization.

This is what critic Susan Sontag notes in Illness as Metaphor about how pandemic supplies “evidence of a world in which nothing important is regional, local, limited; in which everything that can circulate does, and every problem is, or is destined to become, worldwide,” so that products and viruses alike can freely move in a globalized world. The latter can then disrupt the former, where plague proves the precariousness of the supply lines that keep food on grocery store shelves and electricity in the socket, the shockingly narrow band separating hot breakfast and cold beer from the nastiness, brutishness, and shortness of life anarchic. Such is the grim knowledge of Max Brook’s World War Z where “They teach you how to resist the enemy, how to protect your mind and spirit. They don’t teach you how to resist your own people.” If medieval art and literature embraced the idea of fate, whereby it’s impossible to know who shall be first and who shall be last once the plague rats have entered port, than contemporary genre fiction has a similar democratic vision, a knowledge that wealth, power, and prestige can mean little after you’ve been coughed on. When the Black Death came to Europe, no class was spared; it took the sculptor Andrea Pisano and the banker Giovanni Villani, the painter Ambrogio Lorenzetti and the poet Jeauan Gethin, the mystic Richard Rolle and the philosopher William of Ockham, and the father, mother, and friends of Boccaccio. Plague upended society more than any revolution could, and there was a strange egalitarianism to the paupers’ body-pit covered in lye. Sontag, again, writes that “Illness is the night-side of life, a more onerous citizenship. Everyone who is born holds dual citizenship, in the kingdom of the well and in the kingdom of the sick. Although we all prefer to use only the good passport, sooner or later each of us is obliged, at least for a spell, to identify ourselves as citizens of that other place.” Such equality motivated the greatest of medieval artistic themes to emerge from the Black Death, that of the Danse Macabre or “Dance of Death.” In such imagery, painters and engravers would depict paupers and princes, popes and peasants, all linking hands with grinning brown skeletons with hair clinging to mottled pates and cadaverous flesh hanging from bones, dancing in a circle across a bucolic countryside. In the anonymous Totentanz of 1460, the narrator writes “Emperor, your sword won’t help you out/Scepter and crown are worthless here/I’ve taken you by the hand/For you must come to my dance.” During the Black Death, the fearful and the deniers alike explained the disease as due to a confluence of astrological phenomenon or noxious miasma; they claimed it was punishment for sin or they blamed religious and ethnic minorities within their midst. To some, the plague was better understood as “hoax” than reality. The smiling skulls of the Danse Macabre laugh at that sort of cowardly narcissism, for they know that pestilence is a feature of our reality and reality has a way of collecting its debts.

Illness sees no social stratification—it comes for bishop and authoritarian theocrat, king and germaphobic president alike. The final theme of the literature of pandemic, born from the awareness that this world is not ours alone, is that we can’t avert our eyes from the truth, no matter how cankered and ugly it may be in the interim. Something can be both true and senseless. The presence of disease is evidence of that. When I was little, my grandma told me stories about when she was a girl during the 1918 Spanish Influenza epidemic that took 75 million people. She described how, in front of the courthouse of her small Pennsylvania town, wagons arrived carting coffins for those who perished. Such memories are recounted to create meaning, to bear witness, to make sense, to warn, to exclaim that we were here, that we’re still here. Narrative can preserve and remake the world as it falls apart. Such is the point of telling any story. Illness reminds us that the world isn’t ours; literature let’s us know that it is—sometimes. Now—take stock. Be safe. Most of all, take care of each other. And wash your hands.

Image Credit: Wikimedia Commons.