On Obscenity and Literature

“But implicit in the history of the First Amendment is the rejection of obscenity as utterly without redeeming social importance.” —Associate Justice William J. Brennan Jr., Roth v. United States (1957)

Interviewer: Speaking of blue, you’ve been accused of vulgarity. Mel Brooks: Bullshit! —Playboy (February, 1975)

On a spring evening in 1964 at the Café Au Go Go in Greenwich Village, several undercover officers from the NYPD’s vice squad arrested Lenny Bruce for public obscenity. Both Bruce and the club’s owner Howard Solomon were shouldered out through the crowded club to waiting squad cars, their red and blue lights reflected off of the dirty puddles pooled on the pavement of Bleecker Street. For six months the two men would stand trial, with Bruce’s defense attorney calling on luminaries from James Baldwin to Allen Ginsberg, Norman Mailer to Bob Dylan, to attest to the stand-up’s right to say whatever he wanted in front of a paying audience. “He was a man with an unsettling sense of humor,” write Ronald K.L. Collins and David M. Skover in The Trials of Lenny Bruce: The Fall and Rise of an American Icon. “Uncompromising, uncanny, unforgettable, and unapologetic…His words crossed the law and those in it is. He became intolerable to people too powerful to ignore. When it was over, not even the First Amendment saved him.” The three-judge tribunal sentenced Bruce to four months punishment in a workhouse. Released on bail, he never served a day of his conviction, overdosing on morphine in his Hollywood Hills bungalow two years later. He wouldn’t receive a posthumous pardon until 2003.

“Perhaps at this point I ought to say a little something about my vocabulary,” Bruce wrote in his (still very funny) How to Talk Dirty and Influence People: An Autobiography. “My conversation, spoken and written, is usually flavored with the jargon of the hipster, the argot of the underworld, and Yiddish.” Alongside jazz, Jewish-American comedy is one of the few uniquely American contributions to world culture, and if that comparison can be drawn further, then Bruce was the equivalent of Dizzy Gillespie or Miles Davis—he was the one who broke it wide open. Moving comedy away from the realm of the Borscht Belt one-liner, Bruce exemplified the emerging paradigm of stand-up as spoken word riff of personal reflections and social commentary, while often being incredibly obscene. The Catskills comedian Henry Youngman may have been inescapably Jewish, but Bruce was unabashedly so. And, as he makes clear, his diction proudly drew from the margins, hearing more truth in the dialect of the ethnic Other than in mainstream politeness, more honesty in the junky’s language than in the platitudes of the square, more righteous confrontation in the bohemian’s obscenity than in the pieties of the status quo. Among the comics of that golden age of stand-up, only Richard Pryor was his equal in bravery and genius, and despite the fact that some of his humor is dated today, books like How to Talk Dirty and Influence People still radiate an excitement that a mere burlesque performer could challenge the hypocrisy and puritanism of a state that would just as soon see James Joyce’s Ulysses and D.H. Lawrence’s Lady Chatterley’s Lover banned and their publisher’s hauled to jail as they would actually confront any of the social ills that infected the body politic.

What separates Bruce from any number of subsequent comics is that within his performances there was a fully articulated theory of language. “Take away the right to say the word ‘fuck’ and you take away the right say ‘fuck the government,’” he is reported to have said, and this is clearly and crucially true. That’s one model of obscenity’s utility: its power to lower the high and to raise the low, with vulgarity afforded an almost apocalyptic power of resistance. There is a naivety, however, that runs through the comedian’s work, and that’s that Bruce sometimes doesn’t afford language enough power. In one incendiary performance from the early ’60s, Bruce went through a litany of ethnic slurs for Black people, Jews, Italians, Hispanics, Poles, and the Irish, finally arguing that “it’s the suppression of the word that gives it the power, the violence, the viciousness.” He imagines a scenario whereby the president would introduce members of his cabinet by using those particular words, and concludes that following such a moment those slurs wouldn’t “mean anything anymore, then you could never make some six-year-old black kid cry because somebody called him” that word at school. Bruce’s idealism is almost touching—let it not be doubted that he genuinely believed language could work in this way—but it’s also empirically false. Dying a half-century ago he can’t be faulted for his ignorance on this score, but when we now have a president who basically does what Bruce imagined his hypothetical Commander-in-Chief doing, and I think we can emphatically state that the repetition of such ugliness does nothing to dispel its power.       

Discussions about obscenity often devolve into this bad-faith dichotomy—the prudish schoolmarms with their red-pens painting over anything blue and the brave defenders of free speech pushing the boundaries of acceptable discourse. The former hold that there is a certain power to words that must be tamed, while the later champion the individual right to say what they want to say. When the issue is phrased in such a stark manner, it occludes a more discomforting reality—maybe words are never simply utterances, maybe words can be dangerous, maybe words can enact evil things, and maybe every person has an ultimate freedom to use those words as they see fit (notably a different claim than people should be able to use them without repercussion). Bruce’s theory of language is respectably semiotic, a contention about the arbitrary relationship between signifier and signified, whereby that chain of connection can be severed by simple repetition, as when sense flees from a word said over and over again, whether it’s “potato” or “xylophone.” But he was ultimately wrong (as is all of structural and post-structural linguistics)—language is never exactly arbitrary, it’s not really semiotic. We need theurgy to explain how words work, because in an ineffable and numinous way, words are magic. When it comes to obscenity in particular, whether the sexual or the scatological, the racial or the blasphemous, we’re considering a very specific form of that magic, and while Bruce is correct that a prohibition on slurs would render resistance to oppression all the more difficult, he’s disingenuous in not also admitting that it can provide a means of cruelty in its own right. If you couldn’t say obscenities then a certain prominent tweeter of almost inconceivable power and authority couldn’t deploy them almost hourly against whatever target he sees fit. This is not an argument for censorship, mind you, but it is a plea to be honest in our accounting.

Obscenity as social resistance doesn’t have the same cache it once did, nor is it always interpreted as unassailably progressive (as it was for Bruce and his supporters). In our current season of a supposed Jacobin “cancel culture,” words have been ironically re-enchanted with the spark of danger that was once associated with them. Whether or not those who claim that there is some sort of left McCarthyism policing language are correct, it’s relatively anodyne to acknowledge that right now words are endowed with a significance not seen since Bruce appeared in a Manhattan courtroom. Whatever your own stance on the role that offensiveness plays in civilized society, obscenity can only be theorized through multiple perspectives. Four-letter words inhabit a nexus of society, culture, faith, linguistics, and morality (and the law). A “fuck” is never just a “fuck,” and a shit by any other name wouldn’t smell as pungent. Grammatically, obscenities are often classified as “intensifiers,” that is placeholders that exist to emphasize the emotionality of a given declaration—think of them as oral exclamation marks. Writing in Holy Sh*t: A Brief History of Swearing, Melissa Mohr explains vulgarity is frequently “important for the connotation it carries and not for its literal meaning.” Such a distinction came into play in 2003 after the Irish singer Bono of U2 was cited by the Federal Communications Commission when upon winning a Golden Globe he exclaimed “fucking brilliant.” The Enforcement Commission of the bureau initially decided that Bono’s f-bomb wasn’t indecent since its use clearly wasn’t in keeping with the sexual definition of the word, a verdict that was later rescinded higher up within the FCC.

“Historically,” Mohr writes, “swearwords have been thought to possess a deeper, more intimate connection to the things they represent than do other words,” and in that regard the pencil-necked nerds at the FCC ironically showed more respect to the dangerous power of fucking then did Bono. If vigor of emotion is all one was looking for in language, any number of milquetoast words would work as well as a vulgarity, and yet obscenity (even if uttered due to a stubbed toe) is clearly doing something a bit more transcendent than more PG terms—for both good and bad. Swearing can’t help but have an incantatory aspect to it; we swear oaths, and we’re specifically forbidden by the Decalogue from taking the Lord’s name in vain. Magnus Ljung includes religious themes in his typology of profanity, offered in Swearing: A Cross-Cultural Linguistic Study, as one of “five major themes that recur in the swearing of the majority of the languages discussed and which are in all likelihood also used in most other languages featuring swearing.” Alongside religious profanity, Ljung recognizes themes according to scatology, sex organs, sexual activities, and family insults. To this, inevitably, must also be offered ethnic slurs. Profanity is by definition profane, dealing with the bloody, pussy, jizzy reality of what it means to be alive (and thus the lowering of the sacred into that oozy realm is part of what blasphemously shocks). Obscenity has a quality of the theological about it, even while religious profanities have declined in their ability to shock an increasingly secular society.

Today a word like “bloody” sounds archaic or Anglophilic, and almost wholly inoffensive, even while it’s (now forgotten) reference to Christ’s wounds would have been scandalous to an audience reared on the King James Bible. This was the problem that confronted television director David Milch, who created the classic HBO western Deadwood. The resultant drama (with dialogue largely composed in iambic pentameter) was noted as having the most per capita profanity of any show to ever air, but in 1870s Dakota most of those swears would have been religious in nature. Since having Al Swearengen (a perfect name if ever there was one) sound like Yosemite Sam would have dulled the shock of his speech, Milch elected to transform his characters’ language into scatological and ethnic slurs, the latter of which still has the ability to upset an audience in a way that “by Christ’s wounds!” simply doesn’t. When Swearengen offers up his own theory of language to A.W. Merrick, who edits Deadwood’s newspaper, he argues that “Just as you owning a print press proves only an interest in the truth, meaning up to a fucking point, slightly more than us others maybe, but short of a fucking anointing or the shouldering of a sacred burden—unless of course the print press was gift of an angel,” he provides a nice synthesis of the blasphemous and the sexual. The majority of copious swears in Deadwood are of the scatological, sexual, or racial sort, and they hit the ear-drum with far more force than denying the divinity of Christ does. When Milch updated the profanity of the 19th century, he knew what would disturb contemporary audiences, and it wasn’t tin-pot sacrilege.  

All of which is to say that while obscenity has a social context, with what’s offensive being beholden to the mores of a particular century, the form itself universally involves the transgression of propriety, with the details merely altered to the conventions of a time and place. As an example, watch the 2005 documentary The Aristocrats directed by magician Penn Jillette, which features dozens of tellings of the almost unspeakably taboo joke of the same name. Long an after-hours joke told by comedians who would try to one-up each other in the degree of profanity offered, Jillette’s film presents several iconic performers giving variations on the sketch. When I saw the film after it came out, the audience was largely primed for the oftentimes extreme sexual and scatological permutations of the joke, but it was the tellings that involved racial slurs and ethnic stereotypes that stunned the other theater goers. It’s the pushing of boundaries in and of itself, rather than the subject in question, that designates something as an obscenity. According to Sigmund Freud in his (weirdly funny) The Joke and Its Relation to the Unconscious, vulgar humor serves a potent psychological purpose, allowing people “to enjoy undisguised obscenity” that is normally repressed so as to keep “whole complexes of impulses, together with their derivatives, away from consciousness.” Obscenity thus acts as a civilizational pressure valve for humanity’s chthonic impulses.

That
words which are considered obscene are often found in the vocabulary of the
marginalized isn’t incidental, and it recommends spicey language as a site of
resistance. English swearing draws directly from one such point of contact
between our “higher” and our “lower” language. The majority of English swears
have a Germanic origin, as opposed to a more genteel Romance origin (whether
from French or Latin). In keeping with their Teutonic genesis, they tend to
have an abrasive, guttural, jagged quality to their sounds, the better to
convey an onomatopoeic quality. Take a look at the list which comprises comedian
George Carlin’s 1972 bit “Seven Words You Can Never Say on Television.” Four of
them definitely have an Old English etymology, traceable back to the West
Germanic dialect of the Angles, Saxons, Frisians, and Jutes who occupied
Britain in the later centuries of the first millennium. Three of them – the one
that rudely refers to female genitalia, the one that tells you to rudely do
something sexual, and the one that tells you to do that thing to your mother –
may have Latin or Norman origins, though linguists think they’re just as likely
to come from what Medievalists used to call “Anglo-Saxon.” Most of these words
had no obscene connotations in their original context; in Old English the word
for urine is simply “piss,” and the word for feces is “shit.” Nothing dirty
about either word until the eleventh-century Norman invasion of Britain privileged
the French over the English. That stratification, however, gives a certain
gutter enchantment to those old prosaic terms, endowing them with the force of
a swear. Geoffrey Hughes writes in Swearing: A Social History of Foul
Language, Oaths, and Profanities in English that the “Anglo-Saxon element… provides
much more emotional force than does the Normal French of the Latin. Copulating
pandemonium! conveys none of the emotion charge of the native equivalent fucking
hell!”  Invasion, oppression, and
brutality mark those words which we consider to be profane, but they also give
them their filthy enchantments.

What’s clear is that the class connotations of what Bruce called an “argot” can’t be ignored. Swearing is the purview of criminals and travelers, pirates and rebels, highwaymen and drunks. For those lexicographers who assembled lists of English words in the early modern era, swearing, or “canting,” provided an invaluable window into the counter-cultural consciousness. The Irish playwright Richard Head compiled The Canting Academy, or Devil’s Cabinet Opened in 1673, arguably the first full-length English “dictionary,” complied decades before Dr. Johnson’s staider 1755 A Dictionary of the English Language. Decades before Head’s book, and short pamphlets by respectable playwrights from Thomas Dekker to Thomas Middleton similarly illuminated people on the criminal element’s language—other examples that were included as appendices within books, such as Thomas Harman’s A Caveat or Warning for Common Cursitors, go back well into the 16th century. Such “canting guides,” exploring the seamy underbelly of the cockney capital, were prurient pamphlets that illustrated the salty diction of thieves and rogues for the entertainment of the respectable classes. One of the most popular examples was the anonymously edited A New Dictionary of the Terms Ancient and Modern of the Canting Crew, first printed in 1698.  Within, readers could learn the definitions of insults from “blobber-lipped” to “jobber-not.” Such dictionaries (that included words like “swindler” and “phony,” which still survive today) drew from the English underclass, with a motley vocabulary made up of words from rough-hewn English, Romani, and ultimately Yiddish, among other origins.

A direct line runs between the vibrant, colorful, and earthy diction of canting to cockney rhyming slang, or the endangered dialect of Polari used for decades by gay men in Great Britain, who lived under the constant threat of state punishment. All of these tongues are “obscene,” but that’s a function of their oppositional status to received language. Nothing is “dirty” about them; they are, rather, rebellions against “proper” speech, “dignified” language, “correct” talking, and they challenge that codified violence implied by the mere existence of the King’s Speech. Their differing purposes, and respective class connotations and authenticity, are illustrated by a joke wherein a hobo asks a nattily dressed businessman for some change. “’Neither a borrower nor a lender be’—that’s William Shakespeare,” says the businessman. “’Fuck you’—that’s David Mamet,” responds the panhandler. A bit of a disservice to the Bard, however, who along with Dekker and Middleton could cant with the best of them. For example, within the folio one will find “bawling, blasphemous, incharitible dog,” “paper fac’d villain,” and “embossed carbuncle,” among such other similarly colorful examples.

An entire history could be written about early instances of noted slurs, which of course necessitates trawling the Oxford English Dictionary for examples of dirty words that appear particularly early. For “shit,” there is a 1585 instance of the word in a Scottish “flyting,” an extemporaneous poetic rhyme-battle held in Middle Scotts, which took place between Patrick Hume and Alexander Montgomerie. The greatest example of the form is the 15th-century Flyting of Dunbar and Kennedy, containing the first printed instance of the word “fuck.” In the OED, our good friend the dirty lexicographer Richard Head has the earliest example given in the entry for the word “fuck,” the profanity appearing as a noun in his play Hic et Ubique: or, The Humors of Dublin ,wherein a character says “I did creep in…and there I did see [him] putting the great fuck upon my wife.” And the dictionary reflects the etymological ambiguity concerning the faux-francophone/faux-Virgilian word “dildo,” giving earliest attribution to the playwright Robert Greene in 1590 who in his comedy Never Too Late wrote “Dildido dildido, Oh love, oh love, I feel thy rage rumble below and above.” Swearing might be a radical alternative to received language, but it pulses through literature like a counter-history, a shadow realm of the English tongue’s full capabilities. It is a secret language, the twinned-double of more respectable letters, and it’s unthinkable to understand Geoffrey Chaucer without his scatological jokes or Shakespeare minus his bawdy insults. After all, literature is just as much Charles Bukowski as T.S. Eliot; it’s William S. Burroughs and not just Ezra Pound.

Sometimes those dichotomies about what language is capable of are reconciled within the greatest of literature. A syllabus of the immaculate obscene would include the Marquis de Sade’s 120 Days of Sodom, Charles Baudelaire’s The Flowers of Evil, Gustave Flaubert’s Madame Bovary, Joyce’s Ulysses, Lawrence’s Lady Chatterley’s Lover, Vladimir Nabokov’s Lolita, Henry Miller’s Tropic of Cancer (smuggled out of a Barnes & Noble by yours truly when I was 16), and Irving Welsh’s Trainspotting. Along with his fellow Scotsman James Kelman, Welsh shows the full potential of obscenity to present an assault on the pieties of the bourgeois, mocking Madison Avenue sophistry when he famously implores the reader to “Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.” Within the English language, looming above all as the primogeniture of literary smut, is the great British author John Cleland, who in 1748 published our first pornographic novel in Fanny Hill: Or, Memoirs of a Woman of Pleasure, wherein he promised “Truth! stark naked truth, is the word, and I will not so much as take the pains to bestow the strip of a gauze-wrapper on it.” Cleland purposefully wrote Fanny Hill entirely in euphemisms and double entendres, but the lack of dirty words couldn’t conceal the fact that the orgiastic bildungsroman about a middle-age nymphomaniac was seen as unspeakably filthy. The novel has the distinction of being the longest banned work in U.S. history, first prohibited by the Massachusetts Supreme Court in 1821, only to be sold legally after the U.S. Supreme Court ruled that its censorship was unconstitutional in 1966. The same year that Bruce was found face-down, naked and dead, in his California bathroom.     

A goddamn unequivocal fucking triumph of the human spirt that any fucking wanker can march up into a public library and check out a copy of Fanny Hill. That liberty is one that was hard fought for, and we should look askance on anyone who’d throw it away too cavalierly. But there is also something disingenuous as dismissing all those who suppressed works like Fanny Hill or Ulysses or Lady Chatterley’s Lover as mere prigs and prudes. A work is never censored because it isn’t powerful; it’s attacked precisely because of that coiled, latent energy that exists within words, none the more so than those that we’ve labeled as forbidden. If the debate over free speech and censorship is drenched in a sticky oil of bad faith, then that slick spills over into all corners. My fellow liberals will mock the conservative perspective that says film or comic books or video games or novels are capable of altering someone into action, sometimes very ugly action—but of course literature is capable of doing this. Why would we read literature otherwise? Why would we create it otherwise? The censor with his black marker in some ways does due service to literature, acknowledging its significance and its uncanny effect. To claim that literature shouldn’t be censored because all literature is safe is not just fallacious, it’s disrespectful. The far more difficult principle is that literature shouldn’t be censored despite the fact that it’s so often dangerous.

Like any grimoire or incantation, obscenity can be used to liberate and to oppress, to free and to enslave, to bring down those in power but also to froth a crowd into the most hideous paroxysms of fascistic violence. So often the moralistic convention holds that “punching down” is never funny, but the dark truth is that it often is. What we do with that reality is the measure of us as people, because obscenity is neither good nor bad, but all power resides within the mouth of who wields it. What we think of as profanity is a rupture within language, a dialectic undermining conventional speech, what the Greeks called an aporia that constitutes the moment that rhetoric breaks down. Obscenity is when language declares war on itself, often with good cause. Writing in Rabelais and His World, the great Russian critic Mikhail Bakhtin defined what he called the “carnivalesque,” that is the principle that structured much medieval and Renaissance performance and literature, whereby the “principle of laughter and the carnival spirit on which the grotesque is based destroys…seriousness and all pretense.” Examining the Shrovetide carnivals that inaugurated pre-Reformation Lent, Bakhtin optimistically saw something liberatory in the ribald display of upended hierarchies, where the farting, shitting, pissing, vomiting hilarity of the display rendered authority foolish. “It frees human consciousness,” Bakhtin wrote, “and imagination for new potentialities.”

An uneasy and ambivalent undercurrent threads through Bakhtin’s argument, though. If the carnival allowed for a taste of emancipation, there was also always the possibility that it was just more bread and circus, a way to safely “rebel” without actually challenging the status quo. How much of our fucks and shits are just that, simply the smearing of feces on our playpen walls? Even worse, what happens when the carnival isn’t organized by plucky peasants to mock the bishops and princes, but when the church and state organize those mocking pageants themselves? Bakhtin didn’t quite anticipate the troll, nor did Bruce for that matter. Gershon Legman writes in the standard text Rationale of the Dirty Joke: An Analysis of Sexual Humor that “Under the mask of humor, our society allows infinite aggressions, by everyone and against everyone. In the culminating laugh of the listener or observer…the teller of the joke betrays his hidden hostility.” Can’t you take a joke? Because I was just joking. Legman’s reading of obscenity is crucial—it’s never just innocent, it’s never just nothing, it’s never just words. And it depends on who is saying them, and to whom they’re being said. Because swearing is so intimately tied to the theological, the use of profanity literally takes on the aura of damnation. It’s not that words aren’t dangerous—they are. But that doesn’t mean we must suture our mouths, even as honesty compels us to admit that danger. What we do with this understanding is the process that we call civilization. Because if Lenny Bruce had one unassailable and self-evident observation, it was that “Life is a four-letter word.” How could it be fucking otherwise?

Bonus Link:—Pussy Riot: One Woman’s Vagina Takes on Japan’s Obscenity Laws

Image Credit: Flickr/Jeniffer Moo

Secrets That Hold Us

1.
“I didn’t grow up with my mother,” Mom tells me. We’re sitting on the front verandah looking out at the grass before us wilting in the afternoon sun and the dwarf coconut trees that line one side of the driveway. The coconut fronds dip with the breeze, revealing green-and yellow-husked coconuts. It’s hot on the verandah; the aluminum awning, put there years earlier to shield the sun and rain, traps the late afternoon heat as well.
“I was 12 before I knew my mother,” she says.
 My mother is responding to some transgression of mine, what specifically I don’t remember. Perhaps something I neglected to tell her. There’s a lot we didn’t talk about then and don’t talk about now—a trait I earned honestly, I now know.
To grow up in Jamaica is to hear and know these stories of mothers who have migrated abroad or to Kingston, leaving their children to be raised by another relative. Barrel children, we call them, a term that stems from the fact that the migrating parents often send barrels of food and clothes back home for their children. But this is not my mother’s story.
I am in my 20s then when she tells me about meeting her mother for the first time. My mother was 12, living in Clarendon on the southern side of Jamaica with her father’s twin sisters, a cousin, and three of her father’s other children. She talks about that day as if it’s nothing remarkable: A visitor comes and one of the aunts tells my mother to take the woman to another relative’s house some yards away on a vast tract of land that is subdivided for various relatives. They’re on their way, when my grandaunt call my mother back. It turns out they were watching to see what my mother would do, how long it would take my mother to start quizzing the visitor.
“That’s your mother,” my grandaunt tells her. My mother was a baby when her mother left her there with the aunts, too young to recognize her own mother those 12 years later.
Among her siblings, my mother’s story is not unique. My mother’s father worked with the now-defunct railway service in Jamaica, crossing the country on the east-west train line. I don’t know how or where he met the mothers of his children. But it seems, he collected his offspring and deposited them in Clarendon for his sisters to raise. “I don’t know what arrangement Papa Stanley had with her,” my mother says of her parents.
My mother’s mother—Mother Gwen—raised three boys. “She gave away the girl and kept the boys,” my mother says. In it, I hear the sting of abandonment and the loss of having grown up without a concrete reason for her mother’s absence. It’s years before my mother sees her mother again, and by then she’s an adult with a life far different from that of her mother and the sons her mother kept and raised.
2.
My mother came of age with the generation that ushered in Jamaica’s independence from Britain in 1962, and she lives the Jamaican dream, that of moving off the island for a better life elsewhere. Generation after generation of Jamaicans have moved abroad—some to Panama to help build the canal, some to Cuba to work the sugar cane fields, others to Britain during the Windrush era, and countless others to America. A year after marrying my father in 1967, my parents moved to America for undergraduate studies at Tuskegee University, and later to Urbana, Ill., for my father’s graduate studies. Driven away by the cold, winter’s wrath, snow measured by the feet instead of inches, my parents returned to Jamaica in 1971, with the first of their three daughters. My father settled into a job with a unit of a bauxite company and my mother begans teaching early childhood education, supervising teachers and later teaching at a teachers college.
In 1977, they bought the house they still own—a split-level with a front lawn that slopes down to the street. The house overlooks a valley, where a river once ran. Across the valley is the main road into town and a forested cliff side. We sit on that verandah, my mother and I, looking out as she tells me about this grandmother I hadn’t known about until recently. The openness of the yard, the grass sloping down to the road, contrasts with the family secret she’s held on to for so long.
3.
I am already out of college and in my early 20s when my sisters and I rediscover Mother Gwen. We had, the three of us, come to our separate conclusions that our maternal grandmother had long been dead. I don’t have a specific reason for thinking she was dead—no memory of a funeral, no snippets of a conversation at the back of my mind. But growing up, we made weekly or biweekly trips to visit our relatives in different parts of the island: my paternal grandparents in Anchovy, a small town in the rambling hills that look down on Montego Bay; my maternal grandfather in Kingston; the grandaunts who raised my mother in the woods of Clarendon amidst coffee and cocoa and citrus plants. Now, I suspect that I assumed she had passed on because she was not among the relatives we visited, and unlike my other relatives, I have no specific memory of her—no Christmas or Boxing Day dinners, no visits to her after Easter Sunday services.
Grandma—my father’s mother—I remember clearly. She baked birthday cakes in three different sizes because my sisters and I celebrated our birthdays in back-to-back-to-back months—April, May, and June. My older sister got the largest cake, I got the mid-sized cake, and my younger sister got the smallest cake. Grandma divided money in a similar manner—$20, $10 and $5. Our birth order mattered to her.
I don’t have childhood memories of Mother Gwen, or the sons she raised without my mother. Instead, I have a vague recollection of an unfinished house, a metal drum by the side of the house, and large red and orange colored fish swimming in the makeshift aquarium. I don’t know if the house of my memory and the house where we rediscovered our grandmother are the same but it’s the memory that came to me when we walked into the house that Sunday afternoon and my mother said to her mother, who was slowly going blind, “These are my daughters, your granddaughters.” The specifics of the conversation are lost to me now but I imagine we must have talked with our grandmother and our uncles about our studies, our lives in America. Our uncles gave us trinkets they made or bought to sell—bracelets with the Jamaican colors: black, green and yellow.
Even now, I can see the shock on my sisters’ faces—eyebrows going up, eyes widening, exaggerated blinks. How could we have lived so long, on an island as small as Jamaica, without knowing our mother’s mother still lived?
4.
On the verandah that day, my mother tells me that she feared her brothers, feared they would harm her children in some way. In response, she kept us away. My mother doesn’t give a concrete reason for fearing her brothers but vaguely says something about the differences in their lives, what she had built of her life and what they hadn’t made of theirs—and the possibility that potential jealousy of what she had accomplished would bubble over. Unlike my mother who taught at a teachers college for most of my childhood, her brothers—all Rastafarians—made and sold trinkets in various roadside stalls and markets. But they had feared her return to their lives, thinking that perhaps she’d come to claim what she thought should be hers. Perhaps her brothers said something that gave my mother pause.
Without my sisters and me, my mother drifted in and out of Mother Gwen’s life, not taking us back until we were in our 20s, young adults embarking on our own lives. By the time I rediscovered my grandmother and her sons, it was too late for her to become grandma or her sons to become uncles.
But later in her life, my mother truly became Mother’s Gwen’s daughter, driving some 60 miles every Saturday to Spanish Town—where Mother Gwen lived with the only surviving son—with bags of clean laundry and bags and boxes of grocery: fish, chicken, yam, bread, eggs, pumpkin, thyme, and sometimes a pot of fresh soup. Then she’d call out to Mother Gwen, blind then and a little hard of hearing, before walking into the room where she slept or sat in the doorway to catch the Jamaican breeze. My mother gathered the soiled bed sheets and clothes, hand washed some before she left, and hung them on the line in the back to dry. The rest my mother packed to take home where she washed them, only to exchange them for a newly soiled batch a few days later.
On the few occasions I was there, my uncle hovered, keeping watch over my mother, and when he had me for an audience, he turned the small verandah into a stage and spouted word-for-word Marcus Garvey speeches he has rehearsed, every inflection perfectly placed, his eyes staring straight ahead as if looking at the words scrawled in the air. Thin and wiry, he talked about the occasions on which he’d been invited to recite a Marcus Garvey speech, the opportunities he’d missed to make a career out of this ability of his. Sometimes he talked at length about reasons for a decision or reasons he hasn’t been able to make more of his life—his mother, of course, was the main reason.
Mother Gwen died blind, completely dependent on the daughter she given up.


5.
Mother Gwen left her daughter but my mother didn’t. For my mother, I think her mother’s absence is like a shroud she can never remove. And I think it’s why she was always there for her children.
I only have two memories of my mother not being with us: once she took a sabbatical from the college where she taught and spent some time in New York. I don’t recall the length of her absence, just that I got sick the very day she left; the helper who was with us fed me tea made from the leaf of a lime tree and cream soda. I’ve never liked either since. The second time—the summer after my older sister graduated from high school—my mother took my sister to New York and left my younger sister and me at home with our father. We’d always traveled together—my sisters, our mother, and I—so this was new. The morning after their departure, after my father had left for work, the phone rang—an operator with a collect call for my mother. In the background, I heard a cousin saying Papa Stanley had died. I didn’t even have to accept the charge for I already knew the message I had to pass on: my mother’s father was dead. Mom returned within the week.
6.
The absence of my mother’s mother lives with me, too; an obsession I cannot shake, a recurring motif that springs from my unconscious into most of my longer pieces of fiction. These days, the recurring theme in my novels is mothers who don’t raise their children. My first novel, River Woman, is about a young woman who loses her son, and whose mother returns to Jamaica after years a broad for her grandson’s funeral. Their reunion is fraught with tension. My second novel, Tea by the Sea, is about another mother who spends 17 years searching for a daughter taken from her at birth.
While I didn’t set out to write my mother’s story, the ideas of abandonment and loss and belonging have crept into my work and remained there, a lurking obsession that I don’t yet seem able to escape. Perhaps, it’s my unconscious attempt to reach under the layers of family stories to discover why my mother holds on to these family secrets and stores them, as if they will be her undoing.
Sometimes I think the secrets my mother holds trap her into bearing responsibility for her father’s transgressions, for the circumstances of her birth. It’s not her burden to carry, and yet she does. I see it in her response to the news of another brother, another of her father’s children, whom she learned about not long before her father’s death. She had known the young man, taught him at school, knew him with a surname different from her own. As she tells it, her father said the young man, now grown, was coming to visit him. Why, my mother asked. “He’s my son.” Even now, years and years later, my mother still hasn’t fully accepted him. She says, “I didn’t know him as a brother,” and recalls his mother naming him as another man’s child—a jacket, in Jamaican terms—as if those circumstances are her brother’s to bear.
As a writer, I can fictionalize the reasons family members hold onto secrets long after they have lost their usefulness. Or they can let them go, setting free the secrets that trap a child into bearing responsibility for a parent’s mistakes.
7.
My mother is nearing 80. Cancer has weakened her body, perhaps lowered her defenses. She invites her brother—the only one of her mother’s three sons still alive—his daughter, and two grandchildren to visit her home. They came on a Sunday in March, the first time in the 42 years my parents owned that house that her visited. She made cupcakes with her grandniece. And when my mother talks of the visit there is joy in her voice.
I imagine them on the verandah looking out at the expanse of green before them: a brother and a sister nearer to the end of their lives than the beginning, both reaching across the years to their memories of the mother who held them together.
Image Credit: Pikist.

Ten Ways to Save the World

1.              In a purple-walled gallery of the Smithsonian Museum of American Art, you can visit the shrine constructed by Air Force veteran and janitor James Hampton for Jesus Christ’s return. Entitled “Throne of the Third Heaven of the Nation’s Millennium General Assembly,” the altar and its paraphernalia were constructed to serve as temple objects for the messiah, who according to Hampton, based on visions he had of Moses in 1931, the Virgin Mary in 1946, and Adam in 1949, shall arrive in Washington D.C. His father had been a part-time gospel singer and Baptist preacher, but Hampton drew not just from Christianity, but brought Afrocentric folk traditions of his native South Carolina to bear in his composition. Decorated with passages from Daniel and Revelation, Hampton’s thinking (done in secret over 14 years in his Northwest Washington garage) is explicated in his 100-page manifesto St. James: The Book of the 7 Dispensations (dozens of pages are still in an uncracked code). Claiming that he had received a revised version of the Decalogue, Hampton’s notebook declared himself to be “Director, Special Projects for the State of Eternity.” His work is a fugue of word salad, a concerto of pressured speech. A combined staging ground for the incipient millennium—Hampton’s shrine is a triumph.

As if the bejeweled shield of the Urim and the Thummim were constructed not by Levites in ancient Jerusalem, but by a janitor in Mt. Vernon. Exodus and Leviticus give specifications for those liturgical objects of the Jewish Temple—the other-worldly cherubim gilded, their wings touching the hem of infinity huddled over the Ark of the Covenant; the woven brocade curtain with its many-eyed Seraphim rendered in fabric of red and gold; the massive candelabra of the ritual menorah. The materials with which the Jews built their Temple were cedar and sand stone, gold and precious jewels. When God commanded Hampton to build his new shrine, the materials were light-bulbs and aluminum foil, door frames and chair legs, pop cans and cardboard boxes, all held together with glue and tape. The overall effect is, if lacking in gold and cedar, transcendent nonetheless. Hampton’s construction looks almost Mesoamerican, aluminum foil delicately hammered onto carefully measured cardboard altars, names of prophets and patriarchs from Ezekiel to Abraham rendered.

Everyday Hampton would return from his job at the General Services Administration, where he would mop floors and disinfect counters, and for untold hours he’d assiduously sketch out designs based on his dreams, carefully applying foil to wood and cardboard, constructing crowns from trash he’d collected on U Street. What faith would compel this, what belief to see it finished? Nobody knew he was doing it. Hampton would die of stomach cancer in 1964, never married, and with few friends or family. The shrine would be discovered by a landlord angry about late rent. Soon it would come to the attention of reporters, and then the art mavens who thrilled to the discovery of “outsider” art—that is work accomplished by the uneducated, the mentally disturbed, the impoverished, the religiously zealous. “Throne of the Third Heaven of the Nation’s Millennium General Assembly” would be purchased and donated to the Smithsonian (in part through the intercession of artist Robert Rauschenberg) where it would be canonized as the Pieta of American visionary art, outsider art’s Victory of Samothrace. 

Hampton wasn’t an artist though—he was a prophet. He was Elijah and Elisha awaiting Christ in the desert. Daniel Wojcik writes in Outsider Art: Visionary Worlds and Trauma that “apocalyptic visions often have been expressions of popular religiosity, as a form of vernacular religion, existing at a grassroots level apart from the sanction of religious authority.” In that regard Hampton was like so many prophets before him, just working in toilet paper and beer can rather than papyrus—he was Mt. Vernon’s Patmos. Asking if Hampton was mentally ill is the wrong question; it’s irrelevant if he was schizophrenic, bipolar. Etiology only goes so far in deciphering the divine language, and who are we so sure of ourselves to say that the voice in a janitor’s head wasn’t that of the Lord? Greg Bottoms writes in Spiritual American Trash: Portraits from the Margins of Art and Faith that Hampton “knew he was chosen, knew he was a saint, knew he had been granted life, this terrible, beautiful life, to serve God.” Who among us can say that he was wrong? In his workshop, Hampton wrote on a piece of paper “Where there is no vision, the people perish.” There are beautiful and terrifying things hidden in garages all across America; there are messiahs innumerable. Hampton’s shrine is strange, but it is oh so resplendent.

2.              By the time Brother John Nayler genuflected before George Fox, the founder of the Quaker Society of Friends, his tongue had already been bored with a hot iron poker and the letter “B” (for “Blasphemer”) had been branded onto his forehead by civil authorities. The two had not gotten along in the past, arguing over the theological direction of the Quakers, but by 1659 Nayler was significantly broken by their mutual enemies that he was forced to drag himself to Fox’s parlor and to beg forgiveness. Three years had changed the preacher’s circumstances, for it was in imitation of the original Palm Sunday that in 1656 Nayler had triumphantly entered into the sleepy sea-side town of Bristol upon the back of a donkey, the religious significance of the performance inescapable to anyone. A supporter noted in a diary that Nayler’s “name is no more to be called James but Jesus,” while in private writings Fox noted that “James ran out into imaginations… and they raised up a great darkness in the nation.”

At the start of 1656, Nayler was imprisoned, and when Fox visited him in his cell, the latter demanded that the former kiss his foot (belying the Quaker reputation for modesty). “It is my foot,” Fox declared, but Nayler refused. The confidence of a man who reenacted Christ’s entrance into Jerusalem. Guided by the Inner Light that Quakers saw as supplanting even the gospels, Nayler thought of his mission in messianic terms, and organized his vehicle to reflect that. Among the “Valiant Sixty,” itinerant preachers who were too radical even for the Quakers, Nayler was the most revolutionary, condemning slavery, enclosure, and private property. The tragedy of Nayler is that he happened to not actually be the messiah. Before his death, following an assault by a highwayman in 1660, Nayler would write that his “hope is to outlive all wrath and contention, and to wear out all exaltation and cruelty, or whatever is of a nature contrary to itself.” He was 42, beating Christ by almost a decade.

“Why was so much fuss made?” asks Christopher Hill in his classic The World Turned Upside Down: Radical Ideas During the English Revolution. “There had been earlier Messiahs—William Franklin, Arise Evans who told the Deputy Recorder of London that he was the Lord his God…Gadbury was the Spouse of Christ, Joan Robins and Mary Adams believed they were about to give birth to Jesus Christ.” Hill’s answer to the question of Nayler’s singularity is charitable, writing that none of the others actually seemed dangerous, since they were merely “holy imbeciles.” The 17th-century, especially around the time of the English civil wars, was an age of blessed insanity, messiahs proliferating like dandelions after a spring shower. There was John Reeve, Laurence Clarkson, and Lodowicke Muggleton, who took turns arguing over which were the two witnesses mentioned in Revelation, and that God had absconded from heaven and the job was now open. Abiezer Cope, prophet of a denomination known as the Ranters, demonstrated that designation in his preaching and writing. One prophet, Thoreau John Tany (who designated himself “King of the Jews”), simply declared “What I have written, I have written,” including the radical message that hell was liberated and damnation abolished. Regarding the here and now, Tany had some similar radical prescriptions, including to “feed the hungry, clothe the naked, oppress none, set free them bounden.”

There have been messianic claimants from first century Judea to contemporary Utah. When St. Peter was still alive there was the Samaritan magician Simon Magus, who used Christianity as magic and could fly, only to be knocked from the sky during a prayer-battle with the apostle. In the third century the Persian prophet Mani founded a religion that fused Christ with the Buddha, that had adherents from Gibraltar to Jinjiang, and a reign that lasted more than a millennium (with its teachings smuggled into Christianity by former adherent St. Augustine). A little before Mani, and a Phrygian prophet named Montanus declared himself an incarnation of the Holy Spirit, along with his consorts Priscilla and Maximillia. Prone to fits of convulsing revelation, Montanus declared “Lo, the man is as a lyre, and I fly over him as a pick.” Most Church Fathers denounced Montanism as rank heresy, but not Tertullian, who despite being the primogeniture of Latin theology, was never renamed “St. Tertullian” because of those enthusiasms. During the Middle Ages, at a time when stereotype might have it that orthodoxy reigned triumphant, and mendicants and messiahs, some whose names aren’t preserved to history and some who amassed thousands of followers, proliferated across Europe. Norman Cohn remarks in The Pursuit of the Millennium that for one eighth-century Gaulish messiah named Aldebert, followers “were convinced that he knew all their sins…and they treasured as miracle-working talismans the nail pairings and hair clippings he distributed among them.” Pretty impressive, but none of us get off work for Aldebert’s birthday.  

More recently, other messiahs include the 18th-century prophetess and mother of the Shakers Anne Lee, the 20th-century founder of the Korean Unification Church Sun Myung Moon (known for his elaborate mass weddings and owning the conservative Washington Times), and the French test car driver Claude Vorilhon who renamed himself Raël and announced that he was the son of an extraterrestrial named Yahweh (more David Bowe’s The Rise and Fall of Ziggy Stardust and the Spiders from Mars then Paul’s epistles). There are as many messiahs as there are people; there are malicious messiahs and benevolent ones, deluded head-cases and tricky confidence men, visionaries of transcendent bliss and sputtering weirdos. What unites all of them is an observation made by Reeve that God speaks to them as “to the hearing of the ear as a man speaks to a friend.” 

3.Hard to identify Elvis Presley’s apotheosis. Could have been the ’68 Comeback Special, decked in black leather warbling “Suspicious Minds” in that snarl-mumble. Elvis’s years in the wilderness precipitated by his manager Col. Tom Parker’s disastrous gambit to have the musician join the army, only to see all of the industry move on from his rockabilly style, now resurrected on a Burbank sound stage. Or maybe it was earlier, on The Milton Berle Show in 1956, performing Big Mama Thorton’s hit “Hound Dog” to a droopy dog, while gyrating on the Hollywood stage, leading a critic for the New York Daily News to opine that Elvis “gave an exhibition that was suggestive and vulgar, tinged with the kind of animalism that should be confined to dives and bordellos.” A fair candidate for that moment of earthly transcendence could be traced back to 1953, when in Sun Records’ dusty Memphis studio Elvis would cover Junior Parker’s “Mystery Train,” crooning out in a voice both shaky and confident over his guitar’s nervous warble “Train I ride, sixteen coaches long/Train I ride, sixteen coaches long/Well that long black train, got my baby and gone.” But in my estimation, and at the risk of sacrilege, Elvis’s ascension happened on Aug. 16, 1977 when he died on the toilet in the private bathroom of his tacky and opulent Graceland estate.

The story of Elvis’s death has the feeling of both apocrypha and accuracy, and like any narrative that comes from that borderland country of the mythic, it contains more truth than the simple facts can impart. His expiration is a uniquely American death, but not an American tragedy, for Elvis was able to get just as much out of this country as the country ever got out of him, and that’s ultimately our true national dream. He grabbed the nation by its throat and its crotch, and with pure libidinal fury was able to incarnate himself as the country. All of the accoutrement—the rhinestone jumpsuits, the karate and the Hawaiian schtick, the deep-fried-peanut-butter-banana-and-bacon-sandwiches, the sheer pill-addicted corpulence—are what make him our messiah. Even the knowing, obvious, and totally mundane observation that he didn’t write his own music misses the point. He wasn’t a creator—he was a conduit. Greil Marcus writes in Mystery Train: Images of America in Rock ‘n’ Roll of the “borders of Elvis Presley’s delight, of his fine young hold on freedom…[in his] touch of fear, of that old weirdness.” That’s the Elvis that saves, the Elvis of “That’s All Right (Mama)” and “Shake, Rattle, and Roll,” those strange hillbilly tracks, that weird chimerical sound—complete theophany then and now.

There’s a punchline quality to that contention about white-trash worshipers at the Church of Elvis, all of those sightings in the Weekly World News, the Las Vegas impersonators of various degrees of girth, the appearance of the singer in the burnt pattern of a tortilla. This is supposedly a faith that takes its pilgrimage to Graceland as if it were zebra-print Golgotha, that visits Tupelo as if it were Nazareth. John Strausbaugh takes an ethnographer’s calipers to Elvism, arguing in E: Reflections on the Birth of the Elvis Faith that Presley has left in his wake a bona fide religion, with its own liturgy, rituals, sacraments, and scripture. He writes that it is a “The fact that outsiders can’t take it seriously may turn out to be its strength and its shield. Maybe by the time Elvism is taken seriously it will have quietly grown too large and well established to be crushed.” There are things less worthy of your worship than Elvis Presley. If we were to think of an incarnation of the United States, of a uniquely American messiah, few candidates would be more all consumingly like the collective nation. In his appetites, his neediness, his yearning, his arrogance, his woundedness, his innocence, his simplicity, his cunning, his coldness, and his warmth, he was the first among Americans. Elvis is somehow both rural and urban, northern and southern, country and rock, male and female, white and black. Our contradictions are reconciled in him. “Elvis lives in us,” Strausbaugh writes, “There is only one King and we know who he is.” We are Elvis and He was us.   

4.              A hideous slaughter followed those settlers as they drove deep into the continent. On that western desert, where the lurid sun’s bloodletting upon the burnt horizon signaled the end of each scalding day, a medicine man and prophet of the Paiute people had a vision. In a trance, Wodziwob received an oracular missive, that “within a few moons there was to be a great upheaval or earthquake… he whites would be swallowed up, while the Indians would be saved.” Wodziwob would be the John the Baptist to a new movement, for though he would die in 1872, the ritual practice that he taught—the Ghost Dance—would become a rebellion against the genocidal policy of the U.S. Government. For Wodziwob, the Ghost Dance was an affirmation, but it has also been remembered as a doomed moment. “To invoke the Ghost Dance has been to call up an image of indigenous spirituality by turns militant, desperate, and futile,” writes Louis S. Warren in God’s Red Son: The Ghost Dance Religion and the Making of Modern America, “a beautiful dream that died.” But a dream that was enduring. 

While in a coma precipitated by scarlet fever, during a solar eclipse, on New Year’s Day of 1889, a Northern Paiute Native American who worked on a Carson City, Nevada, ranch and was known as Jack Wilson by his coworkers and as Wovoka to his own people, fell into a mystical vision not unlike Wodziwob’s. Wovoka met many of his dead family members, he saw the prairie that exists beyond that which we can see, and he held counsel with Jesus Christ. Wovoka was taught the Ghost Dance, and learned what Sioux Chief Lame Deer would preach, that “the people…could dance a new world into being.” When Wovoka returned he could control the weather, he was able to compel hail from the sky, he could form ice with his hands on the most sweltering of days. The Ghost Dance would spread throughout the western United States, embraced by the Paiute, the Dakota, and the Lakota. Lame Deer said that the Ghost Dance would roll up the earth “like a carpet with all the white man’s ugly things—the stinking new animals, sheep and pigs, the fences, the telegraph poles, the mines and factories. Underneath would be the wonderful old-new world.”

A messiah is simultaneously the most conservative and most radical, preaching a return to a perfected world that never existed but also the overturning of everything of this world, of the jaundiced status quo. The Ghost Dance married the innate strangeness of Christianity to the familiarity of native religion, and like both it provided a blueprint for how to overthrow the fallen things. Like all true religion, the Ghost Dance was incredibly dangerous. That was certainly the view of the U.S. Army and the Bureau of Indian Affairs, which saw an apocalyptic faith as a danger to white settler-colonials and their indomitable, zombie-like push to the Pacific. Manifest Destiny couldn’t abide by the hopefulness of a revival as simultaneously joyful and terrifying as the Ghost Dance, and so an inevitable confrontation awaited.   

The Miniconjou Lakota people, forced into the Pine Ridge Reservation by 1890, were seen as particularly rebellious, in part because their leader Spotted Elk was an adherent. As a pretext concerning Lakota resistance to disarmament, the army opened fire on gathered Miniconjou, and more than 150 people (mostly women and children) would be slaughtered during the Wounded Knee Massacre. As surely as the Romans threw Christians to the lions and Cossacks rampaged through the Jewish shtetls of eastern Europe, so too were the initiates of the Ghost Dance persecuted, murdered, and martyred by the U.S. Government. Warren writes that the massacre “has come to stand in for the entire history of the religion, as if the hopes of all of its devoted followers began and ended in that fatal ravine.” Wounded Knee was the Calvary of the Ghost Dance faith, but if Calvary has any meaning it’s that crucified messiahs have a tendency not to remain dead. In 1973 a contingent of Oglala Lakota and members of the American Indian Movement occupied Wounded Knee, and the activist Mary Brave Beard defiantly performed the Ghost Dance, again.   

5.             Rabbi Menachem Mendel Schneerson arrived in the United States via Paris, via Berlin, and ultimately via Kiev. He immigrated to New York on the eve of America’s entry into the Second World War, and in that interim six million Jews were immolated in Hitler’s ovens—well over half of all the Jews in the world. Becoming Lubavitch Chief Rebbe in 1950, Schneerson was a refugee from a broken Europe that had devoured itself. Schneerson’s denomination of Hasidism had emerged after the vicious Cossack-led pogroms that punctuated life in 17th-century eastern Europe, when many Jews turned towards the sect’s founder, the Baal Shem Tov. His proper name was Rabbi Israel ben Eliezer, and his title (often shortened to “Besht”) meant “Master of the Good Name,” for the Baal Shem Tov incorporated Kabbalah into a pietistic movement that enshrined emotion over reason, feeling over logic, experience over philosophy. David Bial writes in Hasidism: A New History that the Besht espoused “a new method of ecstatic joy and a new social structure,” a fervency that lit a candle against persecution’s darkness.

When Schneerson convened a gathering of Lubavitchers in a Brooklyn synagogue for Purim in 1953,  a black cloud enveloped the Soviet Union. Joseph Stalin was beginning to target Jews whom he implicated in “The Doctor’s Plot,” an invented accusation that Jewish physicians were poisoning Soviet leadership. The state propaganda organ Pravda denounced these supposed members of a “Jewish bourgeois-nationalist organization… The filthy face of this Zionist spy organization, covering up their vicious actions under the mask of charity.” Four gulags were constructed in Siberia, with the understanding that Russian Jews would be deported and perhaps exterminated. Less than a decade after Hitler’s suicide, and Schneerson would look out into the congregation of swaying black-hatted Lubavitchers, and would see a people labeled for extinction.

And so on that evening, Schneerson explicated on the finer points of Talmudic exegesis, on questions of why evil happens in the world, and what man and G-d’s role is in containing that wickedness. Witnesses said that the very countenance of the rabbi was transformed, as he declared that he would speak the words of the living G-d. Enraptured in contemplation, Schneerson connected the Persian courtier Haman’s war against the Jews and Stalin’s upcoming campaign, he invoked G-d’s justice and mercy, and implored the divine to intervene and prevent the Soviet dictator from completing that which Hitler had begun. The Rebbe denounced Stalin as the “evil one,” and as he shouted it was said that his face transformed into a “holy fire.”

Two days later Moscow State Radio announced that Stalin had fallen ill and died. The exact moment of his expiration was when a group of Lubavitch Jews had prayed that G-d would still the hand of the tyrant and punish his iniquities. Several weeks later, and Soviet leadership would admit that the Doctor’s Plot was a government rouse invented by Stalin, and they exonerated all of those who’d been punished as a result of baseless accusations. By the waning days of the Soviet Union, the Lubavitcher Rebbe would address crowds gathered in Red Square by telescreen while the Red Army Band performed Hasidic songs. “Was this not the victory of the Messiah over the dark forces of the evil empire, believers asked?” writes Samuel Heilman and Menachem Friedman in The Rebbe: The Life and Afterlife of Menachem Mendel Schneerson.

The Mashiach (“anointed one” in Hebrew) is neither the Son of G-d nor the incarnate G-d, and his goal is arguably more that of liberation than salvation (whatever either term means). Just as Christianity has had many pseudo-messiahs, so is Jewish history littered with figures whom some believers saw as the anointed one (Christianity is merely the most successful of these offshoots). During the Second Jewish-Roman War of the second century, the military commander Simon bar Kokhba was lauded as the messiah, even as his defeat led to Jewish exile from the Holy Land. During the 17th century, the Ottoman Jew Sabbatai Zevi amassed a huge following of devotees who believed him the messiah come to subvert and overthrow the strictures of religious law itself. Zevi was defeated by the Ottomans not through crucifixion, but through conversion (which is much more dispiriting). A century later, and the Polish libertine Jacob Frank would declare that the French Revolution was the apocalypse, that Christianity and Judaism must be synthesized, and that he was the messiah. Compared to them, the Rebbe was positively orthodox (in all senses of the word). He also never claimed to be the messiah.

What all share is the sense that to exist is to be in exile. That is the fundamental lesson and gift of Judaism, born from the particularities of Jewish suffering. Diaspora is not just a political condition, or a social one; diaspora is an existential state. We are all marooned from our proper divinity, shattered off from G-d’s being—alone, disparate, isolated, alienated, atomized, solipsistic. If there is to be any redemption it’s in suturing up those shards, collecting those bits of light cleaved off from the body of G-d when He dwelled in resplendent fullness before the tragedy of creation. Such is the story of going home but never reaching that destination, yet continuing nevertheless. What gives this suffering such beauty, what redeems the brokenness of G-d, is the sense that it’s that very shattering that imbues all of us with holiness. What the German-Jewish philosopher Walter Benjamin explains in On the Concept of History as the sacred reality that holds that “every second was the narrow gate, through which the Messiah could enter.”

6.When the Living God landed at Palisadoes Airport in Kingston, Jamaica, on April 21, 1966, he couldn’t immediately disembark from his Ethiopian Airlines flight from Addis Ababa. More than 100,000 people had gathered at the airport, the air thick with the sticky, sweet smell of ganja, the airstrip so overwhelmed with worshipers come to greet the Conquering Lion of the Tribe of Judah, His Imperial Majesty Haile Selassie I, King of Kings, Lord of Lords, Elect of God, Power of the Trinity, of the House of Solomon, Amhara Branch, noble Ras Tafari Makonnen, that there was a fear the plane itself might tip over. The Ethiopian Emperor, incarnation of Jah and the second coming of Christ, remained in the plane for a few minutes until a local religious leader, the drummer Ras Mortimer Planno, was allowed to organize the emperor’s descent.

Finally, after several tense minutes, the crowd pulled back long enough for the emperor to disembark onto the tarmac, the first time that Selassie would set foot on the fertile soil of Jamaica, a land distant from the Ethiopia that he’d ruled over for 36 years (excluding from 1936 to 1941 when his home country was occupied by the Italian fascists). Jamaica was where he’d first been acknowledged as the messianic promise of the African diaspora. A year after his visit, while being interviewed by the Canadian Broadcasting Corporation, Selassie was asked what he made of the claims of his status. “I told them clearly that I am a man,” he said, “that I am mortal…and that they should never make a mistake in assuming or pretending that a human being is emanated from a deity.” The thing with being a messiah, though, is that it does not depend on the consent of the worshiped whether or not they’re to be adored.

Syncretic and born from the Caribbean experience, and practiced from Kingstown, Jamaica, to Brixton, London, Rastafarianism is a mélange of Christian, Jewish, and uniquely African symbols and beliefs, with its own novel rhetoric concerning oppression and liberation. Popularized throughout the West because of the indelible catchiness of reggae, with its distinctive muted third beat, and the charisma of the musician Bob Marley who was the faith’s most famous ambassador, Rastafarianism is sometime offensively reduced in peoples’ minds to dreadlocks and spliff smoke. Ennis Barrington Edmonds places the faith’s true influence in its proper context, writing in Rastafari: From Outcasts to Culture Bearers that “the movement has spread around the world, especially among oppressed people of African origins… [among those] suffering some form of oppression and marginalization.”

Central to the narrative of Rastafarianism is the reluctant messiah Selassie, a life-long member of the Ethiopian Orthodox Tewahedo Church. Selassie’s reign had been prophesized by the Jamaican Protestant evangelist Leonard Howell’s claim that the crowning of an independent Black king in an Africa dominated by European colonialism would mark the dawn of a messianic dispensation. A disciple of Black nationalist Marcus Garvey, whom Howell met when both lived in Harlem, the minister read Psalm 68:31’s injunction that “Ethiopia shall soon stretch out her hands unto God” as being fulfilled in Selassie’s coronation. Some sense of this reverence is imparted by a Rastafarian named Reuben in Emily Robeteau’s Searching for Zion: The Quest for Home in the African Diaspora who explained that “Ethiopia was never conquered by outside forces. Ethiopia was the only independent country on the continent of Africa…a holy place.” Sacred Ethiopia, the land where the Ark of the Covenant was preserved. 

That the actual Selassie neither embraced Rastafarianism, nor was particularly benevolent in his own rule, and was indeed deposed by a revolutionary Marxist junta, is of no accounting.  Rather what threads through Rastafarianism is what Robeateu describes as a “defiant, anticolonialist mind-set, a spirit of protest… and a notion that Africa is the spiritual home to which they are destined to return.” Selassie’s biography bore no similarity to residents in the Trenchtown slum of Kingstown where the veneration of a distant African king began, but his name served as rebellion against all agents of Babylon in the hopes of a new Zion. Rastafarianism found in Selassie the messiah who was needed, and in their faith there is a proud way of spiritually repudiating the horrors of the trans-Atlantic slave trade. What their example reminds us of is that a powerful people have no need of a messiah, that he rather always dwells amongst the dispossessed, regardless of what his name is.

7.Among the Persian Sufis there is no blasphemy in staining your prayer rug red with shiraz. Popular throughout Iran and into central Asia, where the faith of Zoroaster and Mani had both once been dominant, Sufism drew upon those earlier mystical and poetic traditions and incorporated them into Islam. The faith of the dervishes, the piety of the wali, the Sufi tradition is that of Persian miniatures painted in stunning, colorful detail, of the poetry of Rumi and Hafez. Often shrouded in rough woolen coats and felt caps, the Sufi practice a mystical version of faith that’s not dissimilar to Jewish kabbalah or Christian hermeticism, an inner path that the 20th-century writer Aldous Huxley called the “perennial philosophy.” As with other antinomian faiths, the Sufis often skirt the line of what’s acceptable and what’s forbidden, seeing in heresy intimations of a deep respect for the divine.

A central poetic topoi of Sufi practice is what’s called shath, that is deliberately shocking utterances that exist to shake believers out of pious complacency, to awaken within them that which is subversive about God. One master of the form was Mansour al-Hallaj, born to Persian speaking parents (with a Zoroastrian grandfather) in the ninth century during the Abbasid Caliphate. Where most Sufi masters were content to keep their secrets preserved for initiates, al-Hallaj crafted a movement democratized for the mass of Muslims, while also generating a specialized language for speaking of esoteric truths, expressed in “antithesis, breaking down language into prepositional units, and paradox,” as Husayn ibn Mansur writes in Hallaj: Poems of a Sufi Martyr. Al-Hallaj’s knowledge and piety were deep—he had memorized the Koran by the age of 12 and he prostrated himself before a replica of Mecca’s Kaaba in his Baghdad garden—but so was his commitment to the radicalism of shath. When asked where Allah was, he once replied that the Lord was within his turban; on another occasion he answered that question by saying that God was under his cloak. Finally in 922, borrowing one of the 99 names of God, he declared “I am the Truth.”

The generally tolerant Abbasids decided that something should be done about al-Hallaj, and so he was tied to a post along the Tigris River, repeatedly punched in the face, lashed several times, decapitated, and finally his headless body was hung over the water. His last words were “Akbar al-Hallaj” —“Al-Hallaj is great.” An honorific normally offered for God, but for this self-declared heretical messiah his name and that of the Lord were synonyms. What’s sacrilegious about this might seem clear, save for that al-Hallaj’s Islamic piety was such that he interpreted such a claim as the natural culmination of Tawhid, the strictness of Islamic monotheism pushed to its logical conclusion—there is but one God, and everything is God, and we are all in God. Idries Shah explains the tragic failure of interpretation among religious authorities in The Sufis, writing that the “attempt to express a certain relationship in language not prepared for it causes the expression to be misunderstood.”  The court, as is obvious, did not agree.

If imagining yourself as the messiah could get you decapitated by the 10th century Abbasids, then in 20th-century Michigan it only got you institutionalized. Al-Hallaj implied that he was the messiah, but for the psychiatric patients in Milton Rokeach’s 1964 study The Three Christs of Ypsilanti each thought that they were the authentic messiah (with much displeasure ensuing when they meet). Based on three years of observation at the Ypsilanti State Hospital starting in 1959, Rokeach treated this trinity of paranoid schizophrenics. Initially Rokeach thought that the meeting of the Christs would disavow them all of their delusions, that the law of logical non-contradiction might mean anything to a psychotic. But the messiahs were steadfast in their faith—each was singular and the others were imposters. Then Rokeach and his graduate students introduced fake messages from other divine beings, in a gambit that the psychiatrist apologized for two decades later and would most definitely land him before an ethics board today. Finally Rokeach grants them the right to their insanities, each of the Christs of Ypsilanti continuing in their merry madness. “It’s only when a man doesn’t feel that he’s a man,” Rokeach concludes, “that he has to be a god.”

Maybe. Or maybe the true madness of the Michigan messiahs was that each thought themselves the singular God. They weren’t in error that each of them were the messiah, they were in error by denying that truth in their fellow patients. Al-Hallaj would have understood, declaring before his executioner that “all that matters for the ecstatic is that the Unique shall reduce him to Unity.” The Christs may have benefited more by Sufi treatment than psychotherapy. Clyde Benson, Joseph Cassel, and Leon Gabor all thought themselves to be God, but al-Hallaj knew that he was (and that You reading this are as well). Those three men may have been crazy, but al-Hallaj was a master of what the Buddhist teacher Wes Nisker calls “crazy wisdom.” In his guidebook The Essential Crazy Wisdom, he celebrates the sacraments of “clowns, jesters, tricksters, and holy fools,” who understand that “we live in a world of many illusions, that the emperor has no clothes, and that much of human belief and behavior is ritualized nonsense.” By contrast, the initiate in crazy wisdom, whether gnostic saint of Kabbalist rabbi, Sufi master or Zen monk, prods at false piety to reveal deeper truths underneath. “I saw my Lord with the eye of the heart,” al-Hallaj wrote in one poem, “I asked, ‘Who are You?’/He replied, ‘You.’”

8. “Bob” is the least-likely looking messiah. With his generic handsomeness, his executive hair-cut dyed black and tightly parted on the left, the avuncular pipe that jauntily sticks out of his tight smile, “Bob” looks like a stock image of a 1950’s pater familias (his name is always spelled with quotation marks). Like a clip-art version of Mad Men’s Don Draper, or Utah Sen. Mitt Romney. “Bob” is also not real (which may or may not distinguish him from other messiahs), but rather the central figure in the parody Church of the Sub-Genius. Supposedly a traveling salesman, J.R. “Bob” Dobbs had a vision of JHVH-1 (the central God in the church) in a homemade television set, and he then went on the road to evangelize. Founded by countercultural slacker heroes Ivan Stang and Philo Drummond in 1979 (though each claims that “Bob” was the actual primogeniture), the Church of the SubGenius is a veritable font of crazy wisdom, promoting the anarchist practice of “culture jamming” and parody in the promulgation of a faith where it’s not exactly clear what’s serious and what isn’t.

“Bob” preaches a doctrine of resistance against JHVH-1 (or Jehovah 1), the demiurge who seems as if a cross between Yahweh and a Lovecraftian elder god. JHVH-1 intended for “Bob” to encourage a pragmatic, utilitarian message about the benefits of a work ethic, but contra his square appearance, the messiah preferred to advocate that his followers pursue a quality known as slack. Never clearly defined (though its connotations are obvious), slack is to the Church of the SubGenius what the Tao is to Taoism or the Word is to Christianity, both the font of all reality and that which gives life meaning. Converts to the faith include luminaries like the underground cartoonist Robert Crumb, Pee-wee Herman Show creator Paul Reubens, founder of the Talking Heads David Byrne, and of course Devo’s Mark Mothersbaugh. If there is any commandment that most resonates with the emotional tenor of the church, it’s in “Bob’s” holy commandment that says “Fuck ‘em if they can’t take a joke.”

The Church of the SubGenius is oftentimes compared to another parody religion that finds its origins from an identical hippie milieu, though first appearing almost two decades before in 1963, known by the ominous name of Discordianism. Drawing from Hellenic paganism, Discordianism holds as one of its central axioms in the Principia Discordia (written by founders Greg Hill and Kerry Wendell Thornley under the pseudonyms of Malaclypse the Younger and Omar Khayyam Ravenhurst) that the “Aneristic Principle is that of apparent order; the Eristic Principle is that of apparent disorder. Both order and disorder are man made concepts and are artificial divisions of pure chaos, which is a level deeper than is the level of distinction making.” Where other mythological systems see chaos as being tamed and subdued during ages primeval, the pious Discordian understands that disorder and disharmony remain the motivating structure of reality. To that end, the satirical elements—its faux scripture, its faux mythology, and its faux hierarchy—are paradoxically faithful enactments of its central metaphysics.

For those of a conspiratorial bent, Thornley first conceived of the movement after leaving the Marine Corps, and he was an associate of Lee Harvey Oswald. It sounds a little like one of the baroque plots in Robert Anton Wilson and Robert Shea’s The Illuminatus! Trilogy. A compendium of occult and conspiratorial lore whose narrative complexity recalls James Joyce or Thomas Pynchon, The Illuminatus! Trilogy was an attempt to produce for Discordianism what Dante crafted for Catholicism or John Milton for Protestantism: a work of literature commensurate with theology. Stang and Drummond, it should be said, were avid readers of The Illuminatus! Trilogy. “There are periods of history when the visions of madmen and dope fiends are a better guide to reality than the common-sense interpretation of data available to the so-called normal mind,” writes Wilson. “This is one such period, if you haven’t noticed already.” And how.

Inventing religions and messiahs wasn’t merely an activity for 20th-century pot smokers. Fearing the uncovering of the Christ who isn’t actually there can be seen as early as the 10th century, when the Iranian war-lord Abu Tahir al-Jannabi wrote about a supposed tract that referenced the “three imposters,” an atheistic denunciation of Moses, Jesus, and Muhammad. This equal opportunity apostasy, attacking all three children of Abraham, haunted monotheism over the subsequent millennium, as the infernal manuscript was attributed to several different figures. In the 13th century, Pope Gregory IX said that the Holy Roman Emperor Frederick II had authored such a work (the latter denied it). Within Giovanni Boccaccio’s 14th-century The Decameron there is reference to the “three imposters,” and in the 17th century, Sir Thomas Browne attributed the Italian Protestant refugee Bernardino Orchino with having composed a manifesto against the major monotheistic faiths.

What’s telling is that everyone feared the specter of atheism, but no actual text existed. They were scared of a possibility without an actuality, terrified of a dead God who was still alive. It wouldn’t be until the 18th century that writing would actually be supplied in the form of the French authored anonymous pamphlet of 1719 Treatise of the Three Imposters. That work, going through several different editions over the next century, drew on the naturalistic philosophy of Benedict Spinoza and Thomas Hobbes to argue against the supernatural status of religion. Its author, possibly the bibliographer Prosper Marchand, argued that the “attributes of the Deity are so far beyond the grasp of limited reason, that man must become a God himself before he can comprehend them.” One imagines that the prophets of the Church of the SubGenius and Discordianism, inventors of gods and messiahs aplenty, would concur. “Just because some jackass is an atheist doesn’t mean that his prophets and gods are any less false,” preaches “Bob” in The Book of the SubGenius.

9.The glistening promise of white-flecked SPAM coated in greasy aspic as it slips out from it’s corrugated blue can, plopping onto a metal plate with a satisfying thud. A pack of Lucky Strikes, with its red circle in a field of white, crinkle of foil framing a raggedly opened end, sprinkle of loose tobacco at the bottom as a last cigarette is fingered out. Heinz Baked Beans, sweet in their tomato gravy, the yellow label with its picture of a keystone slick with the juice from within. Coca-Cola—of course Coca-Cola—its ornate calligraphy on a cherry red can, the saccharine nose pinch within. Products of American capitalism, the greatest and most all-encompassing faith of the modern world, left behind on the Vanuatuan island of Tanna by American servicemen during the Second World War.

More than 1,000 miles northwest from Australia, Tanna was home to airstrips and naval bases, and for the local Melanesians, the 300,000 GIs housed on their island indelibly marked their lives. Certainly the first time most had seen the descent of airplanes from the blue of the sky, the first time most had seen jangling Jeeps careening over the paths of Tanna’s rainforests, the first time most had seen Naval destroyers on the pristine Pacific horizon. Building upon a previous cult that had caused trouble for the jointly administered colonial British-French Condominium, the Melanesians claimed that the precious cargo of the Americans could be accessed through the intercession of a messiah known as John Frum—believed to have possibly been a serviceman who’d introduced himself as “John from Georgia.”

Today the John Frum cult still exists in Tanna. A typical service can include the rising of the flags of the United States, the Marine Corps, and the state of Georgia, while shirtless youths with “USA” painted on their chests march with faux-riffles made out of sticks (other “cargo cults” are more Anglophilic, with one worshiping Prince Philip). Anthropologists first noted the emergence of the John Frum religion in the immediate departure of the Americans, with Melanesians apparently constructing landing strips and air traffic control towers from bamboo, speaking into left-over tin cans as if they were radio controls, all to attract back the quasi-divine Americans and their precious “cargo.” Anthropologist Holger Jebens in After the Cult: Perceptions of Other and Self in West New Britain (Papua New Guinea) describes “cargo cults” as having as their central goal the acquisition of “industrially manufactured Western goods brought by ship or aeroplane, which, from the Melanesian point of view, are likely to have represented a materialization of the superior and initially secret power of the whites.” In this interpretation, the recreated ritual objects molded from bamboo and leaves are offerings to John Frum so that he will return from the heavenly realm of America bearing precious cargo.

If all this sounds sort of dodgy, than you’ve reason to feel uncomfortable. More recently, some anthropologists have questioned the utility of the phrase “cargo cult,” and the interpretation of the function of those practices. Much of the previous model, mired in the discipline’s own racist origins, posits the Melanesians and their beliefs as “primitive,” with all of the attendant connotations to that word. In the introduction to Beyond Primitivism: Indigenous Religious Traditions and Modernity, Jacob K. Olupona writes that some scholars have defined the relationship between ourselves and the Vanuatuans by “challenging the notion that there is a fundamental difference between modernity and indigenous beliefs.” Easy for an anthropologist espying the bamboo landing field and assuming that what was being enacted was a type of magical conspicuous consumption, a yearning on the part of the Melanesians to take part in our own self-evidently superior culture. The only thing that’s actually self-evident in such a view, however, is a parochial and supremacist positioning that gives little credit to unique religious practices. Better to borrow the idea of allegory in interpreting the “cargo cults,” both the role which that way of thinking may impact the symbolism of their rituals, but also what those practices could reflect about our own culture.

Peter Worsley writes in The Trumpet Shall Sound: A Study of “Cargo” Cults in Melanesia that “a very large part of world history could be subsumed under the rubric of religious heresies, enthusiastic creeds and utopias,” and this seems accurate. So much of prurient focus is preoccupied with the material faith. Consequently there is a judgment of the John Frum religion as being superficial. Easy for those of us who’ve never been hungry to look down on praying for food, easy for those with access to medicine to pretend that materialism is a vice. Mock praying for SPAM at your own peril; compared to salvation, I at least know what the former is. Christ’s first miracle in the Gospel of John, after all, was the production of loaves and fishes, a prime instance of generating cargo. John Frum, whether he is real or not, is a radical figure, for while a cursory glance at his cult might seem that what he promises is capitalism, it’s actually the exact opposite. For those who pray to John Frum are not asking for work, or labor, or a Protestant work ethic, but rather delivery from bondage; they are asking to be shepherded into a post-scarcity world. John Frum is not intended to deliver us to capitalism, but rather to deliver us from it. John Frum is not an American, but he is from that more perfect America that exists only in the Melanesian spirit.   

10.On Easter of 1300, within the red Romanesque walls of the Cistercian monastery of Chiaravelle, a group of Umiliati Sisters were convened by Maifreda da Pirovano at the grave of the Milanese noblewoman Guglielma. Having passed two decades before, the mysterious Guglielma was possibly the daughter of King Premysl Otakar I of Bohemia, having come to Lombardy with her son following her husband’s death. Wandering the Italian countryside as a beguine, Guglielma preached an idiosyncratic gospel, claiming that she was an incarnation of the Holy Spirit, and that her passing would incur the third historical dispensation, destroying the patriarchal Roman Catholic Church in favor of a final covenant to be administered through women. If Christ was the new Adam come to overturn the Fall, then his bride was Guglielma, the new Eve, who rectified the inequities of the old order and whose saving grace came for humanity, but particularly for women.

Now a gathering of nuns convened at her inauspicious shrine, in robes of ashen gray and scapulars of white. There Maifreda would perform a Mass, transubstantiating the wafer and wine into the body and blood of Christ. The Guglielmites would elect Maifreda the first Pope of their Church. Five hundred years after the supposed election of the apocryphal Pope Joan, and this obscure order of women praying to a Milanese aristocrat would confirm Maifreda as the Holy Mother of their faith. That same year the Inquisition would execute 30 Guglielmites—including la Papessa. “The Spirit blows where it will and you hear the sound of it,” preached Maifreda, “but you know now whence it comes or whither it goes.”

Maifreda was to Guglielma as Paul was to Christ: apostle, theologian, defender, founder. She was the great explicator of the “true God and true human in the female sex…Our Lady is the Holy Spirit.” Strongly influenced by a heretical strain of Franciscans influenced by the Sicilian mystic Joachim of Fiore, Maifreda held that covenantal history could be divided tripartite, with the first era of Law and God the Father, the second of Grace and Christ the Son, and the third and coming age of Love and the Daughter known as the Holy Spirit. Barbara Newman writes in From Virile Woman to WomanChrist: Studies in Medieval Religion and Literature that after Guglielma’s “ascension the Holy Spirit would found a new Church, superseding the corrupt institution in Rome.” For Guglielma’s followers, drawn initially from the aristocrats of Milan but increasingly popular among more lowly women, these doctrines allowed for self-definition and resistance against both Church and society. Guglielma was a messiah and she arrived for women, and her prophet was Maifreda.

Writing of Guglielma, Newman says that “According to one witness… she had come in the form of a woman because if she had been male, she would have been killed like Christ, and the whole world would have perished.” In a manner she was killed, some 20 years after her death when her bones were disinterred from Chiaravelle, and they were placed on the pyre where Maifreda would be burnt alongside two of her followers. Pope Boniface VIII would not abide another claimant to the papal throne—especially from a woman. But even while Maifreda would be immolated, the woman to whom she gave the full measure of sacred devotion would endure, albeit at the margins. Within a century Guglielma would be repurposed into St. Guglielma, a pious woman whom suffered under the false accusation of heresy, and who was noted as particularly helpful in interceding against migraines. But her subversive import wasn’t entirely dampened over the generations. When the Renaissance Florentine painter Bonifacio Bembo was commissioned to paint an altarpiece around 1445 in honor of the Council of Florence (an unsuccessful attempt at rapprochement between the Catholic and Orthodox Churches) he depicted God crowning Christ and the Holy Spirit. Christ appears as can be expected, but the Holy Spirit has Guglielma’s face.

Maifreda survived in her own hidden way as well, and also through the helpful intercession of Bembo. The altar that he crafted had been commissioned by members of the powerful Visconti family, leaders in Milan’s anti-papal Ghibelline party, and they also requested the artist to produce 15 decks of Tarot cards. The so-called Visconti-Sforza deck, the oldest surviving example of the form, doesn’t exist in any complete set, having been broken up and distributed to various museums. Several of these cards would enter the collection of the American banker J.P. Morgan, where they’d be stored in his Italianate lower Manhattan mansion. A visitor can see cards from the Visconti-Sforza deck that include the fool in his jester’s cap and mottled pants, the skeletal visage of death, and most mysterious of all, il Papesa—the female Pope. Bembo depicts a regal woman, in ash-gray robes and white scapular, the papal tiara upon her head. In the 1960s, the scholar Gertrude Moakley observed that the female pope’s distinctive dress indicates her order: she was an Umilati. Maifreda herself was first cousins with a Visconti, the family preserving the memory of the female pope in Tarot. On Madison Avenue you can see a messiah who for centuries was shuffled between any number of other figures, never sure of when she might be dealt again. Messiahs are like that, often hidden—and frequently resurrected.  

Bonus Links:—Ten Ways to Live ForeverTen Ways to Change Your GodTen Ways to Look at the Color Black

Image Credit: Wikipedia.

A Fraternity of Dreamers

“There is no syllable one can speak that is not filled with tenderness and terror, that is not, in one of those languages, the mighty name of a god.” —Jorge Luis Borges, “The Library of Babel” (1941)

“Witness Mr. Henry Bemis, a charter member in the fraternity of dreamers. A bookish little man whose passion is the printed page…He’ll have a world all to himself…without anyone.” —Rod Serling, “Time Enough at Last,” The Twilight Zone (1959)

 When entering a huge library—whether its rows of books are organized under a triumphant dome, or they’re encased within some sort of vaguely Scandinavian structure that’s all glass and light, or they simply line dusty back corridors—I must confess that I’m often overwhelmed with a massive surge of anxiety. One must be clear about the nature of this fear—it’s not from some innate dislike of libraries, the opposite actually. The nature of my trepidation is very exact, though as far as I know there’s no English word for it (it seems like some sort of sentiment that the Germans might have an untranslatable phrase for). This fear concerns the manner in which the enormity of a library’s collection forces me to confront the sheer magnitude of all that I don’t know, all that I will never know, all that I can never know. When walking into the red-brick modernist hanger of the British Library, which houses all of those brittle books within a futuristic glass cube that looks like a robot’s heart, or the neo-classical Library of Congress with its green patina roof, or Pittsburgh’s large granite Carnegie Library main branch smoked dark with decades of mill exhaust and kept guard by a bronze statue of William Shakespeare, my existential angst is the same. If I start to roughly estimate the number of books per row, the number of rows per room, the number of rooms per floor, my readerly existential angst can become severe. This symptom can even be present in smaller libraries; I felt it alike in the small-town library of Washington, Penn., on Lincoln Avenue and in the single room of the Southeast Library of Washington D.C. on Pennsylvania Avenue. Intrinsic to my fear are those intimations of mortality whereby even a comparatively small collection must make me confront the fact that in a limited and hopefully not-too-short life I will never be able to read even a substantial fraction of that which has been written. All those novels, poems, and plays; all those sentiments, thoughts, emotions, dreams, wishes, aspirations, desires, and connections—completely inaccessible because of the sheer fact of finitude.

Another clarification should be in order—my fear isn’t the same as worrying that I’ll be found out for having never read any number of classical or canonical books (or those of the pop, paper-back variety either). There’s a scene in David Lodge’s classic and delicious campus satire Changing Places: A Tale of Two Campuses in which a group of academics play a particularly cruel game, as academics are apt to do, that asks participants to name a venerable book they’re expected to have read but have never opened. Higher point-values are awarded the more canonical a text is; what the neophytes don’t understand is that the trick is to mention something standard enough that they still can get the points for having not read it (like Laurence Sterne’s Tristram Shandy) but not so standard that they’ll look like an idiot for having never read it. One character—a recently hired English professor—is foolish enough to admit that he skipped Hamlet in high school. The other academics are stunned into silence. His character is later denied tenure. So, at the risk of making the same error, I’ll lay it out and admit to any number of books that the rest of you have probably read, but that I only have a glancing Wikipedia familiarity with: Marcel Proust’s Remembrance of Things Past, James Joyce’s Finnegan’s Wake, Don DeLillo’s White Noise, David Foster Wallace’s Infinite Jest. I’ve never read Harper Lee’s To Kill a Mockingbird, which is ridiculous and embarrassing, and I feel bad about it. I’ve also never read Jonathan Franzen’s The Corrections, though I don’t feel bad about that (however I’m wary that I’ve not read the vast bulk of J.K. Rowling’s Harry Potter books). Some of those previously mentioned books I want to read, others I don’t; concerning the later category, some of those titles make me feel bad about my resistance to them, others I haven’t thought twice about (I’ll let you guess individual titles’ statuses).

I offer this contrition only as a means of demonstrating that my aforementioned fear goes beyond simply imposter syndrome. There are any number of reasons why we wish we’d read certain things, and that we feel attendant moroseness for not having done so—the social stigma of admitting such things, a feeling of not being educated enough or worldly enough, the simple fact that there might be stuff that we’d like to read, but inclination, will power, or simply time has gotten in the way. The anxiety that libraries can sometimes give me is of a wholly more cosmic nature, for something ineffable affects my sense of self when I realize that the majority of human interaction, expression, and creativity shall forever be unavailable to me. Not only is it impossible for me to read the entirety of literature, it’s impossible to approach even a fraction of it—a fraction of a fraction of it. Some several blocks from where I now write is the Library of Congress, the largest collection in the world, which according to its website contains 38 million books (that’s excluding other printed material from posters to pamphlets). If somebody read a book a day, which of course depends on the length of the book, it would take somebody about 104,109 years and change to read everything within that venerable institution (ignoring the fact that about half-a-million to a million new titles are published every year in English alone, and that I was also too unconcerned to factor in leap years).

If you’re budgeting your time, may I suggest the British Library, which though it has a much larger collection of other textual ephemera, has a more manageable 13,950,000 books, which would take you a breezy 38,291 years to get through. If you’re of a totalizing personality, according to Google engineers from a 2010 study estimating the number of books ever written, you’ll have to wade through 129 million volumes of varying quality. That would take you 353,425 years to read. Of course this ignores all of that which has been written but not bound within a book—all of the jottings, the graffiti, the listings, the diaries, the text messages, the letters, and the aborted novels for which the authors have wisely or unwisely hit “Delete.” Were some hearty and vociferous reader to consume one percent of all that’s ever been written—one percent of that one percent—and then one percent of that one percent—they’d be the single most well-read individual to ever live. When we reach the sheer scale of how much human beings have expressed, have written, we enter the realm of metaphors that call for comparisons to grains of sand on the beach or stars in our galaxy. We depart the realm of literary criticism and enter that of cosmology. No wonder we require curated reading lists.

For myself, there’s an unhealthy compulsion towards completism in the attendant tsuris over all that I’ll never be able to read. Perhaps there is something stereotypically masculine in the desire to conquer all of those unread worlds, something toxic in that need. After all, in those moment’s of readerly ennui there’s little desire for the experience, little need for quality, only that desire to cross titles off of some imagined list. Assume it were even possible to read all that has been thought and said, whether sweetness and light or bile and heft, and consider what purpose that accomplishment would even have. Vaguely nihilistic the endeavor would be, reminding me of that old apocryphal story about the conqueror, recounted by everyone from the 16th-century theologian John Calvin to Hans Gruber in Die Hard, that “Alexander the Great…wept, as well indeed he might, because there were no more world’s to conquer,” as the version of that anecdote is written in Washington Irving’s 1835 collection Salmagundi: Or, The Whim-whams and Opinions of Launcelot Langstaff, Esq. and Others. Poor Alexander of Macedon, son of Philip, educated in the Athenian Lyceum by Aristotle, and witness to bejeweled Indian war-elephants bathing themselves on the banks of the Indus and the lapis lazuli encrusted Hanging Gardens of Babylon, the gleaming, white pyramids at Cheops and the massive gates of Persepolis. Alexander’s map of the world was dyed red as his complete possession—he’d conquered everything that there was to be conquered. And so, following the poisoning of his lover Hephaestion, he holed up in Nebuchadnezzar’s Babylonian palace, and he binged for days. Then he died. An irony though, for Alexander hadn’t conquered, or even been to all the corners of the world. He’d never sat on black sand beaches in Hokkaido with the Ainu, he’d never drank ox-blood with the Masai or hunted the giant Moa with the Maori, nor had he been on a walkabout in the Dreamtime with the Anangu Pitjantjatjara or stood atop Ohio’s Great Serpent Mound or seen the grimacing stone-heads of the Olmec. What myopia, what arrogance, what hubris—not to conquer the world, but to think that you had. Humility is warranted whether you’re before the World or the Library.

Alexander’s name is forever associated not just with martial ambitions, but with voluminous reading lists and never-ending syllabi as well, due to the library in the Egyptian city to which he gave his name, what historian Roy MacLeod describes in The Library of Alexandria: Centre of Learning in the Ancient World as “unprecedented in kingly purpose, certainly unique in scope and scale…destined to be far more ambitious [an] undertaking than a mere repository of scrolls.” The celebrated Library of Alexandria, the contents of which are famously lost to history, supposedly confiscated every book from each ship that came past the lighthouse of the city, had its scribes make a copy of the original, and then returned the counterfeit to the owners. This bit of bibliophilic chicanery was instrumental to the mission of the institution—the Library of Alexandria wasn’t just a repository of legal and religious documents, nor even a collection of foundational national literary works, but supposedly an assembly that in its totality would match all of the knowledge in the world, whether from Greece and Egypt, Persia and India. Matthew Battles writes in Library: An Unquiet History that Alexandria was “the first library with universal aspirations; with its community of scholars, it became a prototype of the university of the modern era.” Alexander’s library yearned for completism as much as its namesake had yearned to control all parts of the world; the academy signified a new, quixotic emotion—the desire to read, know, and understand everything. By virtue of it being such a smaller world at the time (at least as far as any of the librarians working there knew) such an aspiration was even theoretically possible.

“The library of Alexandria was comprehensive, embracing books of all sort from everywhere, and it was public, open to anyone with fitting scholarly or literary qualifications,” writes Lionel Casson in Libraries in the Ancient World. The structure overseen by the Ptolemaic Dynasty, who were Alexander’s progeny, was much more of a wonder of the ancient world than the Lighthouse in the city’s harbor. Within its walls, whose appearance is unclear to us, Aristophanes of Byzantium was the first critic to divide poetry into lines, 70 Jews convened by Ptolemy II translated the Hebrew Torah into the Greek Septuagint, and the geographer Eratosthenes correctly calculated the circumference of the Earth. Part of the allure of Alexandria, especially to any bibliophile in this fraternity of dreamers, is the fact that the vast bulk of what was kept there is entirely lost to history. Her card catalogue may have included lost classical works like Aristotle’s second book of Poetics on comedy (a plot point in Umberto Eco’s medieval noir The Name of the Rose), Protagoras’s essay “On the Gods,” the prophetic books of the Sibyllines, Cato the Elder’s seven-book history of Rome, the tragedies of the rhetorician Cicero, and even the comic mock-epic Magrites supposedly written by Homer.

More than the specter of all that has been lost, Alexandria has become synonymous with the folly of anti-intellectualism, as its destruction (variously, and often erroneously, attributed to Romans, Christians, and Muslims) is a handy and dramatic narrative to illustrate the eclipse of antiquity. Let’s keep some perspective though—let’s crunch some numbers again. According to Robin Lane Fox in The Classical World: An Epic History from Homer to Hadrian the “biggest library…was said to have grown to nearly 500,000 volumes.” Certainly not a collection to scoff at, but Alexander’s library, which drew from the furthest occident to the farthest orient, has only a sixth of all the books in the Library of Congress’s Asian Collection; Harvard University’s Widener Library has 15 times as many books (and that’s not including the entire system); the National Library of Iran, housed not far from where Alexander himself died, has 30 times the volumes than did that ancient collection. The number of books held by the Library of Alexandria would have been perfectly respectable in the collection of a small midwestern liberal arts college. By contrast, according to physicist Barak Shoshany on a Quora question, if the 5 zettabytes of the Internet were to be printed, then the resultant stack of books would have to fit on a shelf “4×10114×1011 km or about 0.04 light years thick,” the last volume floating somewhere near the Oort Cloud. Substantially larger shelves would be needed, it goes without saying, than whatever was kept in the storerooms of Alexandria with that cool Mediterranean breeze curling the edges of those papyri.

To read all of those scrolls, codices, and papyri at Alexandria would take our intrepid ideal reader a measly 1,370 years to get through. More conservative historians estimate that the Library of Alexandria may have housed only 40,000 books—if that is the case, then it would take you a little more than a century to read (if you’re still breezing through a book a day). That’s theoretically the lifetime of someone gifted with just a bit of longevity. All of this numeric stuff misses the point, though. It’s all just baseball card collecting, because what the Library of Alexandria represented—accurately or not—was the dream that it might actually be possible to know everything worth knowing. But since the emergence of modernity some half-millennia ago, and the subsequent fracturing of disciplines into ever more finely tuned fields of study, it’s been demonstrated just how much of a fantasy that goal is. There’s a certain disposition that’s the intellectual equivalent of Alexander, and our culture has long celebrated that personality type—the Renaissance Man (and it always seems gendered thus). Just as there was always another land to be conquered over the next mountain range, pushing through the Kush and the Himalayan foothills, so too does the Renaissance Man have some new type of knowledge to master, say geophysics or the conjugation of Akkadian verbs. Nobody except for Internet cranks or precocious and delusional autodidacts actually believes in complete mastery of all fields of knowledge anymore; by contrast, for all that’s negative about graduate education, one clear and unironic benefit is that it taught me the immensity and totality of all of the things that I don’t know.

Alexandria’s destruction speaks to an unmistakable romance about that which we’ll never be able to read, but it also practically says something about a subset of universal completism—our ability to read everything that has survived from a given historical period. By definition it’s impossible to actually read all of classical literature, since the bulk of it is no longer available, but to read all of Greek and Roman writing which survives—that is doable. It’s been estimated that less than one percent of classical literature has survived to the modern day, with Western cultural history sometimes reading as a story of both luck and monks equally preserving that inheritance. It would certainly be possible for any literate individual to read all of Aristophanes’s plays, all of Plato’s dialogues, all of Juvenal’s epigrams.  Harvard University Press’s venerable Loeb Classical Library, preserving Greek and Latin literature in their distinctive minimalist green and red covered volumes, currently has 530 titles available for purchase. Though it doesn’t encompass all that survives from the classical period, it comes close. An intrepid and dogged reader would be able to get through them, realistically, in a few years (comprehension is another matter).

If you need to budget your time, all of Anglo-Saxon writing that survives, that which didn’t get sewn into the back-binding of some inherited English psalm book or found itself as kindling in the 16th century when Henry VIII dissolved the monasteries, is contained across some four major poetic manuscripts, though 100 more general manuscripts endure. The time period that I’m a specialist in, which now goes by the inelegant name of the “early modern period” but which everybody else calls the “Renaissance,” is arguably marked as the first time period for which no scholar would be capable of reading every primary source that endures. Beneficiary of relative proximity to our own time, and a preponderance of titles gestated through the printing press, it would be impossible for anyone to read everything produced in those centuries. For every William Shakespeare play, there are hundreds of yellowing political pamphlets about groups with names like “Muggletonians;” for every John Milton poem, a multitude of religious sermons on subjects like double predestination. You have to be judicious in what you choose to read, since one day you’ll be dead. This reality should be instrumental in any culture wars détente—canons exist as a function of pragmatism.   

The canon thus functions as a kind of short-cut to completism (if you want to read through all of the Penguin Classics editions with their iconic black covers and their little avian symbol on the cover, that’s a meager 1,800 titles to get through). Alexandria’s delusion about gathering all of that which has been written, and perhaps educating oneself from that corpus, drips down through the history of Western civilization. We’ve had no shortage of Renaissance Men who, even if they hadn’t read every book ever written, perhaps at least roughly know where all of them could be found in the card catalogue. Aristotle was a polymath who not only knew the location of those works, but credibly wrote many of them (albeit all that remains are student lecture notes), arguably the founder of fields as diverse as literary criticism to dentistry. In the Renaissance, whereby it could be assumed that the attendant Renaissance Man would be most celebrated, there was a preponderance of those for whom it was claimed that they had mastered all disciplines that could be mastered (and were familiar with the attendant literature review). Leonardo da Vinci, Blaise Pascal, Athanasius Kircher, Isaac Newton, and Gottfried Wilhelm Leibnitz have all been configured as Renaissance Men, their writings respectively encompassing not just art, mathematics, theology, physics, and philosophy, but also aeronautics, gambling, sinology, occultism, and diplomacy as well.

Stateside both Benjamin Franklin and Thomas Jefferson (a printer and a book collector) are classified as such, and more recently figures as varied as Nikola Tesla and Noam Chomsky are sometimes understood as transdisciplinary polymaths (albeit ones for whom it would be impossible to have read all that can be read, even if it appears as such). Hard to disentangle the canonization of such figures from the social impulse to be “well read,” but in the more intangible and metaphysical sense, beyond wanting to seem smart because you want to seem smart, the icon of the Renaissance Man can’t help but appeal to that completism, that desire for immortality that is prodded by the anxiety that libraries inculcate in me. My patron saint of polymaths is the 17th-century German Jesuit Kircher, beloved by fabulists from Eco to Jorge Louis Borges, for his writings that encompassed everything from mathematics to hieroglyphic translation. Paula Findlen writes in the introduction to her anthology Athanasius Kircher: The Last Man Who Knew Everything that he was the “greatest polymath of an encyclopedic age,” yet when his rival Renaissance Man Leibnitz first dipped into the voluminous mass of Kirchermania he remarked that the priest “understands nothing.”

Really, though, that’s true of all people. I’ve got a PhD and I can’t tell you how lightbulbs work (unless they fit into Puritan theology somehow). Kircher’s translations of Egyptian were almost completely and utterly incorrect. As was much of what else he wrote on, from minerology to Chinese history. He may have had an all-encompassing understanding of all human knowledge during that time period, but Kircher wasn’t right very often. That’s alright, the same criticism could be leveled at his interlocutor Leibnitz. Same as it ever was, and applicable to all of us. We’re not so innocent anymore, the death of the Renaissance Man is like the death of God (or of a god). The sheer amount of that which is written, the sheer number of disciplines that exist to explain every facet of existence, should disavow us of the idea that there’s any way to be well-educated beyond the most perfunctory meaning of that phrase. In that gulf between our desire to know and the poverty of our actual understanding are any number of mythic figures who somehow close that gap; troubled figures from Icarus to Dr. Faustus who demonstrate the hubris of wishing to read every book, to understand every idea. A term should exist for the anxiety that those examples embody, the quacking fear before the enormity of all that we don’t know. Perhaps the readerly dilemma, or even textual anxiety.

A full accounting of the nature of this emotion compels me to admit that it goes beyond simply fearing that I’ll never be able to read all of the books in existence, or in some ideal library, or if I’m being honest even in my own library. The will towards completism alone is not the only attribute of textual anxiety, for a not dissimilar queasiness can accompany related (though less grandiose) activities than the desire to read all books that were ever written. To whit—sometimes I’ll look at the ever-expanding pile of books that I’m to read, including volumes that I must read (for reviews or articles) and those that I want to read, and I’m paralyzed by that ever-growing paper cairn. Such debilitation isn’t helpful; the procrastinator’s curse is that it’s a personality defect that’s the equivalent of emotional quicksand. To this foolish inclination towards completism—desiring everything and thus acquiring nothing—I sometimes use Francis Bacon’s claim from his essays of 1625 that “Some books should be tasted, some devoured, but only a few should be chewed and digested thoroughly,” as a type of mantra against textual anxiety, and it mostly works. Perhaps I should learn to cross-stitch it as a prayer to display amongst my books, which even if I haven’t read all of them, they’ve at least been opened (mostly).

But textual anxiety manifests itself in a far weirder way, one that I think gets to the core of what makes the emotion so disquieting. When I’m reading some book that I happen to be enjoying, some random novel picked up from the library or purchased at an airport to pass the time, but not the works that I perennially turn toward—Walt Whitman and John Milton, John Donne and Emily Dickinson—I’m sometimes struck with a profound melancholy born from the fact that I shall never read these sentences again. Like meeting a friendly stranger who somehow indelibly marks your life by the tangible reality of their being, but whom will return to anonymity. Then it occurs to me that even those things I do read again and again, Leaves of Grass and Paradise Lost, I will one day also read for the last time. What such textual anxiety trades in, like all things of humanity, is my fear of finality, of extinction, of death. That’s at the center of this, isn’t it? The simultaneous fear of there being no more worlds to conquer and the fear that the world never can be conquered. Such consummation, the obsession with having it all, evidences a rather immature countenance.

It’s that Alexandrian imperative, but if there is somebody wiser and better to emulate it’s the old cynic Diogenes of Sinope, the philosophical vagabond who spent his days living in an Athenian pot. Laertius reports in Lives of the Eminent Philosophers that “When [Diogenes] was sunning himself…Alexander came and stood over him and said: ‘Ask me for anything you want.’ To which he replied, ‘Stand out of my light.’” And so, the man with everything was devoid of things to give to the man with nothing. Something indicative of that when it comes to this fear that there are things you’ll never be able to read, things you’ll never be able to know. The point, Diogenes seems to be saying, is to enjoy the goddamn light. Everything in that. I recall once admitting my fear about all that I don’t know, all of those books that I’ll never open, to a far wiser former professor of mine. This was long before I understood that an education is knowing what you don’t know, understanding that there are things you will never know, and worshiping at the altar of your own sublime ignorance. When I explained this anxiety of all of these rows and rows of books to never be opened, she was confused. She said “Don’t you see? That just means that you’ll never run out of things to read.” Real joy, it would seem, comes in the agency to choose, for if you were able to somehow give your attention equally to everything than you’d suffer the omniscient imprisonment that only God is cursed with. The rest of us are blessed with the endless, regenerative, confusing, glorious, hidden multiplicity of experience in discrete portions. Laertius writes “Alexander is reported to have said ‘Had I not been Alexander, I should have liked to be Diogenes.”

Image Credit: Wikipedia.

Stories in Formaldehyde: The Strange Pleasures of Taxonomizing Plot

Somewhere within the storerooms of London’s staid, gray-faced Tate Gallery (for it’s currently no longer on exhibit) is an 1834 painting by J.M.W. Turner entitled “The Golden Bough.” Rendered in that painter’s characteristic sfumato of smeared light and smoky color, Turner’s composition depicts a scene from Virgil’s epic Aeneid wherein the hero is commanded by that seventh-century-old prophetic crone, the Sibyl of Cumae, to make an offering of a golden bough from a sacred tree growing upon the shores of crystalline blue Lake Avernus to the goddess Prosperina, if he wishes to descend to Hades and see the shadow of his departed father. “Obscure they went through dreary shades, that led/Along the waste dominions of the dead,” translated John Dryden in 1697, using his favored totemistic Augustinian rhyming couplets, as Aeneas descends further into the Underworld, its entrance a few miles west of Naples. As imagined by Turner, the area around the volcanic lake is pleasant, if sinister; bucolic, if eerie; pastoral, if unsettling. A dapple of light marks the portal whereby pilgrims journey into perdition; in the distance tall, slender trees topped with a cap of branches jut up throughout the landscape. A columned temple is nestled within the scrubby hills overlooking the field. The Sibyl stands with a scythe so that the vegetable sacrifice can be harvested, postlapsarian snakes slither throughout, and the Fates revel in mummery near hell’s doorway. Rather than severe tones of blood red and sulfurous black, earthy red and cadaverous green: Turner opted to depict Avernus in soft blues and greys, and the result is all the more disquieting. Here, the viewer might think, is what the passage between life and death must look like—muted, temperate, serene, barely even noticeable from one transition to the next.

As with the best of Turner’s paintings, with his eye to color the visual equivalent of perfect pitch, it is the texture of hues that renders, if not some didactic message about his subject, a general emotional sense, a sentiment hard to describe and registering at a pitch that can be barely heard and yet alters one’s feelings in the moment. Such was the sense conveyed by the Scottish folklorist James George Frazer who borrowed the artist’s title for his landmark 1890 study The Golden Bough: A Study in Comparative Religion, describing on his first page how the painting is “suffused with the golden glow of imagination in which the divine mind of Turner steeped and transfigured even the fairest natural landscape.” This scene, Frazer enthuses, “is a dream-like vision of the little woodland…[where] Dian herself might still linger by this lonely shore, still haunt these woodlands wild.” An influential remnant of a supremely Victorian enthusiasm for providing quasi-scientific gloss to the categorization of mythology, Frazer’s study provided taxonomy of classical myth so as to find certain similarities, the better to provide a grand, unified theory of ancient religion (or what Edward Casaubon in George Elliot’s Middlemarch, written two decades before, might call The Key to All Mythologies). First viewing Turner’s canvas, and the rationalist Frazer was moved by the painting’s mysteriousness, the way in which the pool blue sky and the shining hellmouth trade in nothing as literal as mere symbolism, but wherein the textured physicality—the roughness of the hill and the ominous haze of the clouds, dusk’s implied screaming cicadas and the cool of the evening—conveys an ineffable feeling. Despite pretensions to an analysis more logical, Frazer intimates the numinous (for, how couldn’t he?). “Who does not know Turner’s picture of the Golden Bough?” he writes.

His argument in The Golden Bough was that religions originated as primitive fertility cults, dedicated to the idea of sacrifice and resurrection, and that from this fundamentally magical worldview would evolve more sophisticated religions, to finally be supplanted by secular science. The other argument from The Golden Baugh is implicit in the book’s very existence—that structure can be ascertained within the messy morass of disparate myths. To make this argument he drew from sources as diverse as Virgil to the Nootka people of British Columbia, classifying, categorizing, and organizing data as surely as a biologist preserving specimens in a jar of formaldehyde. And like Charles Darwin measuring finch beaks, or Thomas Huxley pinning butterflies to wood blocks, Frazer believed that diversity was a mask for similarity.

As reductionist as his arguments are, and as disputed as his conclusions may be, Frazer’s influence was outsize among anthropologists, folklorists, writers, and especially literary critics, who thrilled to the idea that some sort of unity could be found in the chaotic variety of narratives that constitute world mythology. “I am a plain practical man,” Frazer writes, “not one of your theorists and splitters of hairs and choppers of logic,” and while it’s true that The Golden Bough evidences a more imaginative disposition, it still takes part in that old quixotic desire to find some Grand Unified Theory of Narrative. While Frazer’s beat was myth, he was still a reporter in stories, and percolating like a counter-rhythm within discussion of narrative is that old desire, the yearning to find the exact number of plots that it is possible to tell. Frazer, for all that was innovative about his thought, was neither the first nor the last to treat stories like animals in a genus, narratives as if creatures in a phylum.

That grand tradition claims there are only 36 stories that can be told, or seven, or four. Maybe there is really only one tale, the story of wanting something and not getting it, which is after all the contour of this story itself—the strange endurance of the sentiment that all narrative can be easily classifiable into a circumscribed, finite, and relatively small number of possibilities. While I’ve got my skepticism about such an endeavor—seeing those suggested systems as erasing the particularity of stories, of occluding what makes them unique in favor of mutilating them into some Procrustean Bed—I’d be remiss not to confess that I also find these theories immensely pleasing. There is something to be said about the cool rectilinear logic that claims any story, from Middlemarch to Fifty Shades of Grey, Citizen Kane to Gremlins 2, can be stripped down to its raw schematics and analyzed as fundamental, universal, eternal plots that have existed before Gilgamesh’s cuneiform was wedged into wet clay.

Christopher Booker claims in The Seven Basic Plots: Why We Tell Stories that “wherever men and women have told stories, all over the world, the stories emerging to their imaginations have tended to take shape in remarkably similar ways,” differences in culture, language, or faith be damned. With some shading, Booker uses the archetypal psychoanalysis of Carl Jung to claim that every single narrative, whether in epic or novel, film or comic, can be slotted into 1) overcoming the monster (Beowulf, George Lucas’s Star Wars), 2) rags to riches (Charlotte Bronte’s Jane Eyre, Horatio Alger stories), 3) the quest (Homer’s The Odyssey, Steven Spielberg’s Raiders of the Lost Ark), 4) voyage and return (The Ramayana, J.R.R. Tolkien’s The Hobbit), 5) comedy (William Shakespeare’s Twelfth Night, the Coen Brothers’ The Big Lebowski), 6) tragedy (Leo Tolstoy’s Anna Karenina, Arthur Penn’s Bonnie and Clyde) or 7) rebirth (Charles Dickens’s A Christmas Carol, Harold Ramis’s Groundhog Day).

That all of these parenthetically referenced works are, of course, astoundingly different from each other in character, setting, and most of all language, is irrelevant to Booker’s theory. While allowing for more subtlety than my potted overview would allow, Booker still concludes that “there are indeed a small number of plots which are so fundamental to the way we tell stories that it is virtually impossible for any storyteller ever entirely to break away from them.” Such a claim is necessary to Booker’s contention that these narratives are deeply nestled in our collective unconscious, a repository of themes, symbols, and archetypes that are “our basic genetic inheritance,” which he then proffers as an explanation for why humans tell stories at all.

The Seven Basic Plots, published in 2004 after 34 years of labor, is the sort of critical work that doesn’t appear much anymore. Audacious to the point of impudence, ambitious to the level of crack-pottery, Booker’s theory seems more at home in a seminar held by Frazer than in contemporary English departments more apt to discuss gender, race, and class in Jane Austen’s Pride and Prejudice than they are the Orphic themes of rebirth as manifested in that same novel. Being the sort of writer who both denied anthropogenic climate change and defended asbestos (for real), Booker had the conservative’s permanent sense of paranoid aggrievement concerning the treatment of his perspectives. So, let me be clear—contra Booker’s own sentiments, I don’t think that the theories in The Seven Basic Plots are ignored by literary critics because of some sort of politically correct conspiracy of silence; I think that they’re ignored because they’re not actually terribly correct or useful. When figuring out the genealogical lineage of several different species of Galapagos Island finches, similarity becomes a coherent arbiter; however, difference is more important when thinking through what makes exemplary literature exemplary. Genre, and by proxy plot, is frequently more an issue of marketing than anything. That’s not to say that questions of genre have no place in literary criticism, but they are normally the least interesting (“What makes this gothic novel gothic?”). No stranger to such thinking himself, author Kurt Vonnegut may have solved the enigma with the most basic of monomyths elucidated—“man falls into hole, man gets out of hole.”

Booker isn’t after marketing, however, he’s after the key to all mythologies. Like Frazer before him, he won’t be the last critic enraptured by the idea of a Periodic Table of Plots, capable of explaining both Fyodor Dostoevsky’s Crime and Punishment as well as Weekend at Bernie’s, and he won’t be the last. If you wish to blame somebody for this line of thinking, as with most disciplines of human endeavor from ethics to dentistry, look to Aristotle as the culprit. The philosopher’s “four conflicts,” man against himself, man against man, man against nature, and man against the gods, have long been a convenient means of categorizing plots. The allure of there being a limited number of plots is that it makes both reading and writing theoretically easier. The denizens of high culture literary criticism have embraced the concept periodically, as surely as those producing paperbacks promising that a hit book can be easily plotted out from a limited tool kit. Georges Polti, of Providence Rhode Island and later Paris France, wrote The Thirty-Six Dramatic Situations in 1895, claiming that all stories could be categorized in that number of scenarios, including plots of “Crime pursued by vengeance” and “Murderous adultery.” “Thirty-six situations only!” Polti enthuses. “There is to me, something tantalizing about the assertion.” Polti’s book has long been popular as a sort of lo-fi randomizer for generating stories, and its legacy lives on in works like Ronald B. Tobias’s 20 Master Plots and How to Build Them and Victoria Lynn Schmidt’s A Writer’s Guide to Characterization: Archetypes, Heroic Journeys, and Other Elements of Dynamic Character Development.

There is also a less pulpy, tonier history surrounding the thinking that everything can be brewed down to a handful of elemental plots. My attitude concerning such thinking was a bit glib earlier, as there is something to be said about the utility in this thinking, and indeed entire academic disciplines have grown from that assumption. Folklorists use a classification system called the “Aarne-Thompson-Uther Index,” where a multitude of plot-types are given numbers (“Cinderella” is 510A, for example), which can be useful to trace the ways in which stories have evolved and altered over both distance and time. Unlike Polti’s 36 plots, Tobias’s 20, or Booker’s seven, Stith Thompson’s Motif-Index of Folk-Literature goes to six volumes of folk tales, fairy tales, legends, and myths, but the basic idea is the same: plots exist in a finite number (including “Transformation: man to animal” and “Magic strength resides in hair”). As with the system of classification invented by Francis James Child in The English and Scottish Popular Ballads, or the Roud Folk Song Index, the Aarne-Thompson Uther Index is more than just a bit of shell collecting, but rather a system of categorization that helps folklorists make sense of the diversity of oral literature, with scholar Alan Dundes enthusing that the system was among the “most valuable tools in the professional folklorist’s arsenal of aids for analysis.” Morphological approaches define the discipline known as “narrative theory,” which draws from a similar theoretical inclination as that of the ATU Index. All of these methodologies share a commitment to understanding literature less through issues of grammar, syntax, and diction, and more in terms of plot and story. For those who read with an eye towards narrative, there is frequently an inclination, sentiment, or hunch that all stories and novels, films and television shows, epics and lyrics, comics and plays, can have their fat, gristle, and tallow boiled away to leave just the broth and a plot that’s as clean as a bone.

A faith that was popular among the Russian Formalists, sometimes incongruously known as the Prague School (after where many of them, as Soviet exiles, happened to settle), including Roman Jacobson, Viktor Shklovsky, and Vladimir Propp, the last of whom wrote Morphology of the Folktale, reducing those stories to a narrative abstraction that literally looks like mathematics. A similar movement was that of French structuralism, as exemplified by its founder the linguist Ferdinand de Saussure, and as later practiced by the anthropologist Claude Levi-Strauss and the literary critic Roland Barth. In the Anglophone world, with the exception of some departments that are enraptured to narratology, literary criticism has often focused on the evisceration of a text with the scalpel of close reading rather than the measurement of plot with the calipers of taxonomy. Arguably that’s led to the American critical predilection towards “literary” fiction over genre fiction, the rejection of science fiction, fantasy, horror, and romance as being unserious in favor of all of those beautifully crafted stories in The New Yorker where the climax is the main character looking out the window, sighing, and taking a sip of coffee, while realizing that she was never happy, not really.

There are exceptions to the critical valorization in language over plot, however, none more so than in the once mighty but now passé writings of Canadian theorist Northrop Frye. Few scholars in the English-speaking world were more responsible for that once enthusiastic embrace of taxonomic criticism than this United Church of Christ minister and professor at Toronto’s Victoria College. Frye was enraptured to the psychoanalyst Carl Jung’s theories of how fundamental archetypes structure our collective unconscious, and he believed that a similar approach could be applied to narrative, that a limited number of plots structured our way of thinking and approaching stories. In works like Fearful Symmetry on William Blake, and his all-encompassing Anatomy of Criticism, Frye elucidated a complex, baroque, and elegant system of categorizing stories, the better to interpret them properly. “What if criticism is a science as well as an art?” Frye asked, wishing to approach literature like a taxonomist, as if novels were a multitude of plants and animals just awaiting Linnaean classification. For those who read individual poems or novels as exemplary texts, explaining what makes them work, Frye would say that they’re missing the totality of what literature is. “Criticism seems to be badly in need of a coordinating principle,” he writes, “a central hypothesis which, like the theory of evolution in biology, will see the phenomena it deals with as part of a whole.”

Frye argued that this was to be accomplished by identifying that which was universal in narrative, where works could be rendered of their unique flesh down into their skeletons, which we would then find to be myths and archetypes. From this anodyne observation, Frye spun out a complex classification system for all Western literature, one where he identifies the exact archetypes that define poetry and prose, where he flings about terms like “centripidal” and “centrifugal” to interpret individual texts, and where phrases like the “kerygmatic mode” are casually used.  Anatomy of Criticism is true to its title; Frye carves up the cadaver of literature and arrives at an admittedly intoxicating theory of everything. “Physics is an organized body of knowledge about nature, and a student of it says that he is learning physics, not nature,” Frye writes. “Art, like nature, has to be distinguished from the systematic study of it, which is criticism.” In Frye’s physics, there are five “modes” of literature, including the mythic, romantic, high mimetic, low mimetic, and ironic; these are then cross listed with tragic, comic, and thematic forms; what are then derived are genres with names like the dionysian, the elegiac, the aristophanic, and so on. Later in the book he supplies a complex theory of symbolism, a methodology concerning imagery based on the Platonic Great Chain of Being, and a thorough taxonomy of genre. In what’s always struck me as one of the odder (if ingenious) parts of Anatomy of Criticism, Frye ties genres specifically to certain seasons, so that comedy is a spring form, romance belongs to the summer, autumn is a time of tragedy, and winter births irony. How one reads books from those tropical places where seasons neatly divide between rainy or not speaks to a particular chauvinism on the Canadian’s part.

For most viewers of public television, however, their introduction to the “There-are-only-so-many-stories” conceit wasn’t Frye, but rather a Sarah Lawrence College professor who was the titular subject of journalist Bill Moyers’s 1988 PBS documentary Joseph Campbell and the Power of Myth. Drawing largely from his 1949 study The Hero with a Thousand Faces, Campbell became the unlikely star of the series that promulgated his theory of the “monomyth,” the idea that a single-story threads through world mythology and is often focused on what he termed “the hero’s journey.” Viewers were drawn to Campbell’s airy insights about the relationship between Akkadian mythology and Star Wars (a film which George Lucas admitted was heavily influenced by the folklorist’s ideas), and his vaguely countercultural pronouncement that one should “Follow your bliss!,” despite his own right-wing politics (which according to some critics could run the gamut between polite Reaganism to fascist sympathizing). Both Frye and Campbell exhibited a wide learning, but arguably only the former’s was particularly deep. With an aura of crunchy tweediness, Campbell seemed like the sort of professor who would talk to students about the Rubaiyat of Omar Khayyam in an office which smells of patchouli, a threadbare oriental rug on the dusty floor, knick-knacks assembled while studying in India and Japan, and a collapsing bookshelf jammed with underlined paperback copies of Friedrich Nietzsche and Arthur Schopenhauer above his desk. Campbell, in short, looked like what we expect a liberal arts teacher to look like, and for some of his critics (like Dundes who called him an “non-expert” and an “amateur”) that gave him an unearned authority.

But what an authority he constructed, the hero with only one theory to explain everything! Drawing from Jung, Frazer, and all the rest of the usual suspects, Campbell argued in his most famous book that broad archetypes structure all narrative, wherein a “hero ventures forth from the world of common day into a region of supernatural wonder: fabulous forces are there encountered and a decisive victory is won: the hero comes back from this mysterious adventure with the power to bestow boons on his fellow man.” Whether Luke Skywalker venturing out from Tatooine or Gilgamesh leaving Ur, the song remains the same, Campbell says. Gathering material from the ancient near east and Bronze Age Ireland, the India of the Mahabharata and Hollywood screen plays, Campbell claimed that his monomyth was the skeleton key to all narrative, a story whose parsing could furthermore lead to understanding, wisdom, and self-fulfillment among those who are hip to its intricacies. The Hero with a Thousand Faces naturally flattered the pretensions of some artists and writers, what with its implications that they were conduits connected directly to the collective unconsciousness. Much as with Freud and the legions of literary critics who applied his theories to novels and film, if Campbell works well in interpreting lots of movies, it’s because those directors (from Lucas to Stanley Kubrick) happened to be reading him. The monomyth can begin to feel like the critical equivalent of the intelligent design advocate who knows God exists, because why else would we have been given noses on which to so conveniently hold our glasses?

Campbell’s politics, and indeed that of his theory, are ambivalent. His comparative approach superficially seems like the pluralistic, multicultural, ecumenical perspective of the Sarah Lawrence professor that he was, but at the same time the flattening of all stories into this one monomyth does profound violence to the particularity of myths innumerable. There is a direct line between Campbell and the mythos-laden mantras of poet Robert Bly and his Iron John: A Book About Men, the tome that launched a thousand drum circles of suburban dads trying to engage their naturalistic masculinity in vaguely homoerotic forest rituals, or of Canadian psychotherapist/alt-right apologist Jordan Peterson who functions as basically a Dollar Store version of the earlier folklorist. Because myths are so seemingly elemental, mysterious telegrams from the ancient past, whose logic seems imprinted into our unconscious, it’s hard not to see the attraction of a Campbell. And yet whenever someone starts talking about “mythos” it inevitably can start to feel like you’re potentially in the presence of a weirdo who practices “rune magik,” unironically wonders if they’re an ubermensch, and has an uncomfortably racist Google search history. We think of the myth as the purview of the hippie, but it’s just as often the provenance of the jackbooted authoritarian, for Campbell’s writings fit comfortably with a particularly reactionary view of life, which should fit uncomfortably with the rest of us. “Marx teaches us to blame society for our frailties, Freud teaches us to blame our parents,” Campbell wrote in the posthumously published Pathways to Bliss, but the “only place to look for blame is within: you didn’t have the guts to bring up your full moon and live the life that was your potential.” Yeah, that’s exactly it. People can’t afford healthcare or get a job because they didn’t bring up their full moon…

The problem is that if you take Campbell too seriously then everything begins to look like it was written by Campbell. To wit, the monomyth is supposed to go through successive stages, from the hero’s origin in an ordinary world where he receives a “call to adventure,” to being assisted by a mentor who leads him through a “guarded threshold” where he is tested on a “road of trials,” to finally facing his ultimate ordeal. After achieving success, the hero returns to the ordinary world wiser and better, improving the lives of others through the rewards that have been bestowed upon him. The itinerary is more complex than this in The Hero with a Thousand Faces, but this should be enough to convey that Campbell’s schema is general enough that it can be applied to anything, but particular enough that it gives the illusion of rigor. Think of Jesus Christ, called to be the messiah and assisted by John the Baptist, tempted by Satan in the desert, and after coming into Jerusalem facing torture at the hands of the Romans, before his crucifixion and harrowing of hell, only to be resurrected with the promise of universal human salvation. Now, think of Jeff Lebowski, called to be the Dude and assisted by Walter Sobchak, tempted by Jackie Treehorn, battling the nihilists, only to return in time for the bowling finals. Other than speaking deep into the souls of millions of people, it should be uncontroversial to say that the gospels and the Coen Brothers’ The Big Lebowski are only the same story in the most glaringly of superficial ways, and yet the quasi-conspiratorial theory of the monomyth promises secret knowledge that says that they are.    

But here’s the thing—stories aren’t hydrogen, plots aren’t oxygen, narratives aren’t carbon. You can’t reduce the infinity of human experience into a Periodic Table, except in the most perfunctory of ways. To pretend that the tools of classification are the same as the insights of interpretation is to grind the Himalayas into Iowa, it’s to cut so much from the bone that the only meal you’re left with is that of a skeleton. When all things are reduced to monomyth, the enthusiast can’t recognize the exemplary, the unique, the individual, the subjective, the idiosyncratic, because some individual plot doesn’t have a magical wizard shepherding the hero to the underworld, or whatever. It’s to deny the possibility of some new story, of some innovation in narrative, its to spurn the Holy Grail of uniqueness. Still, some sympathy must be offered as to why these models appeal to us, of how archetypal literary criticism appeals to our inner stamp collectors. With apologies to Voltaire, if narrative didn’t exist it would be necessary to invent it—and everything else too. The reasons why archetypal criticism is so appealing are legion—they impose a unity on chaos, provides a useful measure of how narratives work, and give the initiate the sense that they have knowledge that is applicable to everything from The Odyssey to Transformers.

But a type of critical madness lay in the idolatry of confusing methodological models for the particularity of actual stories. Booker writes of stories that are “Rags to Riches,” but that reductionism is an anemic replacement for inhabiting Pip’s mind when he pines for Estella in Charles Dickens’s Great Expectations; he classifies Bram Stoker’s Dracula as being about “Overcoming the Monster,” but that simplification is at the expense of that purple masterpiece’s paranoia, its horror, its hunger, its sexiness. There are no stories except for in the details. To forget that narratives are infinite is a slur against them; it’s the blasphemy of pretending that every person is the same as every other. For in a warped way, there is but one monomyth, but it’s not what the stamp collectors say it is. In all of their variety, diversity, and multiplicity, every tale is a creation myth because every tale is created. From the raw material of life is generated something new, and in that regard we’re not all living variations of the same story, we’re all living within the same story.  

Bonus Links:—The Purpose of Plot: An Argument with MyselfThe Million Basic PlotsOn Not Going Out of the House: Thoughts About Plotlessness

Image Credit: Wikimedia Commons.

Letter from Wartime

“Questo è il fiore del partigiano,/o bella ciao, bella ciao, bella ciao ciao ciao,/questo è il fiore del partigiano/morto per la libertà.”                                         —Italian Partisan Song, “Bella Ciao”
“Heard about Houston? Hear about Detroit?/Heard about Pittsburgh, PA?/You oughta know not to stand by the window/Somebody see you up there.”                                         —Talking Heads, “Life During Wartime”
In the hours before Hurricane Sandy slammed into the northeastern United States, my apartment in Bethlehem (Pennsylvania), which was 100 miles and a few hours from the Atlantic, was permeated by the unmistakable smell of the shore. Stolid son of the Alleghenies that I am, I’d never experienced the full onslaught of a hurricane before. This almost miasmic odor I associated with vacation—a fragrance inextricably connected to the Jersey boardwalk and Massachusetts beaches, of salt-water taffy and lobster rolls—suddenly permeating my living room, whose window looked out on a hulking, rusting former steel mill, felt borderline apocalyptic. As is the nature in things apocalyptic, it’s the incongruity that is alarming. As it was for some frightened 17th-century peasant reading a pamphlet foretelling doom because of the appearance of a mysterious comet in the heavens or the birth of a two-headed calf. The unexpected, the unusual, the unforeseen act as harbinger.
A landlocked home smelling like the beach is perhaps not as dramatic as those former examples, of course, and yet as with a sun-shower or the appearance of frost in May, there is a certain surrealism in things being turned upside down. That disruption in the nature of things makes it feel like worse disorder is coming. As it did, certainly, those hours before climate-change-conjured Sandy knocked out transponders, their explosions lighting up the horizon an oozing green all through the night, the winds howling past my building on its hill overlooking the river, where ultimately the power was out for more than a week, and roads made unpassable by the felled centuries-old oaks and maples which dotted the Lehigh Valley. It’s the eerie stillness in the air before the storm came that impressed itself upon me (so much so that this isn’t the first time I’ve written about it), those last few moments of normalcy before the world ended, but when you could tell it was coming, and there was nothing to do but charge your phone and reinforce your windows to withstand the impact from all of the debris soon to be buffeted about. Can you smell the roiling, stormy, boiling sea in the air right now?
“If destruction be our lot,” state representative Abraham Lincoln told a crowd gathered at the Young Men’s Lyceum of Springfield, Ill., in the winter of 1838, “we must ourselves be its author and finisher. As a nation of freemen, we must live through all time, or die by suicide.” Historical parallels outlive their critical utility; some of us have made a cottage industry out of comparing whatever in our newsfeeds to the Peasants’ Rebellion or the English civil wars. In the realm of emotion however, in psychological reality, is the autumn of 2020 what it felt like to learn that Polish defenses had been overrun by the Nazi blitzkrieg? To apprehend the dull shake of those guns of August a generation before? To read news that Ft. Sumter had fallen? As Franco’s war in Spain was to the world war, as Bleeding Kansas was to the civil, are we merely in the antechamber to a room that contains far worse horrors? Ultimately no year is but like itself, so that we’re already cursed enough to live during these months of pandemic and militia, of incipient authoritarianism contrasted with the uncertain hope for renewal. On the ground it can’t help but feel like one of those earlier moments, so that we’re forced to fiddle about with the inexact tool of historical comparison, of metaphor and analogy. Something of what Lincoln said, more than something, seems applicable now. “Suicide” might not be the right word though, unless we think of the national body politic as a single organism in and of itself. Certainly there are connotations of self-betrayal, but it’s more accurate to see this season of national immolation as what it is—a third of the country targeting another third while the remaining third remains non-committal on what stand they’ll take when everything starts to finally fall apart.
We shouldn’t misread Lincoln’s choice of word as indicating an equivalence of sides; in this split in the national psyche there is the malignant and the non-malignant, and it’s a moral cowardice to conflate those two. On one side we have a groundswell movement on behalf of civil and human rights, a progressive populism that compels the nation to stand up for its always unrealized and endlessly deferred ideals; on the other we have the specter of authoritarianism, of totalitarianism, of fascism. This is not an issue of suicide, it’s one of an ongoing attempted homicide, and if you’re to ever not shrink away from mirrors for the rest of your life—even if the bad guys should win (as they might)—then choose your side accordingly. And figure out that you don’t even have to like your allies, much less love them, to know that they’re better than the worst people in the room. If you bemoan “cancel culture” and “social justice warriors” but not extrajudicial kidnapping of activists by paramilitaries, then you are at best a hypocrite and a fool, and at worst a bad-faith actor justifying the worst of the U.S. government. If your concern is with the rhetorical excesses of a few college kids on Twitter, but you’re silent about the growing fascist cult currently in control of the federal executive, the federal judiciary, half of the federal legislature, and a majority of state governments (not to speak of the awesome power of the military), then you’ve already voted with your words. If you’re disturbed by property destruction, but not the vigilante murder of protestors, then you’ve since made your decision. We all have to imagine that speaking out might still mean something; we have to pretend like voting might make a difference; we all have to live with ourselves as citizens and humans beings. What I’m writing about is something different, however. What I’m writing about is what it feels like to be living through the blood-red dusk of a nation.
When the Romans left Britain, it was so sudden and surprising that we still have record of the shock amongst the locals over the retraction of the empire from their frosted shores. The Medieval English monk Gildas the Wise, as well as his student the Venerable Bede, record that in the immediate years following this abandonment, an appeal was sent to the capital for assistance. “The barbarians drive us to the sea,” wrote the Britons’ leaders, “the sea drives us to the barbarians; between these two means of death, we are either killed or drowned.” Under the protection of the imperial hegemon, the British Celts built an advanced civilization. Aqueducts brought water into the towns and cities, concrete roads lined paths through the countryside. One imagines that the mail arrived on time. In a shockingly brief period, however, and all that was abandoned; the empire having retracted back into itself and left those for whom it was responsible at the mercy of those who wished to pick apart its bones. Three centuries later, and the inhabitants of England no longer even remembered Rome; an anonymous Anglo-Saxon poet writes of a ruined settlement, that “This masonry is wondrous; fates broke it/courtyard pavements were smashed… Roofs are fallen, ruinous towers,/the frosty gate with frost on cement is ravaged,/chipped roofs are torn, fallen,/undermined by old age.” Have you seen American infrastructure lately? By the eighth century and that silent scop singling his song of misinterpreted past glories can’t even imagine by what technology a city like Londinium was made possible. He writes that “the work of giants is decaying,” because surely men couldn’t have moved stones that large into place.
Because historical parallel is such a fickle science, an individual of very different political inclinations than myself might be apt to misunderstand my purposes. They may see some sort of nativist warning in my allegory about Picts and Scots pushing beyond Hadrian’s great, big beautiful wall. Such a reading is woefully incorrect, for the barbarians that I identify are not some mythic subaltern beyond the frontier, but rather the conspiratorial minded fanatics now amassing at the polls, the decadent parsers of tweets who believe in satanic cabals, and the personality cultists who’ve all but abandoned a belief in democracy. As the Greek poet Constantine Cavafy wrote, “Why isn’t anything going on in the senate?/Why are the senators sitting there without legislating?/Because the barbarians are coming today.” We’re beyond the point of disagreeing without being disagreeable, the era of going high when they go low is as chimerical as it ever was.  There is something different in the United States today, and I know that you feel it; something noxious, toxic, sick, diseased, and most of all decadent. The wealthiest nation on Earth with such iniquity, where pandemic burnt—still burns—through the population while the gameshow host emperor froths his supporters into bouts of political necromancy. There is no legislation today because it increasingly feels like this is not a nation of laws, but something lower and uglier.
When I say that there is a decadence, I mean it in the fullest sense of that word. Not in the way that some reactionaries mean, always with their bad faith interpretations; nor exactly in the manner that my fellow leftists often mean, enraptured as they are to that ghost called “materialism.” Rather I mean a fallenness of spirit, a casual cruelty that if I were a praying man I’d identify as being almost devilish. Perhaps there are satanic cabals after all, just not where the letter-people think (I suspect the call is actually coming from within the White House). Since the republic was founded, we’ve fancied ourselves Rome, always fearing the Caesar who never seems to finally cross the Potomac. That’s the thing with self-fulfilling prophecies. Now the denizens of the fading order of Pax Americana seem every bit as incredulous at collapse as those poor Britons a millennium-and-a-half ago. Writing in The Irish Times, the great critic Fintan O’Toole notes that “Over more than two centuries, the United States has stirred a very wide range of feelings in the rest of the world: love and hatred, fear and hope, envy and contempt, awe and anger. But there is one emotion that has never been directed towards the U.S. until now: pity.” I can genuinely say that I appreciate his sentiment.
When I lived in Europe, I couldn’t help but feel that there was ironically something younger about my friends—I imagine it would seem compounded today. The irony comes from the traditional stereotype of “The American,” this rustic well-meaning hayseed, this big, bountiful, beautiful soul traipsing on his errand into the wilderness. If America was a land without history, then the Old World was supposedly death haunted, all those Roman ruins testament to the brutality that marked that continent, not least of all in the last century. Such was the public relations that marked this hemisphere from its supposed discovery onward—but how easily we forget the blood that purchased this place, a land which was never virginal, but that was raped from the beginning. I envy Europeans. I envy their social democracy and their welfare states, their economic safety nets and their sense of communal goodwill (no matter how frayed or occasionally hypocritical). Every European I met, the English and Scots, the French and Italians, seemed more carefree, seemed more youthful. They seemed to have the optimism that Americans are rumored to have but of which there is no remaining evidence of as the third decade of this millennium begins. During the early days of the pestilence the Italians were locked inside all of those beautiful old stone buildings of theirs. Now they’re sitting outside in cafes and trattorias, going to movies and concerts. We’re of course doing those things too, but the difference is that we have more than 200,000 dead and counting, and from the top on down it seems like few care. A French friend of mine once asked how Americans are able to go to the grocery store, the theater, the public park, without fear of getting shot? In the end, America will get you, whether by bullet or microbe. As a nation of freemen, we’re a traumatized people…

One of the few outsiders to really get our number was D.H. Lawrence, who in his Studies in Classic American Literature noted that “The essential American soul is hard, isolate, stoic, and a killer. It has never yet melted.” How could it be otherwise, in a nation built on stolen land by stolen people? America’s story is a gothic tale, a house built on a Native American burial ground. The legacies of bloodshed, of assault, of exploitation, of oppression that mark this forge of modernity ensure that it’s hard to be otherwise, even if we’re not allowed to ever admit such unpatriotic things. In that sense I don’t wonder if it wasn’t inevitable that we’d eventually be led—against the wishes of the majority—by this fool who promises to steal an election while accusing his adversary of the same, who will no doubt refuse to concede even when it becomes clear that he’s lost. We’re continually told by nice, liberal, and morally correct commentators that this is not who we are, but the American president is a philandering, sociopathic carnival barker who sells bullshit to people who can’t be so brain dead as to not know that it’s bullshit, all because they hate people who look different from them more than they love their own children. He’s Elmer Gantry, Harold Hill, “Buzz” Windrip.  He’s the unholy union of P.T. Barnum and Andrew Jackson. What could be more American?
Of course our saving grace has always been that we’re a covenantal nation, defined by supposed adherence to an abstract set of universal values. No land for anything as mundane as blood and soil (even though those ghouls at Charlottesville spread their terror for exactly that reason). There was something scriptural in the idealism that John Winthrop maintained in 1630, whereby national sustenance was in “our community as members of the same body,” or Lincoln in 1864 providing encomium for “government of the people, by the people, for the people,” and Barack Obama in 2004 declaring the American mantra to be one of “Hope in the face of difficulty, hope in the face of uncertainty, the audacity of hope.” That old saw about life, liberty, and the pursuit of happiness. No nation since that of the ancient Hebrews was so fully founded upon an idea—this idea that is by definition so utopian and so completely unattainable that to be a satisfied American is to make your peace with heartbreak, or else to see yourself become either delusional or cold and cruel.
There is an idea of America and the reality of the United States, and all of our greatest literature, rhetoric, and philosophy lives in that infinite gap between, our letters always being an appraisal of the extent of our disappointment. “The promises made in the Declaration of Independence and the Constitution,” writes critic Greil Marcus in The Shape of Things to Come: Prophecy and the American Voice, “were so great that their betrayal was part of the promise.” Thus the greatest of American political modes from the Puritans to Obama would be the jeremiad. Thus our most native of literature, be it Mark Twain’s The Adventures of Huckleberry Finn or Ralph Ellison’s Invisible Man, chart the exigencies of a dream deferred. All of American literature is a tragedy. What we’re living through now isn’t a tragedy, however—it’s a horror novel. Only the most naïve of fools wouldn’t be aware that that strain of malignancy runs through our country’s narrative—all of the hypocrisies, half-truths, and horrors—that define us from the moment when the word “America” was first printed on Martin Waldseemüller and Mathias Ringmann’s map of the world in 1507. In Stephen Vincent Benet’s classic short story “The Devil and Daniel Webster,” Old Scratch himself says that “When the first wrong was done to the first Indian, I was there. When the first slaver put out for the Congo. I stood on her deck…I am merely an honest American like yourself—and of the best descent.” What would Eden be, after all, without the serpent? A thing with devils is that they imply there must be angels; if you can find proof of hell, that indicates that there might be a heaven, somewhere. That’s the corollary to the failed covenant, that even with all of the hypocrisy, half-truth, and horror, there is that creed—unfulfilled, but still stated. Freedom of expression. Equal opportunity. The commonwealth of all people. Do I write jeremiads myself? Very well then.


I only do so to remind us that the confidence man huckster (who as I write this is only a few miles down Pennsylvania Avenue undoubtedly conspiring on what nightmares he’ll unleash upon his fellow citizens when he doesn’t get his way) is an American, if a cankered one. Take solace, though, because America isn’t just Stephen Miller, but Harriet Tubman and John Brown also; it’s not only Steve Bannon, but Frederick Douglass and Elizabeth Cady Stanton; more than Donald Trump, it’s also Eugene Debs and Dorothy Day, James Baldwin, and Emma Goldman, Harvey Milk and Ruth Bader Ginsburg. Such a litany of secular saints is of course inconsistent, contradictory, and I’ll unabashedly confess a bit maudlin. But that’s okay—we need not all agree, we need not all be saints, to still be on the side of those beings in any such Manichean struggle. More than just angels can fight demons; the only thing required is the ability to properly name the latter. Because if American history is anything, if the American idea is anything, it’s a contradictory story, that dialectical struggle that goes back through the mystic chains of memory, a phrase which I once read somewhere. The contradictions of American culture once again threaten to split the whole thing apart. Make your plans accordingly, because the battle always continues.
For such is the great moral struggle of this century. It is against neofascism and its handmaiden of a cultish twisted civil religion. It requires the breaking of this fractured American fever dream, where a vaccine is far from assured. Right now it seems like our choices are authoritarianism or apocalypse, though perhaps there are always reasons to hope for more. What’s coming, I can’t be sure of, but that lyric of the great prophet Leonard Cohen “I’ve seen the future, brother/It is murder” echoes in my numbed brain. Whether or not we can stand athwart history and yell “Stop!” or not, whether or not there is the possibility to affect genuine change, whether or not it’s we can still salvage a country of decency, justice, and freedom—I’m unsure. What I do know is that whether or not any of those things can happen, we must live our political lives with a categorical imperative that acts as if they can. Least of all so that we’re able to live with ourselves alone in the rooms of our minds. Live with at least some convictions, live spiritually like the men remembered in poet Genevieve Taggard’s lyric in honor of those veterans of the Abraham Lincoln Brigade. Americans (mostly socialists, communists, and anarchists) who went to Spain to fight the fascists in the years before the Second World War. “They were human. Say it all; it is true. Now say/When the eminent, the great, the easy, the old,/And the men on the make/Were busy bickering and selling,/Betraying, conniving, transacting, splitting hairs,/Writing bad articles, signing bad papers,/Passing bad bill,/Bribing, blackmailing,/Whimpering, meaching, garroting, – they/Knew and acted.”
Bonus Links:—Letter from the Other ShoreLetter from the PestilenceSteal This Meme: Beyond Truth and LiesOn Pandemic and Literature
Image Credit: SnappyGoat.

How ‘The Hitchhiker’s Guide to the Galaxy’ Saved My Life

The Hitchhiker’s Guide to the Galaxy wastes no time putting its earthbound readers in their place.

“Uncharted.” “Unfashionable.” “Unregarded.” That’s how Douglas Adams describes our interstellar neighborhood in the first sentence of the book. Then he zooms in: “Orbiting this sun at a distance of roughly 98 million miles is an utterly insignificant little blue-green planet whose ape-descendent life forms are so amazingly primitive, they still think digital watches are a pretty neat idea.” It’s among literature’s most savage burns (1970s category), and it’s directed at our entire species.

The rest of the book and its sequels, four by Adams and one by Eoin Colfer, proceed from there. Earth is labeled in the titular galaxy-spanning reference book as “harmless,” an entry later expanded after 15 years of research by roving-reporter-turned-galactic-Virgil Ford Prefect to “mostly harmless.” The most intelligent beings on the planet are revealed to be mice. By the end of chapter three only two humans are left alive, and one of them is Arthur Dent, an incompetent bore of a Dante who just wants to go home but can’t because first his house and then his entire planet have been demolished.

When I first came into contact with THGTG not quite 20 years ago, it taught me that it was okay to be an Arthur Dent. It was exactly the lesson I needed to hear at the time, even if it didn’t stop me from wishing I was cooler than I was. (I still aspire to be Ford Prefect.)

I was 13, and I hadn’t yet acquired even the barest hint of teenage indifference. Everything felt as though it mattered desperately. Every school test and every sporting confrontation became a measure of some critical part of my self-worth. And when I failed at them––or more accurately, “failed to meet my own ludicrously high expectations”––the only way I knew to release the emotion of this disappointment was through tears. Basically, I cried often and hard all the way through eighth grade, and that was how everyone knew me: as a smart kid wound tighter than a tourniquet. I was teased a lot and I was ashamed nearly all of the time and I could not for the life of me figure out how to stop it, despite the best intentions of my parents, who only ever put pressure on me to try to alleviate the pressure I put on myself.

The book twisted open the stuck valve in my psyche. I loved it from the first paragraph of putdowns, so much so that a friend and I spent our first encounter with it passing it back and forth, alternating chapters. We each finished it in a day. We might have torn it in half if we hadn’t already begun treating it as a holy text. We started carrying towels with us immediately. I bought a satchel to carry my books to high school in because that’s what Ford Prefect used. My crying ceased immediately, even though I wasn’t quite sure why.

I didn’t pick up the themes of the destruction of hubris while reading the series the summer before ninth grade. I didn’t recognize how the one thing nearly every character has in common is an outsized impression of his or her own importance. I didn’t tie together how the series is filled with protagonists and antagonists acting out their grandiose ambitions only to be thwarted by the most prosaic of means. The 7.5-million-year calculation of the ultimate answer to life, the universe, and everything produces the hilariously incomprehensible answer of 42. A convoluted scheme to meet the ultimate ruler of the galaxy turns up a recluse who habitually doubts anything outside his Cartesian Self, including his cat. In the sequel The Restaurant at the End of the Universe, the most effective torture device in the galaxy is the Total Perspective Vortex, which shows whoever is placed inside how insignificant they are compared to the enormity of the universe. In the third book, Life, the Universe and Everything, Arthur’s brief delusion of glory at Lord’s Cricket Ground nearly dooms all existence. (The universe is saved by his ineptitude.) The only final sin in Adams’s galaxy is ego, and the only character able to transgress and emerge unscathed is Zaphod Beeblebrox, the two-headed, three-armed outlaw president of the galaxy who lives all but oblivious to the consequences of his actions, and whom the author eventually got bored with and abandoned.

Instead, what my barely-teenaged self took from the book was that my existence was fundamentally absurd. The world is simultaneously too complex and too dumb for it to be anything else. Our rules and systems and desires and foibles are all constantly interacting in bizarre and incalculable ways. The book is a populist gateway to existentialism, exploring the tension between free will and an indifferent universe. “The chances of finding out what’s really going on are so absurdly remote that the only thing to do is to say, ‘Hang the sense of it,” and keep yourself busy,” says the planetary designer Slartibartfast in the first book. Like the existentialists, Adams—and his series—hardly advocate total surrender in the face of the world’s folly. Slartibartfast would come to lead Ford and Arthur on a quest to save the universe from a bellicose planet of raging xenophobes by locating a series of cricket-themed MacGuffins in Life, the Universe and Everything. 

But to believe, even as a 13-year-old boy, that you can brute force your way through life’s unpredictability by being successful at everything you set out to accomplish is to put yourself on the path toward becoming yet another of the planet’s surfeit of megalomaniacs. A person who will stop at nothing to get what he wants often ends up getting stopped by some nothing. (Though, as recent history keeps showing us, perhaps not often enough.) Sometimes in life you’re the state-of-the-art spaceship that can tamper with the rules of reality to dodge a pair of ground-to-air nuclear missiles and sometimes you’re the bowl of petunias, called into being by that tampering, miles above a strange planet ready to plummet to your doom.

Adams’s way of viewing the galaxy was so innately appealing to me that I came to adopt it as my own, meeting the universe with my own absurdity whenever possible but coupling that with a healthy degree of skepticism for the parts of its illogic that weren’t mostly harmless. This is how the Hitchhiker’s Guide taught me to be a grown-up, some combination of spinning things further and sillier and backing away slowly. It is still the mode in which I operate as an adult. It feels kind of similar to how many people operate on the Internet, which is perhaps not always the healthiest of all relationships to have with the world, but for me it was a short-term boon.

The humor and wit that made Adams’s story of the planet’s destruction popular around the world allowed his arguments to slip through unnoticed and slice the Gordian knot of my self-absorption. “If life is going to exist in a Universe of this size, then the one thing it cannot afford to have is a sense of proportion,” Adams writes in The Restaurant at the End of the Universe, but that’s exactly what he gave me. I received, if not total perspective, then at least a new and much-needed lighter one. He helped me find my own small, singular place in the universe.

The Time My Grandma Was in ‘Playboy’

It’s the kind of startling revelation that seems a little too good to be true: that at some point prior to the mid-1970s, my grandmother—wife, mother, and pillar of her midwestern community—was featured in Playboy magazine.

Not as a centerfold, but as a writer.

I first learned of this a quarter century after my grandmother’s death, while visiting my parents’ home in Fort Wayne, Ind.

One minute I’m sorting through boxes of my grandmother’s writings—lamenting her lack of success, despite her lifetime of effort—and the next, my mother’s remarking, “Well, she did publish in Playboy.”

I lift my head from the sheaths of typed pages.

“Wait, where?”

“Playboy,” my mother repeats. “At least I think so. You should really ask your aunt and uncle.”

I do ask my aunt and uncle. The former has no recollection, while the latter retains only the foggiest memory. As my uncle tells it, he was home from college in the early 1970s when his mother mentioned her publication in passing. He can’t recall the context, only the gist: that his mom published in the most popular magazine in his dormitory.

Which is to say nothing of its popularity nationwide.

In 1972, Playboy’s circulation topped 7 million. Compared to today’s numbers, that’s Time and Sports Illustrated subscribers combined. In those days, Playboy readers were middle-class, middle-aged, well-coiffed professionals who drove new cars, dreamed of travel, and knew how to sip a good scotch. That is, if the magazine’s 1960s era “What kind of man reads Playboy?” ad campaign is to be believed.

While the majority of these readers enjoyed the magazine for its articles…seriously, readers ought to enjoy the magazine for its articles! In 2017, following the death of Playboy founder Hugh Hefner, Pete Vernon of the Columbia Journalism Review remarked that Hefner had printed “more serious journalism and fiction than just about any other magazine publisher.” Even a cursory glance of the magazine’s table of contents confirms it. Over the years, it showcased many of the 20th century’s most celebrated writers: Kurt Vonnegut, Jr., Bernard Malamud, Joyce Carol Oates, Ursula K. Le Guin, John Updike, Vladimir Nabokov, among others.

But had it showcased Corrine Hutner Wittenberg, too?
*
In April of 1948, The Folio—Indiana University’s campus literary magazine—published a poem titled “Anniversary” by a poet known simply as “C.H.” At the time of the poem’s publication, C.H.—a thinly-veiled pseudonym for Corrine Hutner—was a 19-year-old English major at IU. Though written eight months prior to her mother’s death, my grandmother wrote with astonishing prescience of the grief soon to come. The poem explores the complexities associated with the anniversary of a death.

My neighbor woman comes to call / And says she happened to be close / And only stopped to rest awhile— / How strange she calls upon a ghost:”

Reading it, I can’t help but note the echoes of Robert Frost in its phrasing. Or the rhythmical hints of Shakespeare in its lines. Yet what strikes me most is the narrator’s reference to herself as a “ghost,” a vanishing act seemingly replicated by my 19-year-old future grandmother’s refusal to use her full name for her first publication.
*
Days after learning of my grandma’s alleged publication, I take my search to the search engines, stringing together one set of keywords after another in the hopes of confirming or denying the claim. I receive no such closure. Though I find nothing to prove that Corrine Hutner, or Corrine Hutner Wittenberg, or “C.H.” ever published within Playboy’s glossy pages, I also find nothing to the contrary. For me, the absence of proof merely means I’m not looking in the right places.

Next, I call the magazine’s corporate office. Despite my long and earnest message, I receive no reply. I mull, I speculate, I theorize. And eventually I arrive at this: if my grandmother published under a pseudonym in her earliest work, might she also have employed a pseudonym later? Particularly for a publication like Playboy, whose reputation would’ve spelled trouble for a female, high school English teacher, such as herself. What would her midwestern colleagues have thought? Or the members of her synagogue? Or her bridge partner? Or my grandfather? Or my mom?
*
In a 1966 issue of American Judaism my grandmother published a story titled “In the Talmudic Tradition.” She published it under the name by which we all knew her: Corrine Wittenberg.  Her bio makes no mention of Playboy or any other publication, stating simply that she “teaches English in a Fort Wayne, Ind., high school.”

Her story, which runs all of three and a half pages, is about a group of Jewish women who, over afternoon coffee, question a rabbi on issues pertaining to good and evil in the world. Though the rabbi hardly gets a word in, the women leave satisfied with the answers that they themselves have supplied. Especially Rose Frankel, a widow of 11 years, who, while on her drive home, at last comes to terms with her own long held grief.

I read my grandmother’s words carefully:

“There is more to man than just his intellect, Rose Frankel, concluded; there’s a continuation of things, a pattern.”

Every time I reread the story, I see more glimpses of Corrine Wittenberg in Rose Frankel.

I wonder: What if Rose Frankel published in Playboy?
*
Since I’m only a paywall away from the learning the truth, I invest eight bucks in a one-month pass into Playboy’s digital archive. The archive stretches back to the magazine’s beginnings in 1953, so I limit my search from 1966 to 1971, the period that—based off of my uncle’s foggy memory—seems likeliest for my grandma’s publication.

Beginning with January 1966 (which features an interview with Princess Grace and a story by Roald Dahl), I scan the pages for any and all female-sounding pseudonyms; Rose Frankel, most of all. Making my job easier is the inconvenient truth regarding Playboy’s gendered publishing history of that era. While women are, indeed, a focal point of the magazine, their role is mainly confined to pictures rather than print. In fact, throughout these years, I’m hard pressed to find a woman’s name in any byline. And certainly no Rose Frankels.

Changing tack, I leave the pseudonyms behind and turn to content. What subjects might my grandmother have written about?

Since much of what I know of my grandma’s creative work revolves around her American Judaism story, I search the contents for the most Jewish-themed pieces I can find. But I can only find works by well-known writers who were not my grandmother: Sol Weinstein’s “On the Secret Service of His Majesty the Queen” (August 1966), Harry Golden’s “God Bless the Gentile” (December 1966), and Isaac Bashevis Singer’s “The Riddle” (January 1967).

Bleary-eyed, I close my laptop. I’m 50 issues deep, and so far, Corrine Wittenberg and her alter egos are nowhere. Had I known my grandma better, I might’ve had some insight into this mystery. But the truth is, I barely knew her. We shared 10 years on this earth, and she lived just miles away, but that time and distance did little to bring us close.

I think we enjoyed our time together, though admittedly, much of what I remember of her is confined to a single rainy afternoon. I was nine or so, and we’d just come across a recipe for chocolate-dipped bananas. We immediately got to work, melting the chocolate in the pot, sliding the bananas onto popsicle sticks, and then placing them on a cookie sheet in the freezer.

Though we’d both dedicate our lives to writing, she died before we could ever discuss our shared passion. What might our relationship have been like if we had another 10 years? Or at least a few more rainy afternoons?

Maybe, over chocolate-dipped bananas, the older me would’ve said, “So Grandma, tell me about the time you published in Playboy.”

“Ah,” she’d have smiled. “Now that’s a story, indeed…”
*
After reading my grandmother’s boxes of creative work, I turn to her less-creative work: her journal, her query letters, her high school lecture notes. The more I read, the more a single word emerges: “evidently.”

Evidently, evidently, evidently.

The word pops from the pages as if aglow in neon light.

This is my Hail Mary. My shot in the dark. My once more unto the breach.

Leaping to my laptop, I return to Playboy’s archive, then use the “search” feature to determine when and where (and in what frequency) the word “evidently” ever graced the magazine’s glossy pages.

I count four instances in the January 1967 issue (featuring an interview with Fidel Castro), and five in the October 1970 issue (which includes an excerpt from Norman Mailer’s The Executioner’s Song). The other issues make occasional use of the word, but not so much as the ones noted above.

I trace every “evidently” back to the byline, but eventually the “evidentlys” run out.

My throat tightens at the inevitable truth.

Evidently, my grandma was never here.
*
A few weeks back, I donated a box of magazines to a thrift store.

Not Playboy’s, but literary magazines.

Not just any old literary magazines, but the ones that had been kind enough—or foolhardy enough—to publish my early work. Most of these publications occurred throughout my graduate school years, and I’d been gobsmacked and grateful every time.

I’d like to tell you I read those magazines cover to cover. (I’m sure Playboy readers would like to say the same of that publication.) But the truth is, I barely flipped through them. Back then I fetishized the publication credit far more than the piece itself. In my defense, I was a broke grad student in search of a job. Every line on a CV, I liked to think, brought me one step closer to being employed. It would’ve been nice to dedicate an afternoon to pleasurably paging through a literary magazine, but it seemed an indulgence I could hardly afford. If I wanted a job, I’d have to write my way into one.

I dumped those magazines and never looked back.

I wonder if my grandma might’ve held on to hers a bit tighter.
*
Weeks after failing to find a Playboy publication that likely never existed, I stumble upon a startling photograph of my grandmother in a 1967 high school yearbook. In it, my 40-year-old grandma is the epitome of style, complete in skirt, matching jacket, and what appear to be pearls. Her right hand holds a textbook as her eyes point toward a student just out of frame. A soft smile graces her lips, while the front row students reflect it back, their pencils at the ready. If the photo’s caption is to be believed, the class is engaged in an “interesting discussion about medieval English literature.”

Suddenly I see my grandmother in a way I never have before: attentive, engaged, and seemingly satisfied. She appears fully in control of her classroom, and dare I say it, hopeful.

And why shouldn’t she have been?

The photo was taken the year after her publication in American Judaism and 27-years before her death. In the time between she’d write thousands of pages that no one would ever read, but how could she have known that then?

For all she knew, maybe a publication in Playboy awaited her.

I like to think that my grandma’s truest legacy can’t be found in the pages of Playboy, but on her death certificate, instead.

Typed beneath “Occupation” are the words “English Teacher.”

But did her commitment to her students make up for the lack of success she received in her literary life? Did she find sufficient satisfaction in a stack of Chaucer papers? Or in a sentence properly diagrammed?

Had she lived long enough, I would’ve asked her these questions, along with the most important question of all.

“So Grandma, tell me about the time you didn’t publish in Playboy.”

“Ah,” she’d say. “Now that’s a story, indeed…”

Bonus Links:
B.J. Hollars Explores the Midwest’s Strangest Corners
The Time I Opened for Bon Iver: On Allowing Failure to Flourish

My First Year as a Mother, I Only Read Women Authors. Here’s What I Learned.

When I was six months pregnant, I moved across the world, and I found myself thinking a lot about containers. First, in order to move I had to put everything I owned, including books, into containers. Then those containers had to be loaded into a shipping container that went across the Atlantic. My old life had to be folded and put away in the trunks of memory as I said goodbye to friends, quit a job I was sorry to leave, broke the lease on my one-bedroom apartment, and signed the paperwork for my spousal visa. And in the third trimester, it had become more and more obvious that my body was itself a container—one that was struggling to contain a writhing, wriggling being.

The boxes carrying my things arrived at my new home a few weeks before my due date, and I put my books back on the shelf, not knowing how to organize them. When would I get to read again? As I waited for contractions to start, I read as fast as I could, soaking up the alone time. But it turned out having a baby didn’t mean I couldn’t find time to read. It just meant reading was different. I read on my Kindle app on my iPhone as the baby nursed. I spent long, lonely maternity leave days browsing the shelves at the local library, and then I read my picks while the baby napped.

But what should I read, now that I inhabited this strange, new life? The world was no longer defined by containers—I was outside of all the boxes now, wandering around in a cold new world and watching over a vulnerable, needy being who didn’t know or care what I was thinking about.

I decided to take on a year-long experiment of reading only women authors. My energy to read—and especially to be an engaged, opinionated reader—was dwindling. I wanted to find inspiration and understanding in the voices of other women. It was reductive, I knew, to imagine other women were the solution, but at the same time I craved reductive thinking. I just wanted things to be simple, and to work.

Early in the year, I found a book that let me look inside another new mother’s postpartum mind, and I recognized my own warped perceptions. The book was Little Labors by Rivka Galchen. “She had appeared as an animal,” Galchen writes about her newborn daughter. “A previously undiscovered old-world monkey, but one with whom I could communicate deeply: it was an unsettling, intoxicating, against-nature feeling. A feeling that felt like black magic. We were rarely apart.” Galchen’s book is in fragments, in dream-like observations and factually-presented metaphors that echoed my own disordered internal world. Suddenly, I felt like I was in a container again—a box labeled “Mothers like Rivka Galchen.” I was sure my experiment was working.

But right away I found another book that made me just as sure it wasn’t. I read Lisa Taddeo’s Three Women. This book showed me the futility of imagining a single, unified female narrative. I saw the futility of even trying. It was a book I read eagerly, because it promised to tell me something about women’s desire. My own poorly-timed child had been the product of my unruly desires, which I struggled to understand in their aftermath. But after I finished racing through the pages, I was disappointed. I’d seen nothing of myself. I was alienated again; I didn’t fit in to this story. Women talking to women did not create a container in which all women fit.

I read Ottessa Moshfegh’s My Year of Rest and Relaxation, and I felt an old, familiar container opening back up again, like an inviting little side door on my slog towards the dim light at the end of the tunnel. This was the familiar feeling of geeking out over a new writer (even if I’d come late to the party on this one). It was like I was 17 and holding the first issue of the Believer magazine all over again. The escapism of fandom was intoxicating. The narrator of this book cloistered herself in her Upper East Side apartment and dulled her days with sleeping pills. I was away from society, too, pushing my baby around in a stroller through our deserted, November-dark suburb, so empty during the weekdays that it was like a post-apocalyptic dystopia. I’d see another mother pushing a baby on the horizon and we’d skitter away from each other like dead leaves, rather than risk interaction. I related to My Year of Rest and Relaxation and I was in awe of My Year of Rest and Relaxation. It felt good to care about something like that again.

I read Hanya Yanagihara’s A Little Life, and I was outraged. This book had been so hyped, was so incredibly, vastly long—and it was so, so bad. Again, it felt good to care. It felt good to be invested enough to be outraged. The rage I felt at having completed such a frustrating and deeply problematic tragedy/fantasy was a healthy, jumping-jacks kind of rage.

I read How Not to Hate Your Husband After Kids by Jancee Dunn and The Book You Wish your Parents Had Read by Phillipa Perry, and I thought both were helpful, but that both also needed to be read my husband for their strategies to work. Suddenly the idea of women’s self-help seemed like an unfair racket: women lecturing women about how we could help ourselves, I thought, as if we existed in a vacuum. It was depressing, and the opposite of escapism. It made the world harsh again. I needed a balance, I thought, between escaping to contained pockets of my old life and crashing into the boundless difficulties of my new one.

I read Conversations with Friends by Sally Rooney (I loved it, I devoured it, but I could barely remember being 19 and it made me feel like I’d wasted my life). I read The Idiot by Elif Batuman (same, pretty much, although it made me glad not to be 19 anymore, instead of regretful). I read My Sister the Serial Killer by Oyinkan Braithwaite and What I Know About Love by Dolly Alderton.

The year wore on, and the days began bringing our suburb a buttery June light, and my son saw trees with leaves for the first time, and I found something that helped. I began to read collections of essays that used the authors’ stories to talk about the stories of the wider world, and vice versa. These were The Collected Schizophrenias by Esme Weijun Wang, Against Memoir by Michelle Tea, Coventry by Rachel Cusk, and Trick Mirror by Jia Tolentino. Once I read one, the rest dominoed after. In these essays, I heard women speaking about themselves and their own stories, without any trace of self-absorption or self-pity.

Wang writes about her experiences with a type of schizophrenia, and then weaves the history and experience of others with schizophrenia into, out of, and around her story. Cusk writes about her parents giving her the silent treatment, and her sadness about the difficulties of mothering adolescents, with an unemotional clarity that gives a kind of grounded, factual comfort to these painful experiences.

It was as though the essayists were opening and shutting a door to the outside world as they wrote, letting its weather into their internal world. At other times, they stood outside themselves and looked in through their windows. The interior and exterior were distinct, and there was a power in the dramatic, architectural interplay created between the two. I luxuriated in the richness of these essays, and I was inspired. I wanted to write about my own journey, my difficulty with motherhood, and my new country. Most of all, I wanted to feel the solidity of knowing how to talk about these things: to paint a square picture with black-and-white lines, then to run colors all over the top, creating something both coherent and ambiguous. Wanting to do that didn’t mean I knew how to, yet. But it was a start.

Image Credit: WikiArt.

Please! Hold Off on That Novel Coronavirus Novel!

Got some bad news for you, on the off chance your bad news supply chain is breaking down. American publishers have gone on a spending spree in hopes of snagging breakout books spawned by the coronavirus pandemic. “Three months into the biggest public health and economic crisis of our era,” The New York Times reports, “authors and publishers are racing to produce timely accounts of the coronavirus outbreak, with works that range from reported narratives about the science of pandemics and autobiographical accounts of being quarantined, to spiritual guides on coping with grief and loss, to a book about the ethical and philosophical quandaries raised by the pandemic, written by the Slovenian philosopher Slavoj Žižek.”

The operative words in that sentence are racing and timely because they point to an irony that can be viewed as an axiom: writing that’s forged in the cauldron of a crisis almost always winds up being undercooked. A writer racing to be timely is, by definition, not pausing to digest, muse, rethink, revise. Some of the forthcoming writing about the pandemic might throb with immediacy, but the bulk of it will likely be solipsistic and slapdash, especially the fiction and diaries and, ugh, those autobiographical accounts of being quarantined. In this case, writing that’s timely is likely to be ephemeral, destined to fade soon after the virus runs its course or gets vanquished by a vaccine. Remember the immediate wake of the 9/11 terrorist attacks, all those proclamations that irony was dead? Irony didn’t merely rise from the ashes of the Twin Towers; it went on to become the gyroscope of much contemporary fiction, sometimes for better, mostly for worse, and by now it has become a universal mechanism for coping with day-to-day life in a rattled world. And that was before this pandemic descended.

Some of the forthcoming plague books might prove me wrong, especially the nonfiction titles about the economic fallout of the pandemic, frontline accounts from overwhelmed hospitals, forensic studies of how the virus took root in human hosts, and a forthcoming collection of case studies of how Covid-19 and other infectious diseases spread. (The question must be asked: who’s racing to write the books about cooking, binge TV watching, pet grooming, and Donald Trump’s golf scores during this pandemic?)

Far less promising is the coming glut of personal accounts, whether they’re fiction, poetry, diaries, or journals. Exhibit A: the ongoing “Pandemic Journal” series in The New York Review of Books, which features writers all over the world sending in personal dispatches. These accounts blur after a while because they swim in a soup of sameness and lack the specificity that brings writing to life. When everybody in the world is doing the same thing, just how unique or interesting can it be? For instance, we learn that there are chronic toilet paper shortages in both London and Sydney (and, I’m guessing, in every other hamlet on the planet). Ali Bhutto writes from Karachi that the usual hum of traffic coming through the bedroom window “has been replaced by silence” (Ditto here in downtown Manhattan). Liza Batkin writes from Rhinebeck, N.Y., that she had to pause to ask her mother if she should dry the dishes with a dish towel or a paper one (I know the feeling). Christopher Robbins writes from New York that “a playground writhing with children in 60-degree weather feels downright sinister” (Got that right). If I know these feelings, do I benefit from knowing that millions of other people know them, too?

Exhibit B: a recent issue of The New York Times Magazine, which features a roster of writers relating “What We’ve Learned in Quarantine.” Among the predictable lessons are that many people liken quarantine to being in prison or at war, yet there are salutary rewards to be found in such solitary activities as braiding your own hair, learning to play the piano, watching birds, and photographing your daughters. Most of these accounts barely rise to the level of tepid uplift, and they’re further proof just how difficult it is to say something wise, or even original, about a pandemic. If you doubt this, I present Exhibit C: a recent essay in The New York Times Book Review by Michiko Kakutani, who was struck by the eerie silence and emptiness of the streets in New York, which, she reminds us, used to be known as “the city that never sleeps.” When the high priestess of American lit crit is reduced to borrowing clichés from Ol’ Blue Eyes, you know you’re in trouble. Kakutani then reminds us just how primitive life was in 17th-century London when the bubonic plague descended: “There was no Purell back then, no Clorox wipes or Lysol spray, no grocery deliveries from Fresh Direct and Whole Foods, no Netflix or Roku to help pass the time.” Thanks for the heads-up, Michiko!

Now I’d be the last person to knock writers who have the good sense and the good luck to get paid for their work. So on one hand, I say bravo to all the writers with freshly inked contracts for pandemic books. On the other hand, I would like to make a simple plea, especially to the writers of poetry and fiction: don’t rush, take your time, let the current horrors seep in deep before you try to make art out of this nightmare we’re all living through. For inspiration, novelists and poets and short story writers should look at the examples set by two writers, one from the 18th century, the other working today.

Daniel Defoe took his time before writing about his era’s horrific calamity, publishing A Journal of the Plague Year almost 50 years after the bubonic plague ravaged London in 1665. The book purports to be a first-person account of that grim year, and its rich detail and plausibility led many readers to regard it as a work of nonfiction rather than what it was—a deeply researched work of imaginative historical fiction. (Defoe was five years old during the plague.) The Nobel laureate Orhan Pamuk has spent the past four years researching and writing an historical novel called Nights of Plague about an outbreak of bubonic plague that killed millions in Asia in 1901, more than a century ago. Before putting pen to paper, Defoe and Pamuk had the good sense to let time do its work of giving traumas context and perspective.

Back on April 8, Edan Lepucki, a gifted novelist and one very funny mother of three, imagined the “least anticipated” fiction that might come out of the pandemic that was then beginning to unleash its ghastly fury. If ever there was a time that demanded a good laugh, this was it. And Lepucki delivered, imagining novels with such titles as Social Distance Warrior, The Spread and my personal favorite, Stay-at-Home Mom. This last, in Lepucki’s overheated imagining, is the story of a woman named Hannah who’s cooped up in her tiny Brooklyn apartment with her husband and daughter and feels her sanity slipping. Slipping so badly, in fact, that “sometimes she imagines cutting off her own arms and legs and hoisting her bleeding torso into her rollaway suitcase and zipping it up (with her teeth) and rotting there forever.” (After being cooped up in my tiny apartment for 11 weeks, I know the feeling.) “And,” Hannah muses, “how all this is better than her old publishing job where she was regularly expected to kiss the egomaniacal asses of Bookstagrammers who never read the novels they posed next to succulents and mugs of bone broth.” Now there’s a novel coronavirus novel I would pay good money to read.

Image Credit: health.mil.

Bonus Link:
On Pandemic and Literature