Extinguishing the Self: On Robert Stone

1.
Until the pandemic forced us into hiatus, I curated a reading series for emerging writers in New York City. For 13 years, we met monthly at KGB Bar, a literary venue in the East Village. The bar was rarely full, but it was always a chore to get people to quiet down; we encouraged readers to invite friends, family, and other writers. On the best nights, the place was full of convivial anticipation, like we were throwing a big, bold send-off before these promising writers lit out to new territory.

Standing at the podium before events, the sights and sounds often reminded me of the wet, snowy Sunday evening when I heard the late, great Robert Stone read in the very same room. He was in the last years of a long life. His flowing beard was all white and emphysema made a whisper of his gravelly voice. His audience had dwindled and there were fewer people in the room that night than on evenings when I hosted readings for less accomplished authors. This is just one of the many lessons Stone taught me: that you can be nominated for four National Book Awards and a Pulitzer—and still face a half empty room at the end of your career.

You can get the full fathom five of Stone’s biography from any of a dozen sources. Stone himself wrote a memoir, Prime Green, that tracks through his Catholic school boyhood, the burden of growing up as the son/caretaker of a schizophrenic mother, and the hows and whys of his decision to run away and join the Navy as a young man. At the beginning of his career, he famously tripped with Ken Kesey and the Merry Pranksters. His second novel was made into a movie starring Nick Nolte. He once joked to an interviewer that he always ended up running into the much more famous author Paul Auster at parties. He was among the literati in the era when being seen around town was a part-time job, long before Entertainment Tonight and ages before social media made celebrity a full-time out-of-body experience.

“You were part of that world,” the interviewer Christopher Bollen said to him in 2013, talking about the druggy counterculture era. “But you have a rare career in that you moved beyond it. How?” Stone’s response was characteristic of the man: pointed, honest, and unglamorous. “I really, really wanted to write,” he said. He knew that his reach was beyond his grasp as a young writer. “I wanted to be a goof on the bus, but I wanted to write more.” So he went to work.

If you consult with the sages at Encyclopedia.com, this is how all that effort worked out for him: “Robert Anthony Stone (born 1937) was an American novelist whose preoccupations were politics, the media, and the random, senseless violence and cruelty that pervade contemporary life both in the United States and in parts of the world where the United States’ influence has extended, such as Latin America and Vietnam. His vision of the world is dark but powerful.”

Well, yes; but also, no.

The novelist Madison Smartt Bell, in an encomium in The New Yorker after Stone died in 2015, claimed that all Stone novels include the character of “a man whose idealism has been blunted by experience.” Certainly, this is true of Stone’s best books, by which I mean (in order): Dog Soldiers, A Flag for Sunrise, and Damascus Gate. In that same 2015 essay, Smartt observes that for all the protagonists of these books, the main narrative is of a redemption that must be earned. Nothing is handed to them. “Stone and his characters struggle with all received ideas at a very high level of intellectual honesty.”

In interviews and essays, Stone never denied that he wrote stories he hoped would capture the fancy of readers. He was not writing for his private muse. Nor was he a David Foster Wallace, tortured by inner Furies, pouring his thoughts onto the page in a losing bid for freedom. You can still watch Stone speak in numerous video interviews on YouTube; he smiles often, wears professorial jackets and ties, and lounges at tables beside a fire. Stone wrote big, rollicking stories like Conrad, Melville, and Dickens because those were the kind of stories that he loved and were large enough to suit his themes. He was a writer who lived in the world and wrote stories full of living.

On the word-by-word level, his work has the jostle and sting of real life; as a writer he inhabited the people in the stories in order to tell their tales. Speaking of A Flag for Sunrise with Kay Bonetti in 1982, he expresses the surprise he felt when two of his characters broke out into a dramatic quarrel at one point. “The day I started writing that piece I didn’t realize that was going to happen,” he said. “It just developed as I wrote the dialogue and imagined myself into the situation.”

For all his timeliness of story and milieu, however, you cannot approach a Robert Stone novel at high speed. He published four books after 1998; all of them have strengths but also none of them feel quite of our time. I suspect this is why his popularity began to wane after the publication of Damascus Gate. You either slow down and let the chemicals of his words do their thing, or you might as well fly on by.

2.
Stone’s best claim to literary fame is the 1975 National Book Award, when the selection committee picked his third novel, Dog Soldiers, as its fiction prizewinner. Stone’s description of his academic experience at Stanford a few years earlier could just as well describe this deeply paranoid masterpiece: “I spent a lot of my time, when I should have been writing, experiencing death and transfiguration and rebirth on LSD in Palo Alto.”

What were the National Book Award judges thinking when they chose to award the prize to this novel, a druggy, rough tale of a playwright-turned-journalist who loses his shit in Saigon and manages to ravage his entire life before the last page? Cast in granite prose, oracular in the best and worst ways, full of scenes that show but confide less than a gruff Midwestern boyfriend, Dog Soldiers has a thrilling plot, but I’m not sure I could tell you what happens in it, even on a close reread. Did the judges find in the book a reflection, darkly, of the chaotic post-Nixonian world in which they lived? Certainly, this is the easy go-to explanation for the adjunct profs who include it on reading lists and the marketing copywriters who prepared promo material for the latest reissue in 2018. It’s a book about hippies! ’Nam! Failed authority! LSD! Well, yes; and no. Dig a little deeper, and, as with Stone’s fiction, a complicated, interconnected counter story begins to take shape.

Fact: the year before Dog Soldiers won the National Book Award, the award was discontinued, briefly. So perfectly Stone. In 1974, the prize jury chose Gravity’s Rainbow by Thomas Pynchon, a writer so writerly that he refused to give interviews. This was apparently the last straw for the publishers who underwrote the prize. They cut their funding. Completely. National Book Award organizers refused to give up, though. They assembled a temporary committee to give their award one more time. They begged the likes of Exxon and Jackie Kennedy Onassis to donate enough moola to keep the lights on. It was one more sign of the times in an age when no institutions seemed like they were going to last. Exactly the kind of world that Dog Soldiers paints in miniature. A perfect choice. Almost as if it were the work of fate. Fate of the kind that flickers in the flames of Stone’s best work. Fate that you can laugh at and say you don’t believe in, but that still has a chance of being true. Robert Stone had to win the prize that year. Because we all needed an author preoccupied by outsiders to be granted the status of a literary insider—so he could go on writing, thinking, and teaching all of us for the next four decades.

3.
The first time I tried to read Robert Stone, I couldn’t stand his prose style. I was 22. Stone’s second masterpiece, A Flag for Sunrise, was on a grad school syllabus that also included the likes of Clockers, Under Western Eyes, and Gentlemen Prefer Blondes. The sine non qua for inclusion was that each book operated in a genre but also rose above its intentions, a concept that my thesis advisor and mentor, David Plante, also inculcated into our thinking in weekly creative writing workshops. He never said as much, but I believe that what Plante was trying to teach us was that if we were to become decent writers, which is to say writers worth our salt, we had to be contextual readers.

My first reading of Stone was troubled by the fact that I was addicted to Cormac McCarthy at the time. For my money, McCarthy is perhaps the only other major male author of the late 20th century who writes convincingly in the cut of the moment. McCarthy and Stone were born within four years of each other. They both had a stint in the military. Both write/wrote painfully slow, labored over their craft, and had very little commercial success at first. But in form and vision, they are opposing calculations on either side of an equals sign.

I read perhaps two pages of A Flag for Sunrise before putting the book back down again. The pace felt too slow. The sentences were sharp but stilted. Characters kept starting and stopping and staring. There was a nun with a man’s name. A lieutenant who was clearly also a drunk. Familiar and unfamiliar all at once. I attended the seminar session on the book without having read A Flag for Sunrise at all. How very Robert Stone of me. I had high hopes for the novel; I was myself trying to write a book that I envisioned as a literary novel with a great plot. That perfect fusion of high and low culture. But I was too eager, too hurried in my work, too starry-eyed with the idea of being done.

Cormac McCarthy novels reward you on a page-by-page basis, or at least they do if his stiff prosaic mescal is your kind of thing. A Stone novel takes longer to get going, and even longer to alter your insides. A Cormac cocktail hits you before the ice cubes melt; the work of Robert Stone will only be clear the next morning, when you realize that you blacked out hours before you got home.

I returned to A Flag for Sunrise a decade later. I revisited Stone in part because I had run out of Graham Greene novels worth my time. After my Cormac McCarthy phase ended, I suffered a long bout of Greene fever. God, how I adored Greene’s books. I still have flash burns on my heart from the pages of The Quiet American, The Heart of the Matter, and The Power and the Glory. To learn more about Greene as a writer, I had even gone so far as to read Greene on Capri, the memoir by Shirley Hazzard (herself a great writer, criminally overlooked both before and after she died).

The rediscovery of Stone was a relief, and a blessing, but not because he was aping Greene. There are plenty of lesser writers who do just that. No, Stone was a find because he added to what Greene was doing. His work possesses the urgency of Greene—the sense of people battling against the dark authorities of this world—but also something else, something that took me many novels and many hours of consideration to realize was lacking in Greene’s novels: a love of living.

Stone was often asked by interviewers for his thoughts on Graham Greene. He was never ambiguous: “He is not a favorite of mine,” he told Charles Ruas in a 1981 interview. He speculated that his antipathy was due to being compelled by nuns to read Greene and Waugh as a schoolboy. Stone was still clinging to that story when he spoke in 1982 to the Missouri Review. But by the end of his life, during his 2015 interview with Christopher Bollen, he no longer felt the need to tiptoe around his deeper feelings. “I always knew I hated Graham Greene,” he said, “even though I thought he was a really good writer.”

Stone’s antipathy, I think, was not professional so much as personal. Graham Greene, for all his talent as a writer, was not a good man. Just about 10 years after Greene died, the Daily Mail wrote a long, dark hit piece on him. The article is a slog through a great writer’s sins. A photo of Greene in late age is captioned as follows: “A man without honour: Graham Greene was an alcoholic who abandoned his wife and two children for affairs with a series of married mistresses.” I learned from the article that late in life, Greene tried to start a brothel on a Portuguese island. And that he shared a house in Italy with an avowed pederast. Asked about his estranged children, Greene is reported as saying: “I think my books are my children.” Graham Greene was the kind of person that no one would want to be constantly compared with. Not if you really cared about the company that you are perceived to keep. And certainly not if you were someone as humanistic, thoughtful, and apparently kind-hearted as Robert Stone.

Perhaps the important difference between Stone and Greene is that while Stone “really, really” wanted to be a writer, he wanted equally to be a good person. I don’t mean in the personal sense, although that seems to have mattered to Stone, too. I mean in the sense of saying things that help guide his readers to a better understanding and appreciation of the world. In the very first words of a taped interview with the Writer’s Institute in 1996, Stone says that people need stories in the same way that the waking mind needs dreams. We put together narratives in order to make sense of life. The punch line of a joke, he goes on to say, is actually a forced recognition of how things are. There is no natural narrative of things; it’s all just out there. “It is up to human will and human ingenuity to compose all this into a narrative.”

4.
If at this point you have in your mind the image of Robert Stone as a neo-Papa standing at his desk and writing out novels by long hand–then you are mistaken. Nor was he a Melvillian scrivener hunched over a desk for hours to write in a slanted longhand better suited for logging barrels of salt pork. The galloping narrative of his books will put you in mind of Stendahl; the moral weight of his vision is on level with Dostoevsky. But those two novelists dictated their work to stenographers. None of this applied to Stone. He was a typing man. He joined the navy as a radio operator, as he reports in his memoir. Later, as he told Bollen in 2013, he learned to type by taking Morse code. “I was using the typewriter from day one,” he said.

Not an Olivetti, either. A Paris Review feature from 1985 describes Stone as working in a cluttered attic at a table just large enough to hold his word processor. That’s right, a fully modern word processor. Unlike Cormac McCarthy, the image of the artist is not meant to be confused with the images in the work. Stone is neither ascetic nor saint. He was just a writer, a big-hearted one.

A picture of Stone in the mid-2010s in the Paris Review shows no fewer than two computer screens. One of them, a laptop with his reading glasses resting on the keys, is—I regret to inform the steampunks among you—almost certainly a MacBook. He was not a simple throwback. Or a caricature. His wife was a waitress when he met her. He remained married to her for his entire adult life. They lived in a simple house in Connecticut on the shore, and their two children, a boy and a girl, grew up and moved out and started their own lives as children everywhere are wont to do.

His work reads as if it were composed to the tune of clanging blacksmiths and left to cool under the stars somewhere far from land. This is the conundrum of good writing. It can take you anywhere. But in Stone’s case, the words you read were almost certainly crafted in a quiet place, by one person, typing in solitude, hopeful of the value of the time spent, but equally certain that it may never mean anything much to anyone. This is the gamble.

Stone suffered to bring the right words forth in the years before acclaim and even afterward. He worked on his first novel, A Hall of Mirrors, for six years. In the era of hot takes and from-the-hip tweets, six years is an eternity. But it is what it takes when you are groping in the dark as a writer.

I find the story of how long Stone labored on his initial book to be both inspirational and validating. I spent six years working on my first novel. It was my thesis while at Columbia. Stone labored over his work while a fellow at Stanford. He had to keep working on the manuscript after he graduated, as did I. He struggled to work and write at the same time. As he told the Paris Review about that first book: “I’d work for twenty weeks and then be on unemployment for twenty weeks and so on. So it took me a long, long time to finish it.” This is the writing life. I am writing the first draft of this essay while I sit on a wooden bench in a coffee shop in Harlem where ironically they are playing Creedence Clearwater Revival’s 1971 single, “Have You Ever Seen the Rain?” I have not been employed full time for months, except for a few consulting gigs. I have been desperately writing this whole time, concocting and executing the first draft of a new novel and rounding out essays like this one that have been ricocheting around in the steel drum of my mind for ages. You find a way to get the work done whenever and however you can, almost as a sidebar to real life. And yet it’s the part of life that you most want to talk about with an interviewer from a literary magazine. Stone anticipates everything that I feel as a writer. There is this long exchange, from that same Paris Review interview, which might as well be a diary entry from my own life, except in my case the book that I’m jazzed over is called Likeness, and it won’t win a National Book Award, because I’m no Robert Stone, but the feelings are all the same:
INTERVIEWER
Is writing easy for you? Does it flow smoothly?
STONE
It’s goddamn hard. Nobody really cares whether you do it or not. You have to make yourself do it. I’m very lazy and I suffer as a result. Of course, when it’s going well there’s nothing in the world like it. But it’s also very lonely. If you do something you’re really pleased with, you’re in the crazy position of being exhilarated all by yourself. I remember finishing one section of Dog Soldiers—the end of Hicks’s walk—in the basement of a college library, working at night, while the rest of the place was closed down, and I staggered out in tears, talking to myself, and ran into a security guard. It’s hard to come down from a high in your work—it’s one of the reasons writers drink.
Stone never figured out how to write quickly. He kept his standards on the top shelf. He spoke about this in one of his last interviews, with Tin House. The editor asks him about the plot of his final novel, The Death of the Black-Haired Girl. (A novel that, I must confess, I could not finish.) He insists that he does not have a plan for his work; that it just unfolds as he discovers it. “So you are not,” the interviewer asks, “in the Nabokov camp of treating your characters like “galley slaves”? I can almost hear Robert Stone chuckling in response. “Well,” he said, “I don’t treat them very well. But, no.

In an interview with Kay Bonetti, in 1982, she said: “Some critics feel you lost control of the structure in both A Hall of Mirrors [his second novel] and Dog Soldiers.” Stone’s response: “Yes. I guess I lost control.” And then he adds, importantly, and perfectly in tune with his Zen persona: “I’m pretty satisfied with the way they turned out.” Later in the same interview, he elaborates: “I see a great deal of human life limited, poisoned, frustrated, by fear and ignorance and the violence that comes from it…I think some of the people I write about are trying to get above that and get around it somehow.”

How a Stone novel ends is perhaps more important than any other fact about it. The ending is where at long last the slowly moving lines converge. The end is the closest we will ever get to the direct sunlight of his ideas. I remember distinctly where I read the ending of Damascus Gate. I was seated on a subway car headed to the Upper West Side apartment where I lived with my wife and daughter. I was an established adult by then, full-time job, mortgage, a little girl who called me papa. All that fell away as I read the book. The only world I knew as I hurtled under the streets of New York was the world of the catacombs under Jerusalem as reported to me by Stone. The characters are lost, confused, and the predators and prey are all mixed up. As a reader, I recall my heart pounding as I turned the pages. But truth be told, I also remember being confused. Like, seriously confused. As a character in a Stone novel might say: What the actual fuck is going on?

All of Stone’s work is about the confusing fate that lies in wait behind the world of likely events. The startling break. The upsetting loss, when all the odds were in your favor. Being confused, overwrought, out of luck, or nearly so—all of Stone’s characters arrive at this moment. And then they get up and push onward. You may or may not like his heroes. But you have to admire their will to live. There are moments in his work that anticipate the modern anti-heroes of Breaking Bad or True Detective. I cannot be the only person who saw a dark reflection in the ending of True Detective season two, when Frank Semyon bleeds out in the desert, and the ending of Dog Soldiers, when the mortally wounded Hicks walks as far as he can along a railroad track. Both men are deeply flawed and filled with hallucinations. Both men are dead long before they realize it.

Arguably, it is in film and television where you can locate Stone’s true heirs. Plenty of male novelists try to mug their way through tough-guy first novels a la Stone, but in so doing they confuse him with the likes of Hemingway, Mailer, and Roth. There’s no strut to his prose; there’s nothing self-aggrandizing in Stone’s work, nor did there seem to be in the man. If anything, his work is about the extinguishment of the self in a Buddhist sense. “You’ll never find Robert Stone in a Robert Stone book,” Wallace Stegner is said to have remarked famously after reading Dog Soldiers.

5.
Five years have passed since Stone’s death. Other than a brief burst of appreciation after his passing, in the form of admiring words from peers and former students alike, at this point his floating pyre has drifted out to sea. I suspect that the rolling tide of literary canonization will not bring him back to shore. His vision is too intentionally arch; his prose style far too mandarin. This saddens me, but I do not think that it would sadden Stone; certainly the man that I met once, very briefly, had no other expectation for what would happen in the world that went on without him.

I heard Stone speak and read from his memoir in December of 2009, on the Sunday evening when he appeared on a double bill for a book promo event at KGB Bar. There was a snowstorm coming, according to the weather reports, and a wet snow had started. I suspect this depressed turn out a little. But it also made the room feel brighter and warmer.

Stone arrived shortly after I did, entering alongside a taller, younger man. Later, I would learn this was Madison Smartt Bell. Bell was an accomplished novelist in his own right, and he had a book of his own to promote; but there was in his posture, his gesture, his way of introducing Stone to all of us, a clear deference for the literary lion in our midst. For his part, Stone had no airs. He had, I would learn later, visited the bar numerous times for readings. In an Identity Theory interview in 2003, he said to the interviewer: “I was in KGB last night and I think it’s very vital, even more vital than it used to be.” He seemed at ease when he stood at the lectern, adjusting the rickety lamp to illuminate the pages he carried. He wore reading glasses on the end of his nose. His eyes smiled when he glanced up.

He read a section from his memoir and the entire text of a short story. The story, “Honeymoon,” would appear three years later in his second story collection, Fun with Problems. At the story’s end, the main character swims to his death while scuba diving, plunging into the “uncolored world of fifteen fathoms. The weight of the air took him down the darkening wall.”

Afterward, the room was still with the quiet trance of a heady draught. Stone took a place at the wall near a corner to sign books. I hadn’t realized there would be a signing. Like a fool, I had not thought to bring any books. Clearly, others had a better sense of what to expect. One young man had brought a handled paper bag full of Robert Stone hardcovers. Stone, with a chortle, signed each one.

Someone from Houghton Mifflin was selling paperback copies of Prime Green. I bought one and then apologized when I slid it under Stone’s nose. I loved Dog Soldiers, I told him. I should have brought a copy with me. Indeed, I had read the book just two months earlier in huge gulping doses. He nodded. Your work is inspirational, I said. I had spent much of my time in line figuring out what to say, what wouldn’t be too fawning but would still convey the proper reverence.

“You’re a writer?” he asked.

“Yes,” I said, although I felt sheepish making this claim.

“Are you working on something? Is it going well?”

“I’m–I’m trying,” I said.

He nodded. He understood. He had seen thousands of versions of me before. I suspect he saw me as a lost child, one so alone as to not know how to ask for help. I was, at this time, twice divorced from literary agents, unpublished after a decade of trying, with not even a short story publication to my name. He told me, simply, to keep at it. That the writing is its own reward. The kind of wisdom that, to a young man, seems like resignation, but that to a man at middle age sounds a lot like fortitude and patience. He asked my name, double checked how to spell it, and then wrote his name and a quick line of encouragement on the title page and handed the book back to me.

After that I went down the bar’s long creaky stairs and out into the wet snowy night and back into the uncertainty of a writing life still largely unlived. I have been thinking about our short exchange ever since.

In a conversation with Robert Birnbaum after the release of Damascus Gate, Stone spoke about the epigraph in the book: “Losing it is as good as having it.” This is a line devoid of poetry and hardly worth an epigraph, unless you’ve bought into the long arrow path that passes through Stone’s oeuvre. As Stone explained to his interviewer, the quote wiped him out when he first heard it. “That which we have,” he said, “we invariably lose. And at the same time, it can’t be taken away.”

Stone was trying to say this same thing a decade earlier at the end of A Flag of Sunrise, when Holliwell is being rescued by a father and son who don’t speak English and seem hesitant to take the bloodied protagonist into their boat: “A man has nothing to fear, he thought to himself, who understands history.”

Losing everything, Stone tells us, is far better than never having anything at all. That full ironic detachment is a lesson that still resonates in our post-Cold War, post-American, pandemic-rankled world—with the empire teetering, so many of our heroes in retreat, and the very idea of grand masters in question, when the notion of a canon is more punch line than party line. Who can be a master? Who can speak for us all? Who is worthy? No one, obviously. But there are some voices that offer more to a listener than others. Stone’s is one of them.

Image Credit: Publishers Weekly.

Gravity and Grace and the Virus

In “Theses on the Philosophy of History,” Walter Benjamin famously writes of history as one long catastrophe, an atmosphere we continue to breathe in. “The tradition of the oppressed teaches us that the ‘state of emergency’ in which we live is not the exception but the rule…The current amazement that the things we are experiencing are ‘still’ possible in the twentieth century is not philosophical.” In the most often-cited thesis, Benjamin offers an image of catastrophe as physical and devastating, a continuous process of ruin blowing against a body. “This is how one pictures the angel of history… Where we perceive a chain of events, he sees one single catastrophe which keeps piling wreckage upon wreckage and hurls it in front of his feet.” Our happiness, Benjamin muses, “exists only in the air we have breathed, among people we could have talked to.” The idea, needless to say, has gained new relevance in the age of aerosols and droplets, of mass death and the fears of proximity. The air we breathe in has never seemed so central to our health and happiness. Benjamin and his wife survived the 1918 influenza epidemic, though I don’t know how much was known about transmission in that time.

In the early months of the pandemic, I didn’t know yet to be worried about aerosols. Like the rest of the country, I was still in a deep antagonism with surfaces, wondering whether I could infect myself with coronavirus if I touched a doorknob and then my pillow. But it was clear that something was crumbling, that it would not be solved easily, and (so?) I found myself deep in the work of Jewish philosophers writing during the Holocaust.

It seemed perverse to want to read about historical devastation when there was so much right around me. Why pick now to develop an obsession with the Holocaust? Yet it was strangely comforting to read things that had been written in a time of crisis—our inherited crisis, it always seemed growing up, despite the fact that my own family’s link was indirect and generations removed—and yet one that was not this crisis, this unbearable time. Written under the sign of catastrophe, the works, which included Gershom Scholem’s wonderful biography Walter Benjamin: The Story of a Friendship; Benjamin’s “Theses on the Philosophy of History”; and Simone Weil’s Gravity and Grace—all of which carried with them a newly familiar awareness of devastation and breakdown. It was the same old devastation I’d known as a Jewish child growing up in prosperous America, the endless and yet definitely historical loss that unfolded and unfolded, but now newly strange, because it was familiar. As I read about Benjamin’s despair at the increasingly inflated German mark, my best friend texted me from Brooklyn, our hometown, to ask for flour.

I didn’t read survivors. For some time I’d found it difficult to think or talk about the Holocaust head-on at all. I was struck by an essay-review in Jewish Currents in which Helen Betya Rubinstein writes, of Sheila Heti’s most recent novel, that “Motherhood is a book written around, or about, the Shoah.” Although the novel seemed to conflate a family curse with the Holocaust, Rubinstein noted, it also refused or was unable to make this connection in explicit terms. Could it be that it was no longer possible to write about the Holocaust in this explicit way? Was it necessary to write about it only subterraneously, lightly, glancingly—as though catching something off guard? And is this always true of devastation? (The articles have already been written about the strange absence of the 1918 flu in literature of the time and after.) Rubinstein fears “that the body of literature about the Shoah has too much saturated the culture, and is too full of errors and missteps, to withstand another, divergent attempt;…that it’s impossible to refer to that history without carrying the weight of all the ways the story’s already been told, including the ways it’s been misrepresented, manipulated, and abused.”

I felt these things too. What I hated in Holocaust literature and film was the American heroism so often implicit in them, the way you needed all those bodies to make an audience feel something. I thought it might help to read philosophy and biography, not memoir or fiction, and so to look sideways without looking away. Benjamin, Scholem, and Weil took their own unusual routes, straight through crisis.

Weil, a secular French Jew who longed to convert to Catholicism, died in a sanatorium, starving herself because, she said, she did not want to eat more than the rations available to the people of France. She was an extreme person whose extremity has engendered imitation and inspiration in the work of Fanny Howe and Chris Kraus, among others, and whose life has often been of greater interest to readers than her difficult work. She didn’t have any recorded romantic or sexual relationships, or children. She held people to sometimes impossibly high standards; she did not like Simone de Bouvoir, one of her few fellow female classmates at the elite university Ecole Normale Supérieure, and pronounced her bourgeois. (In this way of course I also find her charming.) She worked in factories, and taught the children of working people, who often found her bizarre and unrelatable (she was herself bourgeois, in a purely descriptive sense, and they were usually Christians) but lovable. In 1942, she’d gone away to New York, where she was safe, but then came back.

In March, I looked through her signature work, Gravity and Grace, for a useful quote, but it was all about detachment, which seemed impossible if not implausible. I was reading everything out of context. Or more in context than I ever had before? Art, Weil wrote, offers the only consolation we should seek or wish to give: “These [works of art] help us through the mere fact that they exist.” About love: “To love purely is to consent to distance, it is to adore the distance between ourselves and that which we love.” This had never felt truer, as I scanned flight schedules for a plane I wouldn’t take to be with my parents in New York, in the epicenter, an apartment with no room for self-isolation.

“Not to exercise all the power at one’s disposal is to endure the void,” Weil wrote. “This is contrary to all the laws of nature. Grace alone can do it.” At the time, I took this an explanation of what it felt like to obey lockdown instead of exercising whatever power I might have to protect myself and the people around me—that is, to rush to Brooklyn. But Weil herself was always keen to go to where the action was. She went to Spain to fight in the Spanish Civil War, although, once there, she quickly burned her feet outside of active duty and had to be rescued by her parents.

Next I moved to Benjamin, who killed himself while fleeing from France into Spain. Another strange wartime death. He and his group of fellow Jewish refugees had been turned away at the border and would be sent back to Vichy France and the camps shortly. Like Weil, Benjamin has been memorialized as physically awkward and ungainly, almost slapstick in his tragedy; a female acquaintance, according to Scholem, once said of him that he was “so to speak, incorporeal.” His was a tragic death, it has been said, because the officials did not, in the end, send his group back. They suggest that he could have lived, or else that his death was what convinced the Spanish authorities of the seriousness of his group’s crisis, that the storm cloud hovering over their heads was real. A member of the group wrote, in a letter to Theodor Adorno, that she had paid Spanish officials in the area for five years for a gravestone for Benjamin, but when Hannah Arendt went to the site only months later, she found no such marking. Later, when visitors began to come to see the grave, a marking materialized. Gershom Scholem writes thoughtfully and compassionately of his dear friend, who always disappointed him with his ardent promises to turn his attention—soon!—more fully to Judaism and the study of Hebrew. “During that year I thought that Benjamin’s turn to an intensive occupation with Judaism was close at hand,” Scholem writes of 1921. It was not.

What did these works offer me? Something about the authors’ ability to live and work in crisis, a personal crisis in some ways, unevenly distributed as all crises are, although in many other ways a global one. Something about their continued commitment to their work, their continued ability to produce great thought—not that I was capable of deep thought in March or expected to be anytime soon. Something about being myself, as I imagine most of us are, the product of crisis and catastrophe, the child (some generations removed) of those who did not die, people who escaped. Or was this self-aggrandizement, a case of American triumphalism? There were Nazis in the streets and I wanted to know why this was so important to me, what it meant to think about disaster. I wanted to know what history is for, especially but not only Jewish history. Could I use it, and if so, how?

“We experience good only by doing it,” Simone Weil wrote, in a completely different world, in a completely different context, that is also and always our world and our context.

Image Credit: Wikimedia Commons.

What the Literature About Contemporary Korean Women’s Lives Illuminates About Our Own

There was an infamous flasher who lurked around the school gate. He was a local who’d been showing up at the same time and place for years…On cloudy days, he would appear at the empty lot that directly faced the windows of the all-girls’ classroom eight. Jiyoung was in that class in the eighth grade.
Kim Jiyoung, Born 1982 by Cho Nam-Joo

The recent Jeffrey Toobin “incident” of his masturbatory penile exposure during a work call with colleagues at The New Yorker enraged me. And while it was welcome news that The New Yorker has fired him, though not citing a reason—the lack of professionalism during a work call should be obvious. His other employer, CNN, where he serves as head legal analyst, said that following the “Zoom incident,” Toobin “asked for some time off” and that the network had granted it. He will be just fine.

My anger has to do with not just the incident itself but also the subsequent jokey “there-but-for-the-grace-of-God-go-I” responses from the pundit lad-o-sphere, steamrollering over the fact that most women first involuntarily encounter the weaponized penis as children. I was 11 in rural Minnesota when first exposed to a flasher on the street who also threatened me with a broom handle. The response by the male adults in my life to my tears and upset were gales of laughter. The flasher, a drifter, lurked around our small town for days, unbothered by police or other authorities, until he tried flashing a well-built woman who got in a physical altercation with him and he was driven from town. Just a few years later, when I was barely an adolescent, I tried to place an ad in our town newspaper to sell my horse. My parents grabbed the phone out of my hands when they heard me shouting, “I don’t know how long a horse’s penis is, my horse is a mare!”—me not understanding that the man with the trustworthy, inquiring adult voice was not interested in purchasing my horse but was assaulting me via phone. A quick survey of friends suggests my experience is not unique but rather very typical, how this hostile atmosphere begins when girls are still children, continues with everyday misogyny (including impossible standards set by advertising), and proceeds when the weaponized penis enters the workplace. Working for years at an investment bank, a call asking a trader about price/earnings ratios would often devolve into the equivalent of horse-penis talk. For example, when pictures of a colleague’s a new baby were met with a vulgar observation by our section chief about how big the newborn’s penis was. Or when an announcement of new female analysts featured Playboy centerfolds instead of employee pictures. Unfortunately, I couldn’t hang up. Instead I had to endure this just to do my job—and it was clear I was being docked invisible points for being “overly sensitive.”

In economics, this concept of negative collateral effects of an action is called “disutility.” The poisoning of the environment and climate change would be a disutility of the energy sector. Which is why disutility is often never looked at. And when it is, it is relegated to less-than exciting (to the self-described “alpha” trader), marginalized fields like economic sustainability. It’s something I encountered only because of my interest in global developmental economics—because it comes up as a lone STEM-voice of dissent in discussions of why not just send all our old toxic iPhone trash to undeveloped countries and pay them (minimally) for the inevitable cancer they will get (See also: Theory of Competitive Advantage).

I was struck, then, at a number of recent novels set in Korea looking at the cost of sexism baldly and directly. In fact, what has been described as the Korean #MeToo movement began as a grenade that went off in the insular Korean literary community: in 2017, the Hwanghae Literature ran a feminist issue; in it a poem, “Monster,” by Choi Young-mi accused “En,” a fictional character of sexual misconduct. The details of “En” match up directly with Ko Un, arguably Korea’s most famous poet and novelist, oft-cited as Korea’s best hope for the Nobel Prize and whose work appears in most middle school and high school textbooks. Other Korean women quickly reified this accusation, suggesting Ko had gone decades unpunished for such conduct, for using his stature to sexually harass young female writers . Ko responded in a nuclear fashion, suing Choi for one billion won ($886,500) for defamation. But now that Pandora’s box has been opened, the spotlight has also shined on the movie and K-pop industry as well. Even the mayor of Seoul, who received unequivocal praise for his handling of the Covid crisis, became embroiled in his own sexual harassment scandal, and committed suicide after the allegations were made public.

The novel Kim Jiyoung, Born 1982 by Cho Nam-Joo, translated in its elegant minimalism by Jamie Chang, tells a story of a young housewife who seems to be going through a nervous breakdown and extreme depression. But, in some ways, when we see her extremely typical but psychologically and physically violent coming of age, in a dispassionate narration that not only makes the horror more total, it seems less a breakdown than a normal response to being a girl and then a young woman dealing with everyday misogyny. In the case of the flasher in Kim Jiyoung, Born 1982, when the adults brush it off, some bolder girls jump the flasher, tie him up, and bring him to the police station. Jiyoung is a tentative person and it doesn’t escape her notice that the girls who rise up to defend themselves are then punished by their school. She manages to get into college, gets a boyfriend, but overhears her male friends describing her as “spat-out gum” because her boyfriend broke up with her. She is similarly dismissed when seeking employment. It seems clearer and clearer to her that taking a stand or resisting societal norms doesn’t get the rebels anywhere. She is cognizant of and frustrated by her lack of options, and half decides, half falls into marriage and childbearing, holding out hope the conventional route will get her some modicum of satisfaction. She even gives up her job, which gave her a level of independence and self-actualization, even though she didn’t like the job per se, which she realizes only once she’s quit. But she finds out later that even work was sullied: shortly after she’d left, a scandal erupted when it was discovered that the male security team set up spycams in the bathrooms, uploading images to a porn site and to male coworkers. Thus, the routes of possible escape or alternatives fade away.

Cho, a television scriptwriter, said in an interview that she wrote the book in two months because her own life, basically, provided the entire backdrop she needed. The book hit a nerve in Korea and became a bestseller—not just in Korea but internationally; it’s been translated into 18 languages.

Han Kang’s The Vegetarian (translated by Deborah Smith) is another bestseller with a wide audience in the English-language world. More atmospheric than plotted, it centers around a Korean housewife who suddenly stops eating meat after having a disturbing dream. Her indifferent and unloving husband tries to adapt, but at a family gathering her refusal to eat meat becomes insulting to her father, who goads her husband and brother into force feeding her meat. She fights back and ends up institutionalized; her husband, the narrator, is left pondering her putative delusions and the mental instability of a woman who won’t submit to the behavior men want to see from her.

If I Had Your Face by Korean American author Frances Cha follows a group of young women who happen to live in the same building as they navigate lives in Seoul, a glittery place of neon and high-rises, and a place that is widely known as the plastic surgery capital of the world. A place where an estimated one in three women will elect to have a procedure before the age of 30, and where it is impossible to merely take the subway and not see ads promising life transformation—for men and women, but more for women—everywhere). The exquisitely beautiful, cosmetically enhanced Kyuri works at a “room salon,” a fancy place where men pay a premium to consort only with the “prettiest” women. Her roommate is a natural-faced artist dating the rich son of a chaebol family. Down the hall is Ara, a mute hairdresser, whose plain-looking roommate Sujin has a dubious dream to undergo expensive and painful plastic surgery to achieve Kyuri’s looks and work in a similar salon. Sujin’s quest for a beauty she feels will make her happy and financially stable is long and painful. Somewhat more peripherally, the women are aware of the young mother Wonna, on the first floor, who is swimming against not just impossible standards of beauty and sexism but also a cutthroat economy, which was memorably limned in Bong Jun-Ho’s award winning Parasite. These obstacles may make the college dreams she has for her children just an illusion.

Looked at it one way, these novels share a depressing commonality of women ground down or driven “crazy” by an unescapable patriarchy where misogyny is not just baked in, but baked into older women’s (i.e., the in-laws) non-support of the younger. Adding to the collection of fiction, Choi Seung-ja’s newly released poetry collection, Phone Bells Keep Ringing for Me (translated by Won-Chung Kim and Cathy Park Hong), declares poems “short as a shriek,” both witness and battle cry that reminds us that canons full of male authors gloss over societal structures that have kept women largely silent in literature as well as politics and culture via a strict and narrow set of rules of what is “acceptable” behavior–and art–by women. Korean poetry is often marked by the pastoral, and poetry by women comes with expectations to be lyrical and decorous in subject. Choi, then explodes that idea. For instance, Korean culture reveres the seasons, autumn is often considered the most attractive season for its vivid colors infused with melancholy because of the nearness of winter. Choi’s “Dog Autumn” begins with:
Dog autumn attacks.
Syphilis autumn.
And death visits
one of twilight’s paralyzed legs.
Spring, the season of renewal, is also considered an attractive, tender season, with flowers like azalea and cherry blossoms representative of its beauty and ephemerality. In Choi’s hands “Spring” is
…of the lonely, unmarried thirty-three-year-old woman…
In the spring, plants and grass bloom,
and even garbage grows fresh.
The trash pile grows bigger in my mouth.
I cannot swallow it or vomit it up.
Readers would be shortchanging themselves to think of these books as a kind of anthropological look at a Confucian society that favors males. These books fit in perfectly with contemporary English-language narratively inventive novels looking at women’s lives, recent examples that come to mind includes the comedic (but ultimately serious) Fleishmann Is in Trouble by Taffy Brodesser-Akner, a novel that also uses the wife’s absence to make points about the structures of matrimony and sexism. There is also a hard-to-classify novel about motherhood, Helen Phillips’s The Need that uses tropes of horror (a great choice) to examine motherhood and female agency.

These works, however, tend to not overtly reference the structures of patriarchy and misogyny the way the Korean novels do, drawing straight lines from these societal structures, that are largely out of women’s controls, and showing that even recognizing and resisting them isn’t a clear path to equality (or even equity). In the end, withdrawal, absenting one’s self from normal existence in the most disruptive way possible, is one of the few “effective” strategies to draw attention to these women’s stories. Kim Jiyoung, Born 1982 in particular unselfconsciously emphasizes its themes, unafraid of seeming didactic. With mini essays about gender statistics and rates of labor force participations, embedded with footnotes referencing Pew and Guttmacher type statistics, this is a novel through its narrative inventiveness fusing fact and fiction (not unlike Melville’s digressions on the whaling industry), we can see Kim Jiyoung’s story placed within a context of an entire country, for what are individual data points, but actual individuals?

In the end, Ko Un’s billion-won suit against the poet Choi Young-mi was dismissed, for, as the Korea Herald reported, Choi’s consistent testimony and that of other witnesses convinced the Seoul Central District Court that she was telling the truth. Choi thanks the judiciary for “bringing justice,” but the truth is, while fending off Ko’s power-play of a public lawsuit, she was still charged the disutility of the damages she suffered from Ko, to her career, her time and peace of mind. Similarly, for every woman who’s had to put up with a hostile workplace—like a man masturbating during a work meeting in front of his female colleagues—this is a kind of tax that, like gender pay disparities, should be reconsidered and compensated. But as Cho Nam-Joo writes in her novel with men continuing to hold the reins of power, it’s no surprise what women do continues to be undervalued. Kim Jiyoung gives it all up to be the best mother, only to be called a “mom-roach” by some young office lads having coffee in the same park where she, buckling with fatigue, is taking her baby out in the stroller, the damages she’s undergone woefully and forever unacknowledged: “Probably because the moment you put a price on something, someone has to pay.”

Letting the Days Go By

Back on Feb. 29, the extra day in this year of years, I stayed up to watch Saturday Night Live and saw David Byrne’s remarkable performance of “Once in a Lifetime.” I know the song well, although I usually get the title wrong, referring to it as “letting the days go by” or “same as it ever was,” which, along with “once in a lifetime,” are lyrics from chorus. My husband Mike has pointed out to me that both of my misremembered titles have a meaning that it opposite to “Once in a Lifetime.” But one of the reasons I love “Once in a Lifetime” is that it is somewhat at odds with itself, not only lyrically, but musically as well—although I didn’t really understand that until I read this article on NPR, watched this video, listened to multiple versions of the song on Spotify, and finally, after years of having it on my to-watch list, saw Jonathan Demme’s concert film Stop Making Sense. Yes, I fell down a “Once in a Lifetime” rabbit hole. This was in early March, before quarantine had even started. But a certain anxiety had set in. At the playground, parents were standing farther apart during after-school chats, and everyone had stopped sharing snacks. I happened to have two annual check-ups scheduled for the first week of March: my doctor and dentist. My doctor assured me that the new coronavirus was nothing to worry about, while my dentist advised me to come in immediately for a follow-up appointment to fill a cavity. The day after I got my tooth filled, I went into Manhattan for a morning screening of Crip Camp, a documentary that I was planning to review for my blog. I could have asked for a screening link, but I wanted to go into Manhattan. Maybe I knew, in the back of my mind, that it would be the last time. I wore gloves on the bus and subway, and a lightweight scarf over my nose and mouth. I got off the train at Eighth Avenue and walked four long blocks to the Landmark 57, a new theater that I’d never gone to before. I didn’t realize, until I got there, that it was in the silver, triangular building that my son refers to the as “the cake slice” whenever we drive up the West Side Highway. I took a photo of it with my phone before I headed home.The photos that follow the cake slice building are mostly of my kids, stuck inside, in crowded messy rooms, or of my kids, outdoors, in wide-open empty spaces. One of our favorite places to go for our daily Cuomo-approved constitutional was the parking lot of the shipyard, a place that is usually pretty empty, but which had become totally barren of cars, pedestrians, and commuters. Gone were the cruise ships and the ferries and the water taxis. My kids scootered past piles of empty shipping containers and yelled as loud as they wanted. Then they went home and watched Disney+ while Mike and I tried to figure out where we should go. We had been planning a move from Brooklyn to Queens—a future I was so certain of at the beginning of this year that I had begun to research nursery schools in our new, chosen neighborhood. I was concerned about the timing of everything, and I feared I was too late for the school I wanted, and too late with the camp sign-ups, and I was annoyed with my anxieties because I never wanted to be the person who was worried about getting her kids into a particular school or themed day camp. I first listened to “Once in a Lifetime” when I was a teenager. I heard the lyrics as description of mindless acquiescence to the status quo: “You may find yourself in a beautiful house/with a beautiful wife/and you may ask yourself, ‘Well…how did I get here?’” Byrne’s performance of the song in Stop Making Sense seemed to support my interpretation. Halfway through the song he dons a bizarrely oversized suit, making him look like a child in adult’s clothes, or an overstuffed puppet. His dance moves suggest sleepwalking as he stumbles backward and repeatedly hits himself on the forehead with an open palm, as if trying to wake up —or realizing a terrible mistake. To my teenage ears, even the chorus was damning: I thought the phrase “letting the days go by” was a kind of accusation, a way of saying, You’re wasting your life. I hear the song a little differently now. Now, it strikes me as a description of how we get through life: we let the days go by, riding on the backs of accumulated habits. But sometimes we stop and wonder: how did we get here?  I’m having one of those moments now. Maybe you are, too. The pandemic has had a way of slowing everyone down. Also, we moved to the suburbs. So did several of our friends. And so we are literally finding ourselves in strange houses. We’re part of a wave of New Yorkers that you may have read about the newspaper: people who left the city in the wake of the virus, seeking more space for home offices and classrooms. Mike and I are now renting the first floor of a house in Montclair, New Jersey that is at least twice as big as our old apartment and a lot cheaper. We’re kicking ourselves that we didn’t move out here years ago—except that we never would have, because all of our friends were in Brooklyn and Mike could walk to work and I didn’t have to drive anywhere. I can’t drive, by the way—I mean, I can, but it’s been 20 years since I’ve gotten behind the wheel. Montclair, fortunately, is very walkable, but the simple fact that I’m going to have to start driving again—that I won’t get everywhere on my own two feet or in the company of strangers on public transportation—is what leaves me really feeling the question: How did I get here?  When Byrne was on SNL in February, he was promoting his limited-run musical, American Utopia, which was a kind of a career retrospective for him. You can see it now on HBO; Spike Lee filmed it shortly before Broadway was shut down. I watched it last weekend and that’s what led me down the “Once in a Lifetime” rabbit hole a second time. Byrne has always struck me as a slightly detached performer, but in American Utopia he seems much more emotionally present, or maybe just more expressive. He sings more forcefully now and he dances with more precision. He’s older, too. You hear the strain and age in his voice as he aims for higher notes; maybe you worry a little when he executes a back-bending dance move—as he does frequently for “Once in a Lifetime.” When I saw Byrne on SNL, I thought he had the look of a mad, prophesying preacher, someone who’d come to a new realization of the divine late in life. The mystical imagery of the lyrics jumped out at me: had there always been so much water? Had the color blue always been so important? Above all, had it always been an ecstatic song? The joyful version of “Once in a Lifetime” was always there. You can hear it in Angelique Kidjo’s cover, which she sings melodically, at a slight faster tempo, with a horn section and back-up singers. And you can hear it in the instrumentation of the original. But I don’t think you can hear it in Byrne’s early vocal renditions. When he was young, I don’t think he really knew what he had on his hands. I think he was just following his intuition and trying to make the song work.  My personal theory of “Once in a Lifetime” is that the song has a will of its own, and that it wanted to exist in the world. It came together very slowly, starting with a bass line that The Talking Heads recorded during a jam session inspired by the music of Fela Kuti. Bassist Tina Weymouth gets credit for coming up with the riff, but she claims that her husband, drummer Chris Frantz, yelled it to her during rehearsal as and adjustment to what she had been playing. When the song was ready to be arranged, producer Brian Eno misheard the rhythm of the riff, adding a rest at the beginning of the measure. When he realized his mistake, he decided he liked the odd arrangement, and wrote a call-and-response chorus to go with it. The band thought the chorus sounded like a preacher leading a prayer, which led Byrne to the weird world of televangelists. He found his now-iconic lyrics—“And you may find yourself…”—by imitating their Biblically-tinged cadences. Borrowed beats, borrowed lyrics, misheard bass lines, bad transcriptions: the basis of this song about the unconscious way we move through life was made without much conscious thought. And yet it is probably one of The Talking Heads most beloved songs, the kind of song people know without even realizing they know it. A few weeks after we moved to our new apartment, my landlord, who lives above us, filled me in on all the local lore. Apparently, Montclair used to be a weekend destination for Broadway performers. She tells me that the house down the street from us hosted marvelous parties, and that Marlene Deitrich was a frequent guest. How my landlord knows this, I don’t know, but I was able to fact-check the next bit of ancient gossip she shared, which was that the house across the street was once occupied by a musician and composer named Herman Hupfeld. You’ve probably never heard of him, but in 1931, he wrote “As Time Goes By”—one of those songs you know without knowing, and may find yourself humming every once in a while.

Image Credit: Wikimedia Commons.

Who’s Afraid of Theory?

In a pique of indignation, the editors of the journal Philosophy and Literature ran a “Bad Writing Contest” from 1995 to 1998 to highlight jargony excess among the professoriate. Inaugurated during the seventh inning of the Theory Wars, Philosophy and Literature placed themselves firmly amongst the classicists, despairing at the influence of various critical “isms.” For the final year that the contest ran, the “winner” was Judith Butler, then a Berkeley philosophy professor and author of the classic work Gender Trouble: Feminism and the Subversion of Identity. The selection which caused such tsuris was from the journal Diacritics, a labyrinthine sentence where Butler opines that the “move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brough the question of temporality into the thinking of structure,” and so on. If the editors’ purpose was to mock Latinate diction, then the “Bad Writing Contest” successfully made Butler the target of sarcastic opprobrium, with editorial pages using the incident as another volley against “fashionable nonsense” (as Alan Sokal and Jean Bricmont called it) supposedly reigning ascendant from Berkeley to Cambridge.

The Theory Wars, that is the administrative argument over which various strains of 20th-century continental European thought should play in the research and teaching of the humanities, has never exactly gone away, even while departments shutter and university work is farmed out to poorly-paid contingent faculty. Today you’re just as likely to see aspersions on the use of critical theory appear in fevered, paranoid Internet threads warning about “Cultural Marxism” as you are on the op-ed pages of the Wall Street Journal, even while at many schools literature requirements are being cut, so as to make the whole debate feel more like a Civil War reenactment than the Battle of Gettysburg. In another sense, however, and Butler’s partisans seem to have very much won the argument from the ‘80s and ‘90s—as sociologically inflected Theory-terms from “intersectionality” to “privilege” have migrated from Diacritics to Twitter (though often as critical malapropism)—ensuring that this war of attrition isn’t headed to armistice anytime soon.   

So, what exactly is “Theory?” For scientists, a “theory” is a model based on empirical observation that is used to make predictions about natural phenomenon; for the lay-person a “theory” is a type of educated guess or hypothesis. For practitioners of “critical theory,” the phrase means something a bit different. A critical theorist engages with interpretation, engaging with culture (from epic poems to comic books) to explain how their social context allows or precludes certain readings, beyond whatever aesthetic affinity the individual may feel. Journalist Stuart Jeffries explains the history (or “genealogy,” as they might say) of one strain of critical theory in his excellent Grand Hotel Abyss: The Lives of the Frankfurt School, describing how a century ago an influential group of German Marxist social scientists, including Theodor Adorno, Max Horkheimer, Walter Benjamin, and Herbert Marcuse, developed a trenchant vocabulary for “what they called the culture industry,” so as to explore “a new relationship between culture and politics.” At the Frankfurt Institute for Social Research, a new critical apparatus was developed for the dizzying complexity of industrial capitalism, and so words like “reify” and “commodity fetish” (as well as that old Hegelian chestnut “dialectical”) became humanistic bywords.  

Most of the original members of the Frankfurt School were old fashioned gentlemen, more at home with Arnold Schoenberg’s 12-tone avant-garde then with Jelly Roll Morton and Bix Beiderbecke, content to read Thomas Mann rather than Action Comics. Several decades later and a different institution, the Center for Contemporary Cultural Studies at the University of Birmingham in the United Kingdom, would apply critical theory to popular culture. These largely working-class theorists, including Stuart Hall, Paul Gilroy, Dick Hebdige, and Angela McRobbie (with a strong influence from Raymond Williams) would use a similar vocabulary as that developed by the Frankfurt School, but they’d extend the focus of their studies into considerations of comics and punk music, slasher movies and paperback novels, while also bringing issues of race and gender to bear in their writings.

In rejecting the elitism of their predecessors, the Birmingham School democratized critical theory, so that the Slate essay on whiteness in Breaking Bad or the Salon hot take about gender in Game of Thrones can be traced on a direct line back through Birmingham. What these scholars shared with Frankfurt, alongside a largely Marxian sensibility, was a sense that “culture was an important category because it helps us to recognize that one life-practice (like reading) cannot be torn out of a large network constituted by many other life-practices—working, sexual orientation, [or] family life,” as elucidated by Simon During in his introduction to The Cultural Studies Reader. For thinkers like Hall, McRobbie, or Gilroy, placing works within this social context wasn’t necessarily a disparagement, but rather the development of a language commensurate with explaining how those works operate. With this understanding, saying that critical theory disenchants literature would be a bit like saying that astronomical calculations make it impossible to see the beauty in the stars.

A third strain influenced “Theory” as it developed in American universities towards the end of the 20th century, and it’s probably the one most stereotypically associated with pretension and obfuscation. From a different set of intellectual sources, French post-structural and deconstructionist thought developed in the ‘60s and ‘70s at roughly the same time as the Birmingham School. Sometimes broadly categorized as “postmodernist” thinkers, French theory included writers of varying hermeticism like Jacques Derrida, Michel Foucault, Giles Deleuze, Jean Lyotard, Jacques Lacan, and Jean Baudrillard, who supplied English departments with a Gallic air composed of equal parts black leather and Galois smoke. Francois Cusset provides a helpful primer in French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States, the best single volume introduction on the subject. He writes that these “Ten or twelve more or less contemporaneous writers,” who despite their not inconsiderable differences are united by a “critique of the subject, of representation, and of historical continuity,” with their focus the “critique of ‘critique’ itself, since all of them interrogate in their own way” the very idea of tradition. French theory was the purview of Derridean deconstruction, or of Foucauldian analysis of social power structures, the better to reveal the clenched fist hidden within a velvet glove (and every fist is clenched). For traditionalists the Frankfurt School’s Marxism (arguably never all that Marxist) was bad enough; with French theory there was a strong suspicion of at best relativism, at worst outright nihilism.

Theory has an influence simultaneously more and less enduring than is sometimes assumed. Its critics in the ‘80s and ‘90s warned that it signaled the dissolution of the Western canon, yet I can assure you from experience that undergraduates never stopped reading Shakespeare, even if a chapter from Foucault’s Discipline and Punish might have made it onto the syllabus (and it bears repeating that contra the reputation of difficulty, the latter was a hell of a prose stylist). But if current online imbroglios are any indication, its influence has been wide and unexpected, for as colleges pivot towards a business-centered STEM curriculum, the old fights about critical theory have simply migrated online. Much of the criticism against theory in the first iteration of this dispute was about what such thinkers supposedly said (or what people thought they were saying), but maybe even more vociferous were the claims about how they were saying things. The indictment about theory then becomes not just an issue of metaphysics, but one of style. It’s the claim that nobody can argue with a critical theorist because the writing itself is so impenetrable, opaque, and confusing. It’s the argument that if theory reads like anything, that it reads like bullshit.

During the height of these curricular debates there was a cottage industry of books that tackled precisely scholarly rhetoric, not least of which were conservative screeds like Allan Bloom’s The Closing of the American Mind: How Higher Education has Failed Democracy and Impoverished the Souls of Today’s Students and E.D. Hirsh Jr.’s The Dictionary of Cultural Literacy. Editors Will H. Corral and Daphne Patai claim in the introduction to their pugnacious Theory’s Empire: An Anthology of Dissent that “Far from responding with reasoned argument to their critics, proponents of Theory, in the past few decades, have managed to adopt just about every defect in writing that George Orwell identified in his 1946 essay ‘Politics and the English Language.’” D.G. Myers in his contribution to the collection (succinctly titled “Bad Writing”) excoriates Butler in particular, writing that the selection mocked by Philosophy and Literature was “something more than ‘ugly’ and ‘stylistically awful’… [as] demanded by the contest’s rules. What Butler’s writing actually expresses is simultaneously a contempt for her readers and an absolute dependence on their good opinion.”

Meanwhile, the poet David Lehman parses Theory’s tendency towards ugly rhetorical self-justification in Signs of the Times: Deconstruction and the Fall of Paul de Man, in which he recounts the sundry affair whereby a confidante of Derrida and esteemed Yale professor was revealed to have written Nazi polemics during the German-occupation of his native Belgium. Lehman also provides ample denunciation of Theory’s linguistic excess, writing that for the “users of its arcane terminology it confers elite status… Less a coherent system of beliefs than a way of thinking.” By 1996 and even Duke University English professor Frank Lentricchia (in a notoriously Theory-friendly department) would snark in his Lingua Franca essay “Last Will and Testament of an Ex-Literary Critic” to (reprinted in Quick Studies: The Best of Lingua Franca) “Tell me your theory and I’ll tell you in advance what you’ll say about any work of literature, especially those you haven’t read.”

No incident illustrated more for the public the apparent vapidity of Theory than the so-called “Sokal Affair” in 1996, when New York University physics professor Alan Sokal wrote a completely meaningless paper composed in a sarcastic pantomime of critical theory-speak entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity” which was accepted for publication in the prestigious (Duke-based) journal Social Text, with his hoax simultaneously revealed by Lingua Franca. Sokal’s paper contains exquisite nonsense such as the claim that “postmodern sciences overthrow the static ontological categories and hierarchies characteristic of modernist science” and that “these homologous features arise in numerous seemingly disparate areas of science, from quantum gravity to chaos theory… In this way, the postmodern sciences appear to be converging on a new epistemological paradigm.” Sokal’s case against Theory is also, fundamentally, about writing. He doesn’t just attack critical theory for what he perceives as its dangerous relativism, but also at the level of composition, writing in Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science that such discourse “exemplified by the texts we quote, functions in part as a dead end in which some sectors of the humanities and social sciences have gotten lost.” He brags that “one of us managed, after only three months of study, to master the postmodernist lingo well enough to publish an article in a prestigious journal.” Such has long been the conclusion among many folks that Theory is a kind of philosophical Mad Libs disappearing up its own ass, accountable to nobody but itself and the departments that coddle it. Such was the sentiment which inspired the programmers of the Postmodern Essay Generator, which as of 2020 is still algorithmically throwing together random Theory words to create full essays with titles like “Deconstructing Surrealism: Socialism, surrealism and deconstructivist theory” (by P. Hans von Ludwig) and “Social realism and the capitalist paradigm of discourse” (by Agnes O. McElwaine).

Somebody’s thick black glasses would have to be on too tight not to see what’s funny in this, though there’s more than a bit of truth in the defense of Theory that says such denunciations are trite, an instance of anti-intellectualism as much as its opposite. Defenses of Theory in the wake of Sokal’s ruse tended to, not unfairly, query why nobody questions the rarefied and complex language of the sciences but blanches when the humanities have a similarly baroque vocabulary. Status quo objections to that line of thinking tend to emphasize the humanness of the humanities; the logic being that if we’re all able to be moved by literature, we have no need to have experts explain how that work of literature operates (as if being in possession of a heart would make one a cardiologist). Butler, for her part, answered criticism leveled against her prose style in a (well written and funny!) New York Times editorial, where she argues, following a line of Adorno’s reasoning, that complex prose is integral to critical theory because it helps to make language strange, and forces us to interrogate that which we take for granted. “No doubt, scholars in the humanities should be able to clarify how their work informs and illuminates everyday life,” Butler admits, “Equally, however, such scholars are obliged to question common sense, interrogate its tacit presumptions and provoke new ways of looking at a familiar world.”

To which I heartily agree, but that doesn’t mean that the selection of Butler’s mocked by Philosophy and Literature is any good. It costs me little to admit that the sentence is at best turgid, obtuse, and inelegant, and at worst utterly incomprehensible. It costs me even less to admit that that’s probably because it’s been cherry picked, stripped of context, and labeled as such so that it maximizes potential negative impressions. One can defend Butler— and Theory—without justifying every bit of rhetorical excess. Because what some critics disparage about Theory—its obscurity, its rarefied difficulty, its multisyllabic technocratic purpleness—is often true. When I arrived in my Masters program, in a department notoriously Theory-friendly, I blanched as much as Allan Bloom being invited to be a roadie on the Rolling Stones’ Steel Wheel Tour. For an undergraduate enmeshed in the canon, and still enraptured to that incredibly old-fashioned (but still intoxicating) claim of the Victorian critic Matthew Arnold in Culture and Anarchy that the purpose of education was to experience “the best which has been thought and said,” post-structuralism was a drag. By contrast, many of my colleagues, most of them in fact, loved Theory; they thrilled to its punkish enthusiasms, its irony laden critiques, its radical suspicion of the best of which has been thought and said. Meanwhile I despaired that there were no deconstructionists in Dead Poets Society.

I can no longer imagine that perspective. It’s not quite that I became a “Theory Head,” as one calls all of those sad young men reading Deleuze and  Félix Guattari while smoking American Spirit cigarettes, but I did learn to stop worrying and love Theory (in my own way). What I learned is that Theory begins to make sense once you learn the language (whether it takes you three months or longer), and that it’s innately, abundantly, and estimably useful when you have to actually explain how culture operates, not just whether you happen to like a book or not. A poet can write a blazon for her beloved, but an anatomist is needed to perform the autopsy. Some of this maturity came in realizing that literary criticism has always had its own opacity; that if we reject “binary opposition,” we would have to get rid of “dactylic hexameter” as well. The humanities have always invented new words to describe the things of this world that we experience in culture. That’s precisely the practice attacked by John Martin Ellis, who in his jeremiad Against Deconstruction took on Theory’s predilection towards neologism, opining that “there were plenty of quite acceptable ordinary English words for the status of entrenched ideas and for the process of questioning and undermining them.” All of that difference, all of that hegemony, and so much phallologocentricism… But here’s the thing— sometime heteroglossia by any other name doesn’t smell as sweet.

Something anachronistic in proffering a defense of Theory in the third decade of the new millennium; something nostalgic or even retrograde. Who cares anymore? Disciplinary debates make little sense as the discipline itself has imploded, and the anemic cultural studies patois of the Internet hardly seems to warrant the same reflection, either in defense or condemnation. In part though, I’d suggest that it’s precisely the necessity of these words, and their popularity among those who learned them through cultural osmosis and not through instruction, that necessitates a few statements in their exoneration. All of the previous arguments on their behalf—that the humanities require their own jargon, that this vocabulary provides an analytical nuance that the vernacular doesn’t—strike me as convincing. And the criticism that an elite coterie uses words like “hegemonic” as a shibboleth are also valid, but that’s not an argument to abandon the words—it’s an argument to instruct more people on what they mean.

But I’d like to offer a different claim to utility, and that’s that Theory isn’t just useful, but that it’s beautiful. When reading the best of Theory, it’s as if reading poetry more than philosophy, and all of those chewy multisyllabic words can be like honey in the mouth. Any student of linguistics or philology—from well before Theory—understands that synonyms are mythic and that an individual word has a connotative life that is rich and unique. Butler defends the Latinate, writing that for a student “words such as ‘hegemony’ appears strange,” but that they may discover that beyond its simpler meaning “it denotes a dominance so entrenched that we take it for granted, and even appear to consent to it—a power that’s strengthened by its invisibility.” Not only that, I’d add that “hegemony,” with its angular consonants hidden like a sharp rock in the middle of a snowball, conveys a sense of power beyond either brute strength or material plenty. Hegemony has something of the mysterious about it, the totalizing, the absolute, the wickedly divine. To simply replace it with the word “power” is to drain it of its impact. I’ve found this with many of those words; that they’re as if occult tone poems conveying a hidden and strange knowledge; that they’re able to give texture to a picture that would otherwise be flat. Any true defense of Theory must, I contend, give due deference to the sharp beauty that these sometimes-hermetic words convey.

As a totally unscientific sample, I queried a number of my academic (and recovering academic) colleagues on social media to see what words they would add to a list of favorite terms; the jargon that others might roll their eyes at, or hear as grad school clichés, but that are estimably useful, and dare I say it—beautiful. People’s candidates could be divided in particular ways, including words that remind us of some sort of action, words that draw strength from an implied metaphorical imagery, and words that simply have an aural sense that’s aesthetically pleasing (and these are by no means exhaustive or exclusive). For example, Derrida’s concept of “deconstruction,” a type of methodological meta-analysis that reveals internal contradictions within any text, so as to foreground interpretations that might be hidden, was a popular favorite word. “Deconstruction” sounds like an inherently practical term, a word that contractors rather than literary critics might use, the prefix connotes ripping things down while the rest of the word gestures towards building them (back?) up. A similar word that several responders mentioned, albeit one with less of a tangible feel to it, was “dialectics,” which was popularized in the writings of the 19th-century German philosopher George Wilhelm Friedrich Hegel, was mediated through Karl Marx, and was then applied to everything by the Frankfurt School. As with many of these terms, “dialectics” has variable meaning depending on who is using it, but it broadly refers to an almost evolutionary process whereby the internal contradictions of a concept are reconciled, propelling thought into the future. For the materialist deployment of the term by Marx and his followers, the actual word has an almost mystical gloss to it, the trochaic rhythm of the word itself with its up-down-up-down beat evoking the process of thesis-antithesis-synthesis to which the term itself applies. Something about the very sound of “dialectic” evokes both cutting and burying to me, the psychic struggle that the word is supposed to describe.

Then there are the words that are fueled with metaphorical urgency, short poems in their own right that often appropriated from other disciplines. Foucault used words like “genealogy” or “archeology” when some might think that “history” would be fine, and yet those words do something subtly different than the plodding narrative implied by the more prosaic word. With the former there is a sense of telling a story that connects ideas, trends, and themes within a causal network of familial relations, the latter recalls excavation and the revealing of that which remains hidden (or cursed). Deleuze and Guatari borrowed the term “rhizome” from botany, which originally described the complex branching of root systems, now reapplied to how non-hierarchical systems of knowledge propagate. “Rhizome” pays homage to something of beauty from a different way of understanding the world—it is not filching, it is honoring. The Italian Marxist Antonio Gramsci similarly borrowed the term “subaltern,” later popularized by Gayatri Chakravorty Spivak, for whom it came to designate communities of colonized people who are simultaneously exoticized and erased by imperial powers. The word itself was a term used for junior officers in the British colonial service. Finally, I’m partial to “interiority” myself, used to denote fictional representations of consciousness or subjectivity. Yet “interiority,” with its evocation of a deep subterranean network or the domestic spaces of a many-roomed mansion, says something about consciousness that the more common word doesn’t quite.

My favorite critical jargon word, however, is “liminal.” All of us who work on academic Grub Street have their foibles, the go-to scholarly tics marking their prose like an oily fingerprint left on Formica. We all know the professor with their favored jargon turn (often accompanied by an equivalent hand movement, like an intricate form of Neapolitan), or the faculty member who might be taken to yelling out “Hegemonic!” at inopportune times. Thus, I can’t help but sprinkle my own favored term into my writing like paprika in Budapest goulash. My love for the word, used to designate things that are in-between, transitioning, and not quite formed, has less to do with its utility than with the mysterious sense of the sounds that animate it. It’s always been oddly onomatopoeic to me, maybe because it’s a near homophone to “illuminate,” and makes me think of dusk, my favorite time of day. When I hear “liminal” it reminds me of moonbeams and cicadas at sunset; it reminds me that the morning star still endures even at dawn. An affection for the term has only a little to do with what’s useful about it, and everything to do with that connotative ladder that stretches out beyond its three syllables. I suspect that when we love these words, this jargon, it’s an attraction to their magic, the uncanny poetry hidden behind the seemingly technocratic. The best of Theory exists within that liminal space, between criticism and poetry; justifying itself by recourse to the former, but always actually on the side of the latter—even if it doesn’t know it.           

Image Credit: Wikipedia

On Obscenity and Literature

“But implicit in the history of the First Amendment is the rejection of obscenity as utterly without redeeming social importance.” —Associate Justice William J. Brennan Jr., Roth v. United States (1957)

Interviewer: Speaking of blue, you’ve been accused of vulgarity. Mel Brooks: Bullshit! —Playboy (February, 1975)

On a spring evening in 1964 at the Café Au Go Go in Greenwich Village, several undercover officers from the NYPD’s vice squad arrested Lenny Bruce for public obscenity. Both Bruce and the club’s owner Howard Solomon were shouldered out through the crowded club to waiting squad cars, their red and blue lights reflected off of the dirty puddles pooled on the pavement of Bleecker Street. For six months the two men would stand trial, with Bruce’s defense attorney calling on luminaries from James Baldwin to Allen Ginsberg, Norman Mailer to Bob Dylan, to attest to the stand-up’s right to say whatever he wanted in front of a paying audience. “He was a man with an unsettling sense of humor,” write Ronald K.L. Collins and David M. Skover in The Trials of Lenny Bruce: The Fall and Rise of an American Icon. “Uncompromising, uncanny, unforgettable, and unapologetic…His words crossed the law and those in it is. He became intolerable to people too powerful to ignore. When it was over, not even the First Amendment saved him.” The three-judge tribunal sentenced Bruce to four months punishment in a workhouse. Released on bail, he never served a day of his conviction, overdosing on morphine in his Hollywood Hills bungalow two years later. He wouldn’t receive a posthumous pardon until 2003.

“Perhaps at this point I ought to say a little something about my vocabulary,” Bruce wrote in his (still very funny) How to Talk Dirty and Influence People: An Autobiography. “My conversation, spoken and written, is usually flavored with the jargon of the hipster, the argot of the underworld, and Yiddish.” Alongside jazz, Jewish-American comedy is one of the few uniquely American contributions to world culture, and if that comparison can be drawn further, then Bruce was the equivalent of Dizzy Gillespie or Miles Davis—he was the one who broke it wide open. Moving comedy away from the realm of the Borscht Belt one-liner, Bruce exemplified the emerging paradigm of stand-up as spoken word riff of personal reflections and social commentary, while often being incredibly obscene. The Catskills comedian Henry Youngman may have been inescapably Jewish, but Bruce was unabashedly so. And, as he makes clear, his diction proudly drew from the margins, hearing more truth in the dialect of the ethnic Other than in mainstream politeness, more honesty in the junky’s language than in the platitudes of the square, more righteous confrontation in the bohemian’s obscenity than in the pieties of the status quo. Among the comics of that golden age of stand-up, only Richard Pryor was his equal in bravery and genius, and despite the fact that some of his humor is dated today, books like How to Talk Dirty and Influence People still radiate an excitement that a mere burlesque performer could challenge the hypocrisy and puritanism of a state that would just as soon see James Joyce’s Ulysses and D.H. Lawrence’s Lady Chatterley’s Lover banned and their publisher’s hauled to jail as they would actually confront any of the social ills that infected the body politic.

What separates Bruce from any number of subsequent comics is that within his performances there was a fully articulated theory of language. “Take away the right to say the word ‘fuck’ and you take away the right say ‘fuck the government,’” he is reported to have said, and this is clearly and crucially true. That’s one model of obscenity’s utility: its power to lower the high and to raise the low, with vulgarity afforded an almost apocalyptic power of resistance. There is a naivety, however, that runs through the comedian’s work, and that’s that Bruce sometimes doesn’t afford language enough power. In one incendiary performance from the early ’60s, Bruce went through a litany of ethnic slurs for Black people, Jews, Italians, Hispanics, Poles, and the Irish, finally arguing that “it’s the suppression of the word that gives it the power, the violence, the viciousness.” He imagines a scenario whereby the president would introduce members of his cabinet by using those particular words, and concludes that following such a moment those slurs wouldn’t “mean anything anymore, then you could never make some six-year-old black kid cry because somebody called him” that word at school. Bruce’s idealism is almost touching—let it not be doubted that he genuinely believed language could work in this way—but it’s also empirically false. Dying a half-century ago he can’t be faulted for his ignorance on this score, but when we now have a president who basically does what Bruce imagined his hypothetical Commander-in-Chief doing, and I think we can emphatically state that the repetition of such ugliness does nothing to dispel its power.       

Discussions about obscenity often devolve into this bad-faith dichotomy—the prudish schoolmarms with their red-pens painting over anything blue and the brave defenders of free speech pushing the boundaries of acceptable discourse. The former hold that there is a certain power to words that must be tamed, while the later champion the individual right to say what they want to say. When the issue is phrased in such a stark manner, it occludes a more discomforting reality—maybe words are never simply utterances, maybe words can be dangerous, maybe words can enact evil things, and maybe every person has an ultimate freedom to use those words as they see fit (notably a different claim than people should be able to use them without repercussion). Bruce’s theory of language is respectably semiotic, a contention about the arbitrary relationship between signifier and signified, whereby that chain of connection can be severed by simple repetition, as when sense flees from a word said over and over again, whether it’s “potato” or “xylophone.” But he was ultimately wrong (as is all of structural and post-structural linguistics)—language is never exactly arbitrary, it’s not really semiotic. We need theurgy to explain how words work, because in an ineffable and numinous way, words are magic. When it comes to obscenity in particular, whether the sexual or the scatological, the racial or the blasphemous, we’re considering a very specific form of that magic, and while Bruce is correct that a prohibition on slurs would render resistance to oppression all the more difficult, he’s disingenuous in not also admitting that it can provide a means of cruelty in its own right. If you couldn’t say obscenities then a certain prominent tweeter of almost inconceivable power and authority couldn’t deploy them almost hourly against whatever target he sees fit. This is not an argument for censorship, mind you, but it is a plea to be honest in our accounting.

Obscenity as social resistance doesn’t have the same cache it once did, nor is it always interpreted as unassailably progressive (as it was for Bruce and his supporters). In our current season of a supposed Jacobin “cancel culture,” words have been ironically re-enchanted with the spark of danger that was once associated with them. Whether or not those who claim that there is some sort of left McCarthyism policing language are correct, it’s relatively anodyne to acknowledge that right now words are endowed with a significance not seen since Bruce appeared in a Manhattan courtroom. Whatever your own stance on the role that offensiveness plays in civilized society, obscenity can only be theorized through multiple perspectives. Four-letter words inhabit a nexus of society, culture, faith, linguistics, and morality (and the law). A “fuck” is never just a “fuck,” and a shit by any other name wouldn’t smell as pungent. Grammatically, obscenities are often classified as “intensifiers,” that is placeholders that exist to emphasize the emotionality of a given declaration—think of them as oral exclamation marks. Writing in Holy Sh*t: A Brief History of Swearing, Melissa Mohr explains vulgarity is frequently “important for the connotation it carries and not for its literal meaning.” Such a distinction came into play in 2003 after the Irish singer Bono of U2 was cited by the Federal Communications Commission when upon winning a Golden Globe he exclaimed “fucking brilliant.” The Enforcement Commission of the bureau initially decided that Bono’s f-bomb wasn’t indecent since its use clearly wasn’t in keeping with the sexual definition of the word, a verdict that was later rescinded higher up within the FCC.

“Historically,” Mohr writes, “swearwords have been thought to possess a deeper, more intimate connection to the things they represent than do other words,” and in that regard the pencil-necked nerds at the FCC ironically showed more respect to the dangerous power of fucking then did Bono. If vigor of emotion is all one was looking for in language, any number of milquetoast words would work as well as a vulgarity, and yet obscenity (even if uttered due to a stubbed toe) is clearly doing something a bit more transcendent than more PG terms—for both good and bad. Swearing can’t help but have an incantatory aspect to it; we swear oaths, and we’re specifically forbidden by the Decalogue from taking the Lord’s name in vain. Magnus Ljung includes religious themes in his typology of profanity, offered in Swearing: A Cross-Cultural Linguistic Study, as one of “five major themes that recur in the swearing of the majority of the languages discussed and which are in all likelihood also used in most other languages featuring swearing.” Alongside religious profanity, Ljung recognizes themes according to scatology, sex organs, sexual activities, and family insults. To this, inevitably, must also be offered ethnic slurs. Profanity is by definition profane, dealing with the bloody, pussy, jizzy reality of what it means to be alive (and thus the lowering of the sacred into that oozy realm is part of what blasphemously shocks). Obscenity has a quality of the theological about it, even while religious profanities have declined in their ability to shock an increasingly secular society.

Today a word like “bloody” sounds archaic or Anglophilic, and almost wholly inoffensive, even while it’s (now forgotten) reference to Christ’s wounds would have been scandalous to an audience reared on the King James Bible. This was the problem that confronted television director David Milch, who created the classic HBO western Deadwood. The resultant drama (with dialogue largely composed in iambic pentameter) was noted as having the most per capita profanity of any show to ever air, but in 1870s Dakota most of those swears would have been religious in nature. Since having Al Swearengen (a perfect name if ever there was one) sound like Yosemite Sam would have dulled the shock of his speech, Milch elected to transform his characters’ language into scatological and ethnic slurs, the latter of which still has the ability to upset an audience in a way that “by Christ’s wounds!” simply doesn’t. When Swearengen offers up his own theory of language to A.W. Merrick, who edits Deadwood’s newspaper, he argues that “Just as you owning a print press proves only an interest in the truth, meaning up to a fucking point, slightly more than us others maybe, but short of a fucking anointing or the shouldering of a sacred burden—unless of course the print press was gift of an angel,” he provides a nice synthesis of the blasphemous and the sexual. The majority of copious swears in Deadwood are of the scatological, sexual, or racial sort, and they hit the ear-drum with far more force than denying the divinity of Christ does. When Milch updated the profanity of the 19th century, he knew what would disturb contemporary audiences, and it wasn’t tin-pot sacrilege.  

All of which is to say that while obscenity has a social context, with what’s offensive being beholden to the mores of a particular century, the form itself universally involves the transgression of propriety, with the details merely altered to the conventions of a time and place. As an example, watch the 2005 documentary The Aristocrats directed by magician Penn Jillette, which features dozens of tellings of the almost unspeakably taboo joke of the same name. Long an after-hours joke told by comedians who would try to one-up each other in the degree of profanity offered, Jillette’s film presents several iconic performers giving variations on the sketch. When I saw the film after it came out, the audience was largely primed for the oftentimes extreme sexual and scatological permutations of the joke, but it was the tellings that involved racial slurs and ethnic stereotypes that stunned the other theater goers. It’s the pushing of boundaries in and of itself, rather than the subject in question, that designates something as an obscenity. According to Sigmund Freud in his (weirdly funny) The Joke and Its Relation to the Unconscious, vulgar humor serves a potent psychological purpose, allowing people “to enjoy undisguised obscenity” that is normally repressed so as to keep “whole complexes of impulses, together with their derivatives, away from consciousness.” Obscenity thus acts as a civilizational pressure valve for humanity’s chthonic impulses.

That
words which are considered obscene are often found in the vocabulary of the
marginalized isn’t incidental, and it recommends spicey language as a site of
resistance. English swearing draws directly from one such point of contact
between our “higher” and our “lower” language. The majority of English swears
have a Germanic origin, as opposed to a more genteel Romance origin (whether
from French or Latin). In keeping with their Teutonic genesis, they tend to
have an abrasive, guttural, jagged quality to their sounds, the better to
convey an onomatopoeic quality. Take a look at the list which comprises comedian
George Carlin’s 1972 bit “Seven Words You Can Never Say on Television.” Four of
them definitely have an Old English etymology, traceable back to the West
Germanic dialect of the Angles, Saxons, Frisians, and Jutes who occupied
Britain in the later centuries of the first millennium. Three of them – the one
that rudely refers to female genitalia, the one that tells you to rudely do
something sexual, and the one that tells you to do that thing to your mother –
may have Latin or Norman origins, though linguists think they’re just as likely
to come from what Medievalists used to call “Anglo-Saxon.” Most of these words
had no obscene connotations in their original context; in Old English the word
for urine is simply “piss,” and the word for feces is “shit.” Nothing dirty
about either word until the eleventh-century Norman invasion of Britain privileged
the French over the English. That stratification, however, gives a certain
gutter enchantment to those old prosaic terms, endowing them with the force of
a swear. Geoffrey Hughes writes in Swearing: A Social History of Foul
Language, Oaths, and Profanities in English that the “Anglo-Saxon element… provides
much more emotional force than does the Normal French of the Latin. Copulating
pandemonium! conveys none of the emotion charge of the native equivalent fucking
hell!”  Invasion, oppression, and
brutality mark those words which we consider to be profane, but they also give
them their filthy enchantments.

What’s clear is that the class connotations of what Bruce called an “argot” can’t be ignored. Swearing is the purview of criminals and travelers, pirates and rebels, highwaymen and drunks. For those lexicographers who assembled lists of English words in the early modern era, swearing, or “canting,” provided an invaluable window into the counter-cultural consciousness. The Irish playwright Richard Head compiled The Canting Academy, or Devil’s Cabinet Opened in 1673, arguably the first full-length English “dictionary,” complied decades before Dr. Johnson’s staider 1755 A Dictionary of the English Language. Decades before Head’s book, and short pamphlets by respectable playwrights from Thomas Dekker to Thomas Middleton similarly illuminated people on the criminal element’s language—other examples that were included as appendices within books, such as Thomas Harman’s A Caveat or Warning for Common Cursitors, go back well into the 16th century. Such “canting guides,” exploring the seamy underbelly of the cockney capital, were prurient pamphlets that illustrated the salty diction of thieves and rogues for the entertainment of the respectable classes. One of the most popular examples was the anonymously edited A New Dictionary of the Terms Ancient and Modern of the Canting Crew, first printed in 1698.  Within, readers could learn the definitions of insults from “blobber-lipped” to “jobber-not.” Such dictionaries (that included words like “swindler” and “phony,” which still survive today) drew from the English underclass, with a motley vocabulary made up of words from rough-hewn English, Romani, and ultimately Yiddish, among other origins.

A direct line runs between the vibrant, colorful, and earthy diction of canting to cockney rhyming slang, or the endangered dialect of Polari used for decades by gay men in Great Britain, who lived under the constant threat of state punishment. All of these tongues are “obscene,” but that’s a function of their oppositional status to received language. Nothing is “dirty” about them; they are, rather, rebellions against “proper” speech, “dignified” language, “correct” talking, and they challenge that codified violence implied by the mere existence of the King’s Speech. Their differing purposes, and respective class connotations and authenticity, are illustrated by a joke wherein a hobo asks a nattily dressed businessman for some change. “’Neither a borrower nor a lender be’—that’s William Shakespeare,” says the businessman. “’Fuck you’—that’s David Mamet,” responds the panhandler. A bit of a disservice to the Bard, however, who along with Dekker and Middleton could cant with the best of them. For example, within the folio one will find “bawling, blasphemous, incharitible dog,” “paper fac’d villain,” and “embossed carbuncle,” among such other similarly colorful examples.

An entire history could be written about early instances of noted slurs, which of course necessitates trawling the Oxford English Dictionary for examples of dirty words that appear particularly early. For “shit,” there is a 1585 instance of the word in a Scottish “flyting,” an extemporaneous poetic rhyme-battle held in Middle Scotts, which took place between Patrick Hume and Alexander Montgomerie. The greatest example of the form is the 15th-century Flyting of Dunbar and Kennedy, containing the first printed instance of the word “fuck.” In the OED, our good friend the dirty lexicographer Richard Head has the earliest example given in the entry for the word “fuck,” the profanity appearing as a noun in his play Hic et Ubique: or, The Humors of Dublin ,wherein a character says “I did creep in…and there I did see [him] putting the great fuck upon my wife.” And the dictionary reflects the etymological ambiguity concerning the faux-francophone/faux-Virgilian word “dildo,” giving earliest attribution to the playwright Robert Greene in 1590 who in his comedy Never Too Late wrote “Dildido dildido, Oh love, oh love, I feel thy rage rumble below and above.” Swearing might be a radical alternative to received language, but it pulses through literature like a counter-history, a shadow realm of the English tongue’s full capabilities. It is a secret language, the twinned-double of more respectable letters, and it’s unthinkable to understand Geoffrey Chaucer without his scatological jokes or Shakespeare minus his bawdy insults. After all, literature is just as much Charles Bukowski as T.S. Eliot; it’s William S. Burroughs and not just Ezra Pound.

Sometimes those dichotomies about what language is capable of are reconciled within the greatest of literature. A syllabus of the immaculate obscene would include the Marquis de Sade’s 120 Days of Sodom, Charles Baudelaire’s The Flowers of Evil, Gustave Flaubert’s Madame Bovary, Joyce’s Ulysses, Lawrence’s Lady Chatterley’s Lover, Vladimir Nabokov’s Lolita, Henry Miller’s Tropic of Cancer (smuggled out of a Barnes & Noble by yours truly when I was 16), and Irving Welsh’s Trainspotting. Along with his fellow Scotsman James Kelman, Welsh shows the full potential of obscenity to present an assault on the pieties of the bourgeois, mocking Madison Avenue sophistry when he famously implores the reader to “Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.” Within the English language, looming above all as the primogeniture of literary smut, is the great British author John Cleland, who in 1748 published our first pornographic novel in Fanny Hill: Or, Memoirs of a Woman of Pleasure, wherein he promised “Truth! stark naked truth, is the word, and I will not so much as take the pains to bestow the strip of a gauze-wrapper on it.” Cleland purposefully wrote Fanny Hill entirely in euphemisms and double entendres, but the lack of dirty words couldn’t conceal the fact that the orgiastic bildungsroman about a middle-age nymphomaniac was seen as unspeakably filthy. The novel has the distinction of being the longest banned work in U.S. history, first prohibited by the Massachusetts Supreme Court in 1821, only to be sold legally after the U.S. Supreme Court ruled that its censorship was unconstitutional in 1966. The same year that Bruce was found face-down, naked and dead, in his California bathroom.     

A goddamn unequivocal fucking triumph of the human spirt that any fucking wanker can march up into a public library and check out a copy of Fanny Hill. That liberty is one that was hard fought for, and we should look askance on anyone who’d throw it away too cavalierly. But there is also something disingenuous as dismissing all those who suppressed works like Fanny Hill or Ulysses or Lady Chatterley’s Lover as mere prigs and prudes. A work is never censored because it isn’t powerful; it’s attacked precisely because of that coiled, latent energy that exists within words, none the more so than those that we’ve labeled as forbidden. If the debate over free speech and censorship is drenched in a sticky oil of bad faith, then that slick spills over into all corners. My fellow liberals will mock the conservative perspective that says film or comic books or video games or novels are capable of altering someone into action, sometimes very ugly action—but of course literature is capable of doing this. Why would we read literature otherwise? Why would we create it otherwise? The censor with his black marker in some ways does due service to literature, acknowledging its significance and its uncanny effect. To claim that literature shouldn’t be censored because all literature is safe is not just fallacious, it’s disrespectful. The far more difficult principle is that literature shouldn’t be censored despite the fact that it’s so often dangerous.

Like any grimoire or incantation, obscenity can be used to liberate and to oppress, to free and to enslave, to bring down those in power but also to froth a crowd into the most hideous paroxysms of fascistic violence. So often the moralistic convention holds that “punching down” is never funny, but the dark truth is that it often is. What we do with that reality is the measure of us as people, because obscenity is neither good nor bad, but all power resides within the mouth of who wields it. What we think of as profanity is a rupture within language, a dialectic undermining conventional speech, what the Greeks called an aporia that constitutes the moment that rhetoric breaks down. Obscenity is when language declares war on itself, often with good cause. Writing in Rabelais and His World, the great Russian critic Mikhail Bakhtin defined what he called the “carnivalesque,” that is the principle that structured much medieval and Renaissance performance and literature, whereby the “principle of laughter and the carnival spirit on which the grotesque is based destroys…seriousness and all pretense.” Examining the Shrovetide carnivals that inaugurated pre-Reformation Lent, Bakhtin optimistically saw something liberatory in the ribald display of upended hierarchies, where the farting, shitting, pissing, vomiting hilarity of the display rendered authority foolish. “It frees human consciousness,” Bakhtin wrote, “and imagination for new potentialities.”

An uneasy and ambivalent undercurrent threads through Bakhtin’s argument, though. If the carnival allowed for a taste of emancipation, there was also always the possibility that it was just more bread and circus, a way to safely “rebel” without actually challenging the status quo. How much of our fucks and shits are just that, simply the smearing of feces on our playpen walls? Even worse, what happens when the carnival isn’t organized by plucky peasants to mock the bishops and princes, but when the church and state organize those mocking pageants themselves? Bakhtin didn’t quite anticipate the troll, nor did Bruce for that matter. Gershon Legman writes in the standard text Rationale of the Dirty Joke: An Analysis of Sexual Humor that “Under the mask of humor, our society allows infinite aggressions, by everyone and against everyone. In the culminating laugh of the listener or observer…the teller of the joke betrays his hidden hostility.” Can’t you take a joke? Because I was just joking. Legman’s reading of obscenity is crucial—it’s never just innocent, it’s never just nothing, it’s never just words. And it depends on who is saying them, and to whom they’re being said. Because swearing is so intimately tied to the theological, the use of profanity literally takes on the aura of damnation. It’s not that words aren’t dangerous—they are. But that doesn’t mean we must suture our mouths, even as honesty compels us to admit that danger. What we do with this understanding is the process that we call civilization. Because if Lenny Bruce had one unassailable and self-evident observation, it was that “Life is a four-letter word.” How could it be fucking otherwise?

Bonus Link:—Pussy Riot: One Woman’s Vagina Takes on Japan’s Obscenity Laws

Image Credit: Flickr/Jeniffer Moo

Secrets That Hold Us

1.
“I didn’t grow up with my mother,” Mom tells me. We’re sitting on the front verandah looking out at the grass before us wilting in the afternoon sun and the dwarf coconut trees that line one side of the driveway. The coconut fronds dip with the breeze, revealing green-and yellow-husked coconuts. It’s hot on the verandah; the aluminum awning, put there years earlier to shield the sun and rain, traps the late afternoon heat as well.
“I was 12 before I knew my mother,” she says.
 My mother is responding to some transgression of mine, what specifically I don’t remember. Perhaps something I neglected to tell her. There’s a lot we didn’t talk about then and don’t talk about now—a trait I earned honestly, I now know.
To grow up in Jamaica is to hear and know these stories of mothers who have migrated abroad or to Kingston, leaving their children to be raised by another relative. Barrel children, we call them, a term that stems from the fact that the migrating parents often send barrels of food and clothes back home for their children. But this is not my mother’s story.
I am in my 20s then when she tells me about meeting her mother for the first time. My mother was 12, living in Clarendon on the southern side of Jamaica with her father’s twin sisters, a cousin, and three of her father’s other children. She talks about that day as if it’s nothing remarkable: A visitor comes and one of the aunts tells my mother to take the woman to another relative’s house some yards away on a vast tract of land that is subdivided for various relatives. They’re on their way, when my grandaunt call my mother back. It turns out they were watching to see what my mother would do, how long it would take my mother to start quizzing the visitor.
“That’s your mother,” my grandaunt tells her. My mother was a baby when her mother left her there with the aunts, too young to recognize her own mother those 12 years later.
Among her siblings, my mother’s story is not unique. My mother’s father worked with the now-defunct railway service in Jamaica, crossing the country on the east-west train line. I don’t know how or where he met the mothers of his children. But it seems, he collected his offspring and deposited them in Clarendon for his sisters to raise. “I don’t know what arrangement Papa Stanley had with her,” my mother says of her parents.
My mother’s mother—Mother Gwen—raised three boys. “She gave away the girl and kept the boys,” my mother says. In it, I hear the sting of abandonment and the loss of having grown up without a concrete reason for her mother’s absence. It’s years before my mother sees her mother again, and by then she’s an adult with a life far different from that of her mother and the sons her mother kept and raised.
2.
My mother came of age with the generation that ushered in Jamaica’s independence from Britain in 1962, and she lives the Jamaican dream, that of moving off the island for a better life elsewhere. Generation after generation of Jamaicans have moved abroad—some to Panama to help build the canal, some to Cuba to work the sugar cane fields, others to Britain during the Windrush era, and countless others to America. A year after marrying my father in 1967, my parents moved to America for undergraduate studies at Tuskegee University, and later to Urbana, Ill., for my father’s graduate studies. Driven away by the cold, winter’s wrath, snow measured by the feet instead of inches, my parents returned to Jamaica in 1971, with the first of their three daughters. My father settled into a job with a unit of a bauxite company and my mother begans teaching early childhood education, supervising teachers and later teaching at a teachers college.
In 1977, they bought the house they still own—a split-level with a front lawn that slopes down to the street. The house overlooks a valley, where a river once ran. Across the valley is the main road into town and a forested cliff side. We sit on that verandah, my mother and I, looking out as she tells me about this grandmother I hadn’t known about until recently. The openness of the yard, the grass sloping down to the road, contrasts with the family secret she’s held on to for so long.
3.
I am already out of college and in my early 20s when my sisters and I rediscover Mother Gwen. We had, the three of us, come to our separate conclusions that our maternal grandmother had long been dead. I don’t have a specific reason for thinking she was dead—no memory of a funeral, no snippets of a conversation at the back of my mind. But growing up, we made weekly or biweekly trips to visit our relatives in different parts of the island: my paternal grandparents in Anchovy, a small town in the rambling hills that look down on Montego Bay; my maternal grandfather in Kingston; the grandaunts who raised my mother in the woods of Clarendon amidst coffee and cocoa and citrus plants. Now, I suspect that I assumed she had passed on because she was not among the relatives we visited, and unlike my other relatives, I have no specific memory of her—no Christmas or Boxing Day dinners, no visits to her after Easter Sunday services.
Grandma—my father’s mother—I remember clearly. She baked birthday cakes in three different sizes because my sisters and I celebrated our birthdays in back-to-back-to-back months—April, May, and June. My older sister got the largest cake, I got the mid-sized cake, and my younger sister got the smallest cake. Grandma divided money in a similar manner—$20, $10 and $5. Our birth order mattered to her.
I don’t have childhood memories of Mother Gwen, or the sons she raised without my mother. Instead, I have a vague recollection of an unfinished house, a metal drum by the side of the house, and large red and orange colored fish swimming in the makeshift aquarium. I don’t know if the house of my memory and the house where we rediscovered our grandmother are the same but it’s the memory that came to me when we walked into the house that Sunday afternoon and my mother said to her mother, who was slowly going blind, “These are my daughters, your granddaughters.” The specifics of the conversation are lost to me now but I imagine we must have talked with our grandmother and our uncles about our studies, our lives in America. Our uncles gave us trinkets they made or bought to sell—bracelets with the Jamaican colors: black, green and yellow.
Even now, I can see the shock on my sisters’ faces—eyebrows going up, eyes widening, exaggerated blinks. How could we have lived so long, on an island as small as Jamaica, without knowing our mother’s mother still lived?
4.
On the verandah that day, my mother tells me that she feared her brothers, feared they would harm her children in some way. In response, she kept us away. My mother doesn’t give a concrete reason for fearing her brothers but vaguely says something about the differences in their lives, what she had built of her life and what they hadn’t made of theirs—and the possibility that potential jealousy of what she had accomplished would bubble over. Unlike my mother who taught at a teachers college for most of my childhood, her brothers—all Rastafarians—made and sold trinkets in various roadside stalls and markets. But they had feared her return to their lives, thinking that perhaps she’d come to claim what she thought should be hers. Perhaps her brothers said something that gave my mother pause.
Without my sisters and me, my mother drifted in and out of Mother Gwen’s life, not taking us back until we were in our 20s, young adults embarking on our own lives. By the time I rediscovered my grandmother and her sons, it was too late for her to become grandma or her sons to become uncles.
But later in her life, my mother truly became Mother’s Gwen’s daughter, driving some 60 miles every Saturday to Spanish Town—where Mother Gwen lived with the only surviving son—with bags of clean laundry and bags and boxes of grocery: fish, chicken, yam, bread, eggs, pumpkin, thyme, and sometimes a pot of fresh soup. Then she’d call out to Mother Gwen, blind then and a little hard of hearing, before walking into the room where she slept or sat in the doorway to catch the Jamaican breeze. My mother gathered the soiled bed sheets and clothes, hand washed some before she left, and hung them on the line in the back to dry. The rest my mother packed to take home where she washed them, only to exchange them for a newly soiled batch a few days later.
On the few occasions I was there, my uncle hovered, keeping watch over my mother, and when he had me for an audience, he turned the small verandah into a stage and spouted word-for-word Marcus Garvey speeches he has rehearsed, every inflection perfectly placed, his eyes staring straight ahead as if looking at the words scrawled in the air. Thin and wiry, he talked about the occasions on which he’d been invited to recite a Marcus Garvey speech, the opportunities he’d missed to make a career out of this ability of his. Sometimes he talked at length about reasons for a decision or reasons he hasn’t been able to make more of his life—his mother, of course, was the main reason.
Mother Gwen died blind, completely dependent on the daughter she given up.


5.
Mother Gwen left her daughter but my mother didn’t. For my mother, I think her mother’s absence is like a shroud she can never remove. And I think it’s why she was always there for her children.
I only have two memories of my mother not being with us: once she took a sabbatical from the college where she taught and spent some time in New York. I don’t recall the length of her absence, just that I got sick the very day she left; the helper who was with us fed me tea made from the leaf of a lime tree and cream soda. I’ve never liked either since. The second time—the summer after my older sister graduated from high school—my mother took my sister to New York and left my younger sister and me at home with our father. We’d always traveled together—my sisters, our mother, and I—so this was new. The morning after their departure, after my father had left for work, the phone rang—an operator with a collect call for my mother. In the background, I heard a cousin saying Papa Stanley had died. I didn’t even have to accept the charge for I already knew the message I had to pass on: my mother’s father was dead. Mom returned within the week.
6.
The absence of my mother’s mother lives with me, too; an obsession I cannot shake, a recurring motif that springs from my unconscious into most of my longer pieces of fiction. These days, the recurring theme in my novels is mothers who don’t raise their children. My first novel, River Woman, is about a young woman who loses her son, and whose mother returns to Jamaica after years a broad for her grandson’s funeral. Their reunion is fraught with tension. My second novel, Tea by the Sea, is about another mother who spends 17 years searching for a daughter taken from her at birth.
While I didn’t set out to write my mother’s story, the ideas of abandonment and loss and belonging have crept into my work and remained there, a lurking obsession that I don’t yet seem able to escape. Perhaps, it’s my unconscious attempt to reach under the layers of family stories to discover why my mother holds on to these family secrets and stores them, as if they will be her undoing.
Sometimes I think the secrets my mother holds trap her into bearing responsibility for her father’s transgressions, for the circumstances of her birth. It’s not her burden to carry, and yet she does. I see it in her response to the news of another brother, another of her father’s children, whom she learned about not long before her father’s death. She had known the young man, taught him at school, knew him with a surname different from her own. As she tells it, her father said the young man, now grown, was coming to visit him. Why, my mother asked. “He’s my son.” Even now, years and years later, my mother still hasn’t fully accepted him. She says, “I didn’t know him as a brother,” and recalls his mother naming him as another man’s child—a jacket, in Jamaican terms—as if those circumstances are her brother’s to bear.
As a writer, I can fictionalize the reasons family members hold onto secrets long after they have lost their usefulness. Or they can let them go, setting free the secrets that trap a child into bearing responsibility for a parent’s mistakes.
7.
My mother is nearing 80. Cancer has weakened her body, perhaps lowered her defenses. She invites her brother—the only one of her mother’s three sons still alive—his daughter, and two grandchildren to visit her home. They came on a Sunday in March, the first time in the 42 years my parents owned that house that her visited. She made cupcakes with her grandniece. And when my mother talks of the visit there is joy in her voice.
I imagine them on the verandah looking out at the expanse of green before them: a brother and a sister nearer to the end of their lives than the beginning, both reaching across the years to their memories of the mother who held them together.
Image Credit: Pikist.

Ten Ways to Save the World

1.              In a purple-walled gallery of the Smithsonian Museum of American Art, you can visit the shrine constructed by Air Force veteran and janitor James Hampton for Jesus Christ’s return. Entitled “Throne of the Third Heaven of the Nation’s Millennium General Assembly,” the altar and its paraphernalia were constructed to serve as temple objects for the messiah, who according to Hampton, based on visions he had of Moses in 1931, the Virgin Mary in 1946, and Adam in 1949, shall arrive in Washington D.C. His father had been a part-time gospel singer and Baptist preacher, but Hampton drew not just from Christianity, but brought Afrocentric folk traditions of his native South Carolina to bear in his composition. Decorated with passages from Daniel and Revelation, Hampton’s thinking (done in secret over 14 years in his Northwest Washington garage) is explicated in his 100-page manifesto St. James: The Book of the 7 Dispensations (dozens of pages are still in an uncracked code). Claiming that he had received a revised version of the Decalogue, Hampton’s notebook declared himself to be “Director, Special Projects for the State of Eternity.” His work is a fugue of word salad, a concerto of pressured speech. A combined staging ground for the incipient millennium—Hampton’s shrine is a triumph.

As if the bejeweled shield of the Urim and the Thummim were constructed not by Levites in ancient Jerusalem, but by a janitor in Mt. Vernon. Exodus and Leviticus give specifications for those liturgical objects of the Jewish Temple—the other-worldly cherubim gilded, their wings touching the hem of infinity huddled over the Ark of the Covenant; the woven brocade curtain with its many-eyed Seraphim rendered in fabric of red and gold; the massive candelabra of the ritual menorah. The materials with which the Jews built their Temple were cedar and sand stone, gold and precious jewels. When God commanded Hampton to build his new shrine, the materials were light-bulbs and aluminum foil, door frames and chair legs, pop cans and cardboard boxes, all held together with glue and tape. The overall effect is, if lacking in gold and cedar, transcendent nonetheless. Hampton’s construction looks almost Mesoamerican, aluminum foil delicately hammered onto carefully measured cardboard altars, names of prophets and patriarchs from Ezekiel to Abraham rendered.

Everyday Hampton would return from his job at the General Services Administration, where he would mop floors and disinfect counters, and for untold hours he’d assiduously sketch out designs based on his dreams, carefully applying foil to wood and cardboard, constructing crowns from trash he’d collected on U Street. What faith would compel this, what belief to see it finished? Nobody knew he was doing it. Hampton would die of stomach cancer in 1964, never married, and with few friends or family. The shrine would be discovered by a landlord angry about late rent. Soon it would come to the attention of reporters, and then the art mavens who thrilled to the discovery of “outsider” art—that is work accomplished by the uneducated, the mentally disturbed, the impoverished, the religiously zealous. “Throne of the Third Heaven of the Nation’s Millennium General Assembly” would be purchased and donated to the Smithsonian (in part through the intercession of artist Robert Rauschenberg) where it would be canonized as the Pieta of American visionary art, outsider art’s Victory of Samothrace. 

Hampton wasn’t an artist though—he was a prophet. He was Elijah and Elisha awaiting Christ in the desert. Daniel Wojcik writes in Outsider Art: Visionary Worlds and Trauma that “apocalyptic visions often have been expressions of popular religiosity, as a form of vernacular religion, existing at a grassroots level apart from the sanction of religious authority.” In that regard Hampton was like so many prophets before him, just working in toilet paper and beer can rather than papyrus—he was Mt. Vernon’s Patmos. Asking if Hampton was mentally ill is the wrong question; it’s irrelevant if he was schizophrenic, bipolar. Etiology only goes so far in deciphering the divine language, and who are we so sure of ourselves to say that the voice in a janitor’s head wasn’t that of the Lord? Greg Bottoms writes in Spiritual American Trash: Portraits from the Margins of Art and Faith that Hampton “knew he was chosen, knew he was a saint, knew he had been granted life, this terrible, beautiful life, to serve God.” Who among us can say that he was wrong? In his workshop, Hampton wrote on a piece of paper “Where there is no vision, the people perish.” There are beautiful and terrifying things hidden in garages all across America; there are messiahs innumerable. Hampton’s shrine is strange, but it is oh so resplendent.

2.              By the time Brother John Nayler genuflected before George Fox, the founder of the Quaker Society of Friends, his tongue had already been bored with a hot iron poker and the letter “B” (for “Blasphemer”) had been branded onto his forehead by civil authorities. The two had not gotten along in the past, arguing over the theological direction of the Quakers, but by 1659 Nayler was significantly broken by their mutual enemies that he was forced to drag himself to Fox’s parlor and to beg forgiveness. Three years had changed the preacher’s circumstances, for it was in imitation of the original Palm Sunday that in 1656 Nayler had triumphantly entered into the sleepy sea-side town of Bristol upon the back of a donkey, the religious significance of the performance inescapable to anyone. A supporter noted in a diary that Nayler’s “name is no more to be called James but Jesus,” while in private writings Fox noted that “James ran out into imaginations… and they raised up a great darkness in the nation.”

At the start of 1656, Nayler was imprisoned, and when Fox visited him in his cell, the latter demanded that the former kiss his foot (belying the Quaker reputation for modesty). “It is my foot,” Fox declared, but Nayler refused. The confidence of a man who reenacted Christ’s entrance into Jerusalem. Guided by the Inner Light that Quakers saw as supplanting even the gospels, Nayler thought of his mission in messianic terms, and organized his vehicle to reflect that. Among the “Valiant Sixty,” itinerant preachers who were too radical even for the Quakers, Nayler was the most revolutionary, condemning slavery, enclosure, and private property. The tragedy of Nayler is that he happened to not actually be the messiah. Before his death, following an assault by a highwayman in 1660, Nayler would write that his “hope is to outlive all wrath and contention, and to wear out all exaltation and cruelty, or whatever is of a nature contrary to itself.” He was 42, beating Christ by almost a decade.

“Why was so much fuss made?” asks Christopher Hill in his classic The World Turned Upside Down: Radical Ideas During the English Revolution. “There had been earlier Messiahs—William Franklin, Arise Evans who told the Deputy Recorder of London that he was the Lord his God…Gadbury was the Spouse of Christ, Joan Robins and Mary Adams believed they were about to give birth to Jesus Christ.” Hill’s answer to the question of Nayler’s singularity is charitable, writing that none of the others actually seemed dangerous, since they were merely “holy imbeciles.” The 17th-century, especially around the time of the English civil wars, was an age of blessed insanity, messiahs proliferating like dandelions after a spring shower. There was John Reeve, Laurence Clarkson, and Lodowicke Muggleton, who took turns arguing over which were the two witnesses mentioned in Revelation, and that God had absconded from heaven and the job was now open. Abiezer Cope, prophet of a denomination known as the Ranters, demonstrated that designation in his preaching and writing. One prophet, Thoreau John Tany (who designated himself “King of the Jews”), simply declared “What I have written, I have written,” including the radical message that hell was liberated and damnation abolished. Regarding the here and now, Tany had some similar radical prescriptions, including to “feed the hungry, clothe the naked, oppress none, set free them bounden.”

There have been messianic claimants from first century Judea to contemporary Utah. When St. Peter was still alive there was the Samaritan magician Simon Magus, who used Christianity as magic and could fly, only to be knocked from the sky during a prayer-battle with the apostle. In the third century the Persian prophet Mani founded a religion that fused Christ with the Buddha, that had adherents from Gibraltar to Jinjiang, and a reign that lasted more than a millennium (with its teachings smuggled into Christianity by former adherent St. Augustine). A little before Mani, and a Phrygian prophet named Montanus declared himself an incarnation of the Holy Spirit, along with his consorts Priscilla and Maximillia. Prone to fits of convulsing revelation, Montanus declared “Lo, the man is as a lyre, and I fly over him as a pick.” Most Church Fathers denounced Montanism as rank heresy, but not Tertullian, who despite being the primogeniture of Latin theology, was never renamed “St. Tertullian” because of those enthusiasms. During the Middle Ages, at a time when stereotype might have it that orthodoxy reigned triumphant, and mendicants and messiahs, some whose names aren’t preserved to history and some who amassed thousands of followers, proliferated across Europe. Norman Cohn remarks in The Pursuit of the Millennium that for one eighth-century Gaulish messiah named Aldebert, followers “were convinced that he knew all their sins…and they treasured as miracle-working talismans the nail pairings and hair clippings he distributed among them.” Pretty impressive, but none of us get off work for Aldebert’s birthday.  

More recently, other messiahs include the 18th-century prophetess and mother of the Shakers Anne Lee, the 20th-century founder of the Korean Unification Church Sun Myung Moon (known for his elaborate mass weddings and owning the conservative Washington Times), and the French test car driver Claude Vorilhon who renamed himself Raël and announced that he was the son of an extraterrestrial named Yahweh (more David Bowe’s The Rise and Fall of Ziggy Stardust and the Spiders from Mars then Paul’s epistles). There are as many messiahs as there are people; there are malicious messiahs and benevolent ones, deluded head-cases and tricky confidence men, visionaries of transcendent bliss and sputtering weirdos. What unites all of them is an observation made by Reeve that God speaks to them as “to the hearing of the ear as a man speaks to a friend.” 

3.Hard to identify Elvis Presley’s apotheosis. Could have been the ’68 Comeback Special, decked in black leather warbling “Suspicious Minds” in that snarl-mumble. Elvis’s years in the wilderness precipitated by his manager Col. Tom Parker’s disastrous gambit to have the musician join the army, only to see all of the industry move on from his rockabilly style, now resurrected on a Burbank sound stage. Or maybe it was earlier, on The Milton Berle Show in 1956, performing Big Mama Thorton’s hit “Hound Dog” to a droopy dog, while gyrating on the Hollywood stage, leading a critic for the New York Daily News to opine that Elvis “gave an exhibition that was suggestive and vulgar, tinged with the kind of animalism that should be confined to dives and bordellos.” A fair candidate for that moment of earthly transcendence could be traced back to 1953, when in Sun Records’ dusty Memphis studio Elvis would cover Junior Parker’s “Mystery Train,” crooning out in a voice both shaky and confident over his guitar’s nervous warble “Train I ride, sixteen coaches long/Train I ride, sixteen coaches long/Well that long black train, got my baby and gone.” But in my estimation, and at the risk of sacrilege, Elvis’s ascension happened on Aug. 16, 1977 when he died on the toilet in the private bathroom of his tacky and opulent Graceland estate.

The story of Elvis’s death has the feeling of both apocrypha and accuracy, and like any narrative that comes from that borderland country of the mythic, it contains more truth than the simple facts can impart. His expiration is a uniquely American death, but not an American tragedy, for Elvis was able to get just as much out of this country as the country ever got out of him, and that’s ultimately our true national dream. He grabbed the nation by its throat and its crotch, and with pure libidinal fury was able to incarnate himself as the country. All of the accoutrement—the rhinestone jumpsuits, the karate and the Hawaiian schtick, the deep-fried-peanut-butter-banana-and-bacon-sandwiches, the sheer pill-addicted corpulence—are what make him our messiah. Even the knowing, obvious, and totally mundane observation that he didn’t write his own music misses the point. He wasn’t a creator—he was a conduit. Greil Marcus writes in Mystery Train: Images of America in Rock ‘n’ Roll of the “borders of Elvis Presley’s delight, of his fine young hold on freedom…[in his] touch of fear, of that old weirdness.” That’s the Elvis that saves, the Elvis of “That’s All Right (Mama)” and “Shake, Rattle, and Roll,” those strange hillbilly tracks, that weird chimerical sound—complete theophany then and now.

There’s a punchline quality to that contention about white-trash worshipers at the Church of Elvis, all of those sightings in the Weekly World News, the Las Vegas impersonators of various degrees of girth, the appearance of the singer in the burnt pattern of a tortilla. This is supposedly a faith that takes its pilgrimage to Graceland as if it were zebra-print Golgotha, that visits Tupelo as if it were Nazareth. John Strausbaugh takes an ethnographer’s calipers to Elvism, arguing in E: Reflections on the Birth of the Elvis Faith that Presley has left in his wake a bona fide religion, with its own liturgy, rituals, sacraments, and scripture. He writes that it is a “The fact that outsiders can’t take it seriously may turn out to be its strength and its shield. Maybe by the time Elvism is taken seriously it will have quietly grown too large and well established to be crushed.” There are things less worthy of your worship than Elvis Presley. If we were to think of an incarnation of the United States, of a uniquely American messiah, few candidates would be more all consumingly like the collective nation. In his appetites, his neediness, his yearning, his arrogance, his woundedness, his innocence, his simplicity, his cunning, his coldness, and his warmth, he was the first among Americans. Elvis is somehow both rural and urban, northern and southern, country and rock, male and female, white and black. Our contradictions are reconciled in him. “Elvis lives in us,” Strausbaugh writes, “There is only one King and we know who he is.” We are Elvis and He was us.   

4.              A hideous slaughter followed those settlers as they drove deep into the continent. On that western desert, where the lurid sun’s bloodletting upon the burnt horizon signaled the end of each scalding day, a medicine man and prophet of the Paiute people had a vision. In a trance, Wodziwob received an oracular missive, that “within a few moons there was to be a great upheaval or earthquake… he whites would be swallowed up, while the Indians would be saved.” Wodziwob would be the John the Baptist to a new movement, for though he would die in 1872, the ritual practice that he taught—the Ghost Dance—would become a rebellion against the genocidal policy of the U.S. Government. For Wodziwob, the Ghost Dance was an affirmation, but it has also been remembered as a doomed moment. “To invoke the Ghost Dance has been to call up an image of indigenous spirituality by turns militant, desperate, and futile,” writes Louis S. Warren in God’s Red Son: The Ghost Dance Religion and the Making of Modern America, “a beautiful dream that died.” But a dream that was enduring. 

While in a coma precipitated by scarlet fever, during a solar eclipse, on New Year’s Day of 1889, a Northern Paiute Native American who worked on a Carson City, Nevada, ranch and was known as Jack Wilson by his coworkers and as Wovoka to his own people, fell into a mystical vision not unlike Wodziwob’s. Wovoka met many of his dead family members, he saw the prairie that exists beyond that which we can see, and he held counsel with Jesus Christ. Wovoka was taught the Ghost Dance, and learned what Sioux Chief Lame Deer would preach, that “the people…could dance a new world into being.” When Wovoka returned he could control the weather, he was able to compel hail from the sky, he could form ice with his hands on the most sweltering of days. The Ghost Dance would spread throughout the western United States, embraced by the Paiute, the Dakota, and the Lakota. Lame Deer said that the Ghost Dance would roll up the earth “like a carpet with all the white man’s ugly things—the stinking new animals, sheep and pigs, the fences, the telegraph poles, the mines and factories. Underneath would be the wonderful old-new world.”

A messiah is simultaneously the most conservative and most radical, preaching a return to a perfected world that never existed but also the overturning of everything of this world, of the jaundiced status quo. The Ghost Dance married the innate strangeness of Christianity to the familiarity of native religion, and like both it provided a blueprint for how to overthrow the fallen things. Like all true religion, the Ghost Dance was incredibly dangerous. That was certainly the view of the U.S. Army and the Bureau of Indian Affairs, which saw an apocalyptic faith as a danger to white settler-colonials and their indomitable, zombie-like push to the Pacific. Manifest Destiny couldn’t abide by the hopefulness of a revival as simultaneously joyful and terrifying as the Ghost Dance, and so an inevitable confrontation awaited.   

The Miniconjou Lakota people, forced into the Pine Ridge Reservation by 1890, were seen as particularly rebellious, in part because their leader Spotted Elk was an adherent. As a pretext concerning Lakota resistance to disarmament, the army opened fire on gathered Miniconjou, and more than 150 people (mostly women and children) would be slaughtered during the Wounded Knee Massacre. As surely as the Romans threw Christians to the lions and Cossacks rampaged through the Jewish shtetls of eastern Europe, so too were the initiates of the Ghost Dance persecuted, murdered, and martyred by the U.S. Government. Warren writes that the massacre “has come to stand in for the entire history of the religion, as if the hopes of all of its devoted followers began and ended in that fatal ravine.” Wounded Knee was the Calvary of the Ghost Dance faith, but if Calvary has any meaning it’s that crucified messiahs have a tendency not to remain dead. In 1973 a contingent of Oglala Lakota and members of the American Indian Movement occupied Wounded Knee, and the activist Mary Brave Beard defiantly performed the Ghost Dance, again.   

5.             Rabbi Menachem Mendel Schneerson arrived in the United States via Paris, via Berlin, and ultimately via Kiev. He immigrated to New York on the eve of America’s entry into the Second World War, and in that interim six million Jews were immolated in Hitler’s ovens—well over half of all the Jews in the world. Becoming Lubavitch Chief Rebbe in 1950, Schneerson was a refugee from a broken Europe that had devoured itself. Schneerson’s denomination of Hasidism had emerged after the vicious Cossack-led pogroms that punctuated life in 17th-century eastern Europe, when many Jews turned towards the sect’s founder, the Baal Shem Tov. His proper name was Rabbi Israel ben Eliezer, and his title (often shortened to “Besht”) meant “Master of the Good Name,” for the Baal Shem Tov incorporated Kabbalah into a pietistic movement that enshrined emotion over reason, feeling over logic, experience over philosophy. David Bial writes in Hasidism: A New History that the Besht espoused “a new method of ecstatic joy and a new social structure,” a fervency that lit a candle against persecution’s darkness.

When Schneerson convened a gathering of Lubavitchers in a Brooklyn synagogue for Purim in 1953,  a black cloud enveloped the Soviet Union. Joseph Stalin was beginning to target Jews whom he implicated in “The Doctor’s Plot,” an invented accusation that Jewish physicians were poisoning Soviet leadership. The state propaganda organ Pravda denounced these supposed members of a “Jewish bourgeois-nationalist organization… The filthy face of this Zionist spy organization, covering up their vicious actions under the mask of charity.” Four gulags were constructed in Siberia, with the understanding that Russian Jews would be deported and perhaps exterminated. Less than a decade after Hitler’s suicide, and Schneerson would look out into the congregation of swaying black-hatted Lubavitchers, and would see a people labeled for extinction.

And so on that evening, Schneerson explicated on the finer points of Talmudic exegesis, on questions of why evil happens in the world, and what man and G-d’s role is in containing that wickedness. Witnesses said that the very countenance of the rabbi was transformed, as he declared that he would speak the words of the living G-d. Enraptured in contemplation, Schneerson connected the Persian courtier Haman’s war against the Jews and Stalin’s upcoming campaign, he invoked G-d’s justice and mercy, and implored the divine to intervene and prevent the Soviet dictator from completing that which Hitler had begun. The Rebbe denounced Stalin as the “evil one,” and as he shouted it was said that his face transformed into a “holy fire.”

Two days later Moscow State Radio announced that Stalin had fallen ill and died. The exact moment of his expiration was when a group of Lubavitch Jews had prayed that G-d would still the hand of the tyrant and punish his iniquities. Several weeks later, and Soviet leadership would admit that the Doctor’s Plot was a government rouse invented by Stalin, and they exonerated all of those who’d been punished as a result of baseless accusations. By the waning days of the Soviet Union, the Lubavitcher Rebbe would address crowds gathered in Red Square by telescreen while the Red Army Band performed Hasidic songs. “Was this not the victory of the Messiah over the dark forces of the evil empire, believers asked?” writes Samuel Heilman and Menachem Friedman in The Rebbe: The Life and Afterlife of Menachem Mendel Schneerson.

The Mashiach (“anointed one” in Hebrew) is neither the Son of G-d nor the incarnate G-d, and his goal is arguably more that of liberation than salvation (whatever either term means). Just as Christianity has had many pseudo-messiahs, so is Jewish history littered with figures whom some believers saw as the anointed one (Christianity is merely the most successful of these offshoots). During the Second Jewish-Roman War of the second century, the military commander Simon bar Kokhba was lauded as the messiah, even as his defeat led to Jewish exile from the Holy Land. During the 17th century, the Ottoman Jew Sabbatai Zevi amassed a huge following of devotees who believed him the messiah come to subvert and overthrow the strictures of religious law itself. Zevi was defeated by the Ottomans not through crucifixion, but through conversion (which is much more dispiriting). A century later, and the Polish libertine Jacob Frank would declare that the French Revolution was the apocalypse, that Christianity and Judaism must be synthesized, and that he was the messiah. Compared to them, the Rebbe was positively orthodox (in all senses of the word). He also never claimed to be the messiah.

What all share is the sense that to exist is to be in exile. That is the fundamental lesson and gift of Judaism, born from the particularities of Jewish suffering. Diaspora is not just a political condition, or a social one; diaspora is an existential state. We are all marooned from our proper divinity, shattered off from G-d’s being—alone, disparate, isolated, alienated, atomized, solipsistic. If there is to be any redemption it’s in suturing up those shards, collecting those bits of light cleaved off from the body of G-d when He dwelled in resplendent fullness before the tragedy of creation. Such is the story of going home but never reaching that destination, yet continuing nevertheless. What gives this suffering such beauty, what redeems the brokenness of G-d, is the sense that it’s that very shattering that imbues all of us with holiness. What the German-Jewish philosopher Walter Benjamin explains in On the Concept of History as the sacred reality that holds that “every second was the narrow gate, through which the Messiah could enter.”

6.When the Living God landed at Palisadoes Airport in Kingston, Jamaica, on April 21, 1966, he couldn’t immediately disembark from his Ethiopian Airlines flight from Addis Ababa. More than 100,000 people had gathered at the airport, the air thick with the sticky, sweet smell of ganja, the airstrip so overwhelmed with worshipers come to greet the Conquering Lion of the Tribe of Judah, His Imperial Majesty Haile Selassie I, King of Kings, Lord of Lords, Elect of God, Power of the Trinity, of the House of Solomon, Amhara Branch, noble Ras Tafari Makonnen, that there was a fear the plane itself might tip over. The Ethiopian Emperor, incarnation of Jah and the second coming of Christ, remained in the plane for a few minutes until a local religious leader, the drummer Ras Mortimer Planno, was allowed to organize the emperor’s descent.

Finally, after several tense minutes, the crowd pulled back long enough for the emperor to disembark onto the tarmac, the first time that Selassie would set foot on the fertile soil of Jamaica, a land distant from the Ethiopia that he’d ruled over for 36 years (excluding from 1936 to 1941 when his home country was occupied by the Italian fascists). Jamaica was where he’d first been acknowledged as the messianic promise of the African diaspora. A year after his visit, while being interviewed by the Canadian Broadcasting Corporation, Selassie was asked what he made of the claims of his status. “I told them clearly that I am a man,” he said, “that I am mortal…and that they should never make a mistake in assuming or pretending that a human being is emanated from a deity.” The thing with being a messiah, though, is that it does not depend on the consent of the worshiped whether or not they’re to be adored.

Syncretic and born from the Caribbean experience, and practiced from Kingstown, Jamaica, to Brixton, London, Rastafarianism is a mélange of Christian, Jewish, and uniquely African symbols and beliefs, with its own novel rhetoric concerning oppression and liberation. Popularized throughout the West because of the indelible catchiness of reggae, with its distinctive muted third beat, and the charisma of the musician Bob Marley who was the faith’s most famous ambassador, Rastafarianism is sometime offensively reduced in peoples’ minds to dreadlocks and spliff smoke. Ennis Barrington Edmonds places the faith’s true influence in its proper context, writing in Rastafari: From Outcasts to Culture Bearers that “the movement has spread around the world, especially among oppressed people of African origins… [among those] suffering some form of oppression and marginalization.”

Central to the narrative of Rastafarianism is the reluctant messiah Selassie, a life-long member of the Ethiopian Orthodox Tewahedo Church. Selassie’s reign had been prophesized by the Jamaican Protestant evangelist Leonard Howell’s claim that the crowning of an independent Black king in an Africa dominated by European colonialism would mark the dawn of a messianic dispensation. A disciple of Black nationalist Marcus Garvey, whom Howell met when both lived in Harlem, the minister read Psalm 68:31’s injunction that “Ethiopia shall soon stretch out her hands unto God” as being fulfilled in Selassie’s coronation. Some sense of this reverence is imparted by a Rastafarian named Reuben in Emily Robeteau’s Searching for Zion: The Quest for Home in the African Diaspora who explained that “Ethiopia was never conquered by outside forces. Ethiopia was the only independent country on the continent of Africa…a holy place.” Sacred Ethiopia, the land where the Ark of the Covenant was preserved. 

That the actual Selassie neither embraced Rastafarianism, nor was particularly benevolent in his own rule, and was indeed deposed by a revolutionary Marxist junta, is of no accounting.  Rather what threads through Rastafarianism is what Robeateu describes as a “defiant, anticolonialist mind-set, a spirit of protest… and a notion that Africa is the spiritual home to which they are destined to return.” Selassie’s biography bore no similarity to residents in the Trenchtown slum of Kingstown where the veneration of a distant African king began, but his name served as rebellion against all agents of Babylon in the hopes of a new Zion. Rastafarianism found in Selassie the messiah who was needed, and in their faith there is a proud way of spiritually repudiating the horrors of the trans-Atlantic slave trade. What their example reminds us of is that a powerful people have no need of a messiah, that he rather always dwells amongst the dispossessed, regardless of what his name is.

7.Among the Persian Sufis there is no blasphemy in staining your prayer rug red with shiraz. Popular throughout Iran and into central Asia, where the faith of Zoroaster and Mani had both once been dominant, Sufism drew upon those earlier mystical and poetic traditions and incorporated them into Islam. The faith of the dervishes, the piety of the wali, the Sufi tradition is that of Persian miniatures painted in stunning, colorful detail, of the poetry of Rumi and Hafez. Often shrouded in rough woolen coats and felt caps, the Sufi practice a mystical version of faith that’s not dissimilar to Jewish kabbalah or Christian hermeticism, an inner path that the 20th-century writer Aldous Huxley called the “perennial philosophy.” As with other antinomian faiths, the Sufis often skirt the line of what’s acceptable and what’s forbidden, seeing in heresy intimations of a deep respect for the divine.

A central poetic topoi of Sufi practice is what’s called shath, that is deliberately shocking utterances that exist to shake believers out of pious complacency, to awaken within them that which is subversive about God. One master of the form was Mansour al-Hallaj, born to Persian speaking parents (with a Zoroastrian grandfather) in the ninth century during the Abbasid Caliphate. Where most Sufi masters were content to keep their secrets preserved for initiates, al-Hallaj crafted a movement democratized for the mass of Muslims, while also generating a specialized language for speaking of esoteric truths, expressed in “antithesis, breaking down language into prepositional units, and paradox,” as Husayn ibn Mansur writes in Hallaj: Poems of a Sufi Martyr. Al-Hallaj’s knowledge and piety were deep—he had memorized the Koran by the age of 12 and he prostrated himself before a replica of Mecca’s Kaaba in his Baghdad garden—but so was his commitment to the radicalism of shath. When asked where Allah was, he once replied that the Lord was within his turban; on another occasion he answered that question by saying that God was under his cloak. Finally in 922, borrowing one of the 99 names of God, he declared “I am the Truth.”

The generally tolerant Abbasids decided that something should be done about al-Hallaj, and so he was tied to a post along the Tigris River, repeatedly punched in the face, lashed several times, decapitated, and finally his headless body was hung over the water. His last words were “Akbar al-Hallaj” —“Al-Hallaj is great.” An honorific normally offered for God, but for this self-declared heretical messiah his name and that of the Lord were synonyms. What’s sacrilegious about this might seem clear, save for that al-Hallaj’s Islamic piety was such that he interpreted such a claim as the natural culmination of Tawhid, the strictness of Islamic monotheism pushed to its logical conclusion—there is but one God, and everything is God, and we are all in God. Idries Shah explains the tragic failure of interpretation among religious authorities in The Sufis, writing that the “attempt to express a certain relationship in language not prepared for it causes the expression to be misunderstood.”  The court, as is obvious, did not agree.

If imagining yourself as the messiah could get you decapitated by the 10th century Abbasids, then in 20th-century Michigan it only got you institutionalized. Al-Hallaj implied that he was the messiah, but for the psychiatric patients in Milton Rokeach’s 1964 study The Three Christs of Ypsilanti each thought that they were the authentic messiah (with much displeasure ensuing when they meet). Based on three years of observation at the Ypsilanti State Hospital starting in 1959, Rokeach treated this trinity of paranoid schizophrenics. Initially Rokeach thought that the meeting of the Christs would disavow them all of their delusions, that the law of logical non-contradiction might mean anything to a psychotic. But the messiahs were steadfast in their faith—each was singular and the others were imposters. Then Rokeach and his graduate students introduced fake messages from other divine beings, in a gambit that the psychiatrist apologized for two decades later and would most definitely land him before an ethics board today. Finally Rokeach grants them the right to their insanities, each of the Christs of Ypsilanti continuing in their merry madness. “It’s only when a man doesn’t feel that he’s a man,” Rokeach concludes, “that he has to be a god.”

Maybe. Or maybe the true madness of the Michigan messiahs was that each thought themselves the singular God. They weren’t in error that each of them were the messiah, they were in error by denying that truth in their fellow patients. Al-Hallaj would have understood, declaring before his executioner that “all that matters for the ecstatic is that the Unique shall reduce him to Unity.” The Christs may have benefited more by Sufi treatment than psychotherapy. Clyde Benson, Joseph Cassel, and Leon Gabor all thought themselves to be God, but al-Hallaj knew that he was (and that You reading this are as well). Those three men may have been crazy, but al-Hallaj was a master of what the Buddhist teacher Wes Nisker calls “crazy wisdom.” In his guidebook The Essential Crazy Wisdom, he celebrates the sacraments of “clowns, jesters, tricksters, and holy fools,” who understand that “we live in a world of many illusions, that the emperor has no clothes, and that much of human belief and behavior is ritualized nonsense.” By contrast, the initiate in crazy wisdom, whether gnostic saint of Kabbalist rabbi, Sufi master or Zen monk, prods at false piety to reveal deeper truths underneath. “I saw my Lord with the eye of the heart,” al-Hallaj wrote in one poem, “I asked, ‘Who are You?’/He replied, ‘You.’”

8. “Bob” is the least-likely looking messiah. With his generic handsomeness, his executive hair-cut dyed black and tightly parted on the left, the avuncular pipe that jauntily sticks out of his tight smile, “Bob” looks like a stock image of a 1950’s pater familias (his name is always spelled with quotation marks). Like a clip-art version of Mad Men’s Don Draper, or Utah Sen. Mitt Romney. “Bob” is also not real (which may or may not distinguish him from other messiahs), but rather the central figure in the parody Church of the Sub-Genius. Supposedly a traveling salesman, J.R. “Bob” Dobbs had a vision of JHVH-1 (the central God in the church) in a homemade television set, and he then went on the road to evangelize. Founded by countercultural slacker heroes Ivan Stang and Philo Drummond in 1979 (though each claims that “Bob” was the actual primogeniture), the Church of the SubGenius is a veritable font of crazy wisdom, promoting the anarchist practice of “culture jamming” and parody in the promulgation of a faith where it’s not exactly clear what’s serious and what isn’t.

“Bob” preaches a doctrine of resistance against JHVH-1 (or Jehovah 1), the demiurge who seems as if a cross between Yahweh and a Lovecraftian elder god. JHVH-1 intended for “Bob” to encourage a pragmatic, utilitarian message about the benefits of a work ethic, but contra his square appearance, the messiah preferred to advocate that his followers pursue a quality known as slack. Never clearly defined (though its connotations are obvious), slack is to the Church of the SubGenius what the Tao is to Taoism or the Word is to Christianity, both the font of all reality and that which gives life meaning. Converts to the faith include luminaries like the underground cartoonist Robert Crumb, Pee-wee Herman Show creator Paul Reubens, founder of the Talking Heads David Byrne, and of course Devo’s Mark Mothersbaugh. If there is any commandment that most resonates with the emotional tenor of the church, it’s in “Bob’s” holy commandment that says “Fuck ‘em if they can’t take a joke.”

The Church of the SubGenius is oftentimes compared to another parody religion that finds its origins from an identical hippie milieu, though first appearing almost two decades before in 1963, known by the ominous name of Discordianism. Drawing from Hellenic paganism, Discordianism holds as one of its central axioms in the Principia Discordia (written by founders Greg Hill and Kerry Wendell Thornley under the pseudonyms of Malaclypse the Younger and Omar Khayyam Ravenhurst) that the “Aneristic Principle is that of apparent order; the Eristic Principle is that of apparent disorder. Both order and disorder are man made concepts and are artificial divisions of pure chaos, which is a level deeper than is the level of distinction making.” Where other mythological systems see chaos as being tamed and subdued during ages primeval, the pious Discordian understands that disorder and disharmony remain the motivating structure of reality. To that end, the satirical elements—its faux scripture, its faux mythology, and its faux hierarchy—are paradoxically faithful enactments of its central metaphysics.

For those of a conspiratorial bent, Thornley first conceived of the movement after leaving the Marine Corps, and he was an associate of Lee Harvey Oswald. It sounds a little like one of the baroque plots in Robert Anton Wilson and Robert Shea’s The Illuminatus! Trilogy. A compendium of occult and conspiratorial lore whose narrative complexity recalls James Joyce or Thomas Pynchon, The Illuminatus! Trilogy was an attempt to produce for Discordianism what Dante crafted for Catholicism or John Milton for Protestantism: a work of literature commensurate with theology. Stang and Drummond, it should be said, were avid readers of The Illuminatus! Trilogy. “There are periods of history when the visions of madmen and dope fiends are a better guide to reality than the common-sense interpretation of data available to the so-called normal mind,” writes Wilson. “This is one such period, if you haven’t noticed already.” And how.

Inventing religions and messiahs wasn’t merely an activity for 20th-century pot smokers. Fearing the uncovering of the Christ who isn’t actually there can be seen as early as the 10th century, when the Iranian war-lord Abu Tahir al-Jannabi wrote about a supposed tract that referenced the “three imposters,” an atheistic denunciation of Moses, Jesus, and Muhammad. This equal opportunity apostasy, attacking all three children of Abraham, haunted monotheism over the subsequent millennium, as the infernal manuscript was attributed to several different figures. In the 13th century, Pope Gregory IX said that the Holy Roman Emperor Frederick II had authored such a work (the latter denied it). Within Giovanni Boccaccio’s 14th-century The Decameron there is reference to the “three imposters,” and in the 17th century, Sir Thomas Browne attributed the Italian Protestant refugee Bernardino Orchino with having composed a manifesto against the major monotheistic faiths.

What’s telling is that everyone feared the specter of atheism, but no actual text existed. They were scared of a possibility without an actuality, terrified of a dead God who was still alive. It wouldn’t be until the 18th century that writing would actually be supplied in the form of the French authored anonymous pamphlet of 1719 Treatise of the Three Imposters. That work, going through several different editions over the next century, drew on the naturalistic philosophy of Benedict Spinoza and Thomas Hobbes to argue against the supernatural status of religion. Its author, possibly the bibliographer Prosper Marchand, argued that the “attributes of the Deity are so far beyond the grasp of limited reason, that man must become a God himself before he can comprehend them.” One imagines that the prophets of the Church of the SubGenius and Discordianism, inventors of gods and messiahs aplenty, would concur. “Just because some jackass is an atheist doesn’t mean that his prophets and gods are any less false,” preaches “Bob” in The Book of the SubGenius.

9.The glistening promise of white-flecked SPAM coated in greasy aspic as it slips out from it’s corrugated blue can, plopping onto a metal plate with a satisfying thud. A pack of Lucky Strikes, with its red circle in a field of white, crinkle of foil framing a raggedly opened end, sprinkle of loose tobacco at the bottom as a last cigarette is fingered out. Heinz Baked Beans, sweet in their tomato gravy, the yellow label with its picture of a keystone slick with the juice from within. Coca-Cola—of course Coca-Cola—its ornate calligraphy on a cherry red can, the saccharine nose pinch within. Products of American capitalism, the greatest and most all-encompassing faith of the modern world, left behind on the Vanuatuan island of Tanna by American servicemen during the Second World War.

More than 1,000 miles northwest from Australia, Tanna was home to airstrips and naval bases, and for the local Melanesians, the 300,000 GIs housed on their island indelibly marked their lives. Certainly the first time most had seen the descent of airplanes from the blue of the sky, the first time most had seen jangling Jeeps careening over the paths of Tanna’s rainforests, the first time most had seen Naval destroyers on the pristine Pacific horizon. Building upon a previous cult that had caused trouble for the jointly administered colonial British-French Condominium, the Melanesians claimed that the precious cargo of the Americans could be accessed through the intercession of a messiah known as John Frum—believed to have possibly been a serviceman who’d introduced himself as “John from Georgia.”

Today the John Frum cult still exists in Tanna. A typical service can include the rising of the flags of the United States, the Marine Corps, and the state of Georgia, while shirtless youths with “USA” painted on their chests march with faux-riffles made out of sticks (other “cargo cults” are more Anglophilic, with one worshiping Prince Philip). Anthropologists first noted the emergence of the John Frum religion in the immediate departure of the Americans, with Melanesians apparently constructing landing strips and air traffic control towers from bamboo, speaking into left-over tin cans as if they were radio controls, all to attract back the quasi-divine Americans and their precious “cargo.” Anthropologist Holger Jebens in After the Cult: Perceptions of Other and Self in West New Britain (Papua New Guinea) describes “cargo cults” as having as their central goal the acquisition of “industrially manufactured Western goods brought by ship or aeroplane, which, from the Melanesian point of view, are likely to have represented a materialization of the superior and initially secret power of the whites.” In this interpretation, the recreated ritual objects molded from bamboo and leaves are offerings to John Frum so that he will return from the heavenly realm of America bearing precious cargo.

If all this sounds sort of dodgy, than you’ve reason to feel uncomfortable. More recently, some anthropologists have questioned the utility of the phrase “cargo cult,” and the interpretation of the function of those practices. Much of the previous model, mired in the discipline’s own racist origins, posits the Melanesians and their beliefs as “primitive,” with all of the attendant connotations to that word. In the introduction to Beyond Primitivism: Indigenous Religious Traditions and Modernity, Jacob K. Olupona writes that some scholars have defined the relationship between ourselves and the Vanuatuans by “challenging the notion that there is a fundamental difference between modernity and indigenous beliefs.” Easy for an anthropologist espying the bamboo landing field and assuming that what was being enacted was a type of magical conspicuous consumption, a yearning on the part of the Melanesians to take part in our own self-evidently superior culture. The only thing that’s actually self-evident in such a view, however, is a parochial and supremacist positioning that gives little credit to unique religious practices. Better to borrow the idea of allegory in interpreting the “cargo cults,” both the role which that way of thinking may impact the symbolism of their rituals, but also what those practices could reflect about our own culture.

Peter Worsley writes in The Trumpet Shall Sound: A Study of “Cargo” Cults in Melanesia that “a very large part of world history could be subsumed under the rubric of religious heresies, enthusiastic creeds and utopias,” and this seems accurate. So much of prurient focus is preoccupied with the material faith. Consequently there is a judgment of the John Frum religion as being superficial. Easy for those of us who’ve never been hungry to look down on praying for food, easy for those with access to medicine to pretend that materialism is a vice. Mock praying for SPAM at your own peril; compared to salvation, I at least know what the former is. Christ’s first miracle in the Gospel of John, after all, was the production of loaves and fishes, a prime instance of generating cargo. John Frum, whether he is real or not, is a radical figure, for while a cursory glance at his cult might seem that what he promises is capitalism, it’s actually the exact opposite. For those who pray to John Frum are not asking for work, or labor, or a Protestant work ethic, but rather delivery from bondage; they are asking to be shepherded into a post-scarcity world. John Frum is not intended to deliver us to capitalism, but rather to deliver us from it. John Frum is not an American, but he is from that more perfect America that exists only in the Melanesian spirit.   

10.On Easter of 1300, within the red Romanesque walls of the Cistercian monastery of Chiaravelle, a group of Umiliati Sisters were convened by Maifreda da Pirovano at the grave of the Milanese noblewoman Guglielma. Having passed two decades before, the mysterious Guglielma was possibly the daughter of King Premysl Otakar I of Bohemia, having come to Lombardy with her son following her husband’s death. Wandering the Italian countryside as a beguine, Guglielma preached an idiosyncratic gospel, claiming that she was an incarnation of the Holy Spirit, and that her passing would incur the third historical dispensation, destroying the patriarchal Roman Catholic Church in favor of a final covenant to be administered through women. If Christ was the new Adam come to overturn the Fall, then his bride was Guglielma, the new Eve, who rectified the inequities of the old order and whose saving grace came for humanity, but particularly for women.

Now a gathering of nuns convened at her inauspicious shrine, in robes of ashen gray and scapulars of white. There Maifreda would perform a Mass, transubstantiating the wafer and wine into the body and blood of Christ. The Guglielmites would elect Maifreda the first Pope of their Church. Five hundred years after the supposed election of the apocryphal Pope Joan, and this obscure order of women praying to a Milanese aristocrat would confirm Maifreda as the Holy Mother of their faith. That same year the Inquisition would execute 30 Guglielmites—including la Papessa. “The Spirit blows where it will and you hear the sound of it,” preached Maifreda, “but you know now whence it comes or whither it goes.”

Maifreda was to Guglielma as Paul was to Christ: apostle, theologian, defender, founder. She was the great explicator of the “true God and true human in the female sex…Our Lady is the Holy Spirit.” Strongly influenced by a heretical strain of Franciscans influenced by the Sicilian mystic Joachim of Fiore, Maifreda held that covenantal history could be divided tripartite, with the first era of Law and God the Father, the second of Grace and Christ the Son, and the third and coming age of Love and the Daughter known as the Holy Spirit. Barbara Newman writes in From Virile Woman to WomanChrist: Studies in Medieval Religion and Literature that after Guglielma’s “ascension the Holy Spirit would found a new Church, superseding the corrupt institution in Rome.” For Guglielma’s followers, drawn initially from the aristocrats of Milan but increasingly popular among more lowly women, these doctrines allowed for self-definition and resistance against both Church and society. Guglielma was a messiah and she arrived for women, and her prophet was Maifreda.

Writing of Guglielma, Newman says that “According to one witness… she had come in the form of a woman because if she had been male, she would have been killed like Christ, and the whole world would have perished.” In a manner she was killed, some 20 years after her death when her bones were disinterred from Chiaravelle, and they were placed on the pyre where Maifreda would be burnt alongside two of her followers. Pope Boniface VIII would not abide another claimant to the papal throne—especially from a woman. But even while Maifreda would be immolated, the woman to whom she gave the full measure of sacred devotion would endure, albeit at the margins. Within a century Guglielma would be repurposed into St. Guglielma, a pious woman whom suffered under the false accusation of heresy, and who was noted as particularly helpful in interceding against migraines. But her subversive import wasn’t entirely dampened over the generations. When the Renaissance Florentine painter Bonifacio Bembo was commissioned to paint an altarpiece around 1445 in honor of the Council of Florence (an unsuccessful attempt at rapprochement between the Catholic and Orthodox Churches) he depicted God crowning Christ and the Holy Spirit. Christ appears as can be expected, but the Holy Spirit has Guglielma’s face.

Maifreda survived in her own hidden way as well, and also through the helpful intercession of Bembo. The altar that he crafted had been commissioned by members of the powerful Visconti family, leaders in Milan’s anti-papal Ghibelline party, and they also requested the artist to produce 15 decks of Tarot cards. The so-called Visconti-Sforza deck, the oldest surviving example of the form, doesn’t exist in any complete set, having been broken up and distributed to various museums. Several of these cards would enter the collection of the American banker J.P. Morgan, where they’d be stored in his Italianate lower Manhattan mansion. A visitor can see cards from the Visconti-Sforza deck that include the fool in his jester’s cap and mottled pants, the skeletal visage of death, and most mysterious of all, il Papesa—the female Pope. Bembo depicts a regal woman, in ash-gray robes and white scapular, the papal tiara upon her head. In the 1960s, the scholar Gertrude Moakley observed that the female pope’s distinctive dress indicates her order: she was an Umilati. Maifreda herself was first cousins with a Visconti, the family preserving the memory of the female pope in Tarot. On Madison Avenue you can see a messiah who for centuries was shuffled between any number of other figures, never sure of when she might be dealt again. Messiahs are like that, often hidden—and frequently resurrected.  

Bonus Links:—Ten Ways to Live ForeverTen Ways to Change Your GodTen Ways to Look at the Color Black

Image Credit: Wikipedia.

A Fraternity of Dreamers

“There is no syllable one can speak that is not filled with tenderness and terror, that is not, in one of those languages, the mighty name of a god.” —Jorge Luis Borges, “The Library of Babel” (1941)

“Witness Mr. Henry Bemis, a charter member in the fraternity of dreamers. A bookish little man whose passion is the printed page…He’ll have a world all to himself…without anyone.” —Rod Serling, “Time Enough at Last,” The Twilight Zone (1959)

 When entering a huge library—whether its rows of books are organized under a triumphant dome, or they’re encased within some sort of vaguely Scandinavian structure that’s all glass and light, or they simply line dusty back corridors—I must confess that I’m often overwhelmed with a massive surge of anxiety. One must be clear about the nature of this fear—it’s not from some innate dislike of libraries, the opposite actually. The nature of my trepidation is very exact, though as far as I know there’s no English word for it (it seems like some sort of sentiment that the Germans might have an untranslatable phrase for). This fear concerns the manner in which the enormity of a library’s collection forces me to confront the sheer magnitude of all that I don’t know, all that I will never know, all that I can never know. When walking into the red-brick modernist hanger of the British Library, which houses all of those brittle books within a futuristic glass cube that looks like a robot’s heart, or the neo-classical Library of Congress with its green patina roof, or Pittsburgh’s large granite Carnegie Library main branch smoked dark with decades of mill exhaust and kept guard by a bronze statue of William Shakespeare, my existential angst is the same. If I start to roughly estimate the number of books per row, the number of rows per room, the number of rooms per floor, my readerly existential angst can become severe. This symptom can even be present in smaller libraries; I felt it alike in the small-town library of Washington, Penn., on Lincoln Avenue and in the single room of the Southeast Library of Washington D.C. on Pennsylvania Avenue. Intrinsic to my fear are those intimations of mortality whereby even a comparatively small collection must make me confront the fact that in a limited and hopefully not-too-short life I will never be able to read even a substantial fraction of that which has been written. All those novels, poems, and plays; all those sentiments, thoughts, emotions, dreams, wishes, aspirations, desires, and connections—completely inaccessible because of the sheer fact of finitude.

Another clarification should be in order—my fear isn’t the same as worrying that I’ll be found out for having never read any number of classical or canonical books (or those of the pop, paper-back variety either). There’s a scene in David Lodge’s classic and delicious campus satire Changing Places: A Tale of Two Campuses in which a group of academics play a particularly cruel game, as academics are apt to do, that asks participants to name a venerable book they’re expected to have read but have never opened. Higher point-values are awarded the more canonical a text is; what the neophytes don’t understand is that the trick is to mention something standard enough that they still can get the points for having not read it (like Laurence Sterne’s Tristram Shandy) but not so standard that they’ll look like an idiot for having never read it. One character—a recently hired English professor—is foolish enough to admit that he skipped Hamlet in high school. The other academics are stunned into silence. His character is later denied tenure. So, at the risk of making the same error, I’ll lay it out and admit to any number of books that the rest of you have probably read, but that I only have a glancing Wikipedia familiarity with: Marcel Proust’s Remembrance of Things Past, James Joyce’s Finnegan’s Wake, Don DeLillo’s White Noise, David Foster Wallace’s Infinite Jest. I’ve never read Harper Lee’s To Kill a Mockingbird, which is ridiculous and embarrassing, and I feel bad about it. I’ve also never read Jonathan Franzen’s The Corrections, though I don’t feel bad about that (however I’m wary that I’ve not read the vast bulk of J.K. Rowling’s Harry Potter books). Some of those previously mentioned books I want to read, others I don’t; concerning the later category, some of those titles make me feel bad about my resistance to them, others I haven’t thought twice about (I’ll let you guess individual titles’ statuses).

I offer this contrition only as a means of demonstrating that my aforementioned fear goes beyond simply imposter syndrome. There are any number of reasons why we wish we’d read certain things, and that we feel attendant moroseness for not having done so—the social stigma of admitting such things, a feeling of not being educated enough or worldly enough, the simple fact that there might be stuff that we’d like to read, but inclination, will power, or simply time has gotten in the way. The anxiety that libraries can sometimes give me is of a wholly more cosmic nature, for something ineffable affects my sense of self when I realize that the majority of human interaction, expression, and creativity shall forever be unavailable to me. Not only is it impossible for me to read the entirety of literature, it’s impossible to approach even a fraction of it—a fraction of a fraction of it. Some several blocks from where I now write is the Library of Congress, the largest collection in the world, which according to its website contains 38 million books (that’s excluding other printed material from posters to pamphlets). If somebody read a book a day, which of course depends on the length of the book, it would take somebody about 104,109 years and change to read everything within that venerable institution (ignoring the fact that about half-a-million to a million new titles are published every year in English alone, and that I was also too unconcerned to factor in leap years).

If you’re budgeting your time, may I suggest the British Library, which though it has a much larger collection of other textual ephemera, has a more manageable 13,950,000 books, which would take you a breezy 38,291 years to get through. If you’re of a totalizing personality, according to Google engineers from a 2010 study estimating the number of books ever written, you’ll have to wade through 129 million volumes of varying quality. That would take you 353,425 years to read. Of course this ignores all of that which has been written but not bound within a book—all of the jottings, the graffiti, the listings, the diaries, the text messages, the letters, and the aborted novels for which the authors have wisely or unwisely hit “Delete.” Were some hearty and vociferous reader to consume one percent of all that’s ever been written—one percent of that one percent—and then one percent of that one percent—they’d be the single most well-read individual to ever live. When we reach the sheer scale of how much human beings have expressed, have written, we enter the realm of metaphors that call for comparisons to grains of sand on the beach or stars in our galaxy. We depart the realm of literary criticism and enter that of cosmology. No wonder we require curated reading lists.

For myself, there’s an unhealthy compulsion towards completism in the attendant tsuris over all that I’ll never be able to read. Perhaps there is something stereotypically masculine in the desire to conquer all of those unread worlds, something toxic in that need. After all, in those moment’s of readerly ennui there’s little desire for the experience, little need for quality, only that desire to cross titles off of some imagined list. Assume it were even possible to read all that has been thought and said, whether sweetness and light or bile and heft, and consider what purpose that accomplishment would even have. Vaguely nihilistic the endeavor would be, reminding me of that old apocryphal story about the conqueror, recounted by everyone from the 16th-century theologian John Calvin to Hans Gruber in Die Hard, that “Alexander the Great…wept, as well indeed he might, because there were no more world’s to conquer,” as the version of that anecdote is written in Washington Irving’s 1835 collection Salmagundi: Or, The Whim-whams and Opinions of Launcelot Langstaff, Esq. and Others. Poor Alexander of Macedon, son of Philip, educated in the Athenian Lyceum by Aristotle, and witness to bejeweled Indian war-elephants bathing themselves on the banks of the Indus and the lapis lazuli encrusted Hanging Gardens of Babylon, the gleaming, white pyramids at Cheops and the massive gates of Persepolis. Alexander’s map of the world was dyed red as his complete possession—he’d conquered everything that there was to be conquered. And so, following the poisoning of his lover Hephaestion, he holed up in Nebuchadnezzar’s Babylonian palace, and he binged for days. Then he died. An irony though, for Alexander hadn’t conquered, or even been to all the corners of the world. He’d never sat on black sand beaches in Hokkaido with the Ainu, he’d never drank ox-blood with the Masai or hunted the giant Moa with the Maori, nor had he been on a walkabout in the Dreamtime with the Anangu Pitjantjatjara or stood atop Ohio’s Great Serpent Mound or seen the grimacing stone-heads of the Olmec. What myopia, what arrogance, what hubris—not to conquer the world, but to think that you had. Humility is warranted whether you’re before the World or the Library.

Alexander’s name is forever associated not just with martial ambitions, but with voluminous reading lists and never-ending syllabi as well, due to the library in the Egyptian city to which he gave his name, what historian Roy MacLeod describes in The Library of Alexandria: Centre of Learning in the Ancient World as “unprecedented in kingly purpose, certainly unique in scope and scale…destined to be far more ambitious [an] undertaking than a mere repository of scrolls.” The celebrated Library of Alexandria, the contents of which are famously lost to history, supposedly confiscated every book from each ship that came past the lighthouse of the city, had its scribes make a copy of the original, and then returned the counterfeit to the owners. This bit of bibliophilic chicanery was instrumental to the mission of the institution—the Library of Alexandria wasn’t just a repository of legal and religious documents, nor even a collection of foundational national literary works, but supposedly an assembly that in its totality would match all of the knowledge in the world, whether from Greece and Egypt, Persia and India. Matthew Battles writes in Library: An Unquiet History that Alexandria was “the first library with universal aspirations; with its community of scholars, it became a prototype of the university of the modern era.” Alexander’s library yearned for completism as much as its namesake had yearned to control all parts of the world; the academy signified a new, quixotic emotion—the desire to read, know, and understand everything. By virtue of it being such a smaller world at the time (at least as far as any of the librarians working there knew) such an aspiration was even theoretically possible.

“The library of Alexandria was comprehensive, embracing books of all sort from everywhere, and it was public, open to anyone with fitting scholarly or literary qualifications,” writes Lionel Casson in Libraries in the Ancient World. The structure overseen by the Ptolemaic Dynasty, who were Alexander’s progeny, was much more of a wonder of the ancient world than the Lighthouse in the city’s harbor. Within its walls, whose appearance is unclear to us, Aristophanes of Byzantium was the first critic to divide poetry into lines, 70 Jews convened by Ptolemy II translated the Hebrew Torah into the Greek Septuagint, and the geographer Eratosthenes correctly calculated the circumference of the Earth. Part of the allure of Alexandria, especially to any bibliophile in this fraternity of dreamers, is the fact that the vast bulk of what was kept there is entirely lost to history. Her card catalogue may have included lost classical works like Aristotle’s second book of Poetics on comedy (a plot point in Umberto Eco’s medieval noir The Name of the Rose), Protagoras’s essay “On the Gods,” the prophetic books of the Sibyllines, Cato the Elder’s seven-book history of Rome, the tragedies of the rhetorician Cicero, and even the comic mock-epic Magrites supposedly written by Homer.

More than the specter of all that has been lost, Alexandria has become synonymous with the folly of anti-intellectualism, as its destruction (variously, and often erroneously, attributed to Romans, Christians, and Muslims) is a handy and dramatic narrative to illustrate the eclipse of antiquity. Let’s keep some perspective though—let’s crunch some numbers again. According to Robin Lane Fox in The Classical World: An Epic History from Homer to Hadrian the “biggest library…was said to have grown to nearly 500,000 volumes.” Certainly not a collection to scoff at, but Alexander’s library, which drew from the furthest occident to the farthest orient, has only a sixth of all the books in the Library of Congress’s Asian Collection; Harvard University’s Widener Library has 15 times as many books (and that’s not including the entire system); the National Library of Iran, housed not far from where Alexander himself died, has 30 times the volumes than did that ancient collection. The number of books held by the Library of Alexandria would have been perfectly respectable in the collection of a small midwestern liberal arts college. By contrast, according to physicist Barak Shoshany on a Quora question, if the 5 zettabytes of the Internet were to be printed, then the resultant stack of books would have to fit on a shelf “4×10114×1011 km or about 0.04 light years thick,” the last volume floating somewhere near the Oort Cloud. Substantially larger shelves would be needed, it goes without saying, than whatever was kept in the storerooms of Alexandria with that cool Mediterranean breeze curling the edges of those papyri.

To read all of those scrolls, codices, and papyri at Alexandria would take our intrepid ideal reader a measly 1,370 years to get through. More conservative historians estimate that the Library of Alexandria may have housed only 40,000 books—if that is the case, then it would take you a little more than a century to read (if you’re still breezing through a book a day). That’s theoretically the lifetime of someone gifted with just a bit of longevity. All of this numeric stuff misses the point, though. It’s all just baseball card collecting, because what the Library of Alexandria represented—accurately or not—was the dream that it might actually be possible to know everything worth knowing. But since the emergence of modernity some half-millennia ago, and the subsequent fracturing of disciplines into ever more finely tuned fields of study, it’s been demonstrated just how much of a fantasy that goal is. There’s a certain disposition that’s the intellectual equivalent of Alexander, and our culture has long celebrated that personality type—the Renaissance Man (and it always seems gendered thus). Just as there was always another land to be conquered over the next mountain range, pushing through the Kush and the Himalayan foothills, so too does the Renaissance Man have some new type of knowledge to master, say geophysics or the conjugation of Akkadian verbs. Nobody except for Internet cranks or precocious and delusional autodidacts actually believes in complete mastery of all fields of knowledge anymore; by contrast, for all that’s negative about graduate education, one clear and unironic benefit is that it taught me the immensity and totality of all of the things that I don’t know.

Alexandria’s destruction speaks to an unmistakable romance about that which we’ll never be able to read, but it also practically says something about a subset of universal completism—our ability to read everything that has survived from a given historical period. By definition it’s impossible to actually read all of classical literature, since the bulk of it is no longer available, but to read all of Greek and Roman writing which survives—that is doable. It’s been estimated that less than one percent of classical literature has survived to the modern day, with Western cultural history sometimes reading as a story of both luck and monks equally preserving that inheritance. It would certainly be possible for any literate individual to read all of Aristophanes’s plays, all of Plato’s dialogues, all of Juvenal’s epigrams.  Harvard University Press’s venerable Loeb Classical Library, preserving Greek and Latin literature in their distinctive minimalist green and red covered volumes, currently has 530 titles available for purchase. Though it doesn’t encompass all that survives from the classical period, it comes close. An intrepid and dogged reader would be able to get through them, realistically, in a few years (comprehension is another matter).

If you need to budget your time, all of Anglo-Saxon writing that survives, that which didn’t get sewn into the back-binding of some inherited English psalm book or found itself as kindling in the 16th century when Henry VIII dissolved the monasteries, is contained across some four major poetic manuscripts, though 100 more general manuscripts endure. The time period that I’m a specialist in, which now goes by the inelegant name of the “early modern period” but which everybody else calls the “Renaissance,” is arguably marked as the first time period for which no scholar would be capable of reading every primary source that endures. Beneficiary of relative proximity to our own time, and a preponderance of titles gestated through the printing press, it would be impossible for anyone to read everything produced in those centuries. For every William Shakespeare play, there are hundreds of yellowing political pamphlets about groups with names like “Muggletonians;” for every John Milton poem, a multitude of religious sermons on subjects like double predestination. You have to be judicious in what you choose to read, since one day you’ll be dead. This reality should be instrumental in any culture wars détente—canons exist as a function of pragmatism.   

The canon thus functions as a kind of short-cut to completism (if you want to read through all of the Penguin Classics editions with their iconic black covers and their little avian symbol on the cover, that’s a meager 1,800 titles to get through). Alexandria’s delusion about gathering all of that which has been written, and perhaps educating oneself from that corpus, drips down through the history of Western civilization. We’ve had no shortage of Renaissance Men who, even if they hadn’t read every book ever written, perhaps at least roughly know where all of them could be found in the card catalogue. Aristotle was a polymath who not only knew the location of those works, but credibly wrote many of them (albeit all that remains are student lecture notes), arguably the founder of fields as diverse as literary criticism to dentistry. In the Renaissance, whereby it could be assumed that the attendant Renaissance Man would be most celebrated, there was a preponderance of those for whom it was claimed that they had mastered all disciplines that could be mastered (and were familiar with the attendant literature review). Leonardo da Vinci, Blaise Pascal, Athanasius Kircher, Isaac Newton, and Gottfried Wilhelm Leibnitz have all been configured as Renaissance Men, their writings respectively encompassing not just art, mathematics, theology, physics, and philosophy, but also aeronautics, gambling, sinology, occultism, and diplomacy as well.

Stateside both Benjamin Franklin and Thomas Jefferson (a printer and a book collector) are classified as such, and more recently figures as varied as Nikola Tesla and Noam Chomsky are sometimes understood as transdisciplinary polymaths (albeit ones for whom it would be impossible to have read all that can be read, even if it appears as such). Hard to disentangle the canonization of such figures from the social impulse to be “well read,” but in the more intangible and metaphysical sense, beyond wanting to seem smart because you want to seem smart, the icon of the Renaissance Man can’t help but appeal to that completism, that desire for immortality that is prodded by the anxiety that libraries inculcate in me. My patron saint of polymaths is the 17th-century German Jesuit Kircher, beloved by fabulists from Eco to Jorge Louis Borges, for his writings that encompassed everything from mathematics to hieroglyphic translation. Paula Findlen writes in the introduction to her anthology Athanasius Kircher: The Last Man Who Knew Everything that he was the “greatest polymath of an encyclopedic age,” yet when his rival Renaissance Man Leibnitz first dipped into the voluminous mass of Kirchermania he remarked that the priest “understands nothing.”

Really, though, that’s true of all people. I’ve got a PhD and I can’t tell you how lightbulbs work (unless they fit into Puritan theology somehow). Kircher’s translations of Egyptian were almost completely and utterly incorrect. As was much of what else he wrote on, from minerology to Chinese history. He may have had an all-encompassing understanding of all human knowledge during that time period, but Kircher wasn’t right very often. That’s alright, the same criticism could be leveled at his interlocutor Leibnitz. Same as it ever was, and applicable to all of us. We’re not so innocent anymore, the death of the Renaissance Man is like the death of God (or of a god). The sheer amount of that which is written, the sheer number of disciplines that exist to explain every facet of existence, should disavow us of the idea that there’s any way to be well-educated beyond the most perfunctory meaning of that phrase. In that gulf between our desire to know and the poverty of our actual understanding are any number of mythic figures who somehow close that gap; troubled figures from Icarus to Dr. Faustus who demonstrate the hubris of wishing to read every book, to understand every idea. A term should exist for the anxiety that those examples embody, the quacking fear before the enormity of all that we don’t know. Perhaps the readerly dilemma, or even textual anxiety.

A full accounting of the nature of this emotion compels me to admit that it goes beyond simply fearing that I’ll never be able to read all of the books in existence, or in some ideal library, or if I’m being honest even in my own library. The will towards completism alone is not the only attribute of textual anxiety, for a not dissimilar queasiness can accompany related (though less grandiose) activities than the desire to read all books that were ever written. To whit—sometimes I’ll look at the ever-expanding pile of books that I’m to read, including volumes that I must read (for reviews or articles) and those that I want to read, and I’m paralyzed by that ever-growing paper cairn. Such debilitation isn’t helpful; the procrastinator’s curse is that it’s a personality defect that’s the equivalent of emotional quicksand. To this foolish inclination towards completism—desiring everything and thus acquiring nothing—I sometimes use Francis Bacon’s claim from his essays of 1625 that “Some books should be tasted, some devoured, but only a few should be chewed and digested thoroughly,” as a type of mantra against textual anxiety, and it mostly works. Perhaps I should learn to cross-stitch it as a prayer to display amongst my books, which even if I haven’t read all of them, they’ve at least been opened (mostly).

But textual anxiety manifests itself in a far weirder way, one that I think gets to the core of what makes the emotion so disquieting. When I’m reading some book that I happen to be enjoying, some random novel picked up from the library or purchased at an airport to pass the time, but not the works that I perennially turn toward—Walt Whitman and John Milton, John Donne and Emily Dickinson—I’m sometimes struck with a profound melancholy born from the fact that I shall never read these sentences again. Like meeting a friendly stranger who somehow indelibly marks your life by the tangible reality of their being, but whom will return to anonymity. Then it occurs to me that even those things I do read again and again, Leaves of Grass and Paradise Lost, I will one day also read for the last time. What such textual anxiety trades in, like all things of humanity, is my fear of finality, of extinction, of death. That’s at the center of this, isn’t it? The simultaneous fear of there being no more worlds to conquer and the fear that the world never can be conquered. Such consummation, the obsession with having it all, evidences a rather immature countenance.

It’s that Alexandrian imperative, but if there is somebody wiser and better to emulate it’s the old cynic Diogenes of Sinope, the philosophical vagabond who spent his days living in an Athenian pot. Laertius reports in Lives of the Eminent Philosophers that “When [Diogenes] was sunning himself…Alexander came and stood over him and said: ‘Ask me for anything you want.’ To which he replied, ‘Stand out of my light.’” And so, the man with everything was devoid of things to give to the man with nothing. Something indicative of that when it comes to this fear that there are things you’ll never be able to read, things you’ll never be able to know. The point, Diogenes seems to be saying, is to enjoy the goddamn light. Everything in that. I recall once admitting my fear about all that I don’t know, all of those books that I’ll never open, to a far wiser former professor of mine. This was long before I understood that an education is knowing what you don’t know, understanding that there are things you will never know, and worshiping at the altar of your own sublime ignorance. When I explained this anxiety of all of these rows and rows of books to never be opened, she was confused. She said “Don’t you see? That just means that you’ll never run out of things to read.” Real joy, it would seem, comes in the agency to choose, for if you were able to somehow give your attention equally to everything than you’d suffer the omniscient imprisonment that only God is cursed with. The rest of us are blessed with the endless, regenerative, confusing, glorious, hidden multiplicity of experience in discrete portions. Laertius writes “Alexander is reported to have said ‘Had I not been Alexander, I should have liked to be Diogenes.”

Image Credit: Wikipedia.

Stories in Formaldehyde: The Strange Pleasures of Taxonomizing Plot

Somewhere within the storerooms of London’s staid, gray-faced Tate Gallery (for it’s currently no longer on exhibit) is an 1834 painting by J.M.W. Turner entitled “The Golden Bough.” Rendered in that painter’s characteristic sfumato of smeared light and smoky color, Turner’s composition depicts a scene from Virgil’s epic Aeneid wherein the hero is commanded by that seventh-century-old prophetic crone, the Sibyl of Cumae, to make an offering of a golden bough from a sacred tree growing upon the shores of crystalline blue Lake Avernus to the goddess Prosperina, if he wishes to descend to Hades and see the shadow of his departed father. “Obscure they went through dreary shades, that led/Along the waste dominions of the dead,” translated John Dryden in 1697, using his favored totemistic Augustinian rhyming couplets, as Aeneas descends further into the Underworld, its entrance a few miles west of Naples. As imagined by Turner, the area around the volcanic lake is pleasant, if sinister; bucolic, if eerie; pastoral, if unsettling. A dapple of light marks the portal whereby pilgrims journey into perdition; in the distance tall, slender trees topped with a cap of branches jut up throughout the landscape. A columned temple is nestled within the scrubby hills overlooking the field. The Sibyl stands with a scythe so that the vegetable sacrifice can be harvested, postlapsarian snakes slither throughout, and the Fates revel in mummery near hell’s doorway. Rather than severe tones of blood red and sulfurous black, earthy red and cadaverous green: Turner opted to depict Avernus in soft blues and greys, and the result is all the more disquieting. Here, the viewer might think, is what the passage between life and death must look like—muted, temperate, serene, barely even noticeable from one transition to the next.

As with the best of Turner’s paintings, with his eye to color the visual equivalent of perfect pitch, it is the texture of hues that renders, if not some didactic message about his subject, a general emotional sense, a sentiment hard to describe and registering at a pitch that can be barely heard and yet alters one’s feelings in the moment. Such was the sense conveyed by the Scottish folklorist James George Frazer who borrowed the artist’s title for his landmark 1890 study The Golden Bough: A Study in Comparative Religion, describing on his first page how the painting is “suffused with the golden glow of imagination in which the divine mind of Turner steeped and transfigured even the fairest natural landscape.” This scene, Frazer enthuses, “is a dream-like vision of the little woodland…[where] Dian herself might still linger by this lonely shore, still haunt these woodlands wild.” An influential remnant of a supremely Victorian enthusiasm for providing quasi-scientific gloss to the categorization of mythology, Frazer’s study provided taxonomy of classical myth so as to find certain similarities, the better to provide a grand, unified theory of ancient religion (or what Edward Casaubon in George Elliot’s Middlemarch, written two decades before, might call The Key to All Mythologies). First viewing Turner’s canvas, and the rationalist Frazer was moved by the painting’s mysteriousness, the way in which the pool blue sky and the shining hellmouth trade in nothing as literal as mere symbolism, but wherein the textured physicality—the roughness of the hill and the ominous haze of the clouds, dusk’s implied screaming cicadas and the cool of the evening—conveys an ineffable feeling. Despite pretensions to an analysis more logical, Frazer intimates the numinous (for, how couldn’t he?). “Who does not know Turner’s picture of the Golden Bough?” he writes.

His argument in The Golden Bough was that religions originated as primitive fertility cults, dedicated to the idea of sacrifice and resurrection, and that from this fundamentally magical worldview would evolve more sophisticated religions, to finally be supplanted by secular science. The other argument from The Golden Baugh is implicit in the book’s very existence—that structure can be ascertained within the messy morass of disparate myths. To make this argument he drew from sources as diverse as Virgil to the Nootka people of British Columbia, classifying, categorizing, and organizing data as surely as a biologist preserving specimens in a jar of formaldehyde. And like Charles Darwin measuring finch beaks, or Thomas Huxley pinning butterflies to wood blocks, Frazer believed that diversity was a mask for similarity.

As reductionist as his arguments are, and as disputed as his conclusions may be, Frazer’s influence was outsize among anthropologists, folklorists, writers, and especially literary critics, who thrilled to the idea that some sort of unity could be found in the chaotic variety of narratives that constitute world mythology. “I am a plain practical man,” Frazer writes, “not one of your theorists and splitters of hairs and choppers of logic,” and while it’s true that The Golden Bough evidences a more imaginative disposition, it still takes part in that old quixotic desire to find some Grand Unified Theory of Narrative. While Frazer’s beat was myth, he was still a reporter in stories, and percolating like a counter-rhythm within discussion of narrative is that old desire, the yearning to find the exact number of plots that it is possible to tell. Frazer, for all that was innovative about his thought, was neither the first nor the last to treat stories like animals in a genus, narratives as if creatures in a phylum.

That grand tradition claims there are only 36 stories that can be told, or seven, or four. Maybe there is really only one tale, the story of wanting something and not getting it, which is after all the contour of this story itself—the strange endurance of the sentiment that all narrative can be easily classifiable into a circumscribed, finite, and relatively small number of possibilities. While I’ve got my skepticism about such an endeavor—seeing those suggested systems as erasing the particularity of stories, of occluding what makes them unique in favor of mutilating them into some Procrustean Bed—I’d be remiss not to confess that I also find these theories immensely pleasing. There is something to be said about the cool rectilinear logic that claims any story, from Middlemarch to Fifty Shades of Grey, Citizen Kane to Gremlins 2, can be stripped down to its raw schematics and analyzed as fundamental, universal, eternal plots that have existed before Gilgamesh’s cuneiform was wedged into wet clay.

Christopher Booker claims in The Seven Basic Plots: Why We Tell Stories that “wherever men and women have told stories, all over the world, the stories emerging to their imaginations have tended to take shape in remarkably similar ways,” differences in culture, language, or faith be damned. With some shading, Booker uses the archetypal psychoanalysis of Carl Jung to claim that every single narrative, whether in epic or novel, film or comic, can be slotted into 1) overcoming the monster (Beowulf, George Lucas’s Star Wars), 2) rags to riches (Charlotte Bronte’s Jane Eyre, Horatio Alger stories), 3) the quest (Homer’s The Odyssey, Steven Spielberg’s Raiders of the Lost Ark), 4) voyage and return (The Ramayana, J.R.R. Tolkien’s The Hobbit), 5) comedy (William Shakespeare’s Twelfth Night, the Coen Brothers’ The Big Lebowski), 6) tragedy (Leo Tolstoy’s Anna Karenina, Arthur Penn’s Bonnie and Clyde) or 7) rebirth (Charles Dickens’s A Christmas Carol, Harold Ramis’s Groundhog Day).

That all of these parenthetically referenced works are, of course, astoundingly different from each other in character, setting, and most of all language, is irrelevant to Booker’s theory. While allowing for more subtlety than my potted overview would allow, Booker still concludes that “there are indeed a small number of plots which are so fundamental to the way we tell stories that it is virtually impossible for any storyteller ever entirely to break away from them.” Such a claim is necessary to Booker’s contention that these narratives are deeply nestled in our collective unconscious, a repository of themes, symbols, and archetypes that are “our basic genetic inheritance,” which he then proffers as an explanation for why humans tell stories at all.

The Seven Basic Plots, published in 2004 after 34 years of labor, is the sort of critical work that doesn’t appear much anymore. Audacious to the point of impudence, ambitious to the level of crack-pottery, Booker’s theory seems more at home in a seminar held by Frazer than in contemporary English departments more apt to discuss gender, race, and class in Jane Austen’s Pride and Prejudice than they are the Orphic themes of rebirth as manifested in that same novel. Being the sort of writer who both denied anthropogenic climate change and defended asbestos (for real), Booker had the conservative’s permanent sense of paranoid aggrievement concerning the treatment of his perspectives. So, let me be clear—contra Booker’s own sentiments, I don’t think that the theories in The Seven Basic Plots are ignored by literary critics because of some sort of politically correct conspiracy of silence; I think that they’re ignored because they’re not actually terribly correct or useful. When figuring out the genealogical lineage of several different species of Galapagos Island finches, similarity becomes a coherent arbiter; however, difference is more important when thinking through what makes exemplary literature exemplary. Genre, and by proxy plot, is frequently more an issue of marketing than anything. That’s not to say that questions of genre have no place in literary criticism, but they are normally the least interesting (“What makes this gothic novel gothic?”). No stranger to such thinking himself, author Kurt Vonnegut may have solved the enigma with the most basic of monomyths elucidated—“man falls into hole, man gets out of hole.”

Booker isn’t after marketing, however, he’s after the key to all mythologies. Like Frazer before him, he won’t be the last critic enraptured by the idea of a Periodic Table of Plots, capable of explaining both Fyodor Dostoevsky’s Crime and Punishment as well as Weekend at Bernie’s, and he won’t be the last. If you wish to blame somebody for this line of thinking, as with most disciplines of human endeavor from ethics to dentistry, look to Aristotle as the culprit. The philosopher’s “four conflicts,” man against himself, man against man, man against nature, and man against the gods, have long been a convenient means of categorizing plots. The allure of there being a limited number of plots is that it makes both reading and writing theoretically easier. The denizens of high culture literary criticism have embraced the concept periodically, as surely as those producing paperbacks promising that a hit book can be easily plotted out from a limited tool kit. Georges Polti, of Providence Rhode Island and later Paris France, wrote The Thirty-Six Dramatic Situations in 1895, claiming that all stories could be categorized in that number of scenarios, including plots of “Crime pursued by vengeance” and “Murderous adultery.” “Thirty-six situations only!” Polti enthuses. “There is to me, something tantalizing about the assertion.” Polti’s book has long been popular as a sort of lo-fi randomizer for generating stories, and its legacy lives on in works like Ronald B. Tobias’s 20 Master Plots and How to Build Them and Victoria Lynn Schmidt’s A Writer’s Guide to Characterization: Archetypes, Heroic Journeys, and Other Elements of Dynamic Character Development.

There is also a less pulpy, tonier history surrounding the thinking that everything can be brewed down to a handful of elemental plots. My attitude concerning such thinking was a bit glib earlier, as there is something to be said about the utility in this thinking, and indeed entire academic disciplines have grown from that assumption. Folklorists use a classification system called the “Aarne-Thompson-Uther Index,” where a multitude of plot-types are given numbers (“Cinderella” is 510A, for example), which can be useful to trace the ways in which stories have evolved and altered over both distance and time. Unlike Polti’s 36 plots, Tobias’s 20, or Booker’s seven, Stith Thompson’s Motif-Index of Folk-Literature goes to six volumes of folk tales, fairy tales, legends, and myths, but the basic idea is the same: plots exist in a finite number (including “Transformation: man to animal” and “Magic strength resides in hair”). As with the system of classification invented by Francis James Child in The English and Scottish Popular Ballads, or the Roud Folk Song Index, the Aarne-Thompson Uther Index is more than just a bit of shell collecting, but rather a system of categorization that helps folklorists make sense of the diversity of oral literature, with scholar Alan Dundes enthusing that the system was among the “most valuable tools in the professional folklorist’s arsenal of aids for analysis.” Morphological approaches define the discipline known as “narrative theory,” which draws from a similar theoretical inclination as that of the ATU Index. All of these methodologies share a commitment to understanding literature less through issues of grammar, syntax, and diction, and more in terms of plot and story. For those who read with an eye towards narrative, there is frequently an inclination, sentiment, or hunch that all stories and novels, films and television shows, epics and lyrics, comics and plays, can have their fat, gristle, and tallow boiled away to leave just the broth and a plot that’s as clean as a bone.

A faith that was popular among the Russian Formalists, sometimes incongruously known as the Prague School (after where many of them, as Soviet exiles, happened to settle), including Roman Jacobson, Viktor Shklovsky, and Vladimir Propp, the last of whom wrote Morphology of the Folktale, reducing those stories to a narrative abstraction that literally looks like mathematics. A similar movement was that of French structuralism, as exemplified by its founder the linguist Ferdinand de Saussure, and as later practiced by the anthropologist Claude Levi-Strauss and the literary critic Roland Barth. In the Anglophone world, with the exception of some departments that are enraptured to narratology, literary criticism has often focused on the evisceration of a text with the scalpel of close reading rather than the measurement of plot with the calipers of taxonomy. Arguably that’s led to the American critical predilection towards “literary” fiction over genre fiction, the rejection of science fiction, fantasy, horror, and romance as being unserious in favor of all of those beautifully crafted stories in The New Yorker where the climax is the main character looking out the window, sighing, and taking a sip of coffee, while realizing that she was never happy, not really.

There are exceptions to the critical valorization in language over plot, however, none more so than in the once mighty but now passé writings of Canadian theorist Northrop Frye. Few scholars in the English-speaking world were more responsible for that once enthusiastic embrace of taxonomic criticism than this United Church of Christ minister and professor at Toronto’s Victoria College. Frye was enraptured to the psychoanalyst Carl Jung’s theories of how fundamental archetypes structure our collective unconscious, and he believed that a similar approach could be applied to narrative, that a limited number of plots structured our way of thinking and approaching stories. In works like Fearful Symmetry on William Blake, and his all-encompassing Anatomy of Criticism, Frye elucidated a complex, baroque, and elegant system of categorizing stories, the better to interpret them properly. “What if criticism is a science as well as an art?” Frye asked, wishing to approach literature like a taxonomist, as if novels were a multitude of plants and animals just awaiting Linnaean classification. For those who read individual poems or novels as exemplary texts, explaining what makes them work, Frye would say that they’re missing the totality of what literature is. “Criticism seems to be badly in need of a coordinating principle,” he writes, “a central hypothesis which, like the theory of evolution in biology, will see the phenomena it deals with as part of a whole.”

Frye argued that this was to be accomplished by identifying that which was universal in narrative, where works could be rendered of their unique flesh down into their skeletons, which we would then find to be myths and archetypes. From this anodyne observation, Frye spun out a complex classification system for all Western literature, one where he identifies the exact archetypes that define poetry and prose, where he flings about terms like “centripidal” and “centrifugal” to interpret individual texts, and where phrases like the “kerygmatic mode” are casually used.  Anatomy of Criticism is true to its title; Frye carves up the cadaver of literature and arrives at an admittedly intoxicating theory of everything. “Physics is an organized body of knowledge about nature, and a student of it says that he is learning physics, not nature,” Frye writes. “Art, like nature, has to be distinguished from the systematic study of it, which is criticism.” In Frye’s physics, there are five “modes” of literature, including the mythic, romantic, high mimetic, low mimetic, and ironic; these are then cross listed with tragic, comic, and thematic forms; what are then derived are genres with names like the dionysian, the elegiac, the aristophanic, and so on. Later in the book he supplies a complex theory of symbolism, a methodology concerning imagery based on the Platonic Great Chain of Being, and a thorough taxonomy of genre. In what’s always struck me as one of the odder (if ingenious) parts of Anatomy of Criticism, Frye ties genres specifically to certain seasons, so that comedy is a spring form, romance belongs to the summer, autumn is a time of tragedy, and winter births irony. How one reads books from those tropical places where seasons neatly divide between rainy or not speaks to a particular chauvinism on the Canadian’s part.

For most viewers of public television, however, their introduction to the “There-are-only-so-many-stories” conceit wasn’t Frye, but rather a Sarah Lawrence College professor who was the titular subject of journalist Bill Moyers’s 1988 PBS documentary Joseph Campbell and the Power of Myth. Drawing largely from his 1949 study The Hero with a Thousand Faces, Campbell became the unlikely star of the series that promulgated his theory of the “monomyth,” the idea that a single-story threads through world mythology and is often focused on what he termed “the hero’s journey.” Viewers were drawn to Campbell’s airy insights about the relationship between Akkadian mythology and Star Wars (a film which George Lucas admitted was heavily influenced by the folklorist’s ideas), and his vaguely countercultural pronouncement that one should “Follow your bliss!,” despite his own right-wing politics (which according to some critics could run the gamut between polite Reaganism to fascist sympathizing). Both Frye and Campbell exhibited a wide learning, but arguably only the former’s was particularly deep. With an aura of crunchy tweediness, Campbell seemed like the sort of professor who would talk to students about the Rubaiyat of Omar Khayyam in an office which smells of patchouli, a threadbare oriental rug on the dusty floor, knick-knacks assembled while studying in India and Japan, and a collapsing bookshelf jammed with underlined paperback copies of Friedrich Nietzsche and Arthur Schopenhauer above his desk. Campbell, in short, looked like what we expect a liberal arts teacher to look like, and for some of his critics (like Dundes who called him an “non-expert” and an “amateur”) that gave him an unearned authority.

But what an authority he constructed, the hero with only one theory to explain everything! Drawing from Jung, Frazer, and all the rest of the usual suspects, Campbell argued in his most famous book that broad archetypes structure all narrative, wherein a “hero ventures forth from the world of common day into a region of supernatural wonder: fabulous forces are there encountered and a decisive victory is won: the hero comes back from this mysterious adventure with the power to bestow boons on his fellow man.” Whether Luke Skywalker venturing out from Tatooine or Gilgamesh leaving Ur, the song remains the same, Campbell says. Gathering material from the ancient near east and Bronze Age Ireland, the India of the Mahabharata and Hollywood screen plays, Campbell claimed that his monomyth was the skeleton key to all narrative, a story whose parsing could furthermore lead to understanding, wisdom, and self-fulfillment among those who are hip to its intricacies. The Hero with a Thousand Faces naturally flattered the pretensions of some artists and writers, what with its implications that they were conduits connected directly to the collective unconsciousness. Much as with Freud and the legions of literary critics who applied his theories to novels and film, if Campbell works well in interpreting lots of movies, it’s because those directors (from Lucas to Stanley Kubrick) happened to be reading him. The monomyth can begin to feel like the critical equivalent of the intelligent design advocate who knows God exists, because why else would we have been given noses on which to so conveniently hold our glasses?

Campbell’s politics, and indeed that of his theory, are ambivalent. His comparative approach superficially seems like the pluralistic, multicultural, ecumenical perspective of the Sarah Lawrence professor that he was, but at the same time the flattening of all stories into this one monomyth does profound violence to the particularity of myths innumerable. There is a direct line between Campbell and the mythos-laden mantras of poet Robert Bly and his Iron John: A Book About Men, the tome that launched a thousand drum circles of suburban dads trying to engage their naturalistic masculinity in vaguely homoerotic forest rituals, or of Canadian psychotherapist/alt-right apologist Jordan Peterson who functions as basically a Dollar Store version of the earlier folklorist. Because myths are so seemingly elemental, mysterious telegrams from the ancient past, whose logic seems imprinted into our unconscious, it’s hard not to see the attraction of a Campbell. And yet whenever someone starts talking about “mythos” it inevitably can start to feel like you’re potentially in the presence of a weirdo who practices “rune magik,” unironically wonders if they’re an ubermensch, and has an uncomfortably racist Google search history. We think of the myth as the purview of the hippie, but it’s just as often the provenance of the jackbooted authoritarian, for Campbell’s writings fit comfortably with a particularly reactionary view of life, which should fit uncomfortably with the rest of us. “Marx teaches us to blame society for our frailties, Freud teaches us to blame our parents,” Campbell wrote in the posthumously published Pathways to Bliss, but the “only place to look for blame is within: you didn’t have the guts to bring up your full moon and live the life that was your potential.” Yeah, that’s exactly it. People can’t afford healthcare or get a job because they didn’t bring up their full moon…

The problem is that if you take Campbell too seriously then everything begins to look like it was written by Campbell. To wit, the monomyth is supposed to go through successive stages, from the hero’s origin in an ordinary world where he receives a “call to adventure,” to being assisted by a mentor who leads him through a “guarded threshold” where he is tested on a “road of trials,” to finally facing his ultimate ordeal. After achieving success, the hero returns to the ordinary world wiser and better, improving the lives of others through the rewards that have been bestowed upon him. The itinerary is more complex than this in The Hero with a Thousand Faces, but this should be enough to convey that Campbell’s schema is general enough that it can be applied to anything, but particular enough that it gives the illusion of rigor. Think of Jesus Christ, called to be the messiah and assisted by John the Baptist, tempted by Satan in the desert, and after coming into Jerusalem facing torture at the hands of the Romans, before his crucifixion and harrowing of hell, only to be resurrected with the promise of universal human salvation. Now, think of Jeff Lebowski, called to be the Dude and assisted by Walter Sobchak, tempted by Jackie Treehorn, battling the nihilists, only to return in time for the bowling finals. Other than speaking deep into the souls of millions of people, it should be uncontroversial to say that the gospels and the Coen Brothers’ The Big Lebowski are only the same story in the most glaringly of superficial ways, and yet the quasi-conspiratorial theory of the monomyth promises secret knowledge that says that they are.    

But here’s the thing—stories aren’t hydrogen, plots aren’t oxygen, narratives aren’t carbon. You can’t reduce the infinity of human experience into a Periodic Table, except in the most perfunctory of ways. To pretend that the tools of classification are the same as the insights of interpretation is to grind the Himalayas into Iowa, it’s to cut so much from the bone that the only meal you’re left with is that of a skeleton. When all things are reduced to monomyth, the enthusiast can’t recognize the exemplary, the unique, the individual, the subjective, the idiosyncratic, because some individual plot doesn’t have a magical wizard shepherding the hero to the underworld, or whatever. It’s to deny the possibility of some new story, of some innovation in narrative, its to spurn the Holy Grail of uniqueness. Still, some sympathy must be offered as to why these models appeal to us, of how archetypal literary criticism appeals to our inner stamp collectors. With apologies to Voltaire, if narrative didn’t exist it would be necessary to invent it—and everything else too. The reasons why archetypal criticism is so appealing are legion—they impose a unity on chaos, provides a useful measure of how narratives work, and give the initiate the sense that they have knowledge that is applicable to everything from The Odyssey to Transformers.

But a type of critical madness lay in the idolatry of confusing methodological models for the particularity of actual stories. Booker writes of stories that are “Rags to Riches,” but that reductionism is an anemic replacement for inhabiting Pip’s mind when he pines for Estella in Charles Dickens’s Great Expectations; he classifies Bram Stoker’s Dracula as being about “Overcoming the Monster,” but that simplification is at the expense of that purple masterpiece’s paranoia, its horror, its hunger, its sexiness. There are no stories except for in the details. To forget that narratives are infinite is a slur against them; it’s the blasphemy of pretending that every person is the same as every other. For in a warped way, there is but one monomyth, but it’s not what the stamp collectors say it is. In all of their variety, diversity, and multiplicity, every tale is a creation myth because every tale is created. From the raw material of life is generated something new, and in that regard we’re not all living variations of the same story, we’re all living within the same story.  

Bonus Links:—The Purpose of Plot: An Argument with MyselfThe Million Basic PlotsOn Not Going Out of the House: Thoughts About Plotlessness

Image Credit: Wikimedia Commons.