King, God, Smart-Ass Author: Reconsidering Metafiction

“Whoever is in charge here?” -Daffy Duck, Merrie Melodies (1953)

Like all of us, Daffy Duck was perennially put upon by his Creator. The sputtering, stuttering, rageful water fowl’s life was a morass of indignity, embarrassment, anxiety, and existential horror. Despite all of the humiliation Daffy had to contend with, the aquatic bird was perfectly willing to shake his wings at the unfair universe. As expertly delivered by voice artist Mel Blanc, Daffy could honk “Who is responsible for this? I demand that you show yourself!” In animator Chuck Jones’s brilliant and classic 1953 episode of Merrie Melodies titled “Duck Amuck,” he presents Daffy as a veritable Everyduck, a sinner in the hands of a smart-assed illustrator. “Duck Amuck” has remained a canonical episode in the Warner Brothers cartoon catalog, its postmodern, metafictional experimentation heralded for its daring and cheekiness. Any account of what critics very loosely term “postmodern literature”—with its playfulness, its self-referentiality, and it’s breaking of the fourth wall—that considers Italo Calvino, Jorge Luis Borges, Vladimir Nabokov, and Paul Auster but not Jones is only telling part of the metafictional story.  Not for nothing, but two decades ago, “Duck Amuck” was added to the National Film Registry by the Library of Congress as an enduring piece of American culture.

Throughout the episode, Jones depicts increasingly absurd metafictional scenarios involving Daffy’s sublime suffering. Jones first imagines Daffy as a swordsman in a Three Musketeers parody, only to have him wander into a shining, white abyss as the French Renaissance background fades away. “Look Mac,” Daffy asks, never one to let ontological terror impinge on his sense of personal justice, “what’s going on here?” Jones wrenches the poor bird from the musketeer scenery to the blinding whiteness of the nothing-place, then to a bucolic pastoral, and finally to a paradisiacal Hawaiian beach. Daffy’s admirable sense of his own integrity remains intact, even throughout his torture. Pushed through multiple parallel universes, wrenched, torn, and jostled through several different realities, Daffy shouts “All right wise guy, where am I?”  

But eventually not even his own sense of identity is allowed to continue unaffected, as the God-animator turns him into a country-western singer who can only produce jarring sound effects from his guitar, or as a transcendent paintbrush recolors Daffy blue. At one point the animator’s pencil impinges into Daffy’s world, erasing him, negating him, making him nothing. Daffy’s very being, his continued existence depends on the whims of a cruel and capricious God; his is in the world of Shakespeare’s King Lear, where the Duke of Gloucester utters his plaintive cry, “As flies are to wanton boys are we to th’ gods; / They kill us for their sport.” Or at least they erase us. Finally, like Job before the whirlwind, Daffy implores, “Who is responsible for this? I demand that you show yourself!” As the view pans upward, into that transcendent realm of paper and ink where the animator-God dwells, it’s revealed to be none other than the trickster par excellence, Bugs Bunny. “Ain’t I a stinker?” the Lord saith.

Creation, it should be said, is not accomplished without a certain amount of violence. According to one perspective, we can think of Daffy’s tussling with Bugs as being a variation on that venerable old Aristotelian narrative conflict of “Man against God.” If older literature was focused on the agon (as the Greeks put it) between a human and a deity, and modernist literature concerned itself with the conflict that resulted as people had to confront the reality of no God, then the wisdom goes that our postmodern moment is fascinated with the idea of a fictional character searching out his or her creator. According to narrative theorists, that branch of literary study that concerns itself with the structure and organization of story and plot (not synonyms incidentally), such metafictional affectations are technically called metalepsis. H. Porter Abbot in his invaluable The Cambridge Introduction to Narrative explains that such tales involve a “violation of narrative levels” when a “storyworld, is invaded by an entity or entities from another narrative level.”

Metalepsis can be radical in its execution, as when an “extradiegetic narrator” (that means somebody from outside the story entirely) enters into the narrative, as in those narratives where an “’author appears and starts quarreling with one of the characters,” Abbot writes. We’ll see that there are precedents for that sort of thing, but whether interpreted as gimmick or deep reflection on the idea of literature, the conceit that has a narrator enter into the narrative as if theophany is most often associated with something called, not always helpfully, “postmodernism.” Whatever that much-maligned term might mean, in popular parlance it has an association with self-referentiality, recursiveness, and metafictional playfulness (even if readers might find cleverness such as that exhausting). The term might as well be thought of as referring to our historical preponderance of literature that knows that it is literature.

With just a bit of British disdain in his critique, The New Yorker literary critic James Wood writes in his pithy and helpful How Fiction Works that “postmodern novelists… like to remind us of the metafictionality of all things.” Think of the crop of experimental novelists and short story writers from the ’60s, such as John Barth in his Lost in the Funhouse, where one story is to be cut out and turned into an actual Moebius strip; Robert Coover in the classic and disturbing short story “The Babysitter,” in which a variety of potential realities and parallel histories exist simultaneously in the most mundane of suburban contexts; and John Fowles in The French Lieutenant’s Woman, in which the author also supplies multiple “forking paths” to the story and where the omniscient narrator occasionally appears as a character in the book. Added to this could be works where the actual first-person author themselves becomes a character, such as Auster’s New York Trilogy, or Philip Roth’s Operation Shylock (among other works where he appears as a character). Not always just as a character, but as the Creator, for if the French philosopher Roland Barthes killed off the idea of such a figure in his seminal 1967 essay “The Death of the Author,” then much of the period’s literature resurrected Him. Wood notes, perhaps in response to Barthes, that “A certain kind of postmodern novelist…is always lecturing us: ‘Remember, this character is just a character. I invented him.’” Metafiction is when fiction thinks about itself.

Confirming Wood’s observation, Fowles’s narrator writes in The French Lieutenant’s Woman, “This story I am telling is all imagination. These characters I create never existed outside my own mind…the novelist stands next to God. He may not know all, yet he tries to pretend that he does.” Metafictional literature like this is supposed to interrogate the idea of the author, the idea of the reader, the very idea of narrative. When the first line to Calvino’s If on a Winter’s Night a Traveler is “You are about to begin reading Italo Calvino’s new novel, If on a Winter’s Night a Traveler,” it has been signaled that the narrative you’re entering is supposed to be different from those weighty tomes of realism that supposedly dominated in previous centuries. If metalepsis is a favored gambit of our experimental novelists, then it’s certainly omnipresent in our pop culture as well, beyond just “Duck Amuck.”

A list of sitcoms that indulge the conceit includes 30 Rock, Community, Scrubs, and The Fresh Prince of Bel-Air. The final example, which after all was already an experimental narrative about a wholesome kid from West Philly named Will played by a wholesome rapper from West Philly named Will Smith, was a font of avant-garde fourth-wall breaking deserving of Luigi Pirandello or Bertolt Brecht. Prime instances would include the season five episodes “Will’s Misery,” which depicts Carlton running through the live studio audience, and “Same Game, Next Season,” in which Will asks “If we so rich, why we can’t afford no ceiling,” with the camera panning up to show the rafters and lights of the soundstage. Abbot writes that metafiction asks “to what extent do narrative conventions come between us and the world?” which in its playfulness is exactly what The Fresh Prince of Bel-Air is doing, forcing its audience to consider how “they act as invisible constructors of what we think is true, shaping the world to match our assumptions.”

Sitcoms like these are doing what Barth, Fowles, and Coover are doing—they’re asking us to examine the strange artificiality of fiction, this illusion in which we’re asked by a hidden author to hallucinate and enter a reality that isn’t really there. Both audience and narrator are strange, abstracted constructs; their literal corollaries of reader and writer aren’t much more comprehensible. When we read a third-person omniscient narrator, it would be natural to ask “Who exactly is supposed to be recounting this story?” Metafiction is that which does ask that question. It’s the same question that the writers of The Office confront us with when we wonder, “Who exactly is collecting all of that documentary footage over those nine seasons?”

Far from being simply a postmodern trick, metalepsis as a conceit and the metafiction that results have centuries’ worth of examples. Interactions between creator and created, and certainly author and audience, have a far more extensive history than both a handful of tony novelists from the middle of the 20th century and the back catalog of Nick at Nite. For those whose definition of the novel doesn’t consider anything written before 1945, it might come as a shock that all of the tricks we associate with metafiction thread so deep into history that realist literature can seem the exception rather than the rule. This is obvious in drama; the aforementioned theater term “breaking the fourth wall” attests to the endurance of metalepsis in literature. As a phrase, it goes back to Molière in the 17th century, referring to when characters in a drama acknowledge their audience, when they “break” the invisible wall that separates the action of the stage from that of the observers in their seats. If Molière coined the term, it’s certainly older than even him. In all of those asides in Shakespeare—such as that opening monologue of Richard III when the title villain informs all of us who are joining him on his descent into perdition that “Now is the winter of our discontent”—we’re, in some sense, to understand ourselves as being characters in the action of the play itself.  

As unnatural as Shakespearean asides may seem, they don’t have the same sheer metaleptic import of metafictional drama from the avant-garde theater of the 20th century. Pirandello’s classic experimental play Six Characters in Search of an Author is illustrative here, a high-concept work in which unfinished and unnamed characters arrive at a Pirandello production asking their creator to more fully flesh them out. As a character named the Father explains, the “author who created us alive no longer wished…materially to put us into a work of art. And this was a real crime.” A real crime because to be a fictional character means that you cannot die, even though “The man, the writer, the instrument of the creation will die, but his creation does not die.” An immaculate creation outliving its creator, more blessed than the world that is forever cursed to be ruled over by its God. But first Pirandello’s unfortunates must compel their God to grant them existence; they need a “fecundating matrix, a fantasy which could rise and nourish them: make them live forever!” If this seems abstract, you should know that such metaleptic tricks were staged long before Pirandello, and Shakespeare for that matter. Henry Medwall’s 1497 Fulgens and Lucrece, the first secular play in the entire English canon, has two characters initially named “A” and “B” who argue about a play only to have it revealed that the work in question is actually Medwall’s, which the audience is currently watching. More than a century later, and metafictional poses were still explored by dramatists, a prime and delightful example being Shakespeare’s younger contemporary and sometimes-collaborator Francis Beaumont’s The Knight of the Burning Pestle. In that Jacobean play of 1607, deploying a conceit worthy of Thomas Pynchon’s The Crying of Lot 49, Beaumont imagines the production of a play-within-a-play entitled The London Merchant. In the first act, two characters climb the stage from the audience, one simply called “Citizen” and the other “Wife,” and begin to heckle and critique The London Merchant, and its perceived unfairness to the rapidly ascending commercial class. The Knight of the Burning Pestle allows the audience to strike back, the Citizen cheekily telling the actor reading the prologue, “Boy, let my wife and I have a couple of stools / and then begin; and let the grocer do rare / things.”

Historical metalepsis can also be seen in what are called “frame tales,” that is, stories-within-stories that nestle narratives together like Russian dolls. Think of the overreaching narrative of Geoffrey Chaucer’s 14th-century The Canterbury Tales with its pilgrims telling each other their stories as they make their way to the shrine of Thomas Becket, or of Scheherazade recounting her life-saving anthology to her murderous husband in One Thousand and One Nights, as compiled from folktales during the Islamic Golden Age from the eighth to 14th centuries. Abbot describes frame tales by explaining that “As you move to the outer edges of a narrative, you may find that it is embedded in another narrative.” Popular in medieval Europe, and finding their structure from Arabic and Indian sources that go back much further, frame tales are basically unified anthologies where an overreaching narrative supplies its own meta-story. Think of Giovanni Boccaccio’s 14th-century Decameron, in which seven women and three men each tell 10 stories to pass the time while they’re holed up in a villa outside of Florence to await the passage of the Black Death through the city. The 100 resulting stories are ribald, earthy, and sexy, but present through all of their telling is an awareness of the tellers, this narrative about a group of young Florentines in claustrophobic, if elegant, quarantine. “The power of the pen,” one of Boccaccio’s characters says on their eighth day in exile, “is far greater than those people suppose who have not proved it by experience.” Great enough, it would seem, to create a massive sprawling world with so many stories in it. “In my father’s book,” the character would seem to be saying of his creator Boccaccio, “there are many mansions.”

As metaleptic as frame tales might be, a reader will note that Chaucer doesn’t hitch up for that long slog into Canterbury himself, nor does Boccaccio find himself eating melon and prosciutto while quaffing chianti with his aristocrats in The Decameron. But it would be a mistake to assume that older literature lacks examples of the “harder” forms of metalepsis, that writing before the 20th century is devoid of the Author-God appearing to her characters as if God on Sinai. So-called “pre-modern” literature is replete with whimsical experimentation that would seem at home in Nabokov or Calvino; audiences directly addressed on stage and books speaking as themselves to their readers, authors appearing in narratives as creators, and fictions announcing their fictionality.

Miguel de Cervantes’s 17th-century Don Quixote plays with issues of representation and artificiality when the titular character and his trusty squire, Sancho Panza, visit a print shop that is producing copies of the very book you are reading, the errant knight and his sidekick then endeavoring to prove that it is an inferior plagiarism of the real thing. Cervantes’s narrator reflects at an earlier point in the novel about the novel itself, enthusing that “we now enjoy in this age of ours, so poor in light entertainment, not only the charm of his veracious history, but also of the tales and episodes contained in it which are, in a measure, no less pleasing, ingenious, and truthful, than the history itself.” Thus Cervantes, in what is often considered the first novel, can lay claim to being the primogeniture of both realism and metafictionality.

Following Don Quixote’s example could be added other metafictional works that long precede “postmodernism,” including Laurence Sterne’s 18th-century The Life and Opinions of Tristram Shandy, Gentleman, where the physical book takes time to mourn the death of a central character (when an all-black page is printed); the Polish count Jan Potocki’s underread late-18th-century The Manuscript Found in Saragossa, with not just its fantastic caste of Iberian necromancers, kabbalists, and occultists, but its intricate frame structure and forking paths (not least of which include reference to the book that you’re reading); James Hogg’s Satanic masterpiece The Private Memoirs and Confessions of a Justified Sinner, in which the author himself makes an appearance; and Jane Austen’s Northanger Abbey, in which the characters remark on how it feels as if they’re in a gothic novel (or perhaps a parody of one). Long before Barthes killed the Author, writers were conflating themselves as creator with the Creator. As Sterne notes, “The thing is this. That of all the several ways of beginning a book which are now in practice throughout the known world, I am confident my own way of doing it is the best—I’m sure it is the most religious—for I begin with writing the first sentence—and trusting to Almighty God for the second.”

Sterne’s sentiment provides evidence as to why metafiction is so alluring and enduring, despite its minimization by some critics who dismiss it as mere trick while obscuring its long history. What makes metalepsis such an intellectually attractive conceit goes beyond simply that it makes us question how literature and reality interact, but rather what it implies about the Author whom Sterne gestures toward—“Almighty God.” The author of Tristram Shandy understood, as all adept priests of metafiction do (whether explicitly or implicitly), that at its core, metalepsis is theological. In questioning and confusing issues of characters and writers, narrators and readers, actors and audience, metafiction experiments with the very idea of creation. Some metafiction privileges the author as being the supreme-God of the fiction, as in The French Lieutenant’s Woman, and some castes its lot with the characters, as in The Knight of the Burning Pestle. Some metafiction is “softer” in its deployment, allowing the characters within a narrative to give us stories-within-stories; other is “harder” in how emphatic it is about the artifice and illusion of fiction, as in Jones’s sublime cartoon. What all of them share however, is an understanding that fiction is a strange thing, an illusion whereby whether we’re gods or penitents, we’re all privy to a world spun from something as ephemeral as letters and breath. Wood asks, “Is there a way in which all of us are fictional characters, parented by life and written by ourselves?” And the metaphysicians of metafiction answer in the affirmative.

As a final axiom, to join my claim that metafiction is when literature thinks about itself and that metalepsis has a far longer history than is often surmised, I’d finally argue that because all fiction—all literature—is artifice, that all of it is in some sense metafiction. What defines fiction, what makes it different from other forms of language, is that quality of metalepsis. Even if not explicitly stated, the differing realms of reality implied by the very existence of fiction imply something of the meta. Abbot writes “World-making is so much a part of most narratives that some narrative scholars have begun to include it as a defining feature of narrative,” and of that I heartily concur. Even our scripture is metafictional, for what else are we to call the Bible in which Moses is both author and character, and where his death itself is depicted? In metafiction perspective is confused, writer turns to reader, narrator to character, creator to creation. No more apt a description of metafiction, of fiction, of life than that which is offered by Prospero at the conclusion of The Tempest: “Our revels now are ended. These our actors, / As I foretold you, were all spirits and / Are melted into air, into thin air.” For Prospero, the “great globe itself…all which it inherit, shall dissolve / And, like this insubstantial pageant faded…We are such stuff / As dreams are made on, and our little life / Is rounded with a sleep.” Nothingness before and nothingness after, but with everything in between, just like the universe circumscribed by the cover of a book. Metafiction has always defined literature; we’ve always been characters in a novel that somebody else is writing.

Judging God: Rereading Job for the First Time

“And can you then impute a sinful deed/To babes who on their mothers’ bosoms bleed?” —Voltaire, “Poem on the Lisbon Disaster” (1755)

“God is a concept/By which we measure our pain.” —John Lennon, “God” (1970)

The monks of the Convent of Our Lady of Mount Carmel would have shortly finished their Terce morning prayers of the Canonical hours, when an earthquake struck Lisbon, Portugal, on All Saints Day in 1755. A fantastic library was housed at the convent, more than 5,000 volumes of St. Augustin and St. Thomas Aquinas, Bonaventure and Albertus Magnus, all destroyed as the red-tiled roof of the building collapsed. From those authors there had been endless words produced addressing theodicy, the question of evil, and specifically how and why an omniscient and omnibenevolent God would allow such malevolent things to flourish. Both Augustin and Aquinas affirmed that evil had no positive existence in its own right, that it was merely the absence of good in the way that darkness is the absence of light. The ancient Church Father Irenaeus posited that evil is the result of human free will, and so even natural disaster was due to the affairs of women and men. By the 18th century, a philosopher like Gottfried Leibnitz (too Lutheran and too secular for the monks of Carmo) would optimistically claim that evil is an illusion, for everything that happens is in furtherance of a divine plan whereby ours is the “best of all possible worlds,” even in Lisbon on November 1, 1755. On that autumn day in the Portuguese capital, the supposedly pious of the Carmo Convent were faced with visceral manifestations of that question of theodicy in a city destroyed by tremor, water, and flame.

No more an issue of scholastic abstraction, of syllogistic aridness, for in Lisbon perhaps 100,000 of the monks’ fellow subjects would die in what was one of the most violent earthquakes ever recorded. Death marked the initial collapse of the houses, palaces, basilicas, and cathedrals, in the tsunami that pushed in from the Atlantic and up the Tagus River, in the fires that ironically resulted from the preponderance of votive candles lit to honor the holy day, and in the pestilence that broke out among the debris and filth of the once proud capital. Lisbon was the seat of a mighty Catholic empire, which had spread the faith as far as Goa, India, and Rio de Janeiro, Brazil; its inhabitants were adherents to stern Iberian Catholicism, and the clergy broached no heresy in the kingdom. Yet all of that faith and piety appeared to make no difference to the Lord; for the monks of the Carmo Convent who survived their home’s destruction, their plaintive cry might as well have been that of Christ’s final words upon the cross in the Book of Matthew: “My God, my God, why hast thou forsaken me?”

Christ may have been the Son of God, but with his dying words he was also a master of intertextual allusion, for his concluding remarks are a quotation of another man, the righteous gentile from the land of Uz named Job. If theodicy is the one insoluble problem of monotheism, the viscerally felt and empirically verifiable reality of pain and suffering in a universe supposedly divine, then Job remains the great brief for those of us who feel like God has some explaining to do. Along with other biblical wisdom literature like Ecclesiastes or Song of Songs, Job is one of those scriptural books that can sometimes appear as if some divine renegade snuck it into the canon. What Job takes as its central concern is the reality described by journalist Peter Trachtenberg in his devastating The Book of Calamities: Five Questions about Suffering, when he writes that “Everybody suffers: War, sickness, poverty, hunger, oppression, prison, exile, bigotry, loss, madness, rape, addiction, age, loneliness.” Job is what happens when we ask about why those things are our lot with an honest but broken heart.

I’ve taught Job to undergraduates before, and I’ve sometimes been surprised by their lack of shock when it comes to what’s disturbing about the narrative. By way of synopsis, the book tells the story of a man whom the poem makes clear is unassailably righteous, and how Satan, in his first biblical appearance (and counted as a son of God to boot), challenges the Creator, maintaining that Job’s piety is only a result of his material and familiar well-being. The deity answers the devil’s charge by letting the latter physically and psychologically torture blameless Job, so as to demonstrate that the Lord’s human servant would never abjure Him. In Bar-Ilan University bible professor Edward Greenstein’s masterful Job: A New Translation, the central character is soberly informed that “Your sons and your daughters were eating and drinking wine in/the house of their brother, the firstborn,/When here: a great wind came from across the desert;/it affected the four corners of the house;/it fell on the young people and they died.”—and the final eight words of the last line convey in their simplicity the defeated and personal nature of the tragedy. Despite the decimation of Job’s livestock, the death of his children, the rejection of his wife, and finally the contraction of an unsightly and painful skin ailment (perhaps boils), “Job did not commit-a-sin – /he did not speak insult” against God.

Job didn’t hesitate to speak against his own life, however. He bemoans his own birth, wishing the very circumstances that his life could be erased, declaring “Let the day disappear, the day I was born, /And the night that announced: A man’s been conceived!” Sublimely rendered in both their hypocrisy and idiocy are three “friends” (a later interpolation that is the basis of the canonical book adds a fourth) who come to console Job, but in the process they demonstrate the inadequacy of any traditional theodicy in confronting the nature of why awful things frequently happen to good people. Eliphaz informs Job that everything, even the seemingly evil, takes part in God’s greater and fully good plan, that the Lord “performs great things too deep to probe, /wondrous things, beyond number.” The sufferer’s interlocutor argues, as Job picks at his itchy boils with a bit of pottery, perhaps remembering the faces of his dead children when they were still infants, that God places “the lowly up high, /So the stooped attain relief.” Eliphaz, of whom we know nothing other than that he speaks like a man who has never experienced loss, is the friend whom counsels us that everything works out in the end even when we’re at our child’s funeral.

Bildad, on the other hand, takes a different tact, arguing that if Job’s sons “committed a sin against him, /He has dispatched them for their offense,” the victim-blaming logic that from time immemorial has preferred to ask what the raped was wearing rather than why the rapist rapes. Zophar angrily supports the other two, and the latecomer Elihu emphasizes God’s mystery and Job’s impudence to question it. To all of these defenses of the Lord, Job responds that even “were I in the right, his mouth would condemn me. /(Even) were I innocent, he would wrong me… It is all the same. /And so, I declare:/The innocent and the guilty he brings to (the same) end.” In an ancient world where it’s taken as a matter of simple retributive ethics that God will punish the wicked and reward the just, Job’s realism is both more world-weary and more humane than the clichés offered by his fair-weather friends. “Why do the wicked live on and live well,/Grow old and gain in power and wealth?” asks Job, and from 500 BCE unto 2019 that remains the central question of ethical theology. As concerns any legitimate, helpful, or moving answer from his supposed comforters, Greenstein informs us that “They have nothing to say.”

Finally, in the most dramatic and figuratively adept portion of the book, God Himself arrives from a whirlwind to answer the charges of Job’s “lawsuit” (as Greenstein renders the disputation). The Lord never answers Job’s demand to know why he has been forced to suffer, rather answering in non-sequitur about his own awesomeness, preferring to rhetorically answer the pain of a father who has buried his children by asking “Where were you when I laid earth’s foundations?/Tell me – if you truly know wisdom!/Who set its dimensions? Do you know? /Who stretched the measuring line?… Have you ever reached the sources of Sea, /And walked on the bottom of Ocean?” God communicates in the rhetoric of sarcastic grandeur, mocking Job by saying “You must know… Your number of days is so many!” Of course, Job had never done any of those things, but an unempathetic God also can’t imagine what it would be like to have a Son die—at least not yet. That gets ahead of ourselves, and a reader can’t help but notice that for all of His poetry from the whirlwind, and all of His frustration at the intransigent questioning of Job, the Lord never actually answers why such misfortune has befallen this man. Rather God continually emphasizes His greatness to Job’s insignificance, His power to Job’s feebleness, His eternity to Job’s transience.

The anonymous author’s brilliance is deployed in its dramatic irony, for even if Job doesn’t know why he suffers, we know. Greenstein explains that readers “know from the prologue that Job’s afflictions derive from the deity’s pride, not from some moral calculus.” Eliphaz, Bildad, Zophar, and Elihu can pontificate about the unknowability of God’s reasons while Job is uncertain as to if he’s done anything wrong that merits such treatment, but the two omniscient figures in the tale—God and the reader—know why the former did what He did. Because the Devil told him to. Finally, as if acknowledging His own culpability, God “added double to what Job had had,” which means double the livestock, and double the children. A cruelty in that, the grieving father expected to have simply replaced his dead family with a newer, shinier, fresher, and most of all alive brood. And so, both Job and God remain silent for the rest of the book. In the ordering of the Hebrew scriptures, God remains silent for the rest of the Bible, so that when “Job died old and sated with days” we might not wonder if it isn’t the deity Himself who has expired, perhaps from shame. The wisdom offered in the book of Job is the knowledge that justice is sacred, but not divine; so that justice must ours, meaning that our responsibility to each other is all the more important.

Admittedly an idiosyncratic take on the work, and one that I don’t wish to project onto Greenstein. But the scholar does argue that a careful philological engagement with the original Hebrew renders the story far more radical than has normally been presented. Readers of Job have normally been on the side of his sanctimonious friends, and especially the Defendant who arrives out of His gassy whirlwind, but the author of Job is firmly on the side of the plaintiff. If everyone from medieval monks to the authors of Midrash, from Sunday school teachers to televangelists, have interpreted Job as being about the inscrutability of God’s plans and the requirement that we passively take our undeserved pain as part of providence, than Greenstein writes that “there is a very weak foundation in biblical parlance for the common rendering.” He argues that “much of what has passed as translation of Job is facile and fudged,” having been built upon the accumulated exegetical detritus of centuries rather than returning to the Aleph, Bet, and Gimmel of Job itself.

Readers of a more independent bent have perhaps detected sarcasm in Job’s response, or a dark irony in God’s restitution of new and better replacement children for the ones that He let Satan murder. For my fellow jaundiced and cynical heretics who long maintained that Job still has some fight in him even after God emerges from the whirlwind to chastise Him for daring to question the divine plan, Greenstein has good news. Greenstein writes that the “work is not mainly about what you thought it was; it is more subversive than you imagined; and it ends in a manner that glorifies the best in human values.” He compares a modern translation of Job 42:6, where Job speaks as a penitent before the Lord, exclaiming “Therefore I despise myself/and repent in dust and ashes.” Could be clearer–>Such a message seems straightforward—because he questions the divine plan, Job hates himself, and is appropriately humbled. Yet Greenstein contrasts the modern English with a more accurate version based in an Aramaic text found in the Dead Sea Scrolls, where the man of Uz more ambiguously says “Therefore I am poured out and boiled up, and I will become dust.” This isn’t a declaration of surrender; this is an acknowledgement of loss. Greenstein explains that while both the rabbis and the Church wished to see Job repentant and in surrender, that’s not what the original presupposes. Rather, “Job is parodying God, not showing him respect. If God is all about power and not morality and justice, Job will not condone it through acceptance.” Thus, when we write that the book’s subject is judging God, we should read the noun as the object and not the subject of that sentence.

As a work, the moral universe of Job is complex; when compared to other ancient near Eastern works that are older, even exemplary ones like Gilgamesh, its ambiguities mark it as a work of poetic sophistication. Traditionally dated from the period about a half-millennium before the Common Era, Job is identified with the literature of the Babylonian Exile, when the Jews had been conquered and forcibly extricated to the land of the Euphrates. Such historical context is crucial in understanding Job’s significance, for the story of a righteous man who suffered for no understandable reason mirrored the experience of the Jews themselves, while Job’s status as a gentile underscored a dawning understanding that God was not merely a deity for the Israelites, but that indeed his status was singular and universal. When the other gods are completely banished from heaven, however, and the problem of evil rears its horned head, for when the Lord is One, who then must be responsible for our unearned pain?

Either the most subversive or the most truthful of scriptural books (maybe both), Job has had the power to move atheist and faithful alike, evidence for those who hate God and anxious warning for those that love Him. Former Jesuit Jack Miles enthuses in God: A Biography, that “exegetes have often seen the Book of Job as the self-refutation of the entire Judeo-Christian tradition.” Nothing in the canon of Western literature, be it Sophocles’s Oedipus Rex or William Shakespeare’s Hamlet, quite achieves the dramatic pathos of Job. Consider its terrors, its ambiguities, its sense of injustice, and its impartation that “our days on earth are a shadow.” Nothing written has ever achieved such a sense of universal tragedy. After all, the radicalism of the narrative announces itself, for Job concerns itself with the time that God proverbially sold his soul to Satan in the service of torturing not just an innocent man, but a righteous one. And that when questioned on His justification for doing such a thing, the Lord was only able to respond by reiterating His own power in admittedly sublime but ultimately empty poetry. For God’s answer of theodicy to Job is not an explanation of how, but not an explanation of why—and when you’re left to scrape boils off your self with a pottery shard after your 10 children have died in a collapsing house, that’s no explanation at all.

With Greenstein’s translation, we’re able to hear Job’s cry not in his native Babylonian, or the Hebrew of the anonymous poet who wrote his tale, or the Aramaic of Christ crucified on Golgotha, or even the stately if antiquated early modern English of the King James Version, but rather in a fresh, contemporary, immediate vernacular that makes the tile character’s tribulations our own. Our Job is one who can declare “I am fed up,” and something about the immediacy of that declaration makes him our contemporary in ways that underscore the fact that billions are his proverbial sisters and brothers. Greenstein’s accomplishment makes clear a contention that is literary, philosophical, and religious: that the Book of Job is the most sublime masterpiece of monotheistic faith, because what its author says is so exquisitely rendered and so painfully true. Central to Greenstein’s mission is a sense of restoration. This line needs to be clearer: Job is often taught and preached as simply being about humanity’s required humility before the divine, and the need to prostrate ourselves before a magnificent God whose reasons are inscrutable.

By restoring Job its status as a subversive classic, Greenstein does service to the worshiper and not the worshiped, to humanity and not our oppressors. Any work of translation exists in an uneasy stasis between the original and the
adaptation, a one-sided negotiation across millennia where the original author has no say. My knowledge of biblical Hebrew is middling at best, so I’m not suited to speak towards the transgressive power of whomever the anonymous poet of Job was, but regardless of what those words chosen by her or him were, I can speak to Greenstein’s exemplary poetic sense. At its core, part of what makes this version of Job so powerful is how it exists in contrast to those English versions we’ve encountered before, from the sublime plain style of King James to the bubblegum of the Good News Bible. Unlike those traditional translations, the words “God” and “Lord,” with their associations of goodness, appear nowhere in this translation. Rather Greenstein keeps the original “Elohim” (which I should point out is plural), or the unpronounceable,
vowelless tetragrammaton, rendered as “YHWH.” Job is made new through the deft use of the literary technique known as defamiliarization, making that which is familiar once again strange (and thus all the more radical and powerful).

Resurrecting the lightning bolt that is Job’s poetry does due service to the original. Job’s subject is not just theodicy, but the failures of poetry itself. When Job defends himself against the castigation of Eliphaz, Bildad, Zophar, and Elihu, it’s not just a theological issue, but a literary critical one as well. The suffering man condemns their comforts and callousness, but also their clichés. That so much of what his supposed comforters draw from are biblical books like Psalms and Proverbs testifies to Job’s transgressive knack for scriptural self-reflection. As a poet, the author is able to carefully display exactly what she or he wishes to depict, an interplay between the presented and hidden that has an uncanny magic. When the poet writes that despite all of Job’s tribulations, “Job did not commit-a-sin with his lips,” it forces us to ask if he committed sin in his heart and his head, if his soul understandably screamed out at God’s injustice while his countenance remained pious. This conflict between inner and outer appearance is the logic of the novelist, a type of interiority that we associate more with Gustave Flaubert or Henry James than we do with the Bible.

When it comes to physical detail, the book is characteristically minimalist. Like most biblical writing, the author doesn’t present much of a description of either God or Satan, though that makes their presentation all the more eerie and disturbing. When God asks Satan where he has been, the arch-adversary responds “From roving the earth and going all about it.” Satan’s first introduction in all of Western literature, for never before was that word used for the singular character, thus brings with it those connotations of ranging, stalking, and creeping that have so often accrued about the reptilian creature who is always barely visible out of the corner of our eyes. Had we been given more serpentine exposition on the character, cloven hooves and forked tales, it would lack the unsettling nature of what’s actually presented. But when the author wants to be visceral, she or he certainly can be. Few images are as enduring in their immediacy than how Job “took a potsherd with which to scratch himself as he sits in/the ash-heap.” His degradation, his tribulation, his shame still resonates 2,500 years later.

The trio (and later the quartet) of Job’s judgmental colleagues render theodicy as a poetry of triteness, while Job’s poetics of misery is commensurate with the enormity of his fate. “So why did you take me out of the womb?” he demands of God, “Would I had died, with no eye seeing me!/Would I had been as though I had been;/Would I had been carried from womb to tomb”—here Greenstein borrowing a favored rhyming congruence from the arsenal of English’s Renaissance metaphysical poets. Eliphaz and Elihu offer maudlin bromides, but Job can describe those final things with a proficiency that shows how superficial his friends are. Job fantasizes about death as a place “I go and do not return, /To a land of darkness and deep-shade;/A land whose brightness is like pitch-black, /Deep-shade and disorder;/That shines like pitch-black.” That contradictory image, of something shining in pitch-black, is an apt definition of God Himself, who while He may be master of an ordered, fair, and just universe in most of the Bible, in Job He is creator of a “fundamentally amoral world,” as Greenstein writes.

If God from the whirlwind provides better poetry than his defenders could, His theology is just as empty and callous. Greenstein writes that “God barely touches on anything connected to justice or the providential care of humanity,” and it’s precisely the case that for all of His literary power, the Lord dodges the main question. God offers description of the universe’s creation, and the maintenance of all things that order reality, he conjures the enormities of size and time, and even provides strangely loving description of his personal pets, the fearsome hippopotamus Behemoth and the terrifying whale/alligator Leviathan. Yet for all of that detail, exquisitely rendered, God never actually answers Job. Bildad or Elihu would say that God has no duty to explain himself to a mere mortal like Job, that the man of Uz deserves no justification for his punishment in a life that none of us ever chose to begin. That, however, obscures the reality that even if Job doesn’t know the reasons behind what happened, we certainly know.

Greenstein’s greatest contribution is making clear that not only does God not answer Job’s pained query, but that the victim knows that fact. And he rejects it. Job answers God with “I have spoken once and I will not repeat…I will (speak) no more.” If God answers with hot air from the whirlwind, the soon-to-be-restored Job understands that silent witness is a more capable defense against injustice, that quiet is the answer to the whims of a capricious and cruel Creator. God justifies Himself by bragging about His powers, but Job tells him “I have known you are able to do all;/That you cannot be blocked from any scheme.” Job has never doubted that God has it within his power to do the things that have been done, rather he wishes to understand why they’ve been done, and all of the beautiful poetry marshaled from that whirlwind can’t do that. The Creator spins lines and stanzas as completely as He once separated the night from the day, but Job covered in his boils could care less. “As a hearing by ear I have heard you,” the angry penitent says, “And now my eye has seen you. That is why I am fed up.”

Traditional translations render the last bit so as to imply that a repentant Job has taken up dust and ashes in a pose of contrition, but the language isn’t true to the original. More correct, Greenstein writes, to see Job as saying “I take pity on ‘dust and ashes,’” that Job’s sympathy lay with humanity, that Job stands not with God but with us. Job hated not God, but rather His injustice. Such a position is the love of everybody else. Miles puts great significance in the fact that the last time the deity speaks (at least according to the ordering of books in the Hebrew Scriptures) is from the whirlwind.  According to Miles, when the reality of Job’s suffering is confronted by the empty beauty of the Lord’s poetry, a conclusion can be drawn: “Job has won. The Lord has lost.” That God never speaks to man again (at least in the ordering of the Hebrew Scriptures) is a type of embarrassed silence, and for Miles the entirety of the Bible is the story of how humanity was able to overcome God, to overcome our need of God. Job is, in part, about the failure of beautiful poetry when confronted with loss, with pain, with horror, with terror, with evil. After Job there can be no poetry. But if Job implies that there can be no poetry, it paradoxically still allows for prayer. Job in his defiance of God teaches us a potent form of power, that dissent is the highest form of prayer, for what could possibly take the deity more seriously? In challenging, castigating, and criticizing the Creator, Job fulfills the highest form of reverence that there possibly can be.

Image credit: Unsplash/Aaron Burden.

The Universe in a Sentence: On Aphorisms

“A fragment ought to be entirely isolated from the surrounding world like a little work of art and complete in itself like a hedgehog.” —Friedrich Schlegel, Athenaeum Fragments (1798)

“I dream of immense cosmologies, sagas, and epics all reduced to the dimensions of an epigram.” —Italo Calvino, Six Memos for the Next Millennium (1988)

From its first capital letter to the final period, an aphorism is not a string of words but rather a manifesto, a treatise, a monograph, a jeremiad, a sermon, a disputation, a symposium. An aphorism is not a sentence, but rather a microcosm unto itself; an entrance through which a reader may walk into a room the dimensions of which even the author may not know. Our most economic and poetic of prose forms, the aphorism does not feign argumentative completism like the philosophical tome, nor does it compel certainty as does the commandment—the form is cagey, playful, and mysterious. To either find an aphorism in the wild, or to peruse examples in a collection that mounts them like butterflies nimbly held in place with push-pin on Styrofoam, is to have a literary-naturalist’s eye for the remarkable, for the marvelous, for the wondrous. And yet there has been, at least until recently, a strange critical lacuna as concerns aphoristic significance. Scholar Gary Morson writes in The Long and Short of It: From Aphorism to Novel that though they “constitute the shortest [of] literary genres, they rarely attract serious study. Universities give courses on the novel, epic, and lyric…But I know of no course on…proverbs, wise sayings, witticisms and maxims.”

An example of literary malpractice, for to consider an aphorism is to imbibe the purest distillation of a mind contemplating itself. In an aphorism every letter and word counts; every comma and semicolon is an invitation for the reader to discover the sacred contours of her own thought. Perhaps answering Morson’s observation, critic Andrew Hui writes in his new study A Theory of the Aphorism: From Confucius to Twitter that the form is “Opposed to the babble of the foolish, the redundancy of bureaucrats, the silence of mystics, in the aphorism nothing is superfluous, every word bear weight.” An aphorism isn’t a sentence—it’s an earthquake captured in a bottle. It isn’t merely a proverb, a quotation, an epigraph, or an epitaph; it’s fire and lightning circumscribed by the rules of syntax and grammar, where rhetoric itself becomes the very stuff of thought. “An aphorism,” Friedrich Nietzsche aphoristically wrote, “is an audacity.”

If brevity and surprising disjunction are the elements of the aphorism, then in some ways we’re living in a veritable renaissance of the form, as all of the detritus of our fragmented digital existence from texting to Twitter compels us toward the ambiguous and laconic (even while obviously much of what’s produced is sheer detritus). Hui notes the strangely under-theorized nature of the aphorism, observing that at a “time when a presidency can be won and social revolutions ignited by 140-character posts…an analysis of the short saying seems to be crucial as ever” (not that we’d put “covfefe” in the same category as Blaise Pascal).

Despite the subtitle to Hui’s book, A Theory of the Aphorism thankfully offers neither plodding history nor compendium of famed maxims. Rather Hui presents a critical schema to understand what exactly the form does, with its author positing that the genre is defined by “a dialectical play between fragments and systems.” Such a perspective on aphorism sees it not as the abandoned step-child of literature, but rather the very thing itself, a discursive and transgressive form of critique that calls into question all received knowledge. Aphorism is thus both the substance of philosophy and the joker that calls the very idea of literature and metaphysics into question.  The adage, the maxim, and the aphorism mock completism. Jesus’s sayings denounce theology; Parmenides and Heraclitus deconstruct philosophy. Aphoristic thought is balm and succor against the rigors of systemization, the tyranny of logic. Hui writes that “aphorisms are before, against, and after philosophy.” Before, against, and after scripture too, I’ll add. And maybe everything else as well. An aphorism may feign certainty, but the very brevity of the form ensures it’s an introduction, even if its rhetoric wears the costume of conclusion.

Such is why the genre is so associated with gnomic philosophy, for any accounting of aphoristic origins must waylay itself in both the fragments of the pre-Socratic metaphysicians, the wisdom literature of the ancient Near East, and the Confucian and Taoist scholars of China. All of this disparate phenomenon can loosely be placed during what the German philosopher Karl Jaspers called the “Axial Age,” a half-millennium before the Common Era when human thought began to move towards the universal and abstract. From that era (as very broadly constituted) we have Heraclitus’s “Nature loves to hide,” Lao-Tzu’s “The spoken Tao is not the real Tao,” and Jesus Christ’s “The kingdom of God is within you.” Perhaps the aphorism was an engine for that intellectual transformation, the sublimities of paradox breaking women and men out of the parochialism that marked earlier ages. Regardless of why the aphorism’s birth coincides with that pivotal moment in history, that era was the incubator for some of our earliest (and greatest) examples.

From the Greek philosophers who pre-date Socrates, and more importantly his systematic advocate Plato, metaphysics was best done in the form of cryptic utterance. It shouldn’t be discounted that the gnomic quality of thinkers like Parmenides, Heraclitus, Democritus, and Zeno might be due to the disappearance of their full corpus over the past 2,500 years. Perhaps erosion of stone and fraying of papyrus has generated such aphorisms. Entropy is, after all, our final and most important teacher. Nevertheless, aphorism is rife in the pre-Socratic philosophy that remains, from Heraclitus’s celebrated observation that “You can’t step into the same river twice” to Parmenides’s exactly opposite contention that “It is indifferent to me where I am to begin, for there shall I return again.” Thus is identified one of the most difficult qualities of the form—that it’s possible to say conflicting things and that by virtue of how you say them you’ll still sound wise. A dangerous form, the aphorism, for it can confuse rhetoric for knowledge. Yet perhaps that’s too limiting a perspective, and maybe its better to think of the chain of aphorisms as a great and confusing conversation; a game in which both truth and its opposite can still be true.

A similar phenomenon is found in wisdom literature, a mainstay of Jewish and Christian writing from the turn of the Common Era, as embodied in both canonical scripture such as Job, Ecclesiastes, Proverbs, and Song of Songs, as well as in more exotic apocryphal books known for their sayings like The Gospel of Thomas. Wisdom literature was often manifested in the form of a listing of sayings or maxims, and since the 19th century, biblical scholars who advocated the so-called “two-source hypothesis,” have argued that the earliest form of the synoptic New Testament gospels was a (as yet undiscovered) document known simply as “Q,” which consisted of nothing but aphorisms spoken by Christ. This conjectured collection of aphorisms theoretically became the basis for Matthew and Luke whom (borrowing from Mark) constructed a narrative around the bare-bones sentences. Similarly, the eccentric, hermetic, and surreal Gospel of Thomas, which the book of John was possibly written in repost towards, is an example of wisdom literature at its purest, an assemblage of aphorisms somehow both opaque and enlightening. “Split a piece of wood; I am there,” cryptically says Christ in the hidden gospel only rediscovered at Nag Hamadi, Egypt, in 1945, “Lift up the stone, and you will find me there.”

From its Axial-Age origins, the aphorism has a venerable history. Examples were transcribed in the self-compiled Commonplace Books of the Middle Ages and the Renaissance, in which students were encouraged to record their favorite fortifying maxims for later consultation. The logic behind such exercises was, as anthologizer John Gross writes in The Oxford Book of Aphorisms, that the form does “tease and prod the lazy assumptions lodged in the reader’s mind; they wars us how insidiously our vices can pass themselves off as virtues; they harp shamelessly on the imperfections and contradictions which we would rather ignore.” Though past generations were instructed on proverbs and maxims, today “Aphorisms are often derided as trivial,” as Aaron Haspel writes in the introduction to Everything: A Book of Aphorisms, despite the fact that “most people rule their lives with four or five of them.”

The last five centuries have seen no shortage of aphorists who are the originators of those four or five sayings that you might live your life by, gnomic authors who speak in the prophetic utterance of the form like Desiderius Erasmus, François de La Rochefoucauld, Pascal, Benjamin Franklin, Voltaire, William Blake, Nietzsche, Ambrose Bierce, Oscar Wilde, Gustave Flaubert, Franz Kafka, Ludwig Wittgenstein, Dorothy Parker, Theodor Adorno, and Walter Benjamin. Hui explains that “Aphorisms are transhistorical and transcultural, a resistant strain of thinking that has evolved and adapted to its environment for millennia. Across deep time, they are vessels that travel everywhere, laden with fraught yet buoyant.” In many ways, modernity has proven an even more prodigious environment for the digressive and incomplete form, standing both in opposition to the systemization of knowledge that has defined the last half-millennia, while also embodying the aesthetic of fragmented bricolage that sometimes seems as if it was our birthright. Contemporary master of the form Yahia Lababidi writes in Where Epics Fail: Meditations to Live By that “An aphorist is not one who writes in aphorisms but one who thinks in aphorisms.” In our fractured, fragmented, disjointed time, where we take wisdom where we find it, we’re perhaps all aphorists, because we can’t think in any other way.

Anthologizer James Lough writes in his introduction to Short Flights: Thirty-Two Modern Writers Share Aphorisms of Insight, Inspiration, and Wit that “Our time is imperiled by tidal waves of trivia, hectored by a hundred emails a day, colored by this ad slogan or that list of things to do, each one sidetracking our steady, focused awareness.” What might seem to be one of the most worrying detriments of living in post-modernity—our digitally slackening attention spans and inattention to detail—is the exact same quality that allows contemporary aphorists the opportunity to dispense arguments, insight, enlightenment, and wisdom in a succinct package, what Lough describes as a “quickly-digested little word morsel, delightful and instructive, that condenses thought, insight, and wordplay.”

Our century has produced brilliant aphorists who have updated the form while making use of its enduring and universal quality of brevity, metaphor, arresting detail, and the mystery that can be implied by a few short words that seem to gesture towards something slightly beyond our field of sight, and who embody Gross’s description of the genre as one which exemplifies a “concentrated perfection of phrasing which can sometimes approach poetry in its intensity.” Authors like Haspel, Lababidi, Don Paterson in The Fall at Home: New and Selected Aphorisms, and Sarah Manguso in 300 Arguments may take as their subject matter issues of current concern, from the Internet to climate change, but they do it in a form that wouldn’t seem out of place on a bit of frayed pre-Socratic papyrus.

Consider the power and poignancy of Manguso’s maxim “Inner beauty can fade, too.” In only five words, and one strategically placed comma that sounds almost like a reserved sigh, Manguso demonstrates one of the uncanniest powers of the form. That it can remind you of something that you’re already innately aware of, something that relies on the nature of the aphorism to illuminate that which we’d rather obscure. Reading 300 Arguments is like this. Manguso bottles epiphany, the strange acknowledgment of encountering that which you always knew but could never quite put into words yourself, like discovering that the gods share your face. Lababidi does something similar in Where Epics Fail, giving succinct voice to the genre’s self-definition, writing that “Aphorisms respect the wisdom of silence by disturbing it, but briefly.” Indeed, self-referentiality is partially at the core of modern aphorisms; many of Paterson’s attempts are of maxims considering themselves like an ouroboros biting its tail. In the poet’s not unfair estimation, an aphorism is “Hindsight with murderous purpose.”

Lest I be accused of uncomplicated enthusiasms in offering an encomium for the aphorism, let it be said that the form can be dangerous, that it can confuse brevity and wit with wisdom and knowledge, that rhetoric (as has been its nature for millennia) can pantomime understanding as much as express it. Philosopher Julian Baggini makes the point in Should You Judge This Book by Its Cover: 100 Takes on Familiar Sayings and Quotations, writing that aphorisms “can be too beguiling. They trick us into thinking we’ve grasped a deep thought by their wit and brevity. Poke them, however, and you find they ride roughshod over all sorts of complexities and subtleties.” The only literary form where rhetoric and content are as fully unified as the aphorism is arguably poetry proper, but ever fighting Plato’s battle against the versifiers, Baggini is correct that there’s no reason why an expression in anaphora or chiasmus is more correct simply because the prosody of its rhetoric pleases our ears.

Economic statistician Nassim Nicholas Taleb provides ample representative examples in The Bed of Procrustes: Philosophical and Practical Aphorisms of how a pleasing adage need not be an accurate statement. There’s an aesthetically harmonious tricolon in his contention that the “three most harmful addictions are heroin, carbohydrates, and a monthly salary,” and though of those I’ve only ever been addicted to bread, pizza, and pasta, I’ll say from previous addictions that I could add a few more harmful vices to the list made by Taleb. Or, when he opines that “Writers are remembered for their best work, politicians for their worst mistakes, and businessmen are almost never remembered,” I have to object. I’d counter Taleb’s pithy aphorism by pointing out that both John Updike and Philip Roth are remembered for their most average writing, that an absurd preponderance of things in our country are named for Ronald Reagan whose entire political career was nothing but mistakes, and that contra the claim that businessmen are never remembered, I’ll venture that my entire Pittsburgh upbringing where everything is named after a Carnegie, Frick, Mellon, or Heinz demonstrates otherwise. The Bed of Procrustes is like this, lots of sentences that are as if they came from Delphi, but when you spend a second to contemplate Taleb’s claim that “If you find any reason why you and someone are friends, you are not friends” you’ll come to the conclusion that far from experience-tested wise maxim, it’s simply inaccurate.

Critic Susan Sontag made one of the best arguments against aphoristic thinking in her private writings, as published in As Consciousness Is Harnessed to Flesh: Journals and Notebooks, 1964-1980, claiming that “An aphorism is not an argument; it is too well-bred for that.” She makes an important point—that the declarative confidence of the aphorism can serve to announce any number of inanities and inaccuracies as if they were true by simple fiat. Sontag writes that “Aphorism is aristocratic thinking: this is all the aristocrat is willing to tell you; he thinks you should get it fast, without spelling out all the details.” Providing a crucial counter-position to the uncomplicated celebration of all things aphoristic, Sontag rightly observes that “To write aphorisms is to assume a mask—a mask of scorn, of superiority,” which would certainly seem apropos when encountering the claim that if you can enumerate why you enjoy spending time with a friend they’re not really your friend. “We know at the end,” Sontag writes, that “the aphorist’s amoral, light point-of-view self-destructs.”

An irony in Sontag’s critique of aphorisms, for embedded within her prose like any number of shining stones found in a muddy creek are sentences that themselves would make great prophetic adages. Aphorisms are like that though; even with Sontag’s and Baggini’s legitimate criticism of the form’s excesses, we can’t help but approach, consider, think, and understand in that genre for which brevity is the essence of contemplation. In a pose of self-castigation, Paterson may have said that the “aphorism is already a shadow of itself,” but I can’t reject that microcosm, that inscribed reality within a few words, that small universe made cunningly. Even with all that I know about the risks of rhetoric, I can not pass sentence on the sentence. Because an aphorism is open-ended, it is disruptive; as such, it doesn’t preclude, but rather opens; the adage both establishes and abolishes its subject, simultaneously. Hui writes that the true subject of the best examples of the form is “The infinite,” for “either the aphorism’s meaning is inexhaustible or its subject of inquiry—be it God or nature of the self—is boundless.”

In The Aphorism and Other Short Forms, author Ben Grant writes that in the genre “our short human life and eternity come together, for the timelessness of the truth which the aphorism encapsulates can only be measured against our own ephemerality, of which the brevity of the aphorism serves as an apt expression.” I agree with Grant’s contention, but I would amend one thing—the word “truth.” Perhaps that’s what’s problematic about Baggini’s and Sontag’s criticism, for we commit a category mistake when we assume that aphorisms exist only to convey some timeless verity. Rather, I wonder if what Hui describes as the “most elemental of literary forms,” those “scattered lines of intuition…[moving by] arrhythmic leaps and bounds” underscored by “an atomic quality—compact yet explosive” aren’t defined by the truth, but rather by play.

Writing and reading aphorisms is the play of contemplation, the joy of improvisation; it’s the very nature of aphorism. We read such sayings not for the finality of truth, but for the possibilities of maybe. For a short investment of time and an economy of words we can explore the potential of ideas that even if inaccurate, would sink far longer and more self-serious works. That is the form’s contribution. An aphorism is all wheat and no chaff, all sweet and no gaffe.

Image credit: Unsplash/Prateek Katyal.

The Lion and the Eagle: On Being Fluent in “American”

“How tame will his language sound, who would describe Niagara in language fitted for the falls at London bridge, or attempt the majesty of the Mississippi in that which was made for the Thames?” —North American Review (1815)

“In the four quarters of the globe, who reads an American book?” —Edinburgh Review (1820)

Turning from an eastern dusk and towards a western dawn, Benjamin Franklin miraculously saw a rising sun on the horizon after having done some sober demographic calculations in 1751. Not quite yet the aged, paunchy, gouty, balding raconteur of the American Revolution, but rather only the slightly paunchy, slightly gouty, slightly balding raconteur of middle-age, Franklin examined the data concerning the births, immigration, and quality of life in the English colonies by contrast to Great Britain. In his Observations Concerning the Increase of Mankind, Franklin noted that a century hence, and “the greatest Number of Englishmen will be on this Side of the Water” (while also taking time to make a number of patently racist observations about a minority group in Pennsylvania—the Germans). For the scientist and statesman, such magnitude implied inevitable conclusions about empire.


Whereas London and Manchester were fetid, crowded, stinking, chaotic, and over-populated, Philadelphia and Boston were expansive, fresh, and had room to breathe. In Britain land was at a premium, but in America there was the seemingly limitless expanse stretching towards an unimaginable West (which was, of course, already populated by people). In the verdant fecundity of the New World, Franklin imagined (as many other colonists did) that a certain “Merry Old England” that had been supplanted in Europe could once again be resurrected, a land defined by leisure and plenty for the largest number of people. Such thoughts occurred, explains Henry Nash Smith in his classic study Virgin Land: The American West as Symbol and Myth, because “the American West was nevertheless there, a physical fact of great if unknown magnitude.”

As Britain expanded into that West, Franklin argued that the empire’s ambitions should shift from the nautical to the agricultural, from the oceanic to the continental, from sea to land. Smith describes Franklin as a “far-seeing theorist who understood what a portentous role North America” might play in a future British Empire. Not yet an American, but still an Englishmen—and the gun smoke of Lexington and Concord still 24 years away—Franklin enthusiastically prophesizes in his pamphlet that “What an Accession of Power to the British Empire by Sea as well as Land!”

A decade later, and he’d write in a 1760 missive to a one Lord Kames that “I have long been of opinion, that the foundations of the future grandeur and stability of the British empire lie in America.” And so, with some patriotism as a trueborn Englishmen, Benjamin Franklin could perhaps imagine the Court of St. James transplanted to the environs of the Boston Common, the Houses of Parliament south of Manhattan’s old Dutch Wall Street, the residence of George II of Great Britain moved from Westminster to Philadelphia’s Southwest Square. After all, it was the Anglo-Irish philosopher and poet George Berkeley, writing his lyric “Verses on the Prospect of Planting Arts and Learning in America” in his Providence, Rhode Island manse, who could gush that “Westward the course of empire takes its way.”

But even if aristocrats could perhaps share Franklin’s ambitions for a British Empire that stretched from the white cliffs of Dover to San Francisco Bay (after all, christened “New Albion” by Francis Drake), the idea of moving the capital to Boston or Philadelphia seemed anathema. Smith explained that it was “asking too much of an Englishmen to look forward with pleasure to the time when London might become a provincial capital taking orders from an imperial metropolis somewhere in the interior of North America.” Besides, it was a moot point, since history would intervene. Maybe Franklin’s pamphlet was written a quarter century before, but the gun smoke of Lexington and Concord was wafting, and soon the idea of a British capital on American shores seemed an alternative history and a historical absurdity.

Something both evocative and informative, however, in this counterfactual; imagining a retinue of the Queen’s Guard processing down the neon skyscraper canyon of Broadway, or the dusk splintering off of the gold dome of the House of Lords at the crest of Beacon Hill overlooking Boston Common. Don’t mistake my enthusiasms for this line of speculation as evidence of a reactionary monarchism, I’m very happy that such a divorce happened, even if less than amicably. What does fascinate me is the way in which the cleaving of America from Britain affected how we understand each other, the ways in which we become “two nations divided by a common language,” as variously Mark Twain or George Bernard Shaw have been reported as having waggishly once uttered.

Even more than the relatively uninteresting issue that the British spell their words with too many u’s, or say weird things like “lorry” when they mean truck, or “biscuit” when they mean cookie, is the more theoretical but crucial issue of definitions of national literature. Why are American and British literature two different things if they’re mostly written in English, and how exactly do we delineate those differences? It can seem arbitrary that the supreme Anglophiles Henry James and T.S. Eliot are (technically) Americans, and their British counterparts W.H. Auden and Dylan Thomas can seem so fervently Yankee. Then there is what we do with the early folks; is the “tenth muse,” colonial poet Anne Bradstreet, British because she was born in Northampton, England, or was she American because she died in Ipswich, Mass.? Was Thomas More “American” because Utopia is in the Western Hemisphere, was Shakespeare a native because he dreamt of The Tempest? Such divisions make us question how language relates to literature, and how literature interacts with nationality, and what that says vis-à-vis the differences between Britain and America, the lion and the eagle.

A difficulty emerges in separating two national literatures that share a common tongue, and that’s because traditionally literary historians equated “nation with a tongue,” as critic William C. Spengemann writes in A New World of Words: Redefining Early American Literature, explaining how what gave French literature or German literature a semblance of “identity, coherence, and historical continuity” was that they were defined by language and not by nationality. By such logic, if Dante was an Italian writer, Jean-Jacques Rousseau a French one, and Franz Kafka a German author, then it was because those were the languages in which they wrote, despite them respectively being Swiss, Czech, and Florentine.

Americans, however, largely speak in the tongue of the country that once held dominion over the thin sliver of land that stretched from Maine to Georgia, and as far west as the Alleghenies. Thus, an unsettling conclusion has to be drawn; perhaps the nationality of Nathaniel Hawthorne, Herman Melville, and Emily Dickinson was American, but the literature they produced would be English, so that with horror we realize that Walt Whitman’s transcendent “barbaric yawp” would be growled in the Queen’s tongue. Before he brilliantly complicates that logic, Spengemann sheepishly concludes, “writings in English by Americans belong, by definition, to English literature.”

Sometimes misattributed to linguist Noam Chomsky, it was actually the scholar of Yiddish Max Weinreich who quipped that a “language is a dialect with an army and a navy.” If that’s the case, then a national literature is the bit of territory that that army and navy police. But what happens when that language is shared by two separate armies and navies? To what nation does the resultant literature then belong? No doubt there are other countries where English is the lingua franca; all of this anxiety over the difference between British and American literature doesn’t seem so quite involved as regards British literature and its relationship to poetry and prose from Canada, Australia, or New Zealand for that matter.

Sometimes those national literatures, and especially English writing from former colonies like India and South Africa, are still folded into “British Literature.” So Alice Munro, Les Murray, Clive James, J.M. Coetzee, and Salman Rushdie could conceivably be included in anthologies or survey courses focused on “British Literature”—though they’re Canadian, Australian, South African, and Indian—where it would seem absurd to similarly include William Faulkner and Toni Morrison in the same course. Some anthologizers who are seemingly unaware that Ireland isn’t Great Britain will even include James Joyce and W.B. Yeats alongside Gerard Manley Hopkins and D.H. Lawrence as somehow transcendentally “English,” but the similar (and honestly less offensive) audacity of including Robert Lowell or Sylvia Plath as “English” writers is unthinkable.

Perhaps it’s simply a matter of shared pronunciation, the superficial similarity of accent that makes us lump Commonwealth countries (and Ireland) together as “English” literature, but something about that strict division between American and British literature seems more deliberate to me, especially since they ostensibly do share a language. An irony in that though, for as a sad and newly independent American said in 1787, “a national language answers the purpose of distinction: but we have the misfortune of speaking the same language with a nation, who, of all people in Europe, have given, and continue to give us thefewest proofs of love.”

Before embracing English, or pretending that “American” was something different than English, both the Federal Congress and several state legislatures considered making variously French, German, Greek, and Hebrew our national tongue, before wisely rejecting the idea of one preferred language. So unprecedented and so violent was the breaking of America with Britain, it did after all signal the end of the first British Empire, and so conscious was the construction of a new national identity in the following century, that it seems inevitable that these new Federalists would also declare an independence for American literature.

One way this was done, shortly after the independence of the Republic, is exemplified by the fantastically named New York Philological Society, whose 1788 charter exclaimed their founding “for the purpose of ascertaining and improving the American Tongue.” If national literatures were defined by language, and America’s language was already the inheritance of the English, well then, these brave philologists would just have to transform English into American. Historian Jill Lepore writes in A Is for American: Letters and Other Characters in the Newly United States that the society was created in the “aftermath of the bloody War for Independence” in the hope that “peacetime America would embrace language and literature and adopt…a federal, national language.”

Lepore explains the arbitrariness of national borders; their contingency in the time period that the United States was born; and the manner in which, though language is tied to nationality, such an axiom is thrust through with ambiguity, for countries are diverse of speech and the majority of a major language’s speakers historically don’t reside in the birthplace of that tongue. For the New York Philological Society, it was imperative to come up with the solution of an American tongue, to “spell and speak the same as one another, but differently from people in England.” So variable were (and are) the dialects of the United States, from the non-rhotic dropped “r” of Boston and Charleston, which in the 18th century was just developing in imitation of English merchants to the guttural, piratical swagger of Western accents in the Appalachians, that this lexicographer hoped to systematize such diversity into an unique American language. As one of those assembled scholars wrote, “Language, as well as government should be national…America should have her own distinct from all the world.” His name was Noah Webster and he intended his American Dictionary to birth an entirely new language.

It didn’t of course, even if “u” fell out of “favour.” Such was what Lepore describes as an orthographic declaration of independence, for “there iz no alternativ” as Webster would write in his reformed spelling. But if Webster’s goal was the creation of a genuine new language, he inevitably fell short, for the fiction that English and “American” are two separate tongues is as political as is the claim that Bosnian and Serbian, or Swedish and Norwegian, are different languages (or that English and Scots are the same, for that matter). British linguist David Crystal writes in The Stories of English that “The English language bird was not freed by the American manoeuvre,” his Anglophilic spelling a direct rebuke to Webster, concluding that the American speech “hopped out of one cage into another.”  

Rather, the intellectual class turned towards literature itself to distinguish America from Britain, seeing in the establishment of writers who surpassed Geoffrey Chaucer or Shakespeare a national salvation. Melville, in his consideration of Hawthorne and American literature more generally, predicted in 1850 that “men not very much inferior to Shakespeare are this day born on the banks of the Ohio.” Three decades before that, and the critic Sydney Smith had a different evaluation of the former colonies and their literary merit, snarking interrogatively in the Edinburgh Review that “In the four quarters of the globe, who reads an American book?” In the estimation of the Englishmen (and Scotsmen) who defined the parameters of the tongue, Joel Barlow, Philip Freneau, Hugh Henry Brackenridge, Washington Irving, Charles Brockden Brown, and James Fenimore Cooper couldn’t compete with the sublimity of British literature.

Yet, by 2018, British critics weren’t quite as smug as Smith had been, as more than 30 authors protested the decision four years earlier to allow Americans to be considered for the prestigious Man Booker Prize. In response to the winning of the prize by George Saunders and Paul Beatty, English author Tessa Hadley told The New York Times “it’s as though we’re perceived…as only a subset of U.S. fiction, lost in its margins and eventually, this dilution of the community of writers plays out in the writing.” Who in the four corners of the globe reads an American novel? Apparently, the Man Booker committee. But Hadley wasn’t, in a manner, wrong in her appraisal. By the 21st century, British literature is just a branch of American literature. The question is how did we get here?

Only a few decades after Smith wrote his dismissal, and writers would start to prove Melville’s contention, including of course the author of Benito Cereno and Billy Bud himself. In the decade before the Civil War there was a Cambrian Explosion of American letters, as suddenly Romanticism had its last and most glorious flowering in the form of Ralph Waldo Emerson, Henry David Thoreau, Hawthorne, Melville, Emily Dickinson, and of course Walt Whitman. Such was the movement whose parameters were defined by the seminal scholar F.O. Matthiessen in his 1941 classic American Renaissance: Art and Expression in the Age of Emerson and Whitman, who included studies of all of those figures saving (unfortunately) Dickinson.

The Pasadena-born-but-Harvard-trained Matthiessen remains one of the greatest professors to ever ask his students to close-read a passage of Emerson. American Renaissance was crucial in discovering American literature as something distinct and great in its own right from British literature. Strange to think now, but for most of the 20th century the teaching of American literature was left for either history classes, or the nascent (State Department funded) discipline of American Studies. English departments, true to the very name of the field, tended to see American poetry and novels as beneath consideration in the study of “serious” literature. The general attitude of American literary scholars about their own national literature can be summed up by Victorian critic Charles F. Richardson, who, in his 1886 American Literature, opined that “If we think of Shakespeare, Bunyan, Milton, the seventeenth-century choir of lyrists, Sir. Thomas Browne, Jeremy Taylor, Addison, Swift, Dryden, Gray, Goldsmith, and the eighteenth-century novelists…what shall we say of the intrinsic worth of most of the book written on American soil?” And that was a book by an American about American literature!

On the eve of the World War II and Matthiessen approached his subject in a markedly different way, and while scholars had granted the field a respectability, American Renaissance was chief in radically altering the manner in which we thought about writing in English (or “American”). Matthiessen surveyed the diversity of the 1850s from the repressed Puritanism of Hawthorne’s The Scarlet Letter to the Pagan-Calvinist cosmopolitan nihilism of Moby-Dick and the pantheistic manual that is Thoreau’s Walden in light of the exuberant homoerotic mysticism of Whitman’s Leaves of Grass. Despite such diversity, Matthiessen concluded that the “one common denominator of my five writers…was their devotion to the possibilities of democracy.” Matthiessen was thanked for his discovery of American literature by being hounded by the House Un-American Activities Committee over his left-wing politics, until the scholar jumped from the 12th floor of Boston’s Hotel Manger in 1950. The critic’s commitment to democracy was all the more poignant in light of his death, for Matthiessen understood that it’s in negotiation, diversity, and collaboration that American literature truly distinguished itself. Here was an important truth, what Spengemann explains when he argues that American literature is defined not by the language in which it is written (as with British literature), but rather “American literature had to be defined politically.”

When considering the American Renaissance, an observer might be tempted to whisper “Westward the course of empire,” indeed. Franklin’s literal dream of an American capital to the British Empire never happened. Yet by the decade when the United States would wrench itself apart in an apocalyptic civil war, ironically America would become figurative capital of the English language. From London the energy, vitality, and creativity of the English language would move westward to Massachusetts in the twilight of Romanticism, and after a short sojourn in Harvard and Copley Squares, Concord and Amherst, it would migrate towards New York City with Whitman, which if not the capital of the English became the capital of the English language.  From the 1850s onward, British literature became a regional variation of American literature, a branch on the latter’s tree, a mere phylum in its kingdom.

English literary critics of the middle part of the 19th century didn’t note this transition; arguable if British reviewers in the mid part of the 20th century did either, but if they didn’t, it’s an act of critical malpractice. For who would trade the epiphanies in ballad-meter that are the lyrics of Dickinson in favor of the arid flatulence of Alfred Lord Tennyson? Who would reject the maximalist experimentations of Melville for the reactionary nostalgia of chivalry in Walter Scott? Something ineffable crossed the Atlantic during the American Renaissance, and the old problem of how we could call American literature distinct from British since both are written in English was solved—the later is just a subset of the former.

Thus, Hadley’s fear is a reality, and has been for a while. Decades before Smith would mock the pretensions of American genius, and the English gothic novelist (and son of the first prime minister) Horace Walpole would write, in a 1774 letter to Horace Mann, that the “next Augustan age will dawn on the other side of the Atlantic. There will, perhaps, be a Thucydides at Boston, a Xenophon at New York, and, in time, a Virgil at Mexico, and an Isaac Newton at Peru. At last, some curious traveler from Lima will visit England and give a description of the ruins of St. Paul’s.” Or maybe Walpole’s curious traveler was from California, as our language’s literature has ever moved west, with Spengemann observing that if by the American Renaissance the “English-speaking world had removed from London to the eastern seaboard of the United States, carrying the stylistic capital of the language along with it…[then] Today, that center of linguistic fashion appears to reside in the vicinity of Los Angeles.” Franklin would seem to have gotten his capital, staring out onto a burnt ochre dusk over the Pacific Palisades, as westward the course of empire has deposited history in that final location of Hollywood.

John Leland writes in Hip: The History that “Three generations after Whitman and Thoreau had called for a unique national language, that language communicated through jazz, the Lost Generation and the Harlem Renaissance…American cool was being reproduced, identically, in living rooms from Paducah to Paris.” Leland concludes it was “With this voice, [that] America began to produce the popular culture that would stamp the 20th century as profoundly as the great wars.” What’s been offered by the American vernacular, by American literature as broadly constituted and including not just our letters but popular music and film as well, is a rapacious, energetic, endlessly regenerative tongue whose power is drawn not by circumscription but in its porous and absorbent ability to draw from a variety of languages that have been spoken in this land. Not just English, but languages from Algonquin to Zuni.

No one less than the great grey poet Whitman himself wrote, “We American have yet to really learn our own antecedents, and sort them, to unify them. They will be found ampler than has been supposed, and in widely different sources.” Addressed to an assembly gathered to celebrate the 333th anniversary of Santa Fe, Whitman surmised that “we tacitly abandon ourselves to the notion that our United States have been fashion’d from the British Islands only, and essentially form a second England only—which is a very great mistake.” He of the multitudinous crowd, the expansive one, the incarnation of America itself, understood better than most that the English tongue alone can never define American literature. Our literature is Huck and Jim heading out into the territories, and James Baldwin drinking coffee by the Seine; Dickinson scribbling on the backs of envelopes and Miguel Piñero at the Nuyorican Poets Café. Do I contradict myself? Very well then. Large?—of course. Contain multitudes?—absolutely.

Spengemann glories in the fact that “if American literature is literature written by Americans, then it can presumably appear in whatever language an American writer happens to use,” be it English or Choctaw, Ibo or Spanish. Rather than debating what the capital of the English language is, I advocate that there shall be no capitals. Rather than making borders between national literatures we must rip down those arbitrary and unnecessary walls. There is a national speech unique to everyone who puts pen to paper; millions of literatures for millions of speakers. How do you speak American? By speaking English. And Dutch. And French. And Yiddish. And Italian. And Hebrew. And Arabic. And Ibo. And Wolof. And Iroquois. And Navajo. And Mandarin. And Japanese. And of course, Spanish. Everything can be American literature because nothing is American literature. Its promise is not just a language, but a covenant.

Image credit: Freestock/Joanna Malinowska.

From Father Divine to Jim Jones: On the Phenomenon of American Messiahs

“And you can call me an egomaniac, megalomaniac, or whatever you wish, with a messianic complex. I don’t have any complex, honey. I happen to know I’m the messiah.” —Rev. Jim Jones, FBI Tape Q1059-1

“There’s a Starman, waiting in the sky/He’d like to come and meet us/But he thinks he’d blow our minds.” —David Bowie

Philadelphia, once the second-largest city in the British Empire, was now the capital of these newly united states when the French diplomat and marquis François Barbé-Marbois attended a curious event held at the Fourth Street Methodist Church in 1782. Freshly arrived in the capital was a person born to middling stock in Cuthbert, R.I., and christened as Jemima Wilkinson, but who had since becoming possessed with the spirit of Jesus Christ in October of that revolutionary year of 1776 been known as “The Comforter,” “Friend of Sinners, “The All-Friend,” and most commonly as “Public Universal Friend,” subsequently wearing only masculine garments and answering only to male pronouns. Such is the description of the All-Friend as given by Adam Morris in American Messiahs: False Prophets of a Damned Nation, where the “dark beauty of the Comforter’s androgynous countenance” appeared as a “well-apportioned female body cloaked in black robes along with a white or purple cravat, topped by a wide-brimmed hat made of gray beaver fur.” Reflecting back on the theophany that was responsible for his adoption of the spirit of Christ, the Public Universal Friend wrote about how the “heavens were open’d And She saw too [sic] Archangels descending from the east, with golden crowns upon there heads, clothed in long white Robes, down to their feet.”

Even though the Quakers, with their reliance on revelation imparted from an “Inner Light,” were already a progressive (and suspect) denomination, heresy such as All-Friend’s earned his rebuke and excommunication from the church of his childhood. But that was of no accordance, as the Public Universal Friend would begin a remarkably successful campaign of evangelization across war-torn New England. Preaching a radical gospel that emphasized gender equality and communalism, All-Friend was the leader of an emerging religious movement and a citizen of the new Republic when he arrived at the very heart of Quakerism that was Philadelphia, where he hoped to convert multitudes towards this new messianic faith.  

Barbé-Marbois was excited to see the new transgendered messiah, writing that “Jemima Wilkinson has just arrived here…Some religious denomination awaited her with apprehension, others with extreme impatience. Her story is so odd, her dogmas so new, that she has not failed to attract general attention.” All-Friend was not impressed by rank or pedigree, however, and having been tipped off to the presence of the marquise among the congregation, he preceded to castigate Barbé-Marbois, declaring that “Do these strangers believe that their presence in the house of the Lord flatter me? I disdain their honors, I scorn greatness and good fortune.” For all of the All-Friend’s anger at the presence of this sinner in his temple, Barbé-Marbois was still charmed by the messiah, writing of how All-Friend “has chosen a rather beautiful body for its dwelling…beautiful features, a fine mouth, and animated eyes,” adding that his “travels have tanned her a little.” Though he also notes that they “would have burned her in Spain and Portugal.”

Such is the sort of spectacular story that readers will find in Morris’s deeply research, well-argued, exhaustive, and fascinating new book. A San Francisco-based translator and freelance journalist whose work has appeared in The Believer, The Los Angeles Review of Books, and Salon, Morris’s first book provides a counter-history of a shadow America whose importance and influence are none-the-less for its oddity. In a narrative that stretches from the All-Friend and his heterodox preaching as the last embers of the First Great Awakening died out, through case-studies that include the 19th-century Spiritualist messiahs of Thomas Lake Harris and Cyrus Teed (the later known to his followers as “Koresh” after the Persian king in the Hebrew Scriptures); the massive real-estate empire of Harlem-based Father Divine and his racially egalitarian Peace Mission; and finally the dashed promise and horror of Jim Jones and the massacre of the Peoples Temple Agricultural Project at Jonestown, Guyana in 1978, which took 900 lives and was the most brutal religiously based murder in American history until 9/11. Morris doesn’t just collect these figures at random, he argues that “a direct lineage connects Anne Lee, the Shaker messiah who arrived to American shores in 1774, to Jim Jones, the holy-rolling Marxist,” making his claim based not just on similar ideological affinities but often times with evidence of direct historical contact as well, a chain of messianic influences starting at the very origin of the nation and functioning as a subversive counter-melody to the twin American idols of the market and evangelicalism.

Often dismissively rejected as “cult leaders,” these disparate figures, Morris argues, are better understood as the founders of new religions, unusual though they may seem to us. In this contention, Morris draws on a generation of religious studies scholars who’ve long chafed at the analytical inexactitude of the claim that some groups are simply “cults” composed of easily brain-washed followers and paranoid charlatan leaders with baroque metaphysical claims; a sentiment that was understandable after the horror of Jonestown, but which neglects the full complexities and diversity of religion as a site for human meaning. America has had no shortage of groups, both malicious and benign, that sought to spiritually transform the world, and denounce the excesses and immoralities of American culture, while sometimes surpassing them and embracing convoluted theological claims.

In new religious movements there is a necessary radical critique of inequity and injustice; as Jonestown survivor Laura Kohl recalls, the utopian impulse that motivated the Peoples’ Temple was originally “at the very cutting edge of the way we wanted society to evolve. So we wanted to be totally integrated and we wanted to have people of every socio-economic level, every racial background, everything, all included under one roof.” When their experiment began, people like Kohl couldn’t anticipate that it would end with mass suicide instigated by an increasingly paranoid amphetamine addict, because in the beginning “we were trying to be role models of a society, of a culture that was totally inclusive and not discriminatory based on education or race or socio-economic level. We all joined with that in mind.”

Such is the inevitable tragedy of messianism—it requires a messiah. Whatever the idealistic import of their motivations, members of such groups turned their hopes, expectations, and faith towards a leader who inevitably would begin to fashion himself or herself as a savior. Whether you slur them as “cults,” or soberly refer to them as “new religions,” such groups stalk the American story, calling to mind not just industrious Shakers making their immaculate wooden furniture in the 19th century, but also Bhagwan Shree Rajneesh looking positively beatified in his follower-funded Mercedes Benz, Marshall Applewhite dead in his Nikes awaiting the Hale Bop Comet to shepherd Heaven’s Gate to astral realms, and David Koresh immolated with 75 of his fellow Branch Davidians after an eight-week standoff with the FBI and the ATF.

While it would be a mistake to obscure the latent (and sometimes not-so-latent) darkness that can lay at the core of some of these groups—the exploitation, megalomania, and extremism—Morris convincingly argues that simply rejecting such groups as cults does no service to understanding them. This sentiment, that some religions are legitimate while others are irretrievably “cults,” is often mouthed by so-called “deprogrammers,” frequently representatives of evangelical Christian denominations or sham psychologists whose charlatanry could compete with that of the founders of these new faiths themselves. Morris claims that part of what’s so threatening about the figures he investigates, and not other equally controlling phenomena from Opus Dei to the Fellowship, is that unlike those groups, leaders like Ann Lee, Father Divine, and even Jones provided a “viable alternative to the alienation of secularized industrial urbanism, and a politico-spiritual antidote to the anodyne mainline Protestantism that increasingly served as a handmaiden to big business.”

For Morris, such figures and groups are genuinely countercultural, and for all of their oftentimes tragic failings, they’ve provided the only genuine resistance to the forward march of capitalism in American history. These groups are seen as dangerous in a way that Opus Dei and the Fellowship aren’t because they so fully interrogate the basic structures of our society in a manner that those more accepted cults don’t, in part because those mainstream groups exist precisely to uphold the ruling class. As Teed wrote, “Christianity… is but the dead carcass of a once vital and active structure. It will rest supine till the birds of prey, the vultures and cormorants of what is falsely called liberalism have picked its bones of its fleshly covering leaving them to dry to bleach and decompose.”

“Far more than their heretical beliefs,” Morris writes, it is the “communistic and anti-family leanings of American messianic movements [that] pose a threat to the prevailing socio-economic order.” Lee may have thought that she was the feminine vessel for the indwelling presence of Christ, but she also rejected the speculation-based industrialization wreaking havoc on the American countryside; Teed believed that the Earth was hollow and that we lived in its interior, but he also penned cognoscente denunciations of Gilded Age capitalism, and modeled ways of organizing cooperative communities that didn’t rely on big business; Father Divine implied that he ascended bodily from heaven, but he also built a communally owned empire that included some of the first integrated businesses and hotels in the United States; and Jones claimed that he was variously the reincarnation of the Pharaoh Akhenaten, the Buddha, Christ, and Vladimir Lenin, but he was also a staunch fighter against segregation and he and his wife were the first Indiana couple to adopt an African-American child. Such inconsistencies don’t invalidate these figures legitimate beliefs, but they make the ultimate conclusion of how many of these movements all the more tragic because of them.

Though not expressly stated this way by Morris, there is a sense in American Messiahs that the titular figures are both the most and least American of us. The least because they embrace a utopian communism anathema to the national character, and the most because what could be more boot-strapping in its rugged individualism than declaring yourself to be a god? Such are the paradoxes inherent in a commune run by a king (or queen). These new religions steadfastly reject the cutthroat avarice that defines American business in favor of a subversive collectivism. At the center of the social theory and economics of every group, from the Universal Friends to the Peoples’ Temple, is a model of communal living, with Morris writing that such organization “represents the ultimate repudiation of the values and institutions that Americans historically hold dear: it rejects not only the sacrosanct individualism on which American culture thrives, but also the nuclear family unit that evolved alongside industrial capitalism.”

In this way, whether in a Shaker commune or one of Father Divine’s Peace Missions, the intentional community offers “an escape from the degrading alienation of capitalist society by returning—once more—to cooperative and associative modes of living modeled by the apostolic church.” But at the same time, there is the inegalitarian contradiction of allowing a woman or man to be preeminent among the rest, to fashion themselves as a messiah, as a type of absolute CEO of the soul. Such is the marriage of the least American of collectivisms combined with a rugged individualism pushed to the breaking point. And so, America is the land of divine contradictions, for as Harris wrote in his manifesto Brotherhood of the New Life “I have sought to fold the genius of Christianity, to fathom its divine import, and to embody its principles in the spirit and body of our own America.”

Despite the antinomian excesses of theses messiahs, Morris is correct in his argument that the social and economic collectivism promoted and explored by such groups are among the most successful instances of socialism in the United States. A vibrant radical left in America has always been under attack by both capital and the government, so that enduring instances of socialism more often find themselves in a religious context than in a secular one, as failed communes from Brook Farm to Fruitland can attest to. Additionally, because it was the reform-minded French sociologist “Charles Fourier, rather than Karl Marx, who set the agenda of socialist reform in antebellum America,” as Morris writes, there was never the opportunity for a working-class, secular communist party to develop in the United States as it did in Europe. Into that void came sects and denominations that drew freely from America’s religious diversity, marrying Great Awakening evangelism to early modern occultism and 19th-century spiritualism so as to develop heterodox new faiths that often cannily and successfully questioned the status quo; what Morris describes as a “nonsectarian, ecumenical Christian church rebuilt around Christian principles of social justice [that] could be an instrument of radical social reform.”

And though their struggles for racial, gender, and class egalitarianism are often forgotten, Morris does the important job of resuscitating an understanding of how groups from the Shakers to Teed’s Koreshan Unity functioned as vanguards for progressive ideals that altered the country for the better. In ways that are both surprising and crucial to remember, such groups were often decades, if not centuries, ahead of their fellow Americans in embracing the axiom that all people are created equal, and in goading the nation to live up to its highest ideals, while demonstrating the efficacy of religious movements to effect social change and to demonstrate that there are alternative ways to structure our communities. After all, it was in the 18th century that Shakers taught a form of nascent feminism and LGBTQ equality, with their founder Mother Ann Lee publicly praying to “God, Our Mother and Father.”

In groups like the Shakers and the Society of Universal Friends there was an “incipient feminism, which continued to develop” and that defined American messianism over the centuries in the attraction of a “predominantly female following often through promises of equal rights among the faithful.” Women were able to find leadership positions that existed nowhere else in American society at the time, and they also would be emancipated from mandated domestic drudgery and abuse, allowing them an almost unparalleled freedom. As Morris notes, “For the tens of thousands of Americans who attended [these revivals], it was likely the first time any had ever seen a woman permitted to stand at a public lectern.” To read the radicalism of such a spectacle as mere performance, or to ignore their subversive language as simple rhetoric, is to deny that which is transgressive, and perhaps even redemptive, in figures otherwise marginalized in our official histories. An 18th-century church acknowledges the evils of misogyny, and made its rectification one of its chief goals. Nineteenth-century new religions denounced institutional racism from the pulpit long before emancipation. There is admittedly something odd in Spiritualist churches holding seances where the spirits of Thomas Jefferson and George Washington would possess mediums and speak to the assembled, but the fact that those founders then “tended to express regret for their participation in the slave economy, and advanced that a more temperate and egalitarian society would ease the heavenward path of American Christians” isn’t just an attempt at rationalization or absolution, but an acknowledgement of deep historical malignancies in our society that people still have trouble understanding today.

Regarding America’s shameful history of racial inequality, many of these so-called messiahs were often far ahead of conventional politics as well. Father Divine’s Peace Missions are an illustrative example. Possibly born to former slaves in Rockville, Md., and christened George Baker Jr., the man who would come to be called Father Divine studied the emerging loose movement known as “New Thought” and developed his own metaphysics grounded in egalitarianism and the same adoptionist heresy that other pseudo-messiahs had embraced, namely that it was possible to be infused with the spirit of Christ in a way that made you equivalent to Christ. Furthermore, as radical as it was to see a woman preaching, it was just as subversive for Americans to imagine Christ as a black man, but Father Divine’s mission was suffused with the belief that “if God first chose to dwell among a people dispossessed of their land and subject to the diktats of a sinful empire, it was logical for him to return as an African American in the twentieth century.”

Morris explains that it was Father Divine’s contention, as well as that of the other messiahs, that “Such a church might rescue the nation from its spiritual stagnation and the corresponding failure to live up to its democratic ideals.” At the height of the Great Depression, Father Divine built a racially integrated fellowship in a series of communally owned Peace Missions that supplied employment and dignity to thousands of followers, composed of “black and white, rich and poor, illiterate and educated.” His political platform, which he agitated for in circles that included New York City Republican (and sometimes Socialist) mayor Fiorello LaGuardia, as well as President Franklin Delano Roosevelt, included laws requiring insurance, caps on union dues, the abolition of the death penalty, banning of assault weapons, and the repeal of racially discriminatory laws. Even while Morris’s claim that “Father Divine was the most well-known and influential civil rights leader on the national stage between the deportation of Marcus Garvey and the emergence of Martin Luther King Jr.” might need a bit more exposition to be convincing, he does make his case that this enigmatic and iconoclastic figure deserves far more attention and credit than he’s been given.

At times American Messiahs can suffer a bit from its own enthusiasm. It’s always an engaging read, but Morris’s claims can occasionally be a bit too sweeping. After reading quotations from pseudo-messiahs like Teed’s, when he writes that “The universe, is an alchemico-organic dynamo… Its general form is that of the perfect egg or shell, with its central vitellus at or near the center of the sphere,” such exhaustive litanies become exhausting, because strange cosmologies are, well, strange. When you read about the course of study at Teed’s College of Life, where for $50 participants could learn “analogical biology and analogical physiology, disciplines that taught the ‘laws of universal form,'” you can’t help but wish that just a smidgen more cynicism was expressed on Morris’s part. There is an admirability that Morris takes such figures on their own accord, but it’s hard to not approach them with some skepticism. When it comes to hucksters claiming to be possessed by the indwelling presence of God, it’s difficult not to declare that sometimes a crank is a crank is a crank, and a nutcase by any other name would still smell of bullshit.

Still, Morris’s argument that we have to understand the messiahs as instances of often stunningly successful countercultural critique of American greed and inequity is crucial, and he’s right that we forget those lessons at our own peril. If we only read Teed for his bizarre anti-Copernican cosmology positing that we live within the Earth itself, but we forget that he also said that the “question for the people of to-day to consider is that of bread and butter. It must henceforth be a battle to the death between organized labor and organized capital,” than we forget the crucial left role that such groups have often played in American history. Even more important is the comprehension that such groups supplied a potent vocabulary, a sacred rhetoric that often spoke to people and that conceived of the political problems we face in a manner more astute and moving than simple secular analysis did. It’s not incidental that from the Shakers to the Amish, when it comes to successful counter-cultures, it’s the religious rather than the secular communes that endure. When Father Divine’s follower who went by the name John Lamb said that “The philosophies of men…were inadequate to cope with humanity’s problems,” he’s wisely speaking a truth just as accurate today as during the Great Depression when “clouds of tyranny” were wafting in from Europe. Say what you will about Morris’s messiahs, they were often woman and men who understood the score, and posed a solution to those problems regardless of how heterodox we may have found their methods.

Morris writes that “The American messianic impulse is based on a fundamentally irrefutable truth first observed by the Puritans: the injustices of capitalist culture cannot be reformed from within.” You can bracket out your theories of a hollow Earth and your spirit medium seances, but that above observation is the only one worth remembering. A profound lesson to be learned from the example of the women and men who most steadfastly lived in opposition to those idols of American culture. And yet, we can’t forget where the excesses of such opposition can sometimes lead—it’s hard not to wash the taste of the poisoned Kool-Aid of Jonestown from our mouths. With such a tragic impasse, what are those of us who wish there were utopias supposed to do? The great failure of these egalitarian experiments is that they paradoxically ended up enshrining a figure above the rest, that in rejecting American individualism they strangely embraced it in its purest and most noxious forms. If we can all look into those stark, foreboding mirror sunglasses favored by Jim Jones to conceal his drug-addled and bloodshot eyes, I don’t wonder if what we see isn’t our own reflections staring back at us, for good and ill. Perhaps there is an answer in that, for in that reflection we can see not just one man, but rather a whole multitude. What’s needed, if possible, is a messianism without a messiah.

Marks of Significance: On Punctuation’s Occult Power

“Prosody, and orthography, are not parts of grammar, but diffused like the blood and spirits throughout the whole.” —Ben Jonson, English Grammar (1617)

Erasmus, author of The Praise of Folly and the most erudite, learned, and scholarly humanist of the Renaissance, was enraptured by the experience, by the memory, by the very idea of Venice. For 10 months from 1507 to 1508, Erasmus would be housed in a room of the Aldine Press, not far from the piazzas of St. Mark’s Square with their red tiles burnt copper by the Adriatic sun, the glory and the stench of the Grand Canal wafting into the cell where the scholar would expand his collection of 3,260 proverbs entitled Thousands of Adages, his first major work. For Venice was the home to a “library which has no other limits than the world itself;” a watery metropolis and an empire of dreams that was “building up a sacred and immortal thing.”

Erasmus composed to the astringent smell of black ink rendered from the resin of gall nuts, the rhythmic click-click-click of movable type of alloyed lead-tin keys being set, and the whoosh of paper feeding through the press. From that workshop would come more than 100 titles of Greek and Latin, all published with the indomitable Aldus Manutius’s watermark, an image filched from an ancient Roman coin depicting a strangely skinny Mediterranean dolphin inching down an anchor. Reflecting on that watermark (which has since been filched again, by the modern publisher Doubleday), Erasmus wrote that it symbolized “all kinds of books in both languages, recognized, owned and praised by all to whom liberal studies are holy.” Adept in humanistic philology, Erasmus made an entire career by understanding the importance of a paragraph, a phrase, a word. Of a single mark. As did his publisher.

Erasmus’s printer was visionary. The Aldine Press was the first in Europe to produce books made not by folding the sheets of paper in a bound book once (as in a folio), or four times (as in a quarto), but eight times, to produce volumes that could be as small as four to six inches, the so-called octavo. Such volumes could be put in a pocket, what constituted the forerunner of the paperback, which Manutius advertised as “portable small books.” Now volumes no longer had to be cumbersome tomes chained to the desk of a library, they could be squirreled away in a satchel, the classics made democratic. When laying the typeface for a 1501 edition of Virgil in the new octavo form, Manutius charged a Bolognese punchcutter named Francesco Griffo to design a font that appeared calligraphic. Borrowing the poet Petrarch’s handwriting, Griffo invented a slanted typeface that printers quickly learned could denote emphasis, which came to be named after the place of its invention: italic.

However, it was an invention seven years earlier that restructured not just how language appears, but indeed the very rhythm of sentences; for, in 1496, Manutius introduced a novel bit of punctuation, a jaunty little man with leg splayed to the left as if he was pausing to hold open a door for the reader before they entered the next room, the odd mark at the caesura of this byzantine sentence that is known to posterity as the semicolon. Punctuation exists not in the wild; it is not a function of how we hear the word, but rather of how we write the Word. What the theorist Walter Ong described in his classic Orality and Literacy as being marks that are “even farther from the oral world than letters of the alphabet are: though part of a text they are unpronounceable, nonphonemic.” None of our notations are implied by mere speech, they are creatures of the page: comma, and semicolon; (as well as parenthesis and what Ben Jonson appropriately referred to as an “admiration,” but what we call an exclamation mark!)—the pregnant pause of a dash and the grim finality of a period. Has anything been left out? Oh, the ellipses…

No doubt the prescriptivist critic of my flights of grammatical fancy in the previous few sentences would note my unorthodox usage, but I do so only to emphasize how contingent and mercurial our system of marking written language was until around four or five centuries ago. Manutius may have been the greatest of European printers, but from Johannes Guttenberg to William Caxton, the era’s publishers oversaw the transition from manuscript to print with an equivalent metamorphosis of language from oral to written, from the ear to the eye. Paleographer Malcolm Parkes writes in his invaluable Pause and Effect: An Introduction to the History of Punctuation in the West that such a system is a “phenomenon of written language, and its history is bound up with that of the written medium.” Since the invention of script, there has been a war of attrition between the spoken and the written; battle lines drawn between rhetoricians and grammarians, between sound and meaning. Such is a distinction as explained by linguist David Crystal in Making a Point: The Persnickety Story of English Punctuation: “writing and speech are seen as distinct mediums of expression, with different communicative aims and using different processes of composition.”

Obviously, the process of making this distinction has been going on for quite a long time. The moment the first wedged-stylus pressed into wet Mesopotamian clay was the beginning of it, through ancient Greek diacritical and Hebrew pointing systems, up through when Medieval scribes began to first separate words from endless scripto continua, whichbroachednogapsbetweenwordsuntiltheendofthemiddleages. Reading, you see, was normally accomplished out loud, and the written word was less a thing-unto-itself and more a representation of a particular event—that is the event of speaking. When this is the guiding metaphysic of writing, punctuation serves a simple purpose—to indicate how something is to be read aloud. Like the luftpause of musical notation, the nascent end stops and commas of antiquity didn’t exist to clarify syntactical meaning, but only to let you know when to take a breath. Providing an overview of punctuation’s genealogy, Alberto Manguel writes in A History of Reading how by the seventh century, a “combination of points and dashes indicated a full stop, a raised or high point was equivalent to our comma,” an innovation of Irish monks who “began isolating not only parts of speech but also the grammatical constituents within a sentence, and introduced many of the punctuation marks we use today.”

No doubt many of you, uncertain on the technical rules of comma usage (as many of us are), were told in elementary school that a comma designates when a breath should be taken, only to discover by college that that axiom was incorrect. Certain difficulties, with, that, way of writing, a sentence—for what if the author is Christopher Walken or William Shatner? Enthusiast of the baroque that I am, I’m sympathetic to writers who use commas as Hungarians use paprika. I’ll adhere to the claim of David Steel, who in his 1785 Elements of Punctuation wrote that “punctuation is not, in my opinion, attainable by rules…but it may be procured by a kind of internal conviction.” Steven Roger Fischer correctly notes in his A History of Reading (distinct from the Manguel book of the same title) that “Today, punctuation is linked mainly to meaning, not to sound.” But as late as 1589 the rhetorician George Puttenham could in his Art of English Poesie, as Crystal explains, define a comma as the “shortest pause,” a colon as “twice as much time,” and an end stop as a “full pause.” Because our grade school teachers weren’t wrong in a historical sense, for that was the purpose of commas, colons, and semicolons, to indicate pauses of certain amounts of time when scripture was being aloud. All of the written word would have been quietly murmured under the breath of monks in the buzz of a monastic scriptorium.

For grammarians, punctuation has long been claimed as a captured soldier in the war of attrition between sound and meaning, these weird little marks enlisted in the cause of language as a primarily written thing. Fischer explains that “universal, standardized punctuation, such as may be used throughout a text in consistent fashion, only became fashionable…after the introduction of printing.” Examine medieval manuscripts and you’ll find that the orthography, that is the spelling and punctuation (insomuch as it exists), is completely variable from author to author—in keeping with an understanding that writing exists mainly as a means to perform speaking. By the Renaissance, print necessitated a degree of standardization, though far from uniform. This can be attested to by the conspiratorially minded who are flummoxed by Shakespeare’s name being spelled several different ways while he was alive, or by the anarchistic rules of 18th-century punctuation, the veritable golden age of the comma and semicolon. When punctuation becomes not just an issue of telling a reader when to breathe, but as a syntactical unit that conveys particular meanings that could be altered by the choice or placement of these funny little dots, then a degree of rigor becomes crucial. As Fischer writes, punctuation came to convey “almost exclusively meaning, not sound,” and so the system had to become fixed in some sense.

If I may offer an additional conjecture, it would seem to me that there was a fortuitous confluence of both the technology of printing and the emergence of certain intellectual movements within the Renaissance that may have contributed to the elevation of punctuation. Johanna Drucker writes in The Alphabetic Labyrinth: The Letters in History and Imagination how Renaissance thought was gestated by “strains of Hermetic, Neo-Pythagorean, Neo-Platonic and kabbalistic traditions blended in their own peculiar hybrids of thought.” Figures like the 15th-century Florentine philosophers Marsilio Ficino and Giovanni Pico della Mirandola reintroduced Plato into an intellectual environment that had sustained itself on Aristotle for centuries. Aristotle rejected the otherworldliness of his teacher Plato, preferring rather to muck about in the material world of appearances, and when medieval Christendom embraced the former, they modeled his empirical perspective. Arguably the transcendent nature of words is less important in such a context; what difference does the placement of a semicolon matter if it’s not conveying something of the eternal realm of the Forms? But the Florentine Platonists like Ficino were concerned with such things, for as he writes in Five Questions Concerning the Mind (printed in 1495—one year after the first semicolon), the “rational soul…possesses the excellence of infinity and eternity…[for we] characteristically incline toward the infinite.” In Renaissance Platonism, the correct ordering of words, and their corralling with punctuation, is a reflection not of speech, but of something larger, greater, higher. Something infinite and eternal; something transcendent. And so, we have the emergence of a dogma of correct punctuation, of standardized spelling—of a certain “orthographic Platonism.”  

Drucker explains that Renaissance scholars long searched “for a set of visual signs which would serve to embody the system of human knowledge (conceived of as the apprehension of a divine order).” In its most exotic form this involved the construction of divine languages, the parsing of Kabbalistic symbols, and the embrace of alchemical reasoning. I’d argue in a more prosaic manner that such orthographic Platonism is the well-spring for all prescriptivist approaches to language, where the manipulation of the odd symbols that we call letters and punctuation can lend themselves to the discovery of greater truths, an invention that allows us “to converse even with the absent,” as Parkes writes.  In the workshops of the Renaissance, at the Aldine Press, immortal things were made of letters and eternity existed between them, with punctuation acting as the guideposts to a type of paradise. And so it can remain for us.

Linguistic prescriptivists will bemoan the loss of certain standards, of how text speak signals an irreducible entropy of communication, or how the abandonment of arbitrary grammatical rules is as if a sign from Revelation. Yet such reactionaries are not the true guardians of orthographic Platonism, for we must take wisdom where we find it, in the appearance, texture, and flavor of punctuation. Rules may be arbitrary, but the choice of particular punctuation—be it the pregnant pause of the dash or the rapturous shouting of the exclamation mark—matters. Literary agent Noah Lukeman writes in A Dash of Style: The Art and Mastery of Punctuation that punctuation is normally understood as simply “a convenience, a way of facilitating what you want to say.” Such a limited view, which is implicit for either those that advocate punctuation as an issue of sound or as one of meaning, ignores the occult power of the question mark, the theurgy in a comma. The orthographic Platonists at the Aldine Press understood that so much depended on a semicolon; that it signified more than a longer-than-average pause or the demarcation of an independent clause. Lukeman argues that punctuation is rarely “pondered as a medium for artistic expression, as a means of impacting content,” yet in the most “profound way…it achieves symbiosis with the narration, style, viewpoint, and even the plot itself.”

Keith Houston in Shady Characters: The Secret Life of Punctuation, Symbols, and Other Typographical Marks claims that “Every character we type or write is a link to the past;” every period takes us back to the dots that Irish monks used to signal the end of a line; every semicolon back to Manutius’s Venetian workshop. Punctuation, as with the letters whom they serve, has a deep genealogy, their use places us in a chain of connotation and influence that goes back centuries. More than that, each individual punctuation has a unique terroir; they do things that give the sentence a waft, a wisdom, a rhythm that is particular to them. Considering the periods of Ernest Hemingway, the semicolons of Edgar Allan Poe and Herman Melville, and Emily Dickinson’s sublime dash, Lukeman writes that “Sentences crash and fall like the waves of the sea, and work unconsciously on the reader. Punctuation is the music of language.”

To get overly hung up on punctuation as either an issue of putting marks in the right place, or letting the reader know when they can gulp some air, is to miss the point—a comma is a poem unto itself, an exclamation point is an epic! Cecelia Watson writes in her new book, Semicolon: The Past, Present, and Future of a Misunderstood Mark, that Manutius’s invention “is a place where our anxieties and our aspirations about language, class, and education are concentrated.” And she is, of course, correct, as evidenced by all of those partisans of aesthetic minimalism from Kurt Vonnegut to Cormac McCarthy who’ve impugned the Aldine mark’s honor. But what a semicolon can do that other marks can’t! How it can connect two complete ideas into a whole; a semicolon is capable of unifications that a comma is too weak to do alone. As Adam O’Fallon Price writes in The Millions, “semicolons…increase the range of tone and inflection at a writer’s disposal.” Or take the exclamation mark, a symbol that I’ve used roughly four times in my published writing, but which I deploy no less than 15 times per average email. A maligned mark due to its emotive enthusiasms, Nick Ripatrazone observes in The Millions that “exclamation marks call attention toward themselves in poems: they stand straight up.” Punctuation, in its own way, is conscious; it’s an algorithm, as much thought itself as a schematic showing the process of thought.

To take two poetic examples, what would Walt Whitman be without his exclamation mark; what would Dickinson be without her dash? They didn’t simply use punctuation for the pause of breath nor to logically differentiate things with some grammatical-mathematical precision. Rather they did do those things, but also so much more, for the union of exhalation and thought gestures to that higher realm the Renaissance originators of punctuation imagined. What would Whitman’s “Pioneers! O pioneers!” from the 1865 Leaves of Grass be without the exclamation point? What argument could be made if that ecstatic mark were abandoned? What of the solemn mysteries in the portal that is Dickinson’s dash when she writes that “’Hope’ is the thing with feathers –”? Orthographic Platonism instills in us a wisdom behind the arguments of rhetoricians and grammarians; it reminds us that more than simple notation, each mark of punctuation is a personality, a character, a divinity in itself.

My favorite illustration of that principle is in dramatist Margaret Edson’s sublime play W;t, the only theatrical work that I can think of that has New Critical close reading as one of its plot points. In painful detail, W;t depicts the final months of Dr. Vivian Bearing, a professor of 17th-century poetry at an unnamed, elite, eastern university, after she has been diagnosed with Stage IV cancer. While undergoing chemotherapy, Bearing often reminisces on her life of scholarship, frequently returning to memories of her beloved dissertation adviser, E.M. Ashford. In one flashback, Bearing remembers being castigated by Ashford for sloppy work that the former did, providing interpretation of John Donne’s Holy Sonnet VI based on an incorrectly punctuated edition of the cycle. Ashford asks her student “Do you think the punctuation of the last line of this sonnet is merely an insignificant detail?” In the version used by Bearing, Donne’s immortal line “Death be not proud” is end stopped with a semicolon, but as Ashford explains, the proper means of punctuation, as based on the earliest manuscripts of Donne, is simply a comma. “And death shall be no more, comma, Death thou shalt die.”

Ashford imparts to Bearing that so much can depend on a comma. The professor tells her student that “Nothing but a breath—a comma—separates life from everlasting…With the original punctuation restored, death is no longer something to act out on a stage, with exclamation points…Not insuperable barriers, not semicolons, just a comma.” Ashford declares that “This way, the uncompromising way, one learns something from this poem, wouldn’t you say?” Such is the mark of significance, an understanding that punctuation is as intimate as breath, as exulted as thought, and as powerful as the union between them—infinite, eternal, divine.

Image credit: Wikimedia Commons/Sam Town.

Another Person’s Words: Poetry Is Always the Speaker

Blessedly, we are speakers of languages not of our own invention, and as such none of us are cursed in only a private tongue. Words are our common property; it would be a brave iconoclast to write entirely in some Adamic dialect of her own invention, her dictionary locked away (though from the Voynich Manuscript to Luigi Serafini’s Codex Seraphinianus, some have tried). Almost every word you or I speak was first uttered by somebody else—the key is entirely in the rearrangement. Sublime to remember that every possible poem, every potential play, ever single novel that could ever be written is hidden within the Oxford English Dictionary. The answer to every single question too, for that matter. The French philosophers Antoine Arnauld and Claude Lancelot enthuse in their 1660 Port-Royal Grammar that language is a “marvelous invention of composing out of 25 or 30 sounds that infinite variety of expressions which, whilst having in themselves no likeness to what is in our mind, allow us to… [make known] all the various stirrings of our soul.” Dictionaries are oracles. It’s simply an issue of putting those words in the correct order. Language is often spoken of in terms of inheritance, where regardless of our own origins speakers of English are the descendants of Walt Whitman’s languid ecstasies, Emily Dickinson’s psalmic utterances, the stately plain style of the King James bible, the witty innovations of William Shakespeare, and the earthy vulgarities of Geoffrey Chaucer; not to forget the creative infusions of foreign tongues, from Norman French and Latin, to Ibo, Algonquin, Yiddish, Spanish, and Punjabi, among others. Linguist John McWhorter puts it succinctly in Our Magnificent Bastard Tongue: The Untold History of English, writing that “We speak a miscegenated grammar.”

There is a glory to this, our words indicating people and places different from ourselves, our diction an echo of a potter in a Bronze Age East Anglian village, a canting rogue in London during the golden era of Jacobean Theater, or a Five Points Bowery Boy in antebellum New York. Nicholas Oster, with an eye towards its diversity of influence, its spread, and its seeming omnipresence, writes in Empires of the Word: A Language History of the World that “English deserves a special position among world languages” as it is a “language with a remarkably varied history.” Such history perhaps gives the tongue a universal quality, making it a common inheritance of humanity. True with any language, but when you speak it would be a fallacy to assume that your phrases, your idioms, your sentences, especially your words are your own. They’ve passed down to you. Metaphors of inheritance can either be financial or genetic; the former has it that our lexicon is some treasure collectively willed to us, the later posits that in the DNA of language, our nouns are adenine, verbs are as if cytosine, adjectives like guanine, and adverbs are thymine. Either sense of inheritance has its uses as a metaphor, and yet they’re both lacking to me in some fundamental way—too crassly materialist, too eugenic. The proper metaphor isn’t inheritance, but consciousness. I hold that a language is as if a living thing, or to be more specific, as if a thinking thing. Maybe this isn’t a metaphor at all, perhapswe’re simply conduits for the thoughts of something bigger than ourselves, the contemplations of the language which we speak.

Philosopher George Steiner, forever underrated, writes in his immaculate After Babel: Aspects of Language and Translation that “Language is the highest and everywhere the foremost of those assents which we human beings can never articulate solely out of our own means.” We’re neurons in the mind of language, and our communications are individual synapses in that massive brain that’s spread across the Earth’s eight billion inhabitants, and back generations innumerable. When that mind becomes self-aware of itself, when language knows that it’s language, we call those particular thoughts poetry. Argentinean critic (and confidant of Jorge Luis Borges) Alberto Manguel writes in A Reader on Reading that poetry is “proof of our innate confidence in the meaningfulness of wordplay;” it is that which demonstrates the eerie significance of language itself. Poetry is when language consciously thinks.

More than rhyme and meter, or any other formal aspect, what defines poetry is its self-awareness. Poetry is the language which knows that it’s language, and that there is something strange about being such. Certainly, part of the purpose of all the rhetorical accoutrement which we associate with verse, from rhythm to rhyme scheme, exists to make the artifice of language explicit. Guy Deutscher writes in The Unfolding of Language: An Evolutionary Tour of Mankind’s Greatest Invention that the “wheels of language run so smoothly” that we rarely bother to “stop and think about all the resourcefulness that must have gone into making it tick.” Language is pragmatic, most communication doesn’t need to self-reflect on, well, how weird the very idea of language is. How strange it is that we can construct entire realities from variations in the breath that comes out of our mouths, or the manipulation of ink stains on dead trees (or of liquid crystals on a screen). “Language conceals its art,” Deutscher writes, and he’s correct. When language decides to stop concealing, that’s when we call it poetry.

Verse accomplishes that unveiling in several different ways, chief among them the use of the rhetorical and prosodic tricks, from alliteration to Terza rima, which we associate with poetry. One of the most elemental and beautiful aspects of language which poetry draws attention towards are the axioms implied earlier in this essay – that the phrases and words we speak are never our own – and that truth is found not in the invention, but in the rearrangement. In Problems of Dostoevsky’s Poetics, the Russian literary theorist Mikhail Bakhtin wrote that we receive “the word from another’s voice and filled with that other voice.” Our language is not our own, nor is our literature. We communicate in a tongue not of our own creation; we don’t have conversations, we are the conversation. Bakhtin reasons that our “own thought finds the world already inhabited.” Just as the organization of words into enjambed lines and those lines into stanzas demonstrates the beautiful unnaturalness of language, so to do allusion, bricolage, and what theorists call intertextuality make clear to us that we’re not individual speakers of words, but that words are speakers of us. Steiner writes in Grammars of Creation that “the poet says: ‘I never invent.’” This is true, the poet never invents, none of us do. We only rearrange—and that is beautiful.

True of all language, but few poetic forms are as honest about this as a forgotten Latin genre from late antiquity known as the cento. Rather than inheritance and consciousness, the originators of the cento preferred the metaphor of textiles. For them, all of poetry is like a massive patchwork garment, squares of fabric borrowed from disparate places and sewn together to create a new whole. Such a metaphor is an apt explanation of what exactly a cento is – a novel poem that is assembled entirely from rearranged lines written by other poets. Centos were written perhaps as early as the first century, but the fourth-century Roman poet Decimus Magnus Ausonius was the first to theorize about their significance and to give rules for their composition. In the prologue to Cento Nuptialias, where he composed a poem about marital consummation from fragments of Virgil derived from The Aeneid, Georgics, and Eclogues, Ausonius explained that he has “but out of a variety of passages and different meanings,” created something new which is “like a puzzle.”

The editors of The Penguin Dictionary of Literary Terms and Literary Theory explain that while forgotten today, the cento was “common in later antiquity.” Anthologizer and poet David Lehman writes in The American Scholar that “Historically, the intent was often homage, but it could and can be lampoon,” with critic Edward Hirsch writing in A Poet’s Glossary that they “may have begun as school exercises.” Though it’s true that they were written for educational reasons, to honor or mock other poets, or as showy performance of lyrical erudition (the author exhibiting their intimacy with Homer and Virgil), none of these explanations does service to the cento’s significance. To return to my admittedly inchoate axioms of earlier, one function of poetry is to plunge “us into a network of textual relations,” as the theorist Graham Allen writes in Intertextuality. Language is not the provenance of any of us, but rather a common treasury; with its other purpose being what Steiner describes as the “rec-compositions of reality, of articulate dreams, which are known to us as myths, as poetry, as metaphysical conjecture.” That’s to say that the cento remixes poetry, it recombines reality, so as to illuminate some fundamental truth hitherto hidden. Steiner claims that a “language contains within itself the boundless potential of discovery,” and the cento is a reminder that fertile are the recombination’s of poetry that have existed before, that literature is a rich, many-varied compost from which beautiful new specimens can grow towards the sun.

Among authors of centos, this is particularly true of the fourth-century Roman poet Faltonia Betitia Proba. Hirsch explains that one of the purposes of the cento, beyond the pedagogical or the parodic, was to “create Christian narratives out of pagan text,” as was the case with Proba’s Cento virgilianus, the first major Christian epic by a woman poet. Allen explains that “Works of literature, after all, are built from systems, codes and traditions established by previous works of literature;” what Proba’s cento did was a more literal expression of that fundamental fact. The classical past posed a difficulty for proud Roman Christians, for how were the faithful to grapple with the paganism of Plato, the Sibyls, and Virgil? One solution was typological, that is the assumption that if Christianity was true, and yet pagan poets like Virgil still spoke the truth, that such must be encoded within his verse itself, part of the process of Interpretatio Christiana whereby pagan culture was reinterpreted along Christian lines.

Daughter of Rome that she was, Proba would not abandon Virgil, but Christian convert that she also was, it became her task to internalize that which she loved about her forerunner and to repurpose him, to place old wine into new skins. Steiner writes that an aspect of authorship is that the “poet’s consciousness strives to achieve perfect union with that of the predecessor,” and though those lyrics are “historically autonomous,” as reimagined by the younger poet they are “reborn from within.” This is perhaps true of how all influence works, but the cento literalizes that process in the clearest manner. And so Proba’s solution was to rearrange, remix, and recombine the poetry of Virgil so that the Christianity could emerge, like a sculptor chipping away all of the excess marble in a slab to reveal the statue hidden within.

Inverting the traditional pagan invocation of the muse, Proba begins her epic (the proem being the only original portion) with both conversion narrative and poetic exhortation, writing that she is “baptized, like the blest, in the Castalian font – / I, who in my thirst have drunk libations of the Light – / now being my song: be at my side, Lord, set my thoughts/straight, as I tell how Virgil sang the offices of Christ.” Thus, she imagines the prophetic Augustan poet of Roman Republicanism who died two decades before the Nazarene was born. Drawing from a tradition which claimed Virgil’s Eclogue predicted Christ’s birth, Proba rearranged 694 lines of the poet to retell stories from Genesis, Exodus, and the Gospels, the lack of Hebrew names in the Roman original forcing her to use general terms which appear in Virgil, like “son” and “mother,” when writing of Jesus and Mary. Proba’s paradoxically unoriginal originality (or is its original unoriginality?) made her popular in the fourth and fifth centuries, the Cento virgilianus taught to catechists alongside Augustin, and often surpassing Confessions and City of God in popularity. Yet criticism of Proba’s aesthetic quality from figures like Jerome and Pope Gelasius I ensured a millennium-long eclipse of her poem, forgotten until its rediscovery with the Renaissance.

Rearranging the pastoral Eclogues, Proba envisions Genesis in another poet’s Latin words, writing that there is a “tree in full view with fruitful branches;/divine law forbids you to level with fire or iron,/by holy religious scruple it is never allowed to be disturbed./And whoever steals the holy fruit from this tree,/will pay the penalty of death deservedly;/no argument has changed my mind.” Something uncanny about the way that such a familiar myth is reimagined in the arrangement of a different myth; the way in which Proba is a redactor of Virgil’s words, shaping them (or pulling out from) this other, different, unintended narrative. Scholars have derided her poem as juvenilia since Jerome (jealously) castigated her talent by calling her “old chatterbox,” but to be able to organize, shift, and shape another poet’s corpus into orthodox scripture is an unassailable accomplishment. Writers of the Renaissance certainly thought so, for a millennium after Proba’s disparagement, a new generation of humanists resuscitated her.

Cento virgilianus was possibly the first work by a woman to be printed, in 1474; a century before that, and the father of Renaissance poetry Petrarch extolled her virtues in a letter to the wife of the Holy Roman Emperor, and she was one of the subjects of Giovani Boccaccio’s 1374 On Famous Women, his 106 entry consideration of female genius from Eve to Joanna, the crusader Queen of Jerusalem and Sicily. Boccaccio explains that Proba collected lines of Virgil with such “great skill, aptly placing the entire lines, joining the fragments, observing the metrical rules, and preserving the dignity of the verses, that no one except an expert could detect the connections.” As a result of her genius, a reader might think that “Virgil had been a prophet as well as an apostle,” the cento suturing together the classic and the Hebraic, Athens and Jerusalem.

Ever the product of his time, Boccaccio could still only appreciate Proba’s accomplishment through the lens of his own misogyny, writing that the “distaff, the needle, and weaving would have been sufficient for her had she wanted to lead a sluggish life like the majority of women.” Boccaccio’s myopia prevented him from seeing that that was the precise nature of Proba’s genius – she was a weaver. The miniatures which illustrate a 15th-century edition of Boccaccio give truth to this, for despite the chauvinism of the text, Proba is depicted in gold-threaded red with white habit upon her head, a wand holding aloft a beautiful, blue expanding sphere studded with stars, a strangely scientifically accurate account of the universe as the poet sings song of Genesis in the tongue of Virgil. Whatever anonymous artist saw fit to depict Proba as a mage understood her well; for that matter they understood creation well, for perhaps God can generate ex nihilo, but artists must always gather their material from fragments shored against their ruin.

In our own era of allusion, reference, quotation, pastiche, parody, and sampling, you’d think that the cento would have new practitioners and new readers. Something of the disk jockeys Danger Mouse, Fatboy Slim, and Girl Talk in the idea of remixing a tremendous amount of independent lines into some synthesized newness; something oddly of the Beastie Boys’ Paul’s Boutique in the very form of the thing. But centos proper are actually fairly rare in contemporary verse, despite T.S. Eliot’s admission that “mature poets steal.” Perhaps with more charity, Allen argues that reading is a “process of moving between texts. Meaning because something which exists between a text and all the other texts to which it refers and relates.” But while theorists have an awareness of the ways in which allusion dominates the modernist and post-modernist sensibility—what theorists who use the word “text” too often call “intertextuality”—the cento remains as obscure as other abandoned poetic forms from the Anacreontic to the Zajal (look them up). Lehman argues that modern instances of the form are “based on the idea that in some sense all poems are collages made up of other people’s words; that the collage is a valid method of composition, and an eloquent one.”

Contemporary poets who’ve tried their hand include John Ashbery, who weaved together Gerard Manley Hopkins, Lord Byron, and Elliot; as well as Peter Gizzi who in “Ode: Salute to the New York School” made a cento from poets like Frank O’Hara, James Schuyler, and Ashbery. Lehman has tried his own hand at the form, to great success. In commemoration of his Oxford Book of American Poetry, he wrote a cento for The New York Times that’s a fabulous chimera whose anatomy is drawn from a diversity that is indicative of the sweep and complexity of four centuries of verse, including among others Robert Frost, Hart Crane, W.H. Auden, Gertrude Stein, Elizabeth Bishop, Edward Taylor, Jean Toomer, Anne Bradstreet, Henry Wadsworth Longfellow, Edna St. Vincent Millay, Wallace Stevens, Robert Pinsky, Marianne Moore, and this being a stolidly American poem, our grandparents Walt Whitman and Emily Dickinson.

Lehman contributed an ingenious cento sonnet in The New Yorker assembled from various Romantic and modernist poets, his final stanza reading “And whom I love, I love indeed,/And all I loved, I loved alone,/Ignorant and wanton as the dawn,” the lines so beautifully and seamlessly flowing into one another that you’d never notice that they’re respectively from Samuel Taylor Coleridge, Edgar Allan Poe, and William Butler Yeats. Equally moving was a cento written by editors at the Academy of American Poets, which in its entirety reads:

In the Kingdom of the Past, the Brown-Eyed Man is King
Brute. Spy. I trusted you. Now you reel & brawl.
After great pain, a formal feeling comes–
A vulturous boredom pinned me in this tree
Day after day, I become of less use to myself,
The hours after you are gone are so leaden.
Take this rather remarkable little poem on its own accord. Its ambiguity is remarkable, and the lyric is all the more powerful for it. To whom is the narrator speaking, who has been trusted and apparently violated that loyalty? Note how the implied narrative of the poem breaks after the dash that end-stops the third line. In the first part of the poem, we have declarations of betrayal, somebody is impugned as “Brute. Spy.” But from that betrayal, that “great pain,” there is some sort of transformation of feeling; neither acceptance nor forgiveness, but almost a tacit defeat, the “vulturous boredom.” The narrator psychologically, possibly physically, withers. They become less present to themselves, “of less use to myself.” And yet there is something to be said for the complexity of emotions we often have towards people, for though it seems that this poem expresses the heartbreak of betrayal, the absence of its subject is still that which affects the narrators so that the “hours…are so leaden.” Does the meaning of the poem change when I tell you that it was stitched together by those editors, drawn from a diversity of different poets in different countries living at different times? That the “true” authors are Charles Wright, Marie Ponsot, Dickinson, Sylvia Plath, and Samuel Beckett?

Steiner claims that “There is, stricto sensu, no finished poem. The poem made available to us contains preliminary versions of itself. Drafts [and] cancelled versions.” I’d go further than Steiner even, and state that there is no individual poet. Just as all drafts are ultimately abandoned rather than completed, so is the task of completion ever deferred to subsequent generations. All poems, all writing, and all language for that matter, are related to something else written or said by someone else at some point in time, a great chain of being going back to the beginnings immemorial. We are, in the most profound sense, always finishing each other’s’ sentences. Far from something to despair at, this truth is something that binds us together in an untearable patchwork garment, where our lines and words have been loaned from somewhere else, given with the solemn promise that we must pay it forward. We’re all just lines in a conversation that began long ago, and thankfully shall never end. If you listen carefully, even if it requires a bit of decipherment or decoding, you’ll discover the answer to any query you may have. Since all literature is a conversation, all poems are centos. And all poems are prophecies whose meanings have yet to be interpreted.

Image credit: Unsplash/Alexandra.

Ten Ways to Change Your God

“Well, it may be the devil or it may be the LordBut you’re gonna have to serve somebody.” —Bob Dylan (1979)

1.Walking Cambridge’s Trinity Lane in 1894, Bertrand Russell had an epiphany concerning the ontological proof for God’s existence, becoming the unlikely convert effected by logical argumentation. In Russell’s essay “Why I Became a Philosopher,” included in Amelie Oksenberg Rorty’s anthology The Many Faces of Philosophy: Reflections from Plato to Arendt, the logician explains how his ruminations turned to fervor, writing that “I had gone out to buy a tin of tobacco; on my way back, I suddenly threw it up in the air, and exclaimed as I caught it: ‘Great Scott, the ontological argument is sound.’” An atheist had a brief conversion—of a sort.

Not exactly Saul being confronted with the light that (quoting Euripides’s The Bacchae) told him “It is hard for thee to kick against the pricks,” or Augustin in his Confessions recounting that after a ghostly young voice told him to “Take up and read!”, he turned to Paul’s epistles. Russell’s conversion was a bit more abstract—of the head rather than the heart. In his flat-cap, tweed jacket, and herring-bone bowtie, he was converted not by the Holy Spirit, but by a deductive syllogism. Envision the co-author of Principia Mathematica, which rigorously reduced all of mathematics to logic, suddenly being moved by the spirit.

Derived by the medieval monk Anselm of Canterbury in his 1078 Proslogion, the ontological argument holds that since existence must be a property of perfection, and God is a priori defined as a perfect being, than quod erat demonstrandum: God must exist. Russell explains this metaphysical trick in his Nobel Prize-winning History of Western Philosophy: a “Being who possesses all other perfections is better if He exists than if He does not, from which it follows that if he does not He is not the best possible Being.”

From Aquinas to Rene Descartes, there is a venerable history of attempting to prove the existence of an omniscient, omnipotent, omnipresent deity, though as Nathan Schneider writes in God in Proof: The Story of a Search from the Ancients to the Internet, these arguments are “taught, argued about, and forgotten, sometimes saving a person’s particular faith, sometimes eroding it, and usually neither.” In defense of Anselm, nobody in the 11th century doubted God’s existence, and such proofs weren’t designed to convince, but rather to glory in divinity. As a subsequent defense, his proof has endured in a manner that other proofs haven’t. Cosmology and evolution have overturned most others, making them seem primitive to the point of adorableness, but Anselm endures.

Still, the syllogism can’t help but seem like a bit of a magic trick, defining God into existence rather than establishing even what type of God we’re to believe in. Critics of Anselm maintain that existence isn’t a property in the same way that other qualities are. We can imagine all sorts of characters with all sorts of qualities, but that doesn’t mean that they have to exist. Defenders of Anselm would claim that God isn’t like any other character, since a perfect thing that doesn’t exist can’t be said to be a perfect thing, and God is a perfect thing Critics of that would say that it’s possible to conceive of a perfect city, but that doesn’t mean you can buy an Amtrak ticket there, nor would a benevolent God allow Penn Station to look as it does. As the puzzle-writer (and theist) Martin Gardner notes in his delightful The Whys of a Philosophical Scrivener, “I agree with the vast majority of thinkers who see the proof as no more than linguistic sleight-of-hand.”

Eventually Russell’s new faith diffused like incense from a swinging thurible. If philosophy got Russell into this mess, then it also got him out. Russell explains that Immanuel Kant in his Critique of Pure Reason would “demolish all the purely intellectual proofs of the existence of God.” But what faith had Russell gained on Trinity Lane? It wasn’t a belief in God whom that street was named after, nor was it the Lord of Abraham, Isaac, and Jacob. What Russell’s faith was in, had always been in, and would always be in, was the power of reason, and in that he was unwavering.

David Hume, another of Russell’s antecedents, wrote in his 1739 A Treatise of Human Nature that “Reason is, and ought only to be the slave of the passions.” We’re going to believe what we’re going to (dis)believe, and we’ll concoct the reasons for it later. For his part, late in life, Russell was asked how he’d respond if upon death he was brought before God’s throne, and asked why he had dared not to believe? Russell said that he’d answer “Not enough evidence!”

2.According to mercurial family lore, when my paternal grandmother’s grandfather, August Hansmann, boarded a New York-bound steamship two years after the American Civil War and one year before his native Hanover would be subsumed into Prussia, he brought along with him a copy of the Dutch philosopher Baruch Spinoza’s Tractatus Theologico-Politicus, denounced when it was printed in 1677 as “a book forged in hell…by the devil himself.” Like Spinoza, Hansmann was a Jew who lived among gentiles, and like Spinoza, he understood that being Other in a narrative not written by yourself had tragic consequences.

Born illegitimate, Hansmann was raised Jewish even though his father was Christian; a man who understood how being two things sometimes meant that you were seen as nothing, he also knew the strange freedom of how dictated faith is no faith at all. Similarly, Spinoza was a Sephardic Jew of converso background whose Portuguese ancestors practiced their Judaism in secret until Dutch freedom allowed them to reinvent their hidden faiths. Hansmann encountered Spinoza’s celebration of religious liberty, “where everyone’s judgement is free and unshackled, where each may worship God as his conscience dictates, and where freedom is esteemed before all things.” For the pious lens grinder, content to work by the tulip-lined canals of red-brick Amsterdam, religious truth can only be discovered without shackles, divinity only visible if you’re not compelled by Church or State.

When the Jews of Spain and Portugal were forced to convert to Catholicism, many secretly practiced the mitzvoth, venerating the Sabbath, abjuring treyf, and kissing mezuzah’s surreptitiously concealed within the ceramic blue slipper of the Virgin. As scholar Karen Armstrong notes in The Battle for God, these were people who “had been forced to assimilate to a…culture that did not resonate with their inner selves.” When finally able to practice their religion in Holland, many of them then discovered that the Judaism of the rabbis was not the same Judaism that they’d imagined, and so they chose to be something else, something completely new —neither Jewish or Christian, but rather nothing. Armstrong writes that such persecution ironically led to the “first declarations of secularism and atheism in Europe.”

Many of those slurred as swinish Marranos found it more honest to live by the dictates of their own reason. Spinoza was the most famous, condemned by his synagogue for writing things like “I say that all things are in God and move in God,” holding that nature is equivalent with the Lord, so that either nothing is God or everything is. Such pantheism is what made some condemn Spinoza as an atheist, and others such as Russell later describe him as a “God-intoxicated man” who saw holiness in every fallen leaf and gurgling creek, his very name, whether “Baruch” or “Benedict” meaning “blessed.”

Rebecca Newberger Goldstein, in Betraying Spinoza: The Renegade Jew Who Gave us Modernity, asks if he can “be considered…a Jewish thinker?” She argues that his universalism derives from the Mosaic covenant, the monotheism of the Shema extended so that God is Everything. As a result, he is the primogeniture for a certain type of rational, secular, progressive, liberal, humane contemporaneity. On that steamer crossing the Atlantic, Hansmann may have read that “freedom [can] be granted without prejudice…but also that without such freedom, piety cannot flourish.” My great-great grandfather lived his life as a Jew, but the attraction he saw in Spinoza was that each individual could decide for themselves whether to be Jew, Catholic, Protestant, or nothing.

Hansmann worked as a peddler on the Lower East Side, until the Homestead Act enticed him to Iowa, where he married a Huguenot woman who bore him 10 children, while he worked as a trader among the Native Americans. He refused to raise his children in any religion—Jewish or Protestant—preferring rather that they should decide upon reaching adulthood. And so, a union was made between the Jewish and the Low Church Protestant, rejecting both baptism and bris, so that my grandmother born on the frontier had absolutely no religion at all.

That such things are even possible—to be of no religion—is due in no small part to Spinoza’s sacrifice, his congregation having excommunicated him by extinguishing each individual light in the synagogue until the assembled dwelled in darkness. From that expulsion, Spinoza was expected to find refuge among the Protestants—but he didn’t. I’ve a photo from the early years of the 20th century: August Hansmann surrounded by his secular, stolid, midwestern progeny, himself siting in the center with a thick black beard, and a kippah barely visible upon his head.

3.A long line of Spinoza’s ancestors, and my great-great-grandfather’s ancestors, would have concluded Pesach evenings with a “Next year in Jerusalem,” praying for the reestablishment of the Temple destroyed by the Romans in the first century. Less known than the equally exuberant and plaintive Passover declaration is that, for a brief period in the fourth century, it seemed that the Temple might actually be restored, ironically by Rome’s last pagan emperor. Born in Constantinople only six years after the Council of Nicaea convened there to define what exactly a Christian was, Julian the Apostate would mount a failed revolution.

His uncle was Rome’s first Christian emperor who conquered by the cross and who turned his Rome over to Christ. Julian was of a different perspective, seeing in the resurrection of Apollo and Dionysius, Jupiter and Athena, the rejuvenation of Rome. He bid his time until military success foisted him onto the throne, and then Julian revealed himself as an initiate into those Eleusinian Mysteries, a celebrant of Persephone and Demeter who greeted the morning sun and prayed for the bounty of the earth, quoted in W. Heinemann’s The Works of the Emperor Julian as having written “I feel awe of the gods, I love, I revere, I venerate them.”

In Julian’s panegyrics, one can smell the burning thyme and sage, feel the hot wax from votive candles, spy the blue moonlight filtered through pine trees in a midnight cedar grove. If Plutarch recorded the very heavens had once declared “the great god Pan is dead,” then Julian prayed for his return; if the oracles at Delphi and the Sibyllines had been silenced by the Nazarene, then the emperor wanted the divinations of those prophets to operate once again. Julian wanted this paganism to be a new faith, an organized, unified, consolidated religion that bore as much similarity to the cohesion of the Christian Church as it did to the rag-tag collection of rituals and superstitions that had defined previous Roman beliefs.

Classicist Robin Lane Fox makes clear in Pagans and Christians that this wasn’t simple nostalgia. Fox explains that those who returned to paganism normally did so with “an accompanying philosophy” and that apostasy “always lead to a favor for some systematic belief.” The emperor’s conversion was a turning back combined with a the reformer’s desire for regeneration. In paganism, Julian approached origin, genesis, birth—less conversion than a return to what you should have been, but was denied.

Julian the Apostate endures as cipher—duplicitous reactionary who’d see Christian Rome turn back, or tolerant visionary who theologically elevated paganism? Christian thinkers had long commandeered classical philosophy, now pagan thinkers were able to apply the same analytical standards to their own beliefs, developing theology as sophisticated as that of Christianity. The American rake and raconteur Gore Vidal repurposed the emperor as a queer hero of liberalism in his unusual 1964 novel Julian, having his protagonist humanely exclaim that “Truth is where ever man has glimpsed divinity.” Where some had seen those intimations in Golgotha’s sacrifice, the Apostate saw them in the oracles of Chaldea or the groves of Athena.

Far from banning the new faith, Julian declared that “By the gods I do not want the Galileans to be killed or beaten unjustly nor to suffer any other ill.” Julian was rather interested in monopolistic trust-busting, and in part that included funding the rebuilding of the Jewish Temple that would have been destroyed by the emperor’s ancestors. The building of a Third Temple would be terminated when, as a Roman witness to the construction attempts wrote, “fearful balls of fire [broke]…out near the foundations…till the workmen, after repeated scorchings, could approach no more.” The Christians attributed the disaster to God; the Jews and Romans to the Christians.

The desire for a pagan Rome would similarly end with Julian’s defeat on the battle fields of Persia, an emperor who longed to see old gods born again now forced to declare that “You have won, Galilean.” Hard to reverse an eclipse, and so, we supplicate on another mournful and deferred day—“Next year at Delphi.”

4.The titular character in Julian claims that “academics everywhere are forever attacking one another.” During the fourth century, the academic debates were theological, all of those schisms and heresies, excommunications and counter-excommunications between exotic groups with names like the Monophysites and the Chalcedonians, the Arians and the Trinitarians. By the middle of Vidal’s 20th century, such disputations were just as rancorous, but theology was now subsumed into politics. Vidal’s own politics were strange, broadly left but with a sympathy afforded to the anti-establishmentarians of any ideological persuasion.

Vidal is most celebrated for calling the conservative founder of the National Review William F. Buckley a “crypto-Nazi” during a debate on ABC News scheduled to coincide with the 1968 Democratic convention; even the pyrotechnic rainbow of early television was unable to conceal the pure hatred between those two prep school grads. If the earliest years of Christianity saw bishops and monks moving between ever nuanced theological positions, than the 20th century was an era of political conversion, liberals becoming conservatives and conservatives becoming liberals, with Buckley’s magazine a fascinating case study in political apostasy.

Buckley’s politics were cradle-to-grave Republican conservatism, even as he garnered a reputation for expelling acolytes of both Ayn Rand and John Birch from the movement as if he was a medieval bishop overseeing a synod (they’ve long since found a way back in). Entering public life with his 1951 God and Man at Yale: The Superstitions of “Academic Freedom,” Buckley understood better than most how ideology is theology by another name (even as I personally revile his politics). Into this midst, National Review was the stodgy, tweedy vanguard of the reactionary intelligentsia, defining a conservative as “someone who stands athwart history, yelling Stop, at a time when no one is inclined to do so.”

The problem with a manifesto that defines itself entirely by anti-progress is that such a doctrine can be rather nebulous, and so many of the bright young things Buckley hired for the National Review, such as Joan Didion and Garry Wills, found themselves moving to the left. Such were the subtleties of conversion that Wills could be both the author of Confessions of a Conservative and a journalist placed on Richard Nixon’s infamous “enemies list.”

As people become harder of hearing and their bone-density decreases, movement from the left to the right does seem the more predictable narrative. For every Gary Wills, there’s a Norman Podhoretz, an Irving Kristol, a David Horowitz, a Christopher Hitchens. Leave it to the arm-chair Freudians to ascertain what Oedipal complex made those men of the left move towards the Big Daddy of right-wing politics, but what’s interesting are the ways in which they refashioned conservatism in a specifically leftist manner. Their migration was not from milquetoast Democratic liberalism, for they’d indeed been far to the left, several of them self-described Trotskyites. And as the Aztecs who became Catholic kept secretly worshiping their old gods, or as basilicas were built atop temples to Mithras, so too did those doctrines of “permanent revolution” find themselves smuggled into neoconservatism.

If politics is but religion by another means, than it’s the ideological conversion that strikes us as most scandalous. We’ve largely ceded the ground on the sacred—what could be less provocative than abandoning Presbyterianism for Methodism? But politics, that’s the thing that keeps us fuming for holy war, and we’re as titillated by stories of conversion as our ancestors were in tales of heresy and schism. Psychologist Daniel Oppenheimer observes, in Exit Right: The People Who Left the Left and Reshaped the American Century, that “belief is complicated, contingent, multi-determined. But do we really know it? Do we feel it?” Strange to think that Elizabeth Warren was once a Republican, and the man whom she will beat for the presidency was once a Democrat, but such are the vagaries of God and man, whether at Yale or anywhere else.

5.For all their differences, Buckley and Vidal could at least agree on the martini. Buckley would write in 1977 that a “dry martini even at night is a straightforward invitation for instant relief from the vicissitudes of a long day,” and Vidal in his novel Kalki published a year later would rhapsodize about the “martini’s first comforting haze.” On the left or on the right, one thing WASPs concurred about (and though Buckley was technically Catholic he had the soul of an Episcopalian) was the cocktail hour. I’ve no idea if the two had been drinking before their infamous sparring on ABC, though the insults, homophobia, and violent threats make me suspicious.  

Better that they’d have followed the path of conversion that another prep school boy who moved in their social circles named John Cheever did: When on April 9, 1975 his brother checked him into New York’s Smithers Alcoholic Rehabilitation Unit, he never took another drink. Cheever had lived up to the alcoholic reputation of two American tribes—High Church Protestants and Low Church writers. From the former he inherited both the genes and an affection for gin and scotch on a Westchester porch watching the trains from Grand Central thunder Upstate, and from the later he took the Dionysian myth that conflates the muse with ethanol, pining for inspiration but settling for vomiting in an Iowa City barroom.

Cheever was one of the finest short story writers of the 20th century, his prose as crystalline and perfect as a martini. Such was the company of those other addicts, of Ernest Hemingway and F. Scott Fitzgerald, William Faulkner and Thomas Wolfe. Cheever’s story “The Swimmer” is one of the most perfect distillations of how alcoholism will sneak up on a person, and he avoids the laudatory denials you see in a lesser writer like Charles Bukowski. With the repressed self-awareness that is the mocking curse of all true alcoholics, Cheever would write in his diary some two decades before he got sober that “When the beginnings of self-destruction enter the heart it seems no bigger than a grain of sand,” no doubt understanding how a single drink is too many since a dozen is never enough.

His daughter Susan Cheever, herself a recovering alcoholic, notes in Drinking in America: Our Secret History that “My father’s drinking had destroyed his body, but it had also distorted his character—his soul. The restoration of one man through the simple measure of not drinking was revelatory.” The ancients called them spirits for a reason, and in their rejection there is a conversion of a very literal sort. Cheever—along with his friend Raymond Carver—is the happy exception to the fallacy that finds romance in the gutter-death of literary genius, and he got sober by doing the hard work of Alcoholics Anonymous.

The central text of that organization was compiled by Bill W., the founder of AA; its title is technically Alcoholics Anonymous, but members informally call it “The Big Book.” Past the uninspired yellow-and-blue cover of that tome, Cheever would have read stories where he’d have “found so many areas where we overlapped—not all the deeds, but the feelings of remorse and hopelessness. I learned that alcoholism isn’t a sin, it’s a disease.” And yet the treatment of that disease was akin to a spiritual transformation.

A tired debate whether Alcoholics Anonymous is scripture or not, but I’d argue that anything that so fully transforms the countenance of a person can’t but be a conversion, for as the Big Book says, “We talked of intolerance, while we were intolerant ourselves. We missed the reality and the beauty of the forest because we were diverted by the ugliness of some of its trees.” I once was lost, and now I’m found, so on and so forth. When Cheever died, he had seven sober years—and they made all the difference.

6. Conversion narratives are the most human of tales, for the drama of redemption is an internal one, played out between the protagonist and his demons. Certain tropes—the pleasure, the perdition, the contrition, the repentance, the salvation. Augustine understood that we do bad things because bad things are fun—otherwise why would he write in Confessions “Lord, grant me chastity—but not yet.” What readers thrill to are the details, the rake’s regress from dens of iniquity, from gambling, drinking, and whoring to some new-found piety.

For Cheever’s Yankee ancestors, the New England Puritans in whose stead we’ve uneasily dwelled for the past four centuries, “election” was not a matter of personal choice, but rather grace imparted onto the unworthy human. Easy to see some issues of utility here, for when accumulation of wealth is read as evidence of God’s grace, and it’s also emphasized that the individual has no role in his own salvation, the inevitable result is spiritual disenchantment and marginalization. By the middle of the 18th century, some five generations after the first Pilgrim’s slipper graced Plymouth Rock, the Congregationalist pastors of New England attempted to suture the doubts of their flocks, coming up with “half-way covenants” and jeremiads against backsliding so as to preserve God’s bounty.

Into that increasingly secular society would come an English preacher with a thick Gloucester accent named George Whitfield, who first arrived in the New World in 1738. Technically an Anglican priest, Whitfield was a confidant of George Wesley, the father of Methodism, and from that “hot” faith the preacher would draw a new vocabulary, dispelling John Calvin’s chill with the exhortation that sinners must be born again. Crowds of thousands were compelled to repent, for “Come poor, lost, undone sinner, come just as you are to Christ.” On the Eastern seaboard, the Englishman would preach from Salem to Savannah, more than 10,000 times, drawing massive crowds, even impressing that old blasphemer Benjamin Franklin at one Philadelphia revival (the scientist even donated money).

Such was the rhetorical style of what’s called the Great Awakening, when colonial Americans abandoned the staid sermons of the previous century in favor of this shaking, quaking, splitting, fitting preaching. Whitfield and Spinoza shared nothing in temperament, and yet one could imagine that the later might smile at the liberty that “established fractious sectarianism as its essential character,” as John Howard Smith writes in The First Great Awakening: Redefining Religion in American, 1725-1775. Whitfield welcomed worshippers into a massive tent—conversion as a means towards dignity and agency.

So ecumenical was Whitfield’s evangelization that enslaved people came in droves to his revivals, those in bondage welcomed as subjects in Christ’s kingdom. Such was the esteem in which the reverend was held that upon his passing in 1770 a black poet from Cambridge named Phyllis Wheatly would regard the “happy saint” as a man whom “in strains of eloquence refin’d/[did] Inflame the heart, and captivate the mind.” Whitfield’s religious charity, it should be said, was limited. He bemoaned the mistreatment of the enslaved, while he simultaneously advocated for the economic benefits of that very institution.

Can we tighten this line. As different as they were, Whitfield and Malcolm X were both children of this strange Zion that allows such reinvention. Malcolm X writes in a gospel of both American pragmatism and American power, saying that “I’m for truth, no matter who tells it. I’m for justice, no matter who it’s for or against…I am for whoever and whatever benefits humanity as a whole.” Conversion can be a means of seizing power; conversion can be a means of reinvention.

Activist Audre Lorde famously wrote that “The master’s tools will never dismantle the master’s house,” and for a young Harlem ex-con born Malcolm Little, the Christianity of Wheatly and Whitfield would very much seem to be the domain of the plantation’s manor, so that conversion to a slave religion is no conversion at all. Mocking the very pieties of the society that Whitfield preached in, Malcolm X would declare “We didn’t land on Plymouth Rock—Plymouth Rock landed on us.” Malcolm X’s life was an on-going narrative of conversion, of the desire to transform marginalization into power. As quoted by Alex Haley in The Autobiography of Malcolm X, the political leader said “I have no mercy or compassion in me for a society that will crush people, and then penalize them for not being able to stand up.”

Transformation defined his rejection of Christianity, his membership in the Nation of Islam, and then finally his conversion to orthodox Sunni Islam. Such is true even in the rejection of his surname for the free-floating signifier of “X,” identity transformed into a type of stark, almost algebraic, abstraction. If America is a land of conversion narratives, than The Autobiography of Malcolm X is ironically one of the most American. Though as Saladin Ambar reminds us in Malcolm X at Oxford Union, his “conversion was indeed religious, but it was also political,” with all which that implies.

7.It is a truth universally acknowledged, that an apostate in possession of a brilliant spiritual mind, must be in want of a religion. If none of the religions that already exist will do, then it becomes her prerogative to invent a better one and convert to that. Critic Harold Bloom writes in The American Religion that “the religious imagination, and the American Religion, in its fullest formulations, is judged to be an imaginative triumph.” America has always been the land of religious invention, for when consciences are not compelled, the result is a brilliant multitude of schisms, sects, denominations, cults, and communes. In his Essays, the French Renaissance genius Michel de Montaigne quipped that “Man is certainly stark mad; he cannot make a worm, and yet he makes gods by the dozens.” Who, however, if given the choice between a worm or a god, would ever possibly pick the former? For America is a gene splicing laboratory of mythology, an in vitro fertilization clinic of faith, and we birth gods by the scores.

Consider Noble Drew Ali, born Timothy Drew in 1886 to former North Carolina slaves who lived amongst the Cherokee. Ali compiled into the Holy Koran of the Moorish Science Temple of America a series of ruminations, meditations, and revelations he had concerning what he called the “Moorish” origins of African-Americans. Drawing freely from Islam, Christianity, Buddhism, Hinduism, and the free-floating occultism popular in 19th-century America, Ali became one of the first founders of an Afrocentric faith in the United States, his movement the original spiritual home to Wallace Fard Muhammad, founder of the Nation of Islam. Ali writes that the “fallen sons and daughters of the Asiatic Nation of North America need to learn to love instead of hate; and to know of their higher self and lower self. This is the uniting of the Holy Koran of Mecca for teaching and instructing all Moorish Americans.”

Ali drew heavily from mystical traditions, combining his own idiosyncratic interpretations of Islam alongside Freemasonry and Rosicrucianism. Such theurgy was popular in the 19th century, a melancholic era when the almost million dead from Antietam and Gettysburg called out to the living, who responded with séance and Ouija Board. Historian Drew Gilpin Faust recounts in The Republic of Suffering: Death and the American Civil War that “Many bereaved Americans…unwilling to wait until their own deaths reunited them with lost kin…turned eagerly to the more immediate promises of spiritualism.” The 19th century saw mass conversions to a type of magic, a pseudo-empirical faith whose sacraments were technological—the photographing of ghostly ectoplasm, or the receipt of telegraphs from beyond the veil of perception.

Spiritualism wasn’t merely a general term for this phenomenon, but the name of an actual organized denomination (one that still exists). Drawing from 18th-century occultists like Emanuel Swedenborg and Franz Mesmer, the first Spiritualists emerged out of the rich soil of upstate New York, the “Burned Over District” of the Second Great Awakening (sequel to Whitfield’s First). Such beliefs held that the dead were still among us, closer than our very breath, and that spirits could interact with the inert matter of our world, souls intermingled before the very atoms of our being.

Peter Manseau writes in The Apparitionists: A Tale of Phantoms, Fraud, Photography, and the Man Who Captured Lincoln’s Ghost, that “It was a time when rapidly increasing scientific knowledge was regarded not as the enemy of supernatural obsessions, but an encouragement…Electricity had given credence to notions of invisible energies…The telegraph had made communication possible over staggering distances, which raised hopes of receiving messages from the great beyond.”

Among the important founders of the movement were the Fox Sisters of Hydesville, N.Y.; three siblings whom in 1848 claimed that they’d been contacted by spirits, including one named “Mr. Splitfoot,” who communicated in raps, knocks, and clicks. Decades later, Margaret Fox would admit that it was a hoax, since a “great many people when they hear the rapping imagine at once that the spirits are touching them. It is a very common delusion.” Despite the seeming credulity of the movement’s adherents, Spiritualists were crucial reformers, with leaders like Cora L.V. Scott and Paschal Beverly Randolph embracing abolitionism, temperance, civil rights, suffragism, and labor rights. When the cause is good, perhaps it doesn’t matter which god’s vestments you wear.

And of course the great American convert to a religion of his own devising is Joseph Smith. America’s dizzying diversity of faith confused young Smith, who asked “Who of all these parties are right, and how shall I know?” From the same upstate environs as the Fox Sisters, Smith was weened on a stew of evangelicalism and occultism, a child of the Second Great Awakening, who in those flinty woods of New York dreamt of finding shining golden tablets left by angels. Writing in No Man Knows My History: The Life of Joseph Smith, scholar Fawn M. Brodie notes that for the New England and New York ancestors of Smith there was a “contempt for the established church which had permeated the Revolution, which had made the federal government completely secular, and which was in the end to divorce the church from the government of every state.”

Smith rather made America itself his invented religion. Stephen Prothero writes in American Jesus: How the Son of God Became a National Hero that there is a tendency of “Americans to make their nation sacred—to view its citizens as God’s chosen people.” Yet it was only Smith’s Mormons who so completely literalized such a view, for the Book of Mormon describes this as “a land which is choice above all other lands.” The effect was electrifying; Brodie writes: “In the New World’s freedom the church had disintegrated, its ceremonies had changed, and its stature had declined.” What remained was a vacuum in which individual minds could dream of new faiths. Spinoza would recognize such independence, his thin face framed by his curled wig, reflected back from the polished glow of one of Moroni’s tablets excavated from the cold ground of Palmyra, N.Y.

8.“In the beginning there was the Tao, and the Tao was God,” reads John 1:1 as translated in the Chinese Version Union bible commissioned by several Protestant denomination between 1890 and 1919. Appropriating the word “Tao” makes an intuitive sense, arguably closer to the Neo-Platonist language of “Logos” as the term is rendered in the koine Greek, than to the rather confusing terminology of “the Word” as it’s often translated in English.

Read cynically, this
bible could be seen as a disingenuous use of Chinese terminology so as to make
Christianity feel less foreign and more inviting, a Western wolf in Mandarin robes.
More charitably, such syncretism could be interpreted as an attempt to find the
universal core between those two religions, a way of honoring truth regardless
of language. Conversion not between faiths, but above them. Perhaps naïve, but
such a position might imply that conversion isn’t even a possibility, that all
which is needed in the way of ecumenicism is to place the right words with the
right concepts.

The earliest synthesis between Taoism, Buddhism, Confucianism, and Christianity is traceable to the seventh century. At the Mogao Caves in Dunhuang, Gansu Province, a cache called the Jingjiao Documents penned during the Tang Dynasty and attributed to the students of a Syrian monk named Alopen were rediscovered in 1907. Alopen was a representative of that massive eastern branch of Christianity slurred by medieval European Catholics as being “Nestorian,” after the bishop who precipitated their schism at a fifth-century church council (the theological differences are arcane, complicated, and for our purposes unimportant).

During those years of late antiquity, European Christendom was a backwater; before the turn of the first millennium the Catholicus of Baghdad would have been a far more important cleric than the Pope was, for as scholar Philip Jenkins explains in The Lost History of Christianity: The Thousand Year Golden Age of the Church in the Middle East, Africa, and Asia—and How it Died, the “particular shape of Christianity with which we are familiar is a radical departure from what was for well over a millennium the historical norm…For most of its history, Christianity was a tricontinental religion, with power representations in Europe, Africa, and Asia.”

In 635, Alopen was an evangelist to a pluralistic civilization that had a history that went back millennia. His mission was neither colonial nor mercantile, and as a religious scholar he had to make Christianity appealing to a populace content with their beliefs. And so, Alopen converted the Chinese by first converting Christianity. As with the translators of the Chinese Version Union bible, Alopen borrowed Taoist and Buddhist concepts, configuring the Logos of John as the Tao, sin as karma, heaven as nirvana, and Christ as an enlightened Bodhisattva.

Sinologist Martin Palmer, writing in The Jesus Sutras: Rediscovering the Lost Scrolls of Taoist Christianity, argues that Alopen avoided “what many missionaries have tried to do—namely, make people adapt to a Western mind-set.” Rather, Alopen took “seriously the spiritual concerns of China.” Alopen was successful enough that some 150 years after his arrival, a limestone stele was engraved in both Mandarin and Syriac celebrating the history of Chinese Christianity. With a massive cross at the top of the Xi’an stele, it announced itself as a “Memorial of the Propagation in China of the Luminous Religion from Rome.” During a period of anti-Buddhist persecution in the ninth century, when all “foreign” religions were banned, the stele was buried, and by 986 a visiting monk reported that “Christianity is extinct in China.”

Like Smith uncovering his golden tablets, workers in 1625 excavated the Xi’an stele, and recognizing it as Christian sent for Jesuits who were then operating as missionaries to the Ming Court. Portuguese priest Alvaro Semedo, known to the court as Xie Wulu, saw the stele as evidence of Christian continuity; other clergy were disturbed that the monument was from a sect that the Church itself had deemed heretical 1,000 years before. German Jesuit polymath Athanasius Kirchner supplied a Latin translation of the stele, enthusing in his China Illustrata that Xi’an’s rediscovery happened by God’s will “at this time when the preaching of the faith by way of the Jesuits pervaded China, so that old and new testimonies…would go forth…and so the truth of the Gospel would be clear to everyone.” But was it so clear, this strange gospel of the Tao?

Much of Kircher’s book was based on his colleague Fr. Mateo Ricci’s accounts of the Ming Court. Ricci had taken to wearing the robes of a Confucian scholar, borrowing from both Confucius and Lao-Tzu in arguing that Catholicism was a form of those older religions. The Dominicans and Franciscans in China were disturbed by these accommodations, and by 1645 (some 35 years after Ricci had died) the Vatican’s Sacred Congregation for the Doctrine of the Faith ruled against the Jesuits (though this was a process that went back and forth). Maybe there is something fallacious in simply pretending all religions are secretly the same. Prothero writes in God Is Not One: The Eight Rival Religions that Run the World, that we often have “followed scholars and sages down the rabbit hole into a fantasy world in which all gods are one.” Catholicism is not Taoism, and that’s to the integrity of both.

But Ricci’s attitude was a bold one, and in considering different beliefs, he was arguably a forerunner of pluralistic tolerance. We risk abandoning something beautiful if we reject the unity that Alopen and Ricci worked for, because perhaps there is a flexibility to conversion, a delightful promiscuity to faith. Examining one of the Chinese water-colors of Ricci, resplendent in the heavenly blue silk of the panling lanshan with a regal, heavy, black putou on his head, a Roman inquisitor may have feared who exactly was converting whom.  

9.In the painting, Sir Francis Dashwood —11th Baron le Despencer and Great Britain’s Chancellor of the Exchequer from 1762 to 1763—is depicted as if he was St. Francis of Assisi. Kneeling in brown robes, the aristocrat is a penitent in some rocky grove, a hazy blue-grey sfumato marking the countryside visible through a gap in the stones. In the corner is a silver platter, grapes and cherries tumbled onto the soil of this pastoral chapel, as if to remind the viewer of life’s mutability, “Vanity of vanity” and all the rest of it. Some tome—perhaps The Bible?—lay open slightly beyond the nobleman’s gaze, and with hand to breast, Dashwood contemplates what looks like a crucifix. But something is amiss in this portrait painted by Dashwood’s friend, that great notary of 18th-century foibles William Hogarth. The crucifix—it’s not Christ on the cross, but a miniature nude woman with her head thrown back. Suddenly the prurient grin on the stubbly face of Dashwood makes more sense.

If you happen to be an expert on 18th-century French pornography, you might notice that it’s not the gospels that lay open on cracked spine next to Dashwood, but a copy of Nicolas Chorier’s Elegantiae Latini sermonis; were you familiar with the intricacies of Westminster politics in the 1760s, you may have observed that rather than a golden, crescent halo above the baron’s head, it’s actually a cartoon of the Earl of Sandwich in lunar profile.

Already raised in the anti-Catholic environment of British high society, Dashwood’s disdain for religion was incubated during his roguish youth while on his fashionable Grand Tour of the continent—he was expelled from the Papal States. In the anonymously written 1779 Nocturnal Revels, a two-volume account of prostitution in London, the author claims that Dashwood “on his return to England, thought that a burlesque institution in the name of St. Francis, would mark the absurdity of such Societies; and in lieu of the austerities and abstemiousness there practiced, substitute convivial gaiety, unrestrained hilarity, and social felicity.”

To house his “Franciscans,” Dashwood purchased a former Cistercian Abby in Buckinghamshire that overlooked the Thames, and in dazzling stain-glass had inscribed above its entrance the famous slogan from the Abby of Thelema in Francois Rabelais’s 15th-century classic Gargantua and Pantagruel—“Do What Thou Wilt.” Its grounds were decorated with statues of Dionysius—Julian the Apostate’s revenge—and the gothic novelist (and son of a Prime Minister) Horace Walpole wrote that the “practice was rigorously pagan: Bacchus and Venus were the deities to whom they almost publicly sacrificed; and the nymphs and the hogsheads that were laid in against the festivals of this new church.” Within those gothic stone walls, Dashwood’s compatriots very much did do what they would, replacing sacramental wine with liquor, the host with feasting, and the Mass with their orgies. The Monks of Medenham Abby, founded upon a Walpurgis Night in 1752, initiated occasional worshipers including the respected jurist Robert Vansittart, John Montague 4th Earl of Sandwich, the physician Benjamin Edward Bates II, the parliamentarian George Bubb Dodington, and in 1758 they hosted a colonial scientist named Benjamin Franklin (fresh from a Whitfield revival no doubt).

Such gatherings were not uncommon among the bored upper classes of European society; Black Masses were popular among French aristocrats into the 17th century, and in Britain punkish dens of obscenity like Dashwood’s were known as “Hell-Fire Clubs.” Evelyn Lord writes in her history The Hellfire Clubs: Sex, Satanism and Secret Societies that long before Dashwood ever convened his monks, London had been “abuzz with rumors of highborn Devil-worshipers who mocked the established Church and religion, and allegedly supped with Satan,” with the apparently non-Satanic members of Parliament pushing for anti-blasphemy legislation.

That’s the thing with blasphemy though—there’s no Black Mass without first the Mass, no Satan without God. Irreverent, impious, and scandalous though Dashwood may have been, such activities paradoxically confirm faith. Lord writes that the “hell-fire clubs represented an enduring fascination with the forbidden fruit offered by the Devil…But the members of these clubs faced a dilemma: if they believed in Satan and hell-fire, did they by implications believe in a supernatural being, called God, and a place called Heaven?” Should the sacred hold no charged power, were relics simply bits of rag and bone, than there would be no electricity in their debasement; were a crucifix meaningless, than there would be no purpose in rendering it pornographic. A blasphemous conversion, it turns out, may just be another type of conversion.  

Geoffrey Ashe argues in The Hell-Fire Clubs: Sex, Rakes and Libertines that Thelema is an antinomian ethic that can be traced from Rabelais through the Hell-Fire Clubs onto today. He writes that such a history is “strange and unsettling. It discloses scenes of pleasure and laughter, and also some of the extremist horrors ever conceived. It introduces us to cults of the Natural, the Supernatural; to magic, black and otherwise.” Dashwood’s confraternity encompasses figures as diverse as the Marquis de Sade, the notorious occultist Aleister Crowley (who had Rabelais’s motto carved above the entrance to his own monastery in Sicily), and LSD evangelist Timothy Leary. Fear not the blasphemer, for such is merely a cracked prophet of the Lord. As Master Crowley himself wrote in Aceldama: A Place to Bury Strangers, “I was in the death struggle with self: God and Satan fought for my soul those three long hours. God conquered – now I have only one doubt left—which of the twain was God?”  

10.When the Blessed Kateri Tekakwitha, lily of the Mohawks and the sainted maiden of the Iroquois village of Kahnawake, laid her head upon her death-bed one chill spring in 1680, it was said that the disfiguring small-pox scars she’d contracted vanished from her beautiful corpse. There in the dread wilderness of New France, where spring snows fall blue and deep and the horizon is marked with warm smoke from maple long-houses and fallen acorns are soggy under moccasin slippers, America’s indigenous saint would die. A witness recorded that Tekakwitha’s face “suddenly changed about a quarter of an hour after her death, and became in a moment so beautiful.” A fellow nun records that the evening of the saint’s death, she heard a loud knock at her door, and Tekakwitha’s voice saying “I’ve come to say good-bye; I’m on my way to heaven.”

Tekakwitha’s short decades were difficult, as they must by necessity be for anyone who becomes a saint. She was victim of a world collapsing in on itself, of the political, social, economic, and ecological calamities precipitated by the arrival of the very people whose faith she would convert to, one hand holding a bible and a crucifix, the other a gun—all of them covered in the invisible killing virus. Despite it being the religion of the invaders, Tekakwitha had visions of the Virgin and desired conversion, and so she journeyed over frozen Quebec ground to the village of the “Black Robes” who taught that foreign faith.

When Tekakwitha met with the Jesuits, they told the Iroquois woman not of the Tao, nor did they speak of heaven, rather they chanted a hymn of Karonhià:ke, the realm from which the father of all things did send his only son to die. Of her own accord, Tekakwitha meditated on the words of the Jesuits, her confessor Fr. Cholonec recording that she finally said “I have deliberated enough,” and she willingly went to the baptismal font. She has for the past three-centuries been America’s indigenous saint, a symbol of Christ reborn on this land, the woman of two cultures whom William T. Vollman describes in his novel Fathers and Crows as “Tekakwitha…praying besides the Cross of maple wood she had made.”

Much controversy follows such conversions: are we to read Tekakwitha—who endures as a symbol of syncretism between Christianity and indigenous spirituality—as a victim? As a willing penitent? As some cross between the two? In his novel Beautiful Losers, the Canadian poet, novelist, and songwriter Leonard Cohen says of Tekakwitha that a “saint does not dissolve the chaos.” Tekakwitha is not a dialectic to resolve the contradictions between the Catholic and the Iroquois, the French and the Mohawk. She is not an allegory, a parable, a metaphor, or an example—she is Tekakwitha, a woman.

If we are to draw any allegorizing lesson from her example, it must be this—conversion, like death, is something that is finally done alone. Who can we be to parse her reasons for embracing that faith, just as how can we fully inhabit the decisions of Julian, or Spinoza, or Hansmann, or Ricci? Nothing can be more intimate, or sometimes more surprising, than the turn of a soul, the conversion of a woman or man. We aren’t known to one another; we’re finally known only to God—though it’s impossible to say which one. When Tekakwitha’s appearance changed, was this an indication of saintliness? Of her true form? Of the beatified face when it looks upon the creator-god Ha-wen-ni-yu? All that can be said of conversion is that it’s never final, we’re always in the process of being changed, and pray that it’s possible to alter our broken world in return. Converts, like saints, do not reconcile the chaos, they exist amidst it. In hagiography, we find not solution, but mystery—as sacred and holy as footprints on a virgin Canadian snow, finally to be erased as the day turns to night.

Image credit: Unsplash/Diana Vargas.

Philosophizing the Grave: Learning to Die with Costica Bradatan

“It was the hemlock that made Socrates great.” —Seneca

“Honorable purpose in life invites honorable purpose in death.” —David Buckel

On an early spring morning in 2018, when the stars were still out and Manhattan glowed in all of its wasteful wattage across the East River, a 60-year-old retired lawyer named David Buckel made his way past the Grand Army Plaza and the Brooklyn Museum down to Prospect Park. In those hours before the dawn, Buckel dug a shallow circle into the black dirt, which investigators believed he made to prevent the spread of the fire he was about to set, having understood that our collective future would have enough burning. The former attorney paused to send an email to The New York Times—it read: “Most humans on the planet now breathe air made unhealthy by fossil fuels, and many die early deaths as a result—my early death by fossil fuel reflects what we are doing to ourselves”—doused himself in gasoline, and lit a match. Buckel was pronounced dead by 6:30 a.m.

As a civil rights attorney, Buckel’s life was one of unqualified success—an early supporter of Lambda Legal, he’d fought doggedly for LGBTQ rights not just in New York, but in seemingly inhospitable places from Nebraska to Iowa. As a human being, his life was defined by companionship and love—raising a daughter with his husband of 34 years alongside the girl’s biological mother and her wife. And as a community member, his life was committed to stewardship and responsibility—rather than using wasteful fossil fuels he walked several miles every day to the Brooklyn Botanic Garden where, in retirement, he organized the center’s composting program. By all accounts Buckel’s life was centered around justice, both personal and ecological, which should go to some length in explaining his death on April 14, though the philosopher Costica Bradatan reminds us that wherever “self-immolators are going they are not asking anyone with them: their deaths are fierce, but remain exclusively their own.”

With some incongruity, I thought about Buckel’s sacrifice as I sat by my apartment complex’s pool on a hot New England summer day (they seem hotter and hotter), welcoming my 35th birthday reading Bradatan’s poignant, provocative, astute, moving, thoughtful Dying for Ideas: The Dangerous Lives of the Philosophers. A philosopher at Texas Tech University, as well as an honorary research professor at Australia’s University of Queensland, Bradatan has in his role as religion editor for The Los Angeles Review of Books consistently proven the utility of philosophical and theological ideas when it comes to the art of living. In Dying for Ideas, Bradatan examines those who sacrificed their lives for ideas through a “purely secular martyrdom” (though one of his subjects was a saint): the deaths of thinkers for whom the nature and impact of their executions “confers a discrete sublimity upon these philosophers’ gestures. It is great to die for God, but it may be greater to die for no God.” Like the women and men Bradatan writes of, from Thomas More awaiting his beheading in the Tower of London’s turret to the anti-Soviet dissident Jan Patočka tortured in a Prague investigation chamber, Buckel had lived his life, and his death, for an idea. In Buckel’s death, Bradatan might argue, we see not just suicide, but an argument about life, where what “we feel toward the person performing [self-immolation]…is in fact a complex mix of fear and respect, of fascination and repulsion, attraction and revulsion, all at once.”

What Dying for Ideas clarifies is that we must not look away from the pyre; we must consider these deaths that often turn out to be the “threshold where history ends and mythology begins,” as Bradatan writes. Buckel attempted a sacrifice similar to those of the Buddhist nuns and monks who’d self-immolated themselves during the Vietnam War, people of incomparable bravery and detachment whom Buckel had long admired. In short, the attorney wished to turn his body into a candle “helping others to find the right direction,” as Bradatan might have put. As we perhaps approach our own collective martyrdom, measuring the rising tide and warmth alike, there is a pressing wisdom that we must grapple with, what the medievals called the Ars morienda, the art of the good death. Can we notice the warmth of Buckel’s auto de fe amidst the escalating temperatures, or do such flames recede into the apocalyptic heat?

Morbid thoughts to have in the sunniness of a Boston June, listening to a “’90s Hits” Spotify playlist. Yet in the hazy days of the Anthropocene, it’s hard not to feel the uncanniness that acknowledges Buckel’s life and death lived with authenticity, while the rest of us wait for the ice caps to melt listening to Radiohead and Soundgarden on our smart phones. While taking a break from Bradatan’s astute readings of those martyrs like Socrates, Hypatia, and Giordano Bruno who died not for God or country, but rather for philosophical ideas, I paused to scroll through my Twitter feed, only to come across a Vice article titled “New Report Suggests ‘High Likelihood of Human Civilization Coming to an End’ Starting in 2050.” That’s the year that I’ll be (theoretically) “celebrating” my 66th birthday. If I hadn’t paid attention to why Buckel died when it happened, it would behoove me to take notice now.

Dying for Ideas came out three years before Buckel’s suicide, but the glow of his burning body couldn’t help but give off more light by which to read Bradatan’s book. He writes of a death like Buckel’s, that it “does not always mean the negation of life—sometimes it has the paradoxical capacity of enhancing it, of intensifying it to the point of, yes, breathing new life into life.” As Socrates died for the good, as Hypatia died for reason, as More died for faith, and as Bruno died for magic, so Buckel died for the Earth. When scheduling my reviews and writing projects for the summer, I hadn’t intended to necessarily be reading Dying for Ideas on my 35th birthday, but there’s something appropriate in having Bradatan as my Virgil for the date that Dante described in The Divine Comedy as being “Midway upon the journey of our life/[when] I found myself within a dark forest.” Now the dark wood has been paved over, and all of its pollinating insects are going extinct, so for a member of the millennial generation facing the demise of our entire ecosystem and the possible extinction of humanity, both Buckel’s death and Bradatan’s invocation of those who faced their own demise with magnanimity and wisdom hang humid in the air.

Dying for Ideas embodies the French Renaissance essayist Michel de Montaigne’s conviction that “To philosophize is to learn to die.” Bradatan makes the compelling case that to evaluate philosophers in their entirety, you must ascertain not just the rationality of their arguments, but if their lives were consistent with their claims—and that for a few singular philosophers, if their deaths also make their arguments. For philosophers like Bradatan, philosophy is a method as much of a discipline; not merely the realm of logical syllogism, but something that prepares us for the “perilous journey from the agora to the scaffold or the pyre.” If you’ve ever taken an undergraduate philosophy course, Montaigne’s aphorism might not strike you as accurate, for the Anglo-American academic discipline goes a long way to perennially proving Henry David Thoreau’s crack that there are a lot of philosophy professors, but precious few philosophers.

In the American and British academy, the vast majority of university departments tend to orient themselves towards what’s called analytical philosophy, a tradition that rather than engage those eternal philosophical questions concerning the examined life, is rather content with precise, logical anal retention; pleased to enumerate all of the definitions of the word “is” rather than to question how it is that we should live. Fortunately, Bradatan is a disciple of that other contemporary wing, the continental philosophy of 20th-century France and Germany that is still capable of approaching the discipline as “an act of living…[that] often boils down to…learning how to face death—an art of dying,” as Bradatan puts it. He brings to bear not the rigid, rigorous, logical arguments of analytical philosophers like A.J. Ayers, Saul Kripke, and Willard van Orman Quine—men whom for all their brilliance and importance did little to illuminate the philosophical issue of how are we to live?—in favor of the impressionistic and poetic truths embodied by thinkers like Simone Weil, Martin Heidegger, and most of all Pierre Hadot.

The last philosopher has never quite achieved the reputation he deserves in the Anglophone world, his philosophy concerning “care for the self” rather filtered through by more popular disciples, such as the historian Michel Foucault. In Hadot’s estimation, the pre-Socratic philosophers of ancient Greece approached their thought not as a means of abstract contemplation, or as a way of producing one more publication for the tenure review file, but rather as a solemn, rigorous, unsparing honest examination and method that served to transform one’s very life. As Hadot wrote in What Is Ancient Philosophy?, there is “No discourse worthy of being called philosophical, that is separated from the philosophical life.” For philosophers in Hadot’s stead, including Bradatan, to approach philosophy as if it were simply the academic discipline that investigates the history of ideas, or with even more futility as a type of logical game, is to abdicate a solemn responsibility. In reducing philosophy to nothing more than just another university department with journals and conferences it’s to reject philosophy as something that can change your life.

Thus do we honor Socrates, that ugly little gadfly who in acknowledging his ignorance was paradoxically the wisest man in Athens. Socrates is Bradatan’s first martyr-philosopher, the most celebrated of men who died for an idea. Such is the power of the event, the forced drinking of hemlock at the hands of an Athenian state that claimed Socrates had corrupted the youths of the city and preached against her gods. He is not an uncomplicated figure, from the detached, calm, almost-absurdly-professorial figure of clean lines and smooth surfaces in French painter Jacques-Louis David’s The Death of Socrates, to radical journalist I.F. Stone’s The Trial of Socrates with its revisionist interpretation of the anti-democratic teacher’s death as being in some sense warranted.

What Bradatan does differently is that he reads the death of Socrates itself as an argument to be interpreted—treating the moment of extinction itself as one would a proof. The historical Socrates is himself mute; though he appears in a few scenes of the dramatist Xenophon, and while he is the beguiling central character in the dialogues of his student Plato, no actual writing of the founder of Western philosophy itself exists. Considering Plato’s treatment of Socrates in dialogues like The Republic, The Symposium, The Crito, and The Apology, Bradatan writes that the teacher’s “voice is authoritative, compelling, commanding, almost annoyingly so. Yet his silences—whenever they occur…these silences can be unbearable. This is Socrates at his most uncanny.” With all of the words of Socrates that we have being mediated through the mouth of another, Bradatan rather returns to his death as the ultimate silence, a masterpiece to be read as rigorously as any dialogue of Plato’s.

In Socrates’s willing execution, Bradatan sees a consistency of purpose and a representation of how the philosopher argued we should live our lives. Had Socrates capitulated to the court, had he admitted to wrong-doing and served a lighter sentence, it would be to invalidate his own teaching. Such consistency of purpose is something that united the martyrs he examines, no matter how different their actual thought may have been. Bradatan writes that “With the spectacle of their dying bodies alone they had to express whatever they could not communicate through all their rhetorical mastery…death was the most effective means of persuasion…such a death is a philosophical work in its own right—sometimes a masterpiece.”

When Bruno was hoisted aloft the burning green wood of Rome’s campo de fiori, he abjured the penitential crucifix presented by the tonsured Dominicans, offering back rather a blasphemous stream of invective in his thick Neapolitan accent, the proud heretic to the very end. And from his execution, though Bruno was himself an advocate for magic, occultism, and hermeticism, the Italian Enlightenment would find inspiration against the forces of Inquisition and superstition that had condemned him; his death becoming a far more potent argument than anything he ever actually wrote. More than a millennium before, and Hypatia made a very different argument after Christian zealots under the command of Alexandria’s bishop Cyril grabbed the Neo-Platonist mystic and mathematician, dragged her through the filthy streets of that cosmopolitan city and ultimately stripped her of her clothes, and then cut her very flesh from her body using either sharpened pottery shards or oyster shells depending on the account. Supposedly Hypatia uttered no words or cries. Hypatia’s silence had its own lessons as a woman in a culture that denies half the population its bodily autonomy, a rational mystic who thought the material world was fallen and an illusion. Though none of Hypatia’s writings survive, she has often been figured as the first feminist, her silent protest even as she was murdered making a powerful argument about the power of obstinate quiet. By contrast, More approached the infinite not with silence, but with wry British humor, the ever ironic and mutable founder of Utopia ascending the scaffold and telling his executioner, “See me safe up, as for my coming down, I can shift for myself.”

All of us will rendezvous with the eternal one day, and most philosophy professors will die in their beds. As Bradatan confesses, “No matter how hard we fight, how graceful our dance, how bold our stance, the end is always the same: total annihilation.” But these philosophers were those whose death was as an apotheosis, albeit accomplished in different ways: from Socrates’s lecturing to Hypatia’s silence, Bruno’s cursing to More’s joking, all in their varied ways confirming Bradatan’s contention that “the point is not to avoid death but to live without fear and humiliation before it comes.” Were that the only critical intervention of Dying for Ideas, it’d be powerful enough, but Bradatan makes his most important (and audacious) contribution concerning the societal importance of such sacrifice.

Drawing on the work of the French anthropologist Rene Girard, Bradatan applies what’s called the “scapegoat mechanism,” Girard’s contention that all of human civilization is propelled by the occasional sacrifice of some kind of innocent agent on whom all of the sins of the culture are imparted. This is most obvious in Christianity, as Girard writes of the “non-violent God who willingly becomes a victim in order to free us from our violence” in Evolution and Conversion: Dialogues on the Origins of Culture. As Christ was supposed to be a sacrifice who died for the world, so Bradatan argues that these philosophers’ deaths functioned as sacrifices for the truth, the radical honesty that Foucault calls parrhesia. The philosopher possesses a radical freedom and powerful powerlessness that is perhaps only matched by that of the jester; she has the agency to confront and compel the truth where others are mired in lies and delusions, and in her sacrifice the philosopher dies so that we may light ourselves out of our ignorance. All of Bradatan’s examples are partisans of parrhesia, and all took leave of this world at moments of profound social and cultural dislocation and crisis. Socrates was sacrificed by a democratic state extricating itself from years of tyranny, Hypatia skinned by a crowd that watched the eclipse of paganism and the ascendancy of Christianity, Bruno immolated upon pyres set by men whose religious certainties were challenged by the scientific revolution, More decapitated as Christendom was fractured by the Reformation, and Patočka tortured by a communist state in which the people no longer had any faith. Bradatan explains that scapegoats are needed in places where “An atmosphere of doom settles in, society is divided, the crisis persists. Everything seems to move in circles, the usual remedies don’t seem to work. Something radical is needed.” What we see in Socrates’s calm disputation or More’s scaffold stand-up act is not simply death, but the sacrifice of parrhesia so that the rest of us may know truth, an example of thinkers whom Bradatan describes as “mystics [who] are God’s traffickers” that “routinely smuggle bits of eternity into our corrupted world.”

So, I return to my phone, skimming accounts of collapsing ice shelfs and flooding river banks, reading articles about how humanity may have less than a few decades left, pushed to the brink by the irrational, insatiable hungers of our economic system and its supporting faith. As analysis, Bradatan offers us a claim about how some brave souls die for ideas, how their sacrifices are meant to illuminate those malignancies that threaten a society in collapse. Women and men like Bruno and Hypatia are set aside from the realm of the rest of us, they are, in the original sense of the word, sacred. It is the action of the sacred that, according to Bradatan, when it “erupts into our lives…its presence is unmistakable, its effects lasting, its memory haunting.” The martyred, these sacred women and men, are separate from us regular folk who are more content to scroll through Twitter by the pool rather than die for an idea. If ever a people needed a holy sacrifice uttering parrhesia on the scaffold, a sacred scapegoat shouting the truth from a pyre, it is our own. In the social media frenzy of orthographically-challenged White House tweets and the hermetic reading of Mueller investigation tea leaves, the media moved on from Buckel’s martyrdom in Prospect Park with disturbing promptness—just another horrific story receding deep into our news feeds, his radical honesty drowned out in the static of so much otherwise meaningless information permeating our apocalyptic age. Perhaps it’s time to listen to him, before its too late.

Image credit: Unsplash/Cullan Smith.

Is There a Poet Laureate of the Anthropocene?

“Annihilating all that’s made/To a green thought in a green shade.” -Andrew Marvell, 1681

Sometime in 1612, the genius dramatist, unofficial Poet Laurette, and all-around Falstaffian personality that was the writer Ben Jonson imagined a sort of epic voyage down London’s now-long-lost Fleet River. Jonson’s epic concerned neither the harrowing of hell, nor the loss of Paradise, or at least it didn’t do either in quite the manner that Dante had or that Milton would. Rather, Jonson envisioned the travails of his characters on this industrial Styx as less sacred and more profane, lacking transcendence but making up for it in the sheer fecundity of sewage that floated upon the canal that today flows underneath the bohemian environs of Camden Town, and whose tinkling can be heard through sewer grates in Clerkenwell.

In Jonson’s mock-epic “On the Famous Voyage,” the bucolic and pastoral have been replaced with offal and shit, where “Arses were heard to croak, instead of frogs,” and a thick crust called “Ycleped Mud” composed of excrement and refuse was known to bob like moss on the surface of the water. So disgusting are the noxious fumes from both commerce and latrine that Jonson’s fellow colleague, the poet Sir John Harrington, would write in his 1586 A New Discourse of a Stale Subject Called the Metamorphosis of Ajax that such smells “are two of those pains of Hell…and therefore I have endeavored in my poor buildings to avoid those two inconveniences as much as I may.” And so, at his manor of Kelston, he constructed the forerunner of the modern flushing toilet.

Conventions of pastoral poetry often had characters with names like Strephon and Chloe in repose upon halcyon shores; Jonson’s is rather a sort of anti-pastoral, one in keeping with the grime and dirt that increasingly defined his era.  On the Fleet River, “The sinks ran grease, and hair of measled hogs, /The heads, houghs, entrails, and the hides of dogs.” Flowing from the center green of the city out to the Thames, the Fleet was polluted with the garbage of nascent industry, a slick, oily stream; a fetid and beshitted, offal-filled cesspool, made glistening with the rendering of animal fat and the corpses of dogs and cats. Notorious prisons like Ludgate and Newgate were on the banks of that now-subterranean river; plague-ridden slums filled with rural transplants clung to the brown shores of the Fleet. 

In his delightful A Time Traveler’s Guide to Elizabethan England, social historian Ian Mortimer describes the privies that would have been in homes lining rivers like the Fleet: a “twelve-foot shaft of several hundred gallons of decomposing excrement and urine…seeping into the clay, for two or three years.” Mortimer reminds us, however, that though “Noisome smells and noxious fumes are common” in early modern England, this “does not mean that people do not notice them.” Indeed, both the population growth of London as well as the beginnings of mass industry, from leather tanning to wool dying, would have wafted new smells into the nostrils of the English. As Jonson wrote with infernal gleam, “Your Fleet Lane Furies…That, with still-scalding streams, make this place hell.”

Filth had been a topic of literary expression long before Jonson, one only read Geoffrey Chaucer, Francois Rabelais, or Giovani Boccaccio to know that poets have long sung not just of heaven, but of the asshole, the piss-bucket, and the privy as well. I’d venture that “On the Famous Voyage” does something a little bit different than the scatological depictions in The Canterbury Tales, however. Because Jonson’s London was so much bigger than Chaucer’s, because it was just beginning to be ensnared in the environmental degradations of industrialization, the scope of the olfactory and hygienic assault is greater than a Medieval writer could have imagined. “On the Famous Voyage” is satirical verse, yes; but it’s also an ecological poem.

Almost three centuries before the Romantic poet William Blake would castigate the “dark Satanic Mills” of Britain’s industrial revolution, Jonson gave expression to misgivings about how London was quickly erasing nature in the name of mercantile aspiration. Throughout the 16th-century, London expanded from the former sleepy agrarian capital of a sleepy agrarian kingdom into what would soon be the largest city on Earth. Around when Jonson was born, the city’s population was roughly 70,000 people; by the time he wrote “On the Famous Voyage,” it had grown to 200,000. Only a half-century later and London was home to half-a-million women and men. Emily Cockayne writes in Hubbub: Filth, Noise & Stench in England that “London was a wealthy bustling and expanding city, but infrastructural development could not keep pace and parts of the city became increasingly crowded, dirty and noisy.” Jonson didn’t just speak of dirt, he sang about waste; he didn’t just talk of filth, he was revulsed at garbage. “On the Famous Voyage” was among the first of what critics call ecopoems, and that’s because it’s an early missive from the beginning of our current geological epoch of the Anthropocene.

That term has become recently trendy in academic discussions of literature, a designation borrowed from geologists and climatologists to clarify the ways in which the Earth has been inextricably altered by humanity. With coinage frequently credited to the Nobel Prize-winning chemist Paul Crutzen, the Anthropocene is supposed to define our current era, when people have (mostly for worse) altered the environment of the world in such a way that we’ve become the dominate actor and force in the acidity of the ocean, the thickness of the ozone layer, the very temperature of the planet. Legal scholar Jedediah Purdy explains in After Nature: A Politics of the Anthropocene that “we have made the world our anthill: the geological layers we are now laying down on the earth’s surface are marked by our chemicals and other industrial emissions, the pollens of our crops, and the absence of the many species we have driven to extinction.”

Scientists disagree on when it’s appropriate to mark the beginnings of the Anthropocene. As the period is most spectacularly defined by anthropogenic climate catastrophe, the Industrial Revolution of the 19th century, with its harnessing of fossil fuels such as coal and oil, would seem an appropriate starting point. Others identify the Anthropocene’s creation moment as recently as the Trinity test of the first atomic bomb in 1945, to as long as 10 millennia ago when agriculture first emerged on the silty banks of the Tigris and Euphrates. At the risk of betraying my own early-modern-minded myopia, a credible case could be made for Jonson’s era as the dawn of the Anthropocene, which would have certain implications for how we read him and his compatriots in our own day, when the United Nations Intergovernmental Panel on Climate Change’s 2018 report concludes that we may have less than a decade to avert the worst results of global warming.

There are social, cultural, technological, and economic reasons for understanding the 16th and 17th centuries as the earliest decades of the Anthropocene. By Jonson’s birth there had already been a century of the Columbian Exchange, whereby the flora and fauna of the western and eastern hemispheres, which had been separated for millions of years, suddenly circumnavigated the world in a flurry of trade that radically altered the planet. Many of the economic and social trends that we associate with modernity—colonialism, globalization, and capitalism—see their start in Jonson’s century. The Renaissance also helps us to understand the interactions between climate and humanity, as the women and men of Jonson’s day were in the midst of what’s been called the “Little Ice Age.” During that period, temperatures plummeted, possibly due to the reforestation of North America brought about by plague and genocide that decimated native populations. Arguably, this process didn’t end until the cumulative effect of the Industrial Revolution’s mass emissions of carbon-dioxide began to warm the planet—obviously an ongoing process. During those years of snow and ice, Europe appeared radically different from the way it does today, as accounts of Tudor fairs upon the frozen Thames or the grey winter paintings of Peter Breughel attest. Philipp Blom in Nature’s Mutiny: How the Little Ice Age of the Long Seventeenth Century Transformed the West and Shaped the Present writes that “Climate change…affected everyone. There was no escaping the weather.” What’s crucial to remember is that though the thermometer’s mercury was headed in a different direction than it is today, the weather of Jonson’s day was also shaped profoundly by the affairs of people. 

An important result of humanity’s changed relationship to nature—the alteration that defines the Anthropocene—is the emergence of new urban spaces, new relationships between people and place that fundamentally changed the experience of what it means to be an inhabitant of Earth. Something new in “On the Famous Voyage:” Jonson has produced a literature that isn’t just about hygiene (or the lack thereof) but about mass pollution. For such lyrics to be written, the conditions of crowded, filthy, industrialized urbanity were required. “On the Famous Voyage” is about environmental collapse. Though rarely thought of as such, Jonson is an ecopoet at the precise moment in history when we redefined our relationship to nature—for the worse.  Which is precisely what begs for a reevaluation of not just Jonson, but that entire tribe of under-read 17th-century poets whom he influenced and that called themselves the “Tribe of Ben,” posterity remembering them (when it does) as the Cavalier poets. Writers like Robert Herrick, John Suckling, Thomas Carew, Richard Lovelace, and most famous of them, though only occasionally categorized in their company, Andrew Marvell. Editor Miriam K. Starkman writes in her introduction to 17th-Century English Poetry that for the Cavalier poets, “External nature is…[the] most direct referent, source of beauty, joy, and mutability.”

For the Cavaliers, the pastoralism of classical poets like Hesiod and Virgil had much to recommend. One of their favored genres was the “country-house poem,” where their ecological concerns become apparent. In Jonson’s 1616 “To Penshurst,” he described the manor of Sir Richard Sidney with a language very different from that which he deployed four years earlier in his panegyric to the Fleet River. In “To Penshurst” Jonson extols this estate with its “better marks, of soil, of air, /Of wood, of water,” this “lower land, that to the river bends, /Thy sheep, they bullocks, kine, and calves do feed;/The middle grounds they mares and horses breed.” Jonson’s is a rhetoric of Eden, with prelapsarian tongue he describes:

…thy orchard fruit, they garden flowers, hurstFresh as the air, and new as are the hours. The early cherry, with the later plum, Fig, grape, and quince, each in his time doth come;The blushing apricot and wooly peachHang on thy walls, that every child may reach.

Such prelapsarian evocation of Eden is a common
trope in country-house poems, what Starkman described as “vague overtones of a faded
fragrance, a world just lost.” Unlike the Puritan, the Cavalier does not simply
mourn paradises lost, but rather preserves a bit of that charged immanence
within nature as it is now, acknowledging for the possibility of transcendence
just below surface appearances. What the country-house poem presents is paradise
in verse, a lyric crafted by the human mind as surely as a garden is planted by
human hands, with the verse itself becoming a type of perfection that you can
step into. Consider Jonson’s clear influence in Marvell’s almost-perfect 1681 “Upon
Appleton House:”

Ripe apples drop about my head;The luscious clusters of the vineUpon my mouth do crush their wine;The nectarine and curious peachInto my hands themselves do reach;Stumbling on melons as I pass, Ensnar’d with flow’rs, I fall on grass.

By imagining a world without the fall, poems such as these query us with the possibility of a future where the fall has been reversed, where the exploitation of nature is ceased. “On the Famous Voyage,” with its bawdy, earthy, fecal corporality may seem a long distance from Penshurst Palace. Yet pastoralism and its discontents are actually part of the same project; the disjunct between depicting nature in its abundance as well as the exploitation of the environment share a similar ideological underpinning. Starkman explains that for the Cavaliers, there is a “stoical awareness of the tragedy of Nature, the garden of innocence violated by experience.” Whether writing about bucolic orchards or shit-contaminated rivers, whether talking of nature or its violation, what these poems take as their subject is the environment. What their critics might say they lack in ingenuity, the Cavaliers more than make up in ecological prescience.

Drawing inspiration from Jonson’s verse, the Cavaliers have historically (and with much reductionism) been made to contrast with the other dominant tradition of 17th-century English poetry, the metaphysical school influenced by John Donne, and including George Herbert, Henry Vaughan, and Thomas Traherne. Owing much to a distinction made by Dr. Johnson (of no relation to Ben) in his 1781 Lives of the Most Eminent English Poets, the author described the Cavalier as being concerned with “sprightliness and dignity,” a verse which “endeavors to be gay,” where the poetry is “liberally supplied with… soft images; for beauty is more easily found.”  By contrast, Dr. Johnson saw the metaphysicals as writing poetry where the “most heterogenous ideas are yoked by violence together; nature and art are ransacked for illustrations, comparisons, and illusions.” Something, perhaps, to be observed in the fact that where the Cavaliers were content to observe nature, the metaphysicals mined the environment for metaphors, as if they were a precious non-renewable resource hidden below the broken crust of the world. Poetry such as Donne’s was defined by the so-called “metaphysical conceit,” the deployment of a metaphor that was surprising and novel—Dr. Johnson’s “heterogenous ideas… yoked by violence together.”

If the Cavaliers were plain-spoken, the metaphysicals were sophisticated; the former literal and physical, the latter metaphorical and spiritual; the first were backward-looking pastoral conservatives, the second forward-looking aesthetic radicals. Not to mention the coming political and sectarian splits of the English civil wars, with the Cavaliers (true to their courtly name) associated with High Church religion while fighting on behalf of the Royalist cause. John Stubbs writes in Reprobates: The Cavaliers of the English Civil War that “the cavaliers were elegant gentlemen, chivalrous if sometimes dissipated,” though by contrast their political adversaries “the roundheads were religious and social revolutionaries.” Such a difference could presumably be seen in their writing. Cavalier verse lends itself to the almost pagan imagery of a poem like Herrick’s 1648 “The Argument to His Book,” which indeed could be read as an ars poetica for the entire tradition:

I sing of brooks, of blossoms, birds, and bowers:Of April, May, of June, and July-flower.I sing of Maypoles, hock-carts, wassails, wakesOf bride-grooms, brides, and of their bridal-cakes.I write of youth, of love and have accessBy these, to sing of cleanly-wantonness.

Nobody would mistake that sentiment for a Puritan ethos. Yet there is a simplicity to the traditional division; it implies that the Cavaliers lacked in Christianity (though Herrick was a priest), or that the metaphysicals lacked in sensuality—and anybody who has read Donne knows that that’s not the case. Literary historians often still teach the split between those two 17th-century literary traditions as an archetypal and Manichean struggle between abstraction and literalism, metaphysical sophistication and sentimental pastoralism. Despite the crudeness of such a formulation, there is a romanticism in understanding seventeenth-century poetry as divided between the head of the Puritan and the heart of the Cavalier. Scholar Earl Miner observes in an essay included in the Norton Critical Edition of Ben Jonson and the Cavalier Poets that the Cavalier ideal “reflects many things: a conservative outlook, a response to a social threat, classical recollections, love of a very English way of life, and a new blending of old ideas.”

Dr. Johnson, it should be said, cared not for the metaphysicals; his poetic conservatism and political royalism predisposed him to the Cavaliers, but this is a position that has not been commonly held for a very long time. For the literary modernists of the early 20th-century, the Cavaliers seemed naïve, sentimental, simple, pastoral; poet T.S. Eliot in his 1922 essay on the metaphysicals argued that “civilization comprehends great variety and complexity… The poet must become more and more comprehensive, more allusive, more indirect, in order to force, to dislocate if necessary, language into his meaning.” For Eliot and others like him, the metaphysicals with their ingenious and sophisticated rhetoric, their complexity and abstraction, were a model to be emulated, and the Cavaliers, well, not so much.

The result is that the Cavaliers have seen a critical eclipse over the course of the last 10 decades. The metaphysicals dwelled amongst the stars, but the Cavaliers were content to muck in the dirt, and perhaps to dwell upon the beauty of the rosebuds while they were there. The ideology of the Cavalier was seen as hopelessly archaic when confronting the complexity of modernity. Not for nothing, the Cavaliers—and their royalist political program—are associated with Maypoles and Mummer parades, feast-days and carnival, and all the rest of the lackadaisical accoutrement conflated with a Merry Old England swept aside by Puritanism and modern capitalism. The Cavalier is thus a figure of naïve romanticism, Stubbs writing of how “Everyone can picture him…with his lovelocks, his broad hat, his mantle and bucket-topped boots, the basked handled rapier at his side, a buskin covering his satin doublet.” It’s true that the Cavaliers were often aristocratic (though not always), often royalist (though some like Marvell equivocated with chameleon-like urgency depending on politics), and that their verse could be plain-spoken and conservative (though deceptively so).

But we need not abandon them because of their embrace of a royalist politics; we need not obscure them because they spoke not to Eliot, or because modernists didn’t find their verse sufficiently complicated. To slur the Cavaliers as “conservative” is to perform a political category mistake; it’s to impose the conditions of the present day onto a period where exact corollaries are impossible to find. Michael Schmidt writes in Lives of the Poets that the Cavaliers mark the “beginnings of a literary tradition that takes pastoral convention into the actual countryside and finds in the harmony between nature and nurture a civilizing theme.” They have at the core of their ethics an understanding about an inseparable connection between nature and humanity that is almost “pagan in attitude.” For all of their reputation as being steadfastly traditionalist, and as much as their enthusiasms for the Caroline regime strike us as reactionary, the Cavaliers’ embrace of nature does have a radical message, standing as it does in opposition to the environmental exploitation that in our current day takes us to the precipice of complete collapse.

Stubbs described how the “cavalier and the puritan are potent archetypes. The puritan upholds the work ethic and the will to give up pleasure, scourging the soul for flaws. In the cavalier we have the individualist, more attuned to the passing moment and in greater touch with his desires.” How could we not recommend them, the Cavaliers, standing as they did in opposition to positivism, Puritanism, and privatization—forces that threaten to destroy our world—at the precise moment when those forces first emerged? Often dismissed for lacking seriousness, for their enthusiasm for sport and drink, for their indulgence, foppery, and libertinism, could we not identify such values as precisely those that should be valorized? Could we not see in their celebration of fairs, feasts, festivals, flora, and fauna a denunciation of work, industry, commerce, and all the rest of the alienated soullessness that now threatens us with ecological collapse?

Now is the precise moment to consider the earliest body of eco-poems ever penned—at the moment when the Anthropocene dawned. There is as much of Extinction Rebellion in Cavalier poetry as there is royalism. In embracing nature, they rejected the Puritanism that threatens our world, and in the process, what emerged was a powerful aesthetic of “Anarchopastoralism.” Schmidt writes that a “long time must pass before an anachronism is released back into time,” but if ever there was a moment to embrace the radical ecopoetics of the Cavaliers and their Anarchopastoralism, it’s in our current warm winter of the late Anthropocene. Too often dismissed by the ruthless individualists of modernism as embarrassing throwbacks engaged in Medieval affectations, the Cavaliers actually offered a complex meditation on the relationship of humanity to nature, and how the violation of the later compels the same for the former.

What the modernists saw as so rightly evocative in metaphysical poetry—the abstraction, the ingenuity, the philosophical sophistication—is arguably the foundation of the very alienation that has so easily separated us from nature; an inadvertent capitulation to the inhuman perspective that treats both people and the environment as mere commodities. This is not to blame the metaphysicals—that would be absurd, and I’m too much in love with the verse of Donne and Herbert to ever countenance such a thing. Besides, I may argue that the Cavaliers are more than just charming, but it’d be a hard claim to count Lovelace the poetic equal of Donne. What the metaphysical poets did accomplish, however, is a certain achievement of abstraction; a product of the age that allowed for mechanistic metaphors for human anatomy, where the French philosopher Rene Descartes could argue contra all experience that animals are simply little machines. Such a perspective is one that hasn’t unsurprisingly pushed us deeper into the Anthropocene.

To court reductionism once again, it’s the Puritanism that’s so dangerous in the metaphysicals, but we might yet be saved by the paganism in the Cavaliers; we may yet find our proper relationship to what Herrick called the “civil wilderness.” Stubbs writes that the “puritan is more dominant in recent times, present in the astonishing intellectual and physical achievements of the modern era—achieved at crushing human cost.” Might we not find room for the Cavalier then? For theirs is a theology that Starkman described as “under the influences of Neo-Platonism,” a “sensibility…well on its way to secular transcendentalism,” where nature “is divine.”

Perhaps the most crucial, if most subtle, difference between the metaphysicals and the Cavaliers is in their approach towards time, mutability, finality, and death. With a touch of critical eccentricity, I claim that for the metaphysicals their approach to the hereafter is one of memento mori, but for that of the Cavalier it’s carpe diem. The first refers to the approach that asks a penitent to forever remember while they are alive that one day, they shall be dead; the second is the exhortation that because you’ll shall be dead one day, you must “Seize the day” in the present. Memento mori is the aesthetic of spoiled fruit and time-glasses depicted in Dutch vanitas paintings; it’s the winged skull on a Puritan’s grave. Carpe diem, by contrast, is the drained wine-glass, the chicken bone cleared of meat. Not necessarily mutually exclusive positions, but as aesthetics they differ by giving the metaphysicals a gloss of piety, prayer, and death-obsession; the Cavaliers one of a lusty embrace of the moment. Carpe diem is the convention that allows Herrick to implore virgins to “Gather ye rosebuds while ye may, /Old Time is still a-flying;/And this same flower that smiles today/To-morrow will be dying.”

While often simply read as an injunction to live life to the fullest, Stubbs correctly notes how this poem from Herrick’s 1648 Hesperides is “almost an austere lyric.” For lacking the apparent sobriety of memento mori, poems read as carpe diem are counter-intuitively more severe. Without claiming that any of the poets across both traditions were anything other than (mostly) orthodox Christians, the differences between a memento mori and a carpe diem perspective are crucial. While it would seem that the later would encourage us to live a life that could be wasteful, the opposite is actually true. Without the consolations of eternity, we are to make our lives as fit as possible while we’re actually living them, for when “now to the abyss I pass” (as Marvell wrote in 1651), any continued action becomes an impossibility.

What the ecopoetics of the Cavaliers offer us, in this era where (to repurpose a lyric of Carew), “the winters [are] gone, the earth hath lost/Her snow-whited robes, and now no more the frost/Candies the grasse, or casts an ycie creame/Vpon the silver Lake, or Chyrstall streame,” is a type of wisdom. Memento mori may ask us to reflect on the singularity of our own death, yet such a view presupposes a passing moment separating this life from the next, an entrance into eternity where all may be reconciled, all may be answered, all may be saved. But we can’t wait for such a moment of enlightenment, or for saviors other than ourselves to provide entrance into the next scene. Carpe diem, contrary to its reputation, does not necessarily hold such a naive faith. What “Gather ye rosebuds” reminds us of is not just our own mortality, but that of Arcadia as well. It’s an elegy for a dying world. The Cavalier intuits that the garden is not a symbol for anything higher; the garden is all that we have—and it’s good enough. Now our task is to preserve it.