This Isn’t the Essay’s Title

-

“The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.” —F. Scott Fitzgerald, “The Great Crack Up” (1936)

“You’re so vain, you probably think this song is about you.” —Carly Simon (1971)

On a December morning in 1947 when three fellows at Princeton’s Institute for Advanced Study set out for the Third Circuit Court in Trenton, it was decided that the job of making sure that the brilliant but naively innocent logician Kurt Gödel didn’t say something intemperate at his citizenship hearing would fall to Albert Einstein. Economist Oscar Morgenstern would drive, Einstein rode shotgun, and a nervous Gödel sat in the back. With squibs of low winter light, both wave and particle, dappled across the rattling windows of Morgenstern’s car, Einstein turned back and asked, “Now, Gödel, are you really well prepared for this examination?” There had been no doubt that the philosopher had adequately studied, but as to whether it was proper to be fully honest was another issue. Less than two centuries before, and the signatories of the U.S. Constitution had supposedly crafted a document defined by separation of powers and coequal government, checks and balances, action and reaction. “The science of politics,” wrote Alexander Hamilton in “Federalist Paper No. 9,” “has received great improvement,” though as Gödel discovered, clearly not perfection. With a completism that only a Teutonic logician was capable of, Gödel had carefully read the foundational documents of American political theory, he’d poured over the Federalist Papers and the Constitution, and he’d made an alarming discovery.

It’s believed that while studying Article V, the portion that details the process of amendment, Gödel realized that there was no safeguard against that article itself being amended. Theoretically, a sufficiently powerful political movement with legislative and executive authority could rapidly amend the articles of amendment so that a potential demagogue would be able to rule by fiat, all while such tyranny was perfectly constitutional. A paradox at the heart of the Constitution—something that supposedly guaranteed democracy having coiled within it rank authoritarianism. All three men driving to Trenton had a keen awareness of tyranny; all were refugees from Nazi Germany; all had found safe-haven on the pristine streets of suburban Princeton. After the Anschluss, Gödel was a stateless man, and though raised Protestant he was suspect by the Nazis and forced to emigrate. Gödel, with his wife, departed Vienna by the Trans-Siberian railroad, crossed from Japan to San Francisco, and then took the remainder of his sojourn by train to Princeton. His path had been arduous and he’d earned America, so when Gödel found a paradox at the heart of the Constitution, his desire to rectify it was born from patriotic duty. At the hearing, the judge asked Gödel how it felt to become a citizen of a nation where it was impossible for the government to fall into anti-democratic tyranny. But it could, Gödel told him, and “I can prove it.” Apocryphally, Einstein kicked the logician’s chair and ended that syllogism.

Born in Austria-Hungary, citizen of Czechoslovakia, Austria, Germany, and finally the United States, Gödel’s very self-definition was mired in incompleteness, contradiction, and unknowability. From parsing logical positivism among luminaries such as Rudolph Carnap and Moritz Schlick, enjoying apfelstrudel and espresso at the Café Reichsrat on Rathausplatz while they discussed the philosophy of mathematics, Gödel now rather found himself eating apple pie and weak coffee in the Yankee Doodle Tap Room on Nassau Street—and he was grateful.  Gone were the elegant Viennese wedding-cake homes of the Ringstrasse, replaced with Jersey’s clapboard colonials; no more would Gödel debate logic among the rococo resplendence of the University of Vienna, but at Princeton he was at least across the hall from Einstein. “The Institute was to be a new kind of research center,” writes Ed Regis in Who Got Einstein’s Office?: Eccentricity and Genius at the Institute for Advanced Study. “It would have no students, no teachers, and no classes,” the only responsibility being pure thought, so that its fellows could be purely devoted to theory. Its director J. Robert Oppenheimer (of Manhattan Project fame) called it an “intellectual hotel;” physicist Richard Feynman was less charitable, referring to it as a “lovely house by the woods” for “poor bastards” no longer capable of keeping up. Regardless, it was to be Gödel’s final home, and there was something to that.

Seventeen years before his trip to Trenton, and it was at the Café Reichsrat where he presented the discovery for which he’d forever be intractably connected—Gödel’s Incompleteness Theorems. In 1930 he had irrevocably altered mathematics when Gödel demonstrated that the dream of completism that had dogged deduction since antiquity was only a mirage. “Any consistent formal system,” argues Gödel in his first theorem, “is incomplete… there are statements of the language… which can neither be proved nor disproved.” In other words, it’s an impossibility that any set of axioms can be demonstrated to be true as part of a self-contained system—the rationalist dream of a unified, self-evidently provable system is only so much fantasy. Math, it turns out, will never be depleted, since there can never be a solution to all mathematical problems. In Gödel’s formulation, a system must either sometimes produce falsehoods, or it must sometimes generate unprovable truths, but it can never consistently render only completely provable truths. As the cognitive scientist Douglas Hofstadter explained in his countercultural classic Gödel, Escher, Bach: An Eternal Golden Braid, “Relying on words to lead you to the truth is like relying on an incomplete formal system to lead you to the truth. A formal system will give you some truth, but… a formal system, no matter how powerful—cannot lead to all truths.” In retrospect, the smug certainties of American exceptionalism should have been no match for Gödel, whose scalpel-like mind had already eviscerated mathematics, philosophy, and logic, to say nothing of some dusty parchment once argued over in Philadelphia.

His theorems rest on a variation of what’s known as the “Liar’s Paradox,” which asks what the logical status of a proposition such as “This statement is false” might be. If that sentence is telling the truth, then it must be false, but if it’s false, then it must be true, ad infinitum, in an endless loop. For Gödel, that proposition is amended to “This sentence is not provable,” with his reasoning demonstrating that a sufficiently formal system of logic can’t demonstrate that proposition, regardless of its truth value, since to prove the statement is to make it unprovable, but if unprovable, then it’s proved, again ad infinitum in yet another grueling loop. As with the Constitution and its paeans to democracy, so must mathematics be rendered perennially useful while still falling short of perfection. The elusiveness of certainty bedeviled Gödel throughout his life; a famously paranoid man, the assassination of his friend Schlick by a Nazi student in 1936 pushed the logician into a scrupulous anxiety. After the death of his best friend Einstein in 1955 he became increasingly isolated. “Gödel’s sense of intellectual exile deepened,” explains Rebecca Goldstein in Incompleteness: The Proof and Paradox of Kurt Gödel. “The young man in the dapper white suit shriveled into an emaciated man, entombed in a heavy overcoat and scarf even in New Jersey’s hot humid summers, seeing plots everywhere… His profound isolation, even alienation, from his peers provided fertile soil for that rationality run amuck which is paranoia.” When his beloved wife fell ill in 1977, Gödel quit eating since she could no longer prepare his meals. The ever-logical man whose entire career had demonstrated the fallibility of rationality had concluded that only his wife could be trusted not to poison his food, and so when she was unable to cook, he properly reasoned (by the axioms that were defined) that it made more sense to simply quit eating. When he died, Gödel weighed only 50 pounds.

Gödel’s thought was enmeshed in that orphan of logic that we call paradox. As was Einstein’s, that man who converted time into space and space into time, who explained how energy and mass were the same thing so that (much to his horror) the apocalyptic false dawn of Hiroshima was the result. Physics in the 20th century had cast off the intuitive coolness of classical mechanics, discovering that contradiction studded the foundation of reality. There was Werner Heisenberg with his uncertainty over the location of individual subatomic particles, Louis de Broglie and the strange combination of wave and particle that explained the behavior of light, Niels Bohr who understood atomic nuclei as if they were smeared across space, and the collapsing wave functions of Erwin Schrödinger for whom it could be imagined that a hypothetical feline was capable of being simultaneously alive and dead. Science journalist John Gribbin explains in Schrödinger’s Kittens and the Search for Reality: Solving the Quantum Mysteries that contemporary physics is defined by “paradoxical phenomena as photons (particles of light) that can be in two places at the same time, atoms that go two ways at once… [and how] time stands still for a particle moving at light speed.” Western thought has long prized logical consistency, but physics in the 20th century abolished all of that in glorious absurdity, and from those contradictions emerged modernity—the digital revolution, semiconductors, nuclear power, all built on paradox.

The keystone of classical logic is the so-called “Law of Non-Contradiction.” Simply put, something cannot both be and not be what it happens to be simultaneously, or if symbolic logic is your jam:  ¬(p ∧ ¬p), and I promise you that’s the only formula you will see in this essay. Aristotle said that between two contradictory statements one must be correct and the other false—”it will not be possible to be and not to be the same thing” he writes in The Metaphysics—but the anarchic potential of the paradox greedily desires truth and its antecedents. And again, in the 17th century the philosopher Wilhelm Gottfried Leibnitz tried to succinctly ward off contradiction in his New Essays on Human Understanding when he declared, “Every judgement is either true or false,” and yet paradoxes fill the history of metaphysics like landmines studded across the Western Front. Paradox is the great counter-melody of logic—it is the question of whether an omnipotent God could will Himself unable to do something, and it’s the eye-straining M.C. Escher lithograph “Waterfall” with its intersecting Penrose triangles showing a stream cascading from an impossible trough. Paradox is the White Queen’s declaration in Lewis Carroll’s Through the Looking Glass that “sometimes I believed as many as six impossible things before breakfast,” and the Church Father Tertullian’s creedal statement that “I believe it because it is absurd.” The cracked shadow logic of our intellectual tradition, paradox is confident though denounced by philosophers as sham-faced; it is troublesome and not going anywhere. When a statement is made synonymous with its opposite, then traditional notions of propriety are dispelled and the fun can begin. “But one must not think ill of the paradox,” writes Søren Kierkegaard in Philosophical Fragments, “for the paradox is the passion of thought, and the thinker without the paradox is like the lover without passion: a mediocre fellow.”

As a concept, it may have found its intellectual origin on the sunbaked, dusty, scrubby, hilly countryside of Crete. The mythic homeland of the Minotaur, who is man and beast, human and bull, a walking, thinking, raging horned paradox covered in cowhide and imprisoned within the labyrinth. Epimenides, an itinerant philosopher some seven centuries before Christ, supposedly said that “All Cretans are liars” (St. Paul actually quotes this assertion in his epistle to Titus). A version of the aforementioned Liar’s Paradox thus ensues. If Epimenides is telling the truth then he is lying, and if he is lying then he is telling the truth. This class of paradoxes has multiple variations (in the Middle Ages they were known as “insolubles”—the unsolvable). For example, consider two sentences vertically arranged; the upper one is written “The statement below is true” and the lower says “The statement above is false,” and again the reader is caught in a maddening feedback loop. Martin Gardner, who for several decades penned the delightful “Mathematical Games” column in Scientific American, asks in Aha! Gotcha: Paradoxes to Puzzle and Delight, “Why does this form of the paradox, in which a sentence talks about itself, make the paradox clearer? Because it eliminates all ambiguity over whether a liar always lies and a truth-teller always tells the truth.” The paradox is a function of language, and in that way is the cousin to tautology, save for the former describing propositions that are always necessarily both true and false.

Some intrinsic meaning is elusive in all of this this, so that it would be easy to reject all of it as rank stupidity, but paradoxes provide a crucial service. In paradox, we experience the breakdown of language and of literalism. Whether or not paradoxes are glitches in how we arrange our words or due to something more intrinsic, they signify a null-space where the regular ways of thinking, of understanding, of writing, no longer hold. Few crafters of the form are as synonymous with paradox as the fifth-century BCE philosopher Zeno of Elea. Consider his famed dichotomy paradox, wherein Zeno concludes that motion itself must be impossible, since the movement from point A to point B always necessitates a halving of distance, forever (and so the destination itself can never be reached). Or his celebrated arrow paradox, wherein Aristotle explains in Physics that “If everything when it occupies an equal space is at rest at that instant of time, and if that which is in location is always occupying such a space at any moment, the flying arrow is therefore motionless at that instant of time and at the next instant of time.” And yet the arrow still moves. Roy Sorenson explains in A Brief History of the Paradox that the form “developed from the riddles of Greek folklore” (as with the Sphinx’s famous query in Sophocles’s Oedipus Rex), so that words have always mediated these conundrums, while Anthony Gottlieb writes in The Dream of Reason: A History of Philosophy from the Greeks to the Renaissance that “ingenious paradoxes… try to discredit commonsense views by demonstrating that they lead to unacceptable consequences,” in a gambit as rhetorical as it is analytical. Often connected primarily with mathematics and philosophy, paradox is fundamentally a literary genre, and one ironically (or paradoxically?) associated with the failure of language itself. All of the great authors of paradox—the pre-Socratics, Zen masters, Jesus Christ—were at their core storytellers, they were writers. Words stretched to incomprehension and narrative unspooling is their fundamental medium. Epimenides’s utterance triggers a collapse of meaning, but where the literal perishes there is room made for the figurative. Paradox is the mother of poetry.

I’d venture that the contradictions of life are the subject of all great literature, but paradoxes appear in more obvious forms, too. “There was only one catch and that was Catch-22,” writes Joseph Heller. The titular regulation of Heller’s Catch-22 concerned the mental state of American pilots fighting in the Mediterranean during the Second World War, with the policy such that if somebody requests that they don’t want to fly a mission because of mental infirmity, they’ve only demonstrated their own sanity, since anyone who would want to fly must clearly be insane, so that it’s impossible to avoid fighting. The captain was “moved very deeply by the absolute simplicity of this clause of Catch-22 and let out a respectful whistle.” Because politics is often the collective social function of reducto ad absurdum, political novels make particularly adept use of paradox. George Orwell did something similar in his celebrated (and oft-misinterpreted) novel of dystopian horror 1984, wherein the state apparatus trumpets certain commandments, such as “War is peace. /Freedom is slavery. /Ignorance is strength.” Perhaps such dialectics are the (non-Marxist) socialist Orwell’s parody of Hegelian double-speak, a mockery of that supposed engine of human progress that goes through thesis-antithesis-synthesis. Within paradox there is a certain freedom, the ability to understand that contradiction is an attribute of our complex experience, but when statements are also defined as their opposite, meaning itself can be the casualty. Paradox understood as a means to enlightenment bestows anarchic freedom; paradox understood as a means unto itself is nihilism.

Political absurdities are born out of the inanity of rhetoric and the severity of regulation, but paradox can entangle not just society, but the fabric of reality as well. Science fiction is naturally adept at examining the snarls of existential paradox, with time travel a favored theme. Paul Nahin explains in Time Machines: Time Travel in Physics, Metaphysics, and Science Fiction that temporal paradoxes are derived from the simple question of “What might happen if a time traveler changed the past?” This might seem an issue entirely of hermetic concern, save for in contemporary physics neither general relativity nor quantum mechanics preclude time travel (indeed certain interpretations of those theories downright necessitate it). So even the idea of being able to move freely through past, present, and future has implications for how reality is constituted, whether or not we happen to be the ones stepping out of the tesseract. “The classic change-the-past paradox is, of course, the so-called grandfather paradox,” writes Nahin, explaining that it “poses the question of what happens if an assassin goes back in time and murders his grandfather before his (the time-travelling murderer’s) own father is born.” The grandfather’s murder requires a murderer, but for the murderer in question to be born there is also the requirement that the grandfather not be murdered, so that the murderer is able to travel back in time and kill his ancestor, and again we’re in a strange loop.

Variations exist as far back as the golden age of the pulps, appearing in magazines like Amazing Stories as early as 1929. More recently, Ray Bradbury explored the paradox in “A Sound of Thunder,” where he is explicit about the paradoxical implications that any travel to the past will alter the future in baroque ways, with a 21st century tourist accidentally killing a butterfly in the Cretaceous, leading to the election of an openly fascistic U.S. president millions of years into the future (though the divergence of parallel universes is often proffered as a means of avoiding such implications). In Bradbury’s estimation, every single thing in history, every event, every incident, is “an exquisite thing,” so that a “small thing that could upset balances and knock down a line of small dominoes and then big dominoes and then gigantic dominoes all down the years across Time.” This conundrum need not only be phrased in patricidal terms, for what all temporal paradoxes have at their core is an issue of causality—if we imagine that time progresses from past through future, then what happens when those terms get all mixed up? How can we possibly understand a past that’s influenced by a future that in turn has been affected by the past?

Again, no issue of scholastic quibbling, for though we experience time as moving forward like one of Zeno’s arrows, the physics itself tells us that past, present, and future are constituted in entirely stranger ways. One version of the grandfather paradox involves, rather than grisly murder, the transfer of information from the future to the past; for example, in Tim Powers’s novel The Anubis Gates, a time traveler is stranded in the early 19th century. The character realizes that “I could invent things—the light bulb, the internal combustion engine… flush toilets.” But he abandons this hubris, for “any such tampering might cancel the trip I got here by, or even the circumstances under which my mother and father met.” Many readers will perhaps be aware of temporal paradoxes from the Robert Zemeckis Back to the Future film trilogy (which for what they lack in patricide they make up for in Oedipal sentiments), notably a scene in which Marty McFly inadvertently introduces Chuck Berry to his own song “Johnny B. Goode.” Ignoring the troubling implications that a suburban white teenager had to somehow teach the Black inventor of rock ‘n’ roll his own music, Back to the Future presents a classic temporal paradox—if McFly first heard “Johnny B. Goode” from Berry records, and Berry first heard the song from McFly, then from whence was the song actually composed? (Perhaps from God).

St. Augustine asks in The City of God “What is time, then? If nobody asks me, I know; but if I were desirous to explain it to one who should ask me, I plainly do not know.” Paradox sprouts from the fertile soil of our own incomprehension, and to its benefit there is virtually nothing that humans really understand, at least not really. Time is the oddest thing of all, if we honestly confront the enormity of it. I’m continually surprised that I can’t easily walk into 1992 as if it were a room in my house. No surprise then that time and space are so often explored in the literature of paradox. Oxymoron and irony are the milquetoast cousins of paradox, but poetry at its most polished, pristine, and adamantine elevates contradiction into an almost religious principle. Among the 17th-century poets who worked in the stead of John Donne, paradox was often a central aspect of what critics have called a “metaphysical conceit.” These brilliant, crystalline, rhetorical turns are often like Zeno’s paradoxes rendered into verse, expanding and compressing time and space with a dialectical glee. An example of this from the good Dr. Donne, master of both enigma and the erotic, who in his poem “The Good-Morrow” imagined two lovers for whom they have made “one little room an everywhere.” The narrator and the beloved’s bed-chamber—perhaps there is heavy wooden paneling on the wall and a canopy bed near a fireplace burning green wood, a full moon shining through the mottled crown glass window—are as if a singularity where north, south, east and west; past, present, and future; are all collapsed into a point. Even more obvious is Donne in “The Paradox,” wherein he writes that “Once I loved and died; and am now become/Mine epitaph and tomb;/Here dead men speak their last, and so do I,” the talking corpse its own absurdity made flesh.

So taken were the 20th-century scholars known as the New Critics with the ingenuity of metaphysical conceits that Cleanth Brooks would argue in his classic The Well Wrought Urn: Studies in the Structure of Poetry that the “language of poetry is the language of paradox.” Donne and Andrew Marvell, George Herbert, and Henry Vaughan used paradox as a theme and a subject—but to write poetry itself is paradoxical. To write fiction is paradoxical. Even to write nonfiction is paradoxical. To write at all is paradoxical. A similar sentiment concerning the representational arts is conveyed in the Belgian surrealist painter Rene Magritte’s much parodied 1929 work “The Treachery of Images.” Magritte presents an almost absurdly recognizable smoking pipe, polished to a totemistic brown sheen with a shiny black mouth piece, so basically obvious that it might as well be from an advertisement, and beneath it he writes in cursive script “Ceci n’est pas une pipe”—”This is not a pipe.” A seeming blatant contradiction, for what could the words possibly relate to other than the picture directly above them? But as Magritte told an interviewer, “if I had written on my picture ‘This is a pipe,’ I’d have been lying!” For you see, Magritte’s image is not a pipe, it is an image of a pipe. Like Zeno’s paradoxes, what may first initially seem to be simple-minded contrarianism, a type of existential trolling if you will, belies a more subtle observation. The philosopher Michel Foucault writes in his slender volume This Is Not a Pipe that “Contradiction could exist only between two statements,” but that in the painting “there is clearly but one, and it cannot be contradictory because the subject of the propositions is a simple demonstrative.” According to Foucault, the picture, though self-referential, is not a paradox in the logical sense of the word. And yet there is an obvious contradiction between the viewer’s experience of the painting, and the reality that they’ve not looked upon some carefully carved and polished pipe, but rather only brown and black oil carefully applied to stretched canvas.

This, then, is the “treachery” of which Magritte speaks, the paradox that is gestated within that gulf where meaning resides, a valley strung between the-thing-in-itself and the way in which we represent the-thing-in-itself. Writing is in some ways even more treacherous than painting, for at least Magritte’s picture looks like a pipe—perhaps other than some calligraphic art, literature appears as nothing so much as abstract squiggles. Moby-Dick is not a whale and Jay Gatsby is not a man. They are less than a picture of a pipe, for we have not even images of them, only ink-stained books, and the abject abstraction of mere letters. And yet the paradox is that from that nothingness is generated the most sumptuous something; just as the illusion of painting can trick one into the experience of the concrete, so does the more bizarre phenomenon of the literary imagination make you hallucinate characters that are generated from the non-figurative alphabet. From this essay, if I’ve done even a somewhat adequate job, you’ve hopefully been able to envision Gödel and Einstein bundled into a car on the Jersey turnpike, windows frosted with nervous breath and laughter, the sun rising over the wooded Pine Barrens—or to imagine John and Anne Donne bundled together under an exquisite blanket of red and yellow and blue and green, the heavy oak door of their chamber closed tight against the English frost—but of course you’ve seen no such thing. You’ve only skimmed through your phone while sitting on the toilet, or toggled back and forth between open tabs on your laptop. Literature is paradoxical because it necessitates the invention of entire realities out of the basest nothing; the treachery of representation is that “This is not a pipe” is a principle that applies to absolutely all of the written word, and yet when we read a novel or a poem we can smell the burning tobacco.

All of literature is a great enigma, a riddle, a paradox. What the Zen masters of Japanese Buddhism call a kaon. Religion is too often maligned for being haunted by the hobgoblin straw-man of consistency, and yet the only real faith is one mired in contradiction, and few practices embrace paradox quite like Zen. Central to Zen is the breaking down of the dualities that separate all of us from absolute being, the distinction between the I and the not-I. As a means to do this, Zen masters deploy the enigmatic stories, puzzles, sayings, and paradoxes of kaon, with the goal of forcing the initiate toward the para-logical, a catalyst for the instantaneous enlightenment known as satori. Sometimes reduced to the “What is the sound of one-hand clapping?” variety of puzzle (though that is indeed a venerable kaon), the monk and master D.T. Suzuki explains in An Introduction to Zen Buddhism that these apparently “paradoxical statements are not artificialities contrived to hide themselves behind a screen of obscurity; but simply because the human tongue is not an adequate organ for expressing the deepest truth of Zen, the latter cannot be made the subject of logical exposition; they are to be experienced in the inmost soul when they become for the first time intelligible.” A classic kaon, attributed to the ninth-century Chinese monk Linji Yixuan, famously says “If you meet the Buddha, kill him.” Linji’s point is similar to Magritte’s—”This is not the Buddha.” It’s a warning about falling into the trap of representation, of refusing to resist the treachery of images, and yet the paradox is that the only way we have of communicating is through the fallible, inexact, medium of words. Zen is the only religion whose purpose is to overcome religion, and everything else for that matter. It asks us to use its paradoxes as a ladder to which we can climb toward ultimate being—and then we’re to kick that ladder over. In its own strange way, literature is the ultimate kaon, all of these novels and plays, poems and essays, all words, words, words meaning nothing and signifying everything, gesturing towards a Truth beyond truth, and yet nothing but artfully arranged lies (and even less than that, simply arrayed squiggles on a screen). To read is to court its own type of enlightenment, of transcendence, and not just because of the questions literature raises, but because of literature’s very existence in the first place.

Humans are themselves the greatest of paradoxes: someone who is kind can harbor flashes of rage, the cruelest of people are capable of genuine empathy, our greatest pains often lead to salvation and we’re sometimes condemned by that which we love. In a famous 1817 letter to his brothers, the English Romantic poet John Keats extolled the most sublime of literature’s abilities that was to dwell in “uncertainties, mysteries, doubts, without any irritable reaching after fact and reason,” a quality that he called “negative capability.” An irony in our present’s abandonment of nuance, for ours is a paradoxical epoch through and through—an era of unparalleled technological superiority and appalling barbarity, of instantaneous knowledge and virtually no wisdom. A Manichean age as well—which valorizes consistency above all other virtues, though it is that most suburban of values—yet Keats understood that if we’re to give any credit to literature, and for that matter any credit to people, we must be comfortable with complexity and contradiction. Negative capability is what separates the moral from the merely didactic. In all of our baroque complexity, paradox is the operative mode of literature, the only rhetorical gambit commensurate with displaying the full spectrum of what it means to be a human. We are all such glorious enigmas—creatures of finite dimension and infinite worth. None of us deserve grace, and yet all of us are worthy of it, a moral paradox that makes us beautiful not in spite of its cankered reality, but because of it. The greatest of paradoxes is that within that contradictory form, there is the possibility of genuine freedom—of liberation.

Image Credit: Wikipedia

Circles of the Damned

-

Maybe during this broiling summer you’ve seen the footage—in one striking video, women and men stand dazed on a boat sailing away from the Greek island of Evia, watching as ochre flames consume their homes in the otherwise dark night. Similar hellish scenes are unfolding in Algeria, Tunisia, and Libya, as well as in Turkey and Spain. Currently Siberia is experiencing the largest wildfire in recorded history, an unlikely place for such a conflagration, joined by large portions of Canada. As California burns, the global nature of our immolation is underscored by horrific news around the world, a demonstration of the United Nation’s Intergovernmental Report on Climate Change’s conclusion that such disasters are “unequivocally” anthropogenic, with the authors signaling a “code red” for the continuation of civilization. On Facebook, the mayor of a town in Calabria mourned that “We are losing our history, our identity is turning to ashes, our soul is burning,” and though he was writing specifically about the fires raging in southern Italy, it’s a diagnosis for a dying world as well.

Seven centuries ago, another Italian wrote in The Divine Comedy, “Abandon all hope ye who enter here,” which seems just as applicable in 2021 as it did in 1221. That exiled Florentine had similar visions of conflagration, describing how “Above that plain of sand, distended flakes/of fire showered down; their fall was slow –/as snow descends on alps when no wind blows… when fires fell, /intact and to the ground.” This September sees the 700th anniversary of both the completion of The Divine Comedy and the death of its author Dante Alighieri. But despite the chasms of history that separate us, his writing about how “the never-ending heat descended” holds a striking resonance. During our supposedly secular age, the singe of the inferno feels hotter when we’ve pushed our planet to the verge of apocalyptic collapse. Dante, you must understand, is ever applicable in our years of plague and despair, tyranny and treachery.

People are familiar with The Divine Comedy’s tropes even if they’re unfamiliar with Dante. Because all of it—the flames and sulphur, the mutilations and the shrieks, the circles of the damned and the punishments fitted to the sin, the descent into subterranean perdition and the demonic cacophony—find their origin with him. Neither Reformation nor revolution has dispelled the noxious fumes from inferno. There must be a distinction between the triumphalist claim that Dante says something vital about the human condition, and the objective fact that in some ways Dante actually invented the human condition (or a version of it).  When watching the videos of people escaping from Evia, it took me several minutes to understand what it was that I was looking at, and yet those nightmares have long existed in our culture, as Dante gave us a potent vocabulary to describe Hell centuries ago. For even to deny Hell is to deny something first completely imagined by a Medieval Florentine.

Son of a prominent family, enmeshed in conflicts between the papacy and the Holy Roman Empire, a respected though ultimately exiled citizen, Dante resided far from the shores of modernity, though the broad contours of his poem, with its visceral conjurations of the afterlife, are worth repeating. “Midway upon the journey of our life/I found myself within a forest dark,” Dante famously begins. At the age of 35, Dante descends into the underworld, guided by the ancient Roman poet Virgil and sustained by thoughts of his platonic love for the lady Beatrice. That Inferno constitutes only the first third of The Divine Comedy—subsequent sections consider Purgatory and Heaven—yet that it is the most read, speaks to something cursedly intrinsic in us. The poet descends like Orpheus, Ulysses, and Christ before him deep into the underworld, journeying through nine concentric circles, each more brutal than the previous. Perdition is a space of “sighs and lamentations and loud cries” filled with “Strange utterances, horrible pronouncements, /accents of anger, words of suffering, /and voice shrill and faint and beating hands” who are buffeted “forever through that turbid, timeless air, /like sand that eddies when a whirlwind swirls.” Cosmology is indistinguishable from ethics, so that each circle was dedicated to particular sins: the second circle is reserved for crimes of lust, the third to those of gluttony, the fourth to greed, the wrathful reside in the fifth circle, the sixth is domain of the heretics, the seventh is for the violent, all those guilty of fraud live in the eighth, and at the very bottom that first rebel Satan is eternally punished alongside all traitors.

Though The Divine Comedy couldn’t help but reflect the concerns of Dante’s century, he still formulated a poetics of damnation so tangible and disturbing that it’s still the measure of hellishness, wherein he “saw one/Rent from the chin to where one breaks wind. /Between his legs were hanging down his entrails;/His heart was visible, and the dismal sack/That makes excrement of what is eaten.” Lest it be assumed that this is simply sadism, Dante is cognizant of how gluttony, envy, lust, wrath, sloth, covetousness, and pride could just as easily reserve him a space. Which is part of his genius; Dante doesn’t just describe Hell, which in its intensity provides an unparalleled expression of pain, but he also manifests a poetry of justice, where he’s willing to implicate himself (even while placing several of his own enemies within the circles of the damned). 

No doubt the tortures meted out—being boiled alive for all eternity, forever swept up in a whirlwind, or masticated within the freezing mouth of Satan—are monstrous. The poet doesn’t disagree—often he expresses empathy for the condemned. But the disquiet that we and our fellow moderns might feel is in part born out of a broad theological movement that occurred over the centuries in how people thought about sin. During the Reformation, both Catholics and Protestants began to shift the model of what sin is away from the Seven Deadly Sins, and towards the more straightforward Ten Commandments. For sure there was nothing new about the Decalogue, and the Seven Deadly Sins haven’t exactly gone anywhere, but what took hold—even subconsciously—was a sense that sins could be reduced to a list of literal injunctions. Don’t commit adultery, don’t steal, don’t murder. Because we often think of sin as simply a matter of broken rules, the psychological acuity of Dante can be obscured. But the Seven Deadly Sins are rather more complicated—we all have to eat, but when does it become gluttony? We all have to rest, but when is that sloth?

An interpretative brilliance of the Seven Deadly Sins is that they explain how an excess of otherwise necessary human impulses can pervert us. Every human must eat; most desire physical love; we all need the regeneration of rest—but when we slide into gluttony, lust, sloth, and so on, it can feel as if we’re sliding into the slime that Dante describes. More than a crime, sin is a mental state which causes pain—both within the person who is guilty and to those who suffer because of those actions. In Dante’s portrayal of the stomach-dropping, queasy, nauseous, never-ending uncertainty of the damned’s lives, the poet conveys a bit of their inner predicament. The Divine Comedy isn’t some punitive manual, a puritan’s little book of punishments. Rather than a catalogue of what tortures match which crimes, Dante’s book expresses what sin feels like. Historian Jeffrey Burton Russell writes in Lucifer: The Devil in the Middle Ages how in hell “we are weighted down by sin and stupidity… we sink downward and inward… narrowly confined and stuffy, our eyes gummed shut and our vision turned within ourselves, drawn down, heavy, closed off from reality, bound by ourselves to ourselves, shut in and shut off… angry, hating, and isolated.”

If such pain were only experienced by the guilty, that would be one thing, but sin effects all within the human community who suffer as a result of pride, greed, wrath, and so on. There is a reason why the Seven Deadly Sins are what they are. In a world of finite resources, to valorize the self above all others is to take food from the mouths of the hungry, to hoard wealth that could be distributed to the needy, to claim vengeance as one’s own when it is properly the purview of society, to enjoy recreation upon the fruits of somebody else’s labor, to reduce another human being to a mere body, to covet more than you need, or to see yourself as primary and all others as expendable. Metaphor is the poem’s currency, and what’s more real than the intricacies of how organs are pulled from orifices is how sin—that disconnect between the divine and humanity—is experienced. You don’t need to believe in a literal hell—I don’t—to see what’s radical in Dante’s vision. What Inferno offers isn’t just grotesque descriptions, increasingly familiar though they may be on our warming planet, but also a model of thinking about responsibility to each other in a connected world.

Such is the feeling of the anonymous authors of the 2018 anarchist manifesto The Invisible Committee—a surprise hit when published in France—who opined that “No bonds are innocent” in our capitalist era, for “We are already situated within the collapse of a civilization,” structuring their tract around the circles of Dante’s inferno. Since its composition, The Divine Comedy has run like a molten vein through culture both rarefied and popular; from being considered by T.S. Eliot, Samuel Becket, Primo Levi, and Dereck Walcott, to being referenced in horror films, comic books, and rock music. Allusion is one thing, but what we can see with our eyes is another—as novelist Francine Prose writes in The Guardian, those images of people fleeing from Greek wildfires are “as if Dante filmed the Inferno on his iPhone.” For centuries artists have mined Inferno for raw materials, but now in the sweltering days of the Anthropocene we are enacting it. To note that our present appears as a place where Hell has been pulled up from the bowels of the earth is a superficial observation, for though Dante presciently gives us a sense of what perdition feels like, he crucially also provided a means to identify the wicked.

Denizens of the nine circles were condemned because they worshiped the self over everybody else; now the rugged individualism that is the heretical ethos of our age has made man-made apocalypse probable. ExxonMobil drills for petroleum in the heating Arctic and the apocalyptic QAnon cult proliferates across the empty chambers of Facebook and Twitter; civil wars are fought in the Congo over the tin, tungsten, and gold in the circuit boards of the Android you’re reading this essay with, children in Vietnam and Malaysia sew the clothing we buy at The Gap and Old Navy, and the simplest request for people to wear masks so as to protect the vulnerable goes unheeded in the name of “freedom” as our American Midas Jeff Bezos barely flies to outer space while workers in Amazon warehouses are forced to piss in bottles rather than be granted breaks. Responsibility for our predicament is unequally distributed, those in the lowest circle are the ones who belched out carbon dioxide for profit knowing full well the effects, those who promoted a culture of expendable consumerism and valorized the rich at the expense of the poor. Late capitalism’s operative commandment is to pretend that all seven of the deadly sins are virtues. Literary scholar R.W.B. Lewis describes the “Dantean principle that individuals cannot lead a truly good life unless they belong to a good society,” which means that all of us are in a lot of trouble. Right now, the future looks a lot less like paradise and more like inferno. Dante writes that “He listens well who takes notes.” Time to pay attention.

Bonus Link:—Is There a Poet Laureate of the Anthropocene?

Image Credit: Wikipedia

Who’s There?: Every Story Is a Ghost Story

-

“One need not be a chamber to be haunted.” —Emily Dickinson

Drown Memorial Hall was only a decade old when it was converted into a field hospital for students stricken with the flu in the autumn of 1918. A stolid, grey building of three stories and a basement, Drown Hall sits half-way up South Mountain where it looks over the Lehigh Valley to the federal portico of the white-washed Moravian Central Church across the river, and the hulking, rusting ruins of Bethlehem Steel a few blocks away. Composed of stone the choppy texture of the north Atlantic in the hour before a squall, with yellow windows buffeted by mountain hawks and grey Pennsylvania skies. Built in honor of Lehigh University’s fourth president, a mustachioed Victorian chemistry professor, Drown was intended as a facility for leisure, exercise, and socialization, housing (among other luxuries) bowling alleys and chess rooms. Catherine Drinker Bowen enthused in her 1924 History of Lehigh University that Drown exuded “dignity and, at the same time, a certain at-home-ness to every function held there,” that the building “carries with it a flavor and spice which makes the hotel or country club hospitality seem thin, flat and unprofitable.” If Drown was a monument to youthful exuberance, innocent pluck, and boyish charm, then by the height of the pandemic it had become a cenotaph to cytokine storms. Only a few months after basketballs and Chuck Taylors would have skidded across its gymnasium floor, and those same men would lay on cots hoping not to succumb to the illness. Twelve men would die of the influenza in Drown.

After its stint as a hospital, Drown would return to being a student center, then the Business Department, and by the turn of our century the English Department. It was in that final purpose that I got to know Drown a decade ago, when I was working on my PhD. Toward the end of my first year, I had to go to my office in the dusk after-hour, when lurid orange light breaks through the cragged and twisted branches of still leafless trees in the cold spring, looking nothing so much like jaundiced fingers twisting the black bars of a broken cage, or like the spindly embers of a church’s burnt roof, fires still cackling through the collapsing wood.  I had to print a seminar paper for a class on 19th-century literature, and to then quickly adjourn to my preferred bar. When I keyed into the locked building, it was empty, silent save for the eerie neon hum of the never-used vending machines and the unnatural pooling luminescence of perennially flickering fluorescent lights in the stairwells at either end of the central hall. While in a basement computer lab, I suddenly heard a burst of noise upstairs come from one end of the hall rapidly progress towards the other—the unmistakable sound of young men dribbling a basketball. Telling myself that it must be the young children of one of the department’s professors, I shakily ascended. As soon as I got to the top the noise ceased. The lights were out. The building was still empty. Never has an obese man rolled down that hill quite as quickly as I did in the spring of 2011.

There are several rational explanations—students studying in one of the classrooms even after security would have otherwise locked up. Or perhaps the sound did come from some faculty kids (though to my knowledge nobody was raising adolescents at that time). Maybe there was something settling strangely, concrete shifting oddly or water rushing quickly through a pipe (as if I didn’t know the difference between a basketball game and a toilet flushing). When depleted of all explanations, I know what I heard and what it sounded like, and I still have no idea what it was. Nor is this the only ghost story that I could recount—there was the autumn of 2003 when walking back at 2 a.m. after the close of the library at Washington and Jefferson College, feet unsteady on slicked brown leaves blanketing the frosted sidewalk, that I noted an unnatural purple light emanating from a half-basement window of Macmillan Hall, built in 1793 (having been the encampment of Alexander Hamilton during the Whisky Rebellion) and the oldest university building west of the Alleghenies. A few seconds after observing the shining, I heard a high-pitched, unnatural, inhuman banshee scream—some kind of poltergeist cry—and being substantially thinner in that year I was able to book it quickly back to my dorm. Or in 2007 while I was living in Scotland, when I toured the cavernous subterranean vaults underneath the South Bridge between the Old and New towns of Edinburgh, and I saw a young chav, who decided to make obscene hand gestures within a cairn that the tour guide assured us had “evil trapped within it,” later break down as if he was being assaulted by some unseen specter. Then there was the antebellum farm house in the Shenandoah Valley that an ex-girlfriend lived in, one room being so perennially cold and eerie that nobody who visited ever wanted to spend more than a few minutes in it. A haunted space in a haunted land where something more elemental than intellect screams at you that something cruel happened there.

Paranormal tales are popular, even among those who’d never deign to believe in something like a poltergeist, because they speak to the ineffable that we feel in those arm-hair-raised, scalp-shrinking, goose-bumped-moments where we can’t quite fully explain what we felt, or heard, or saw. I might not actually believe in ghosts, but when I hear the dribbling of a basketball down an empty and dark hall, I’m not going to stick around to find out what it is. No solitary person is ever fully a skeptic when they’re alone in a haunted house. Count me on the side of science journalist Mary Roach, who in Spook: Science Tackles the Afterlife writes that “I guess I believe that not everything we humans encounter in our lives can be neatly and convincingly tucked away inside the orderly cabinetry of science. Certainly, most things can… but not all. I believe in the possibility of something more.” Haunting is by definition ambiguous—if with any certainty we could say that the supernatural was real it would, I suppose, simply be the natural.

Franco-Bulgarian philosopher Tzvetan Todorov formulated a critical model of the supernatural in his study The Fantastic: A Structural Approach to a Literary Genre, in which he argued that stories about unseen realms could be divided between the “uncanny” and the “marvelous.” The former are narrative elements that can ultimately be explained rationally, i.e., supernatural plot points that prove to be dreams, hallucinations, drug trips, hoaxes, illusions, or anything unmasked by the Scooby-Doo gang. The latter are things that are actually occult, supernatural, divine. When it’s unclear as to whether or not a given incident in a story is uncanny or marvelous, then it’s in that in-between space of the fantastic, which is the same place any ghostly experience has had to be honestly categorized. “The fantastic is that hesitation experienced by a person who knows only the laws of nature, confronting an apparently supernatural event,” writes Todorov, and that is a succinct description of my Drown anomaly. Were it to be simply uncanny, then I suppose my spectral fears would have been assuaged if upon my ascent I found a group of living young men playing impromptu pick-up basketball. For that experience to be marvelous, I’d have to know beyond any doubt that what I heard were actual spirits. As it is, it’s the uncertainty of the whole event—the strange, spooky, surreal ambiguity—that makes the incident fantastic. “What I’m after is proof,” writes Roach. “Or evidence, anyway —evidence that some form of disembodied consciousness persists when the body closes up shop. Or doesn’t persist.” I’ve got no proof or disproof either, only the distant memory of sweaty palms and a racing heart.

Ghosts may haunt chambers, but they also haunt books; they might float through the halls of Drown, but they even more fully possess the books that line that building’s halls. Traditional ghosts animate literature, from the canon to the penny dreadful, including what the Victorian critic Matthew Arnold grandiosely termed the “best which has been thought and said” as well as lurid paperbacks with their garish covers. We’re so obsessed with something seen just beyond the field of vision that vibrates at a frequency that human ears can’t quite detect—from Medieval Danish courts to the Overlook Hotel, Hill House to the bedroom of Ebenezer Scrooge—that we’re perhaps liable to wonder if there is something to ghostly existence. After all, places are haunted, lives are haunted, stories are haunted. Such is the nature of ghosts; we may overlook their presence, their flitting and meandering through the pages of our canonical literature, but they’re there all the same (for a place can be haunted whether you notice that it is or not).

How often do you forget that the work that is the greatest in the language is basically a ghost story? William Shakespeare’s Hamlet is fundamentally a good old-fashioned yarn about a haunted house (in addition to being a revenge tragedy and pirate tale). The famed soliloquy of the Danish prince dominates our cultural imagination, but the most cutting bit of poetry is the eerie line that begins the play: “Who’s there?” Like any good supernatural tale, Hamlet begins in confusion and disorientation, as the sentries Marcellus and Bernardo patrolling Elsinore’s ramparts first espy the silent ghost of the murdered king, with the latter uttering the shaky two-word interrogative. Can you imagine being in the audience, sometime around 1600 when it was a widespread belief that there are more things in heaven and earth than can be dreamt of in our philosophies, and hearing the quivering question asked in the darkness, faces illuminated by tallow candle, the sense that there is something just beyond our experience that has come from beyond? The status of Hamlet’s ghost is ambiguous; some critics have interpreted the specter as a product of the prince’s madness, others claim that the spirit is clearly real. Such uncertainty speaks to what’s fantastic about the ghost, as ambiguity haunts the play. Notice that Bernardo doesn’t ask “What’s there?” His question is phrased towards a personality with agency, even as the immaterial spirit of Hamlet’s dead father exists in some shadow-realm between life and death.

A ghost’s status was no trifling issue—it got to the core of salvation and damnation. Protestants wouldn’t believe that souls could wander the earth; they would either be rewarded in heaven or punished in hell, while ghosts must necessarily be demons. Yet Shakespeare’s play seems to make clear that Hamlet’s father has indeed returned, perhaps as an inhabitant of that way station known as purgatory, that antechamber to eternity whereby the ghost can ascend from the Bardo to skulk around Elsinore for the space of a prologue. Of course, when Shakespeare wrote Hamlet, ostensibly good Protestant that he was, he should have held no faith in purgatory, that abode of ghosts being in large part that which caused Luther to nail his theses to the Wittenberg Cathedral door. When the final Thirty-Nine Articles of the Church of England were ratified in 1571 (three decades before the play’s premier), it was Article 22 that declared belief in purgatory was a ” thing vainly invented, and grounded upon no warranty of scripture; but rather repugnant to the word of God.” According to Stephen Greenblatt’s argument from Hamlet in Purgatory, the ghost isn’t merely Hamlet’s father, but also a haunting from the not-so-distant Catholic past, which the official settlement had supposedly stripped away with rood screens and bejeweled altars. Elsinore’s haunting is not just that of King Hamlet’s ghost, but also of those past remnants that the reformers were unable to completely bury. Greenblatt writes that for Shakespeare purgatory “was a piece of poetry” drawn from a “cultural artery” whereby the author had released a “startling rush of vital energy.” There are a different set of ambiguities at play in Hamlet, not least of which is how this “spirit of health or goblin damned” is to be situated between orthodoxy and heresy. In asking “who” the ghost is, Bernardo is simultaneously asking what it is, where it comes from, and how such a thing can exist. So simple, so understated, so arresting is the first line of Hamlet that I’m apt to say that Bernardo’s question is the great concern of all supernatural literature, if not all literature. Within Hamlet there is a tension between the idea of survival and extinction, for though the prince calls death the “undiscovered country from whose bourn no traveler returns,” he himself must know that’s not quite right. After all, his own father came back from the dead (a role that Shakespeare played himself).

Shakespeare’s ghoul is less ambiguous than those of Charles Dickens, for the ghost of Jacob Marley who visits Ebenezer Scrooge in A Christmas Carol is accused of simply being an “undigested bit of beef, a blot of mustard, a crumb of cheese, a fragment of underdone potato. There’s more of gravy than of grave about you, whatever you are!” Note that the querying nature of the final clause ends in an exclamation rather than a question mark, and there’s no asking who somebody is, now only what they are. Because A Christmas Carol has been filtered through George C. Scott, Patrick Stewart, Bill Murray, and the Muppets, there is a tendency to forget just how terrifying Dickens’s novel actually is. The ultimately repentant Scrooge and his visitations from a trinity of moralistic specters offer up visions of justice that are gothic in their capacity to unsettle. The neuter sprite that is the Ghost of Christmas Past with holly and their summer flowers; hail-fellow-well-met Bacchus that is the Ghost of Christmas Present; and the grim memento mori visage of the reaper who is the Ghost of Christmas Yet to Come (not to mention Marley padlocked and in fetters). Dickens in the mold of Dante, for whom haunting is its own form of retribution, a means of purging us of our inequities and allowing for redemption. Andrew Smith writes in The Ghost Story 1840-1920: A Cultural History that “Dickens’s major contribution to the development of the ghost story lies in how he employs allegory in order to encode wider issues relating to history, money, and identity.” Morality itself—the awesome responsibility impinging on us every second of existence—can be a haunting. The literal haunting of Scrooge is more uncertain, for perhaps they’re gestated from madness, hallucination, nightmare, or as he initially said, indigestion. Dickens’s ghosts are ambiguous—as they always must be—but the didactic sentiment of A Christmas Carol can’t be.

Nothing is more haunting than history, especially a wicked one, and few tales are as cruel as that of the United States. Gild the national narrative all that we want, American triumphalism is a psychological coping mechanism. This country, born out of colonialism, genocide, slavery, is a massive haunted burial ground, and we all know that grave yards are where ghosts dwell. As Leslie Fiedler explained in Love and Death in the American Novel, the nation itself is a “gothic fiction, nonrealistic and negative, sadist and melodramatic—a literature of darkness and the grotesque.” America is haunted by the weight of its injustice; on this continent are the traces of the Pequod and Abenaki, Mohawk and Mohegan, Apache and Navajo whom the settlers murdered; in this country are the filaments of women and men held in bondage for three centuries, and from every tree hangs those murdered by our American monstrosity. That so many Americans willfully turn away from this—the Faustian bargain that demands acquiescence—speaks not to the absence of haunting; to the contrary, it speaks of how we live among the possessed still, a nation of demoniacs. William Faulkner’s observation in Requiem for a Nun that “The past is never dead. It’s not even past” isn’t any less accurate for being so omnipresent. No author conveyed the sheer depth and extent of American haunting quit like Toni Morrison, who for all that she accomplished must also be categorized among the greatest authors of ghost stories. To ascribe such a genre as that to a novel like Morrison’s Beloved is to acknowledge that the most accurate depictions of our national trauma have to be horror stories if they’re to tell the truth. “Anything dead coming back to life hurts”—there’s both wisdom and warning in Morrison’s adage.

Beloved’s plot is as chilling as an autumnal wind off of the Ohio River, the story of the former enslaved woman Sethe whose Cincinnati home is haunted by the ghost of her murdered child, sacrificed in the years before the Civil War to prevent her being returned to bondage in Kentucky. Canonical literature and mythology have explored the cruel incomprehension of infanticide—think of Euripides’s Medea—but Sethe’s not irrational desire to send her “babies where they would be safe” is why Beloved’s tragedy is so difficult to contemplate. When a mysterious young woman named Beloved arrives, Sethe becomes convinced that the girl is the spirit of her murdered child. Ann Hostetler, in her essay from the collection Toni Morrison: Memory and Meaning, writes that Beloved’s ghost “was disconcerting to many readers who expected some form of social or historical realism as they encountered the book for the first time.” She argues, however, that the “representation of history as the return of the repressed…is also a modernist strategy,” whereby “loss, betrayal, and trauma is something that must be exorcized from the psyche before healing can take place.” Just because we might believe ourselves to be done with ghosts doesn’t mean that ghosts are done with us. Phantoms so often function as the allegorical because whether or not specters are real, haunting very much is. We’re haunted by the past, we’re haunted by trauma, we’re haunted by history. “This is not a story to pass on,” Morrison writes, but she understands better than anyone that we can’t help but pass it on.

Historical trauma is more occluded in Stephen King’s The Shining, though that hasn’t stopped exegetes from interpreting the novel about homicide in an off-season Colorado resort as being concerned with the dispossession of Native American, particularly in the version of the story as rendered by director Stanley Kubrick. The Overlook Hotel is built upon a Native American burial ground, Navajo and Apache wall hangings are scattered throughout the resort. Such conjectures about The Shining are explored with delighted aplomb in director Rodney Ascher’s documentary Room 237 (named after the most haunted place in the Overlook), but as a literary critical question, there’s much that’s problematic in asking what any given novel, or poem, or movie is actually about; an analysis is on much firmer ground when we’re concerned with how a text works (a notably different issue).  So, without discounting the hypothesis that The Shining is concerned with the genocide of indigenous Americans, the narrative itself tells a more straightforward story of haunting, as a bevy of spirits drive blocked, recovering alcoholic writer and aspiring family destroyer Jack Torrance insane. Kubrick’s adaptation is iconic—Jack Nicholson as Torrance (with whom he shares a first name) breaking through a door with an axe; his son, young Danny Torrance, escaping through a nightmarish, frozen hedge-maze of topiary animals; the crone in room 237 coming out of the bathtub; the blood pouring from the elevators; the ghostly roaring ’20s speakeasy with its chilling bartender, and whatever the man in the boar costume was. Also, the twins.

Still, it could be observed that the only substantiated supernatural phenomenon is the titular “Shining” that afflicts both Danny and the Overlook’s gentle caretaker, Dick Halloran. “A lot of folks, they got a little bit of shine to them,” Dick explains. “They don’t even know it. But they always seem to show up with flowers when their wives are feelin blue with the monthlies, they do good on school tests they don’t even study for, they got a good idea how people are feelin as soon as they walk into a room.” As hyper-empathy, the shining makes it possible that Danny is merely privy to a variety of psychotic breaks his father is having rather than those visions being actually real. While Jack descends further and further into madness, the status of the spectral beings’ existence is ambiguous (a point not everyone agrees on, however). It’s been noted that in the film, the appearance of a ghost is always accompanied by that of a mirror, so that The Shining’s hauntings are really manifestations of Jack’s fractured psyche. Narrowly violating my own warning concerning the question of “about,” I’ll note how much of The Shining is concerned with Jack’s alcoholism, the ghostly bartender a psychic avatar of all that the writer has refused to face. Not just one of the greatest ghost stories of the 20th century, The Shining is also one of the great novels of addiction, an exploration of how we can be possessed by own deficiencies. Mirrors can be just as haunted as houses. Notably, when King wrote The Shining, he was in the midst of his own full-blown alcoholism, so strung out he barely remembers writing doorstopper novels (he’s now been sober for more than 30 years). As he notes in The Shining, “We sometimes need to create unreal monsters and bogies to stand in for all the things we fear in our real lives.”   

If Hamlet, A Christmas Carol, Beloved, and The Shining report on apparitions, then there are novels, plays, and poems that are imagined by their creators as themselves being chambers haunted by something in between life and death. Such a conceit offers an even more clear-eyed assessment of what’s so unsettling about literature—this medium, this force, this power—capable of affecting what we think, and see, and say as if we ourselves were possessed. As a trope, haunted books literalize a profound truth about the written word, and uneasily push us towards acknowledging the innate spookiness of language, where the simplest of declarations is synonymous with incantation. Richard Chambers’s collection of short stories The King in Yellow conjures one of the most terrifying examples of a haunted text, wherein an imaginary play that shares the title of the book is capable of driving its audience to pure madness. “Strange is the night where the black stars rise, /And strange moons circle through the skies,” reads verse from the play; innocuous, if eerie, though it’s in the subtlety that the demons get you. Chambers would go on to influence H.P. Lovecraft, who conceived of his own haunted book in the form of the celebrated grimoire The Necronomicon, which he explains was “Composed by Abdul Alhazred, a mad poet of Sanaa in Yemen, who was said to have flourished during the period of the Ommiade caliphs, circa 700 A.D.,” and who rendered into his secret book the dark knowledge of the elder gods who were responsible for his being “seized by an invisible monster in broad daylight and devoured horribly before a large number of fright-frozen witnesses.” Despite the Necronomicon’s fictionality, there are a multitude of occultists who’ve claimed over the years that Lovecraft’s haunted volume is based in reality (you can buy said books online).

Then there are the works themselves which are haunted. Shakespeare’s Macbeth is the most notorious example, its themes of witchcraft long lending it an infernal air, with superstitious directors and actors calling it the “Scottish play” in lieu of its actual title, lest some of the spells within bring ruin to a production. Similar rumors dog Shakespeare’s contemporary Christopher Marlowe and his play Doctor Faustus, with a tradition holding that the incantations offered upon the stage summoned actual demons. With less notoriety, the tradition of “book curses” was a full-proof way to guard against the theft of the written word—a practice that dates back as far as the Babylonians, but that reached its apogee during the Middle Ages, when scribes would affix to manuscript colophons warnings about what should befall an unscrupulous thief. “Whoever steals this Book of Prayer/May he be ripped apart by swine, /His heart splintered, this I swear, /And his body dragged along the Rhine,” writes Simon Vostre in his 1502 Book of Hours. To curse a book is perhaps different than a stereotypical haunting, yet both of these phenomenon, not-of-this-world as they are, assume disembodied consciousness as manifest among otherwise inert matter; the curse is a way of imbuing yourself and your influence beyond your demise. It’s to make yourself a ghost, and it worked in leaving those books complete. “If you ripped out a page, you were going to die in agony. You didn’t want to take the chance,” Marc Drogin dryly notes in Anathema! Medieval Scribes and the History of Book Curses.

Not all writing is cursed, but surely all of it is haunted. Literature is a catacomb of past readers, past writers, past books. Traces of those who are responsible for creation linger among the words on a page; Shakespeare can’t hear us, but we can still hear him (and don’t ghosts wander through those estate houses upon the moors unaware that they’ve died?). Disenchantment has supposedly been our lot since Luther, or Newton, or Darwin chased the ghosts away, leaving behind this perfect mechanistic clockwork universe with no need for superfluous hauntings. Though like Hamlet’s father returned from a purgatory that we’re not supposed to believe in, we’re unwilling to acknowledge the specters right in front of us. Of all of the forms of expression that humanity has worked with—painting, music, sculpture—literature is the eeriest. Poetry and fiction are both incantation and conjuration, the spinning of specters and the invoking of ghosts; it is very literally listening to somebody who isn’t there, and might not have been for a long while. All writing is occult, because it’s the creation of something from ether, and magic is simply a way of acknowledging that—a linguistic practice, an attitude, a critical method more than a body of spells. We should be disquieted by literature; we should be unnerved. Most of all, we should be moved by the sheer incandescent amazement that such a thing as fiction, and poetry, and performance are real. Every single volume upon the shelves of Drown, every book in an office, every thumbed and underlined play sitting on a desk, is more haunted than that building. Reader, if you seek enchantments, turn to any printed page. If you look for a ghost, hear my voice in your own head.

Bonus Link:—Binding the Ghost: On the Physicality of Literature

Image Credit: Flickr/Kevin Dooley

Agentless Agency: On Submitting to Lit Journals

- | 3

1.
Last month, I came across a sentence in Cal Newport’s A World Without Email that took my breath away: Outsource what you don’t do well. Newport describes how one entrepreneur’s decision to hire a part-time assistant swiftly drove up the startup’s efficiency and the entrepreneur’s satisfaction with his job. I put down the book and watched two dogs wrestle in my neighbor’s yard. Newport’s dictum had sparked an idea that seemed so scandalous, so alluring, so taboo, that it might just work…

What if I could find the funds in my teaching salary to hire a writing assistant for a few hours a week? Namely, someone to submit my stories and essays for me?

Creating work has never been an issue. I began composing short stories and poems as a kid, majored in creative writing in college, and attended an MFA program, where I largely worked on fiction. I rarely submitted the stories that I’d spent months and years polishing.

Professors urged us to submit regularly, to create Excel spreadsheets, to amass rejections and keep going. But the whole system felt so obtuse and unrewarding: you submitted a story, waited for months on end, then received a polite form rejection, if that. Every now and then a personal note would come through, suggesting that you send something else. Discouraged by the rejections, I rarely did.

When I managed to actually publish work, the path to success seemed difficult to repeat. In one instance, a college professor kindly nominated me for an Emerging Writers issue. Afterward, a magazine editor at a tiny lit mag reached out, urging me to submit a story. I did so, and somehow the piece wound up being selected as an O’Henry Prize Story that year. All of it—the professor’s nomination, the solicitation, the prize—came down to incredible, unthinkable luck, and no real work on my part, aside from writing and editing the piece itself.

When it came time to look for an agent, a friend offered to put me in touch with his. She eventually agreed to shop my story collection around, but didn’t get any bites. And since the agent regularly misspelled my name in emails, I figured I wasn’t her first priority. Then another friend from grad school, who had since begun agenting, reached out and offered to represent me. I said yes, and she recommended I try my hand at a novel before selling the story collection. I wrote the novel in a couple years, she sold it, and I wound up with a generous book deal and a great editor, even if the novel itself didn’t sell very well.

I assumed that this method would continue for the rest of my career: I’d write another novel, she’d sell it, basta. But none of my drafts seemed to satisfy her, and after five years, we parted ways. Now, on my own (cue the Les Mis soundtrack), with 20 years of writing and publishing under my belt, I still feel squeamish when it comes to submissions. I’ve begun writing more nonfiction, and have had some luck placing personal essays, although this, too, feels scattershot.

Meanwhile, there’s a groaning file labeled WRITING on my computer that contains, I swear, dozens of standalone pieces—poems, short stories, flash fiction and nonfiction, essays, novel drafts, a memoir—all of which silently rebuke me whenever I open Microsoft Word. I’m proud of that work. I think most of it holds up (even if, skimming an old short story the other day, I realized I’d need to substitute a character’s “CD-burning” for a Spotify mix.)

Agentless, as the majority of writers are, how do we find our own agency? My fiancé, Alejandro, ironically, is exactly where I was 10 years ago: poised to finish and publish his first novel. He has done well with submissions: an American Short Fiction prize two years ago turned into a Best American prize last year. He seems less fazed by the whole slush pile prospect: as I type this, he’s in his office next door, shortening a short story for a Guernica submission. Is it his scrappy, thick-skinned approach (he applied to the Michener Program four years in a row before an acceptance) or is he innocent of an industry weariness that my 20 years in the biz has conferred, like a professional tennis player’s sore shoulder?

2.
I love reading business and productivity books because they’re reassuringly matter of fact. But Newport’s suggestion to outsource your headaches is complicated when it comes to submitting creative writing. How can I instruct someone on how to submit my work if I don’t have a reliable process in place? Should I hire a marketer? A college student? A virtual assistant? A freelance publicist?

Ideally, I would hand my teeming file of writing to a deeply organized soul who would go to town organizing it, strategizing about where to submit, and then send work out like mad, using my cover letters. I could offer bonuses for work that was accepted, along with a fair hourly wage. But with such an enormous lag time between submitting and hearing back, and with acceptance rates so low, it’s hard to create an appealing incentive. And the prospect of sacrificing therapy sessions for a publishing assistant seems dubious, to say the least.

Another one of my favorite productivity gurus, Greg McKeown, whose latest tome, Effortless, I devoured in the way I no longer devour novels (see: industry weariness), suggests asking yourself these questions when approaching a thorny task: How could this be easy? And: How am I making this too complicated?

3.
After finishing Newport’s book, I spent a full week trying to come up with a job description for a writing assistant before deciding I probably just need to do the work. Last week, during a lull from teaching responsibilities, I decided I would look over old pieces and edit them in the morning, and then send each story out to five places in the afternoon. Simple, right? Log into Submittable, copy and paste the cover letter, attach the short story file, basta! (Sadly, anytime my plans end with basta, it’s usually a sign that they’re not going to work out.)

Sitting down at my laptop to submit again reminded me why I always avoided it. Trying to figure out if a magazine is in a reading period. Trying to scout out the appropriate editor on the masthead. (Alejandro, scandalously, told me that he just addresses his letters to an anonymous Editor. Ballsy.) Trying to decide what my list of publications should be. Do I attempt a college admissions approach, with reaches and safeties? But if I’m sending out a bunch of work over the course of several weeks, including several short stories, how to choose which magazine should receive what?

Barf.

4.
For a long time after my divorce, six years ago, I refused to date online. I didn’t want to go through the drama of meeting people who wouldn’t work out. I wanted connection to happen naturally, in the real world. Unfortunately, this meant I jumped at every odd encounter that occasionally crossed my path, just to prove to myself that this organic method was serving me well.

When I finally took the plunge and signed up for dating apps, it took six months of good, shitty, and largely underwhelming dates before meeting Alejandro. I was his first Bumble date, go figure. I told you he was lucky when it came to submissions. And now that I think about it, he totally lured me by touting that recent ASF short story prize in his dating profile, as if I were another magazine editor instead of a romantic prospect.

But maybe there’s something to thinking about submitting as a kind of matchmaking for my creative work rather than as a test of its fundamental worth. Scrolling through lit mags the way I once swiped through faces and profiles. We’re told to go for the most selective publications first, but maybe looking for the friendliest and most intriguing journals would be a more enjoyable prospect. Over the years, my writing has gotten more experimental, and prospective publishers for later work will likely look much different than publishers for work from my 20s and early 30s, just as my romantic partners have changed along with shifts in my personality and my priorities.

5.
The first definition of “submission,” according to Oxford Languages, is “the action or fact of accepting or yielding to a superior force or to the will or authority of another person.” Part of why I’ve avoided submitting in the past is that it always makes me feel so powerless, so… submissive. But perhaps submitting is also about yielding to the truth that my work isn’t for everybody, just as my style of clothing (I’m newly obsessed with vests.) or taste in music (‘90s country forever!) is off-putting to some.

Okay. New plan. I’m going to approach submissions as an online dating adventure for my writing, and see if I can set my pieces up on some alluring blind dates. After all, it’s way better to imagine my story sipping wine at a candlelit Italian restaurant than drowning in a “slush pile.” First step: submit this essay on submitting.

Image Credit: Pixabay

FOMO, but for Books

- | 2

We were getting ready to go to the community pool last weekend, packing all the things we needed: towels, sunblock, water, change of clothes, etc. My husband glanced in the bag to double-check everything and then casually asked if there was a reason I’d packed two books and a magazine.

“To read,” I told him.

He looked at me. We have two children, one eight and one three. The three-year-old cannot swim. The eight-year-old can and requires an audience. There is perhaps a 10 minute window when I might be able to read uninterrupted. And yet I had to bring those books. Because…what if I did have time? And what if, when I got there, I just wasn’t in the mood to finish Elizabeth Kolbert’s Under a White Sky? (But what if I was?) What if, instead, I wanted to dig into Daniel Okrent’s history of prohibition, Last Call, which was due at the library very soon. Or what if I was in the mood for fiction, and I felt like reading the literary magazine that had just come in the mail?

“You never read at the pool,” he said. “You stay in the water and swim.”

This is true. My husband is the one who reads near bodies of water. He also has a realistic grasp on the number of hours in a day. He always brings one book with him on vacation, and he reads that one book. Sometimes he brings a book he’s already read, to guarantee that he will like it. This summer, he’s been bringing The Sun Also Rises to the pool, a novel he’s read at least three times before. It’s a great summer read. I love the part where they go fishing and have a picnic and keep the wine bottles cold in the stream. Just thinking about that scene makes me want to read that book again.

Sometimes I think I have FOMO, but for books. It’s particularly acute in the summer. When I go on vacation, I always take too many books with me. On my first big trip with my husband—before we had children—I packed five novels and then bought magazines at the airport. I read the magazines on the plane and the novels languished in my bag. There was no time to read. We were traveling around Spain, walking and eating and talking. And I knew that would be the case. Yet I packed the books. In my mind, we were traveling to a place where we would somehow have time to see all the sights and also relax for several hours every morning, and to read books. A place with 30-hour days.  

I ask myself where this fantasy comes from and I think—as with so many things—it goes back to childhood. When I was 10 years old, my family moved from New Hampshire to western Maryland. It was the year of Brood X, and I remembered the thick whine of the cicada song in the air when we arrived. I was bewildered by the humid weather and I didn’t know anyone. The kids down the street invited me to Vacation Bible School, so I went there, where I read the Bible and learned the Lord’s Prayer while my mother unpacked. When that was over, my mother took me to the library and told me to get out as many books as I wanted. When we got home, she gave me a glass of lemonade and the foldable wing chair and told me to find a spot outside in the shade to read—preferably a place where she could see me from her office window.

And so began a summer ritual that lasted through middle school. I loved how unrestricted summer reading was. You could read as much as you wanted, and whatever you wanted—there was no one interrupting you to do homework in other subjects or to get ready for soccer practice. You could read more than one book at once, dipping into one and then another and back again. You could skim over the sections you didn’t like; nobody was going to quiz you. You could read books you were too young for—hello, John Irving!—and books you were too old for—hello, Anne of Green Gables for the 10th time! You could read comics and sci-fi and celebrity biographies alongside the classics. You could go ahead and read the books that you might be assigned in high school—Ethan Frome, The Catcher in the Rye, Catch-22, The Adventures of Huckleberry Finn, The Stranger—because what if you weren’t assigned them and then you didn’t get to read them?  

Henry James once wrote, “Summer afternoon, summer afternoon—to me those have always been the two most beautiful words in the English language.” I’ve always assumed this quote refers to reading. I doubt James was hanging clothes out on the line or preparing dinner. He definitely wasn’t sitting by a pool and rating a cannonball jump on a scale of one to five at the request of a small child. I don’t begrudge James his summers, but I do assume that he, unlike most people I know, was able to hold onto them well into adulthood. My leisurely afternoons began to disappear midway through high school, when I started to work in the summers, and were gone by my 20s—though I did have more empty afternoons for reading in my 20s than I do now. But in my 20s, the Internet began to encroach on my time, and I also began to read with greater purpose—to learn the craft of writing or to gain knowledge to think critically about a particular issue. Of course, I read to be entertained, too, and to be absorbed in another person’s way of seeing things. That has never gone away. But the haphazard reading of my childhood summers is gone, and sometimes I think I’m chasing it, when I pack too many books.

Then again, maybe they’ll return to me, sooner than I realize. Just last week, our local library branch finally opened for browsing. My son headed to the children’s area and returned, 20 minutes later, with a tall stack. My pile was more modest—just two books. But as we walked home together and my son chatted about what he would read first, I took vicarious pleasure in his excitement. I wish I could say that I read alongside him, but the truth is cleaned up, attended to his younger sister, made lunch, and then finished up some of my own work. But summer isn’t over yet, and with a little planning, I may yet sneak in a few afternoons.  

Bonus Links:—A Summer Reading List for Wretched Assholes Who Prefer to Wallow in Someone Else’s MiseryAlternate Routes: A Summer Reading ItineraryThe Problem with Summer Reading

Image Credit: Pixabay

Silent All These Years: On Annie Dillard

- | 1

1.
Some years ago, I attended a conference featuring boldface names and their thoughts on the topic of the essay as art. At 39, I’d written three failed novels, and essays felt like the last form left to me. I was desperate for tips, tricks, and whatever writerly chum they throw to audiences at events like these.

“An essay,” said Philip Lopate on the day of the conference, “is an invitation to think alongside me.”

I jotted his words in a Moleskine notebook and have been turning them over in my head and on the page ever since. The best essays are trips to terra nova, yes; but at heart, all essays depend on a simple sense of camaraderie. From the first word to the last, the writer of an essay is a guide, even if the piece never gets out of first gear. Each essay is a fellowship.

By Lopate’s definition, there’s no better essayist than Annie Dillard. Her thoughts go places no one else can see. Following in her path, you can sip the cold fire of eternity, cheat death in a stunt plane, or trace God’s name in sand, salt, or cloud. She didn’t invent the essay. Her most famous work isn’t classified as an essay. But in the cosmos of essayists, there’s Annie Dillard, and there’s everyone else.

2.
It doesn’t cost more than a couple clicks to get the complete text for Dillard’s piece on witnessing a solar eclipse. If you haven’t read it, you should; if you’ve read it before, reread it. “Total Eclipse” was published 40 years ago, but it’s still wild. I read it for the first time one winter night. I had a good hunch the moon would slip between sun and Earth in the narrative. But I wasn’t prepared for a lunatic opera. By the end of the piece I was like, wait, wait, who is this woman?

Despite being a septuagenarian who cut her teeth on theodicy, Annie Dillard’s name is practically a Twitter hashtag. She gets regular hat tip tweets, often quotations from her first prose book, Pilgrim at Tinker Creek, or from The Writing Life, her slim but wise book on craft. Writers admire her sleek writing and crisp turns of phrase, but plenty of her fans just love love love her without knowing how to explain why. I’m not sure I can explain why, either; there are too many good reasons.

For one thing, her prose is sharp as a chert blade. Don’t know what an ancient Solutrean chert blade is? Neither did I, until I read For the Time Being, her last book of nonfiction. “Hold one of these chert blades to the sky,” she writes, describing an ancient knife made of chiseled stone so fine that “it passes light.” She continues:
At its very edge the blade dissolves into the universe at large. It ends imperceptibly at an atom. Each one of these delicate, absurd objects takes hundreds of separate blows to fashion. At each stroke and each pressure flake, the brittle chert might – and, by the record, very often did – snap.
Here’s another reason to love her. She teaches you things. Not showy facts to prove her smarts. Not cheap trivia any sixth grader with a library card could tell you. She delivers the goods for questions you didn’t or wouldn’t have the tenacity to resolve. In a retrospective on her career, The New York Times cited her understanding that “the mundane itself — snails, fireplaces, shrubs, pebbles, socks, minor witticisms — is secretly amazing.”

This March, my nine-year-old asked how the frogs in the park could survive if their pond froze over. I don’t really know, I said. But Dillard knows. She did the work and put the answer right there on page 47 of the 2013 reissue of Pilgrim at Tinker Creek. She’s not showing off when she tells you about how frogs survive the winter by burrowing in mud and breathing through their skin.  She’s just demonstrating that it’s possible to unriddle an unknown, if you put in the work.

Despite all the love you can find for Annie Dillard, despite her perch in the pantheon of great writers, there’s something off-kilter in the way people talk about her.

On its face, this is absurd. She’s part of the canon. She received a National Humanities Medal. She has written books that will outlive her. So what then is it that bothers me about how we talk about Annie Dillard? Why do I feel like there is something we’ve already begun to lose with respect to her?

A few years ago, no less a writer than Geoff Dyer attempted to position Dillard as a single star in the firmament of writers. He surveys her career in search of an answer to the question of what kind of writer she is exactly. He is uniquely gifted at indirectly saying what can’t be said about geniuses and grand artists (as his book on D.H. Lawrence demonstrates). Near the end of his Dillard piece, Dyer triangulates her position with respect to other writers he admires. Then he stops. I don’t want to say he fails. But the final product feels incomplete, as if he saw the summit but didn’t quite reach its seat.

Such difficulty in classifying Dillard is not unusual. You see it in casual profiles but also in scholarly essays and surveys of her work. Pick an article or two at random and you’ll likely see what I mean. In a CrossCurrents piece on ecotheology from 1995, Dillard is called a mystic, a contemplative, an exegete, a theologian, an ecological guru—and that’s literally just the first page.

In the Chicago Tribune’s 1999 review of For the Time Being, the reviewer rhapsodizes over the book’s sestina-like structure, calling it a “new form.” And it really is sui generis (for my money, it is her finest work), as she weaves a narrative that goes seven times around a weft of eight concepts: birth, sand, China, clouds, numbers, thinker, evil, and now. The Tribune’s reviewer marvels over Dillard’s deft ability to evoke the existential paradox that while each of us “matter not a whit, we also matter profoundly.” A rave review for a masterpiece by a phenom. But the newcomer to Dillard who reads this could be forgiven for thinking: what kind of person could or would dare to take up such a God’s eye view?

All Dillard’s high praise is well deserved; but it’s all also a problem, a significant one. In all these appreciations, all these assessments, each commentator, no matter how gifted or thoughtful—not a single one of them speaks of Dillard as if she belongs. She is a strange katydid, a demon flower. I do not excuse myself from this diagnosis, either. After all, didn’t I begin this piece by putting Dillard into a definitive category of one?

The way we talk about Annie Dillard makes me both sad and afraid. Sad because we are unable to appreciate in full the words she has written if we cannot see how she is, in the final analysis, just one of us. And afraid because affixing someone with otherness is the first stage in allowing that someone to be forgotten, and I am afraid of a world where writing such as hers could surface and then vanish.

3.
As Dillard herself writes, “The way to learn about a writer is to read the text. Or texts.” Consider a few of hers, then.

Her very first book, Tickets for a Prayer Wheel, was a poetry collection pieced together six years after she graduated from college. She found a publisher to bring it out in 1974. But she wasn’t destined to be a poet. Or, at least, not just a poet. Her next move was to do something that poets aren’t supposed to do: she published a book of prose. And she did it before dust could even collect on the first remaindered stacks of her poetry.

I’m calling it her first book of prose, but I could just as well call it the book of prose, as far as literary gatekeepers are concerned. Pilgrim at Tinker Creek (like the earlier book of poetry) came out when Dillard was 28 years old, and it’s the book that catapulted her from young writer to splashy, important transcendentalist almost overnight–or at least that’s what her publishers would have you believe. The edition that I bought earlier this year has a four page About Annie Dillard section and not one but two Afterwords, just in case I lose one, I suppose.

Upon its initial release, Pilgrim at Tinker Creek was praised by many critics, but there were more than a few notable voices with reservations. Eudora Welty famously grumped about not quite understanding some of the book’s lyrical asides. She also wanted more voices. A critic at Kirkus Reviews was downright galling: a brief review claimed Dillard’s reach exceeded her grasp, but patted her head for trying like a good girl.

Real talk: for a failed novelist or wannabe writer, it’s quite soothing to learn that a book the Modern Library now calls a classic had a rather bumpy roll-out. According to a historical note on Dillard’s website, the hardcover copies of Pilgrim didn’t actually sell very well. Only in paperback did it catch on. For all its merits, Pilgrim at Tinker Creek is not a page-turner. It’s a slow, steady, dark burn; the kind of conflagration that sightlessly devours oxygen, killing without the melodramatics of flame.

Is it a heresy to admit that of all Dillard’s nonfiction, Pilgrim at Tinker Creek moves me the least? The book is full of prose marvels, yes; it’s just that it feels like it’s written on spec. As if, full of youthful hubris, she said, Well, this fellow Thoreau wasn’t really doing anything so hard? I can do that. And then she did. I marvel far more at later books like the aforementioned For the Time Being or Teaching a Stone to Talk, her almost perfect essay collection. The structure of Pilgrim flags because it feels forced. Sort of the opposite of the impression given by her second book of prose, Holy the Firm.

Holy the Firm runs less than 15,000 words. Twice as long as a long personal essay. Shorter than a feature in a weekend magazine. Yet I suspect more people have read about how she wrote the book—shack, island, airplane, fire—than have read the book itself. It’s worthy of the flame of her reputation. The letters do not smolder on the page, but the ideas surely do. Kirkus called this outing “a difficult, restless rumination,” and they weren’t wrong. It is a book about learning to live in the afterglow of life’s casual cruelty, about accepting the lot of whatever this thing is, this universe. Basically the plot of all her noteworthy work, which is to say, all of her books from here on out. Once she caught her stride, there was no stopping her.

In her 40s, Dillard joined the Roman Catholic Church. She’d grown up a Protestant but left the church as a teenager. She told an interviewer that she converted in middle age to keep close to God, even though she did not always agree entirely with the people who worshipped around her. In her essay, “An Expedition to the Pole,” she writes:
It is madness to wear ladies’ straw hats and velvet hats to church; we should all be wearing crash helmets. Ushers should issue life preservers and signal flares; they should lash us to our pews. For the sleeping god may wake someday and take offense, or the waking god may draw us out to where we can never return.
There is a way of reading Pilgrim at Tinker Creek that positions Dillard not as Thoreau’s heir but as a contemporary Christian writer and thinker. She employs the language of Christianity and speaks of creation and eternity and grace. But she’s not a religious writer, not really; religious language is just her jumping off point. She’s proposing to meet readers at the village church not because it’s her destination, but because it’s a large landmark we all know. Dillard’s subsequent books push further afield. She continues to insist on questioning God, rather than seeking God’s grace. The disappointment this engendered in some readers is evident, as seen in “Annie Dillard: Mistaken Mystic?”, an article from the journal of an evangelical society. Here the judgment on Dillard is brief, swift, and stark. The writer uses the kind of language that I recognize from growing up in a church: “She does not point to the Bible often enough,” and: “She does not understand how Jesus fulfills all of God’s promises and love.”

In her memoir An American Childhood, Annie Dillard describes how as a teenager she felt estranged from the people in the church pews around her. I recognize the conflict. I was a Sunday-School kid. I stood for the role of Joseph in the Christmas play. I sang in Easter programs. The judgy tone of the aforementioned journal is the tone that many keepers of the flame would use to dismiss someone who is being difficult because he or she is questioning rather than blindly accepting. Rigidity about how to believe, what to believe, an emphasis on dogma, on prescribed process, on blind tradition: this part of religious education stuck in my craw. But I accepted it. I had to. What other way of looking at the world was available to me back then?

In her book Living by Fiction, Dillard writes: “Can we not loose the methods of literary criticism on the raw world? May we not analyze the breadth of our experience? We can and may – but only if we consider the raw world as a text … as a work of art.”

If someone had put this book, any book by Annie Dillard in my hands as a teenager, I would have turned the pages with quaking hands. She speaks in a register that anyone acquainted with religion can recognize. But she has the temerity to also point a finger at the heavens and say, You made this, now explain it.

By the time Dillard turned 50, she’d left the Catholic Church; she was striking out on her own again. No surprise. When it comes to God, Dillard is all about interrogation, not devotion; she wants an audience with the divine because she’s got questions, not because she wants to receive beatific truth. Again and again across her work, there’s a sense of inquiry, the inquiry of an alert human on the move through life. A life where God has gone silent, even if you believe he made this place where we all live.

In the end, she emulated that silence, intentionally or not. For decades she gave space to contemplate the mundane and overlooked aspects of life. She sought out and drew attention to silences, holy and profane, large and small, whatever caught her eye. She gave voice to details others would miss. Then, after a time, she found her own place in the silence and stepped into it.

4.
She wrote 11 original books over the course of 30 years and then, abruptly, she stopped. Suddenly, nothing. She has given a few interviews, sat with some journalists. A blurb here and there. But mostly, she’s gone silent; she has not published any form of nonfiction in two decades. Theories come and go as to why. But the only thing for certain is the totality of her withdrawal as a creator.

A prolonged silence can prompt questions that words or new texts cannot. Is she out of things to say? Is she angry? Is her message complete?

If Dillard were writing this essay, then I’m certain she would have found a way to note God’s famous periods of silence. After the Holocaust. After the earthquake at Lisbon. For the 400 years between the Book of Malachi and the Gospel of Matthew.

What does God’s silence mean? What is he doing? What is he thinking? What does Annie Dillard’s silence mean? What is she thinking? Does she have dementia? Does God have no heart? Does she have nothing left to say? Does God have nothing left to tell us?

Five years ago, the Atlantic ran a mean-spirited profile entitled “Where Have You Gone, Annie Dillard?” The piece begins with hearty praise for Dillard, but the meal turns sour long before the last course. The article annoys even as it explicates. The writer rightly criticizes people who class Dillard as nouveau Thoreau; but he botches the landing by veering into questions of intent, as if intentions were clear-cut as texts. Most cruelly, the article suggests that the very nature of Dillard’s inquiry is what has damned her to silence. That she was doomed to leave us hanging. Because she’s got the heartless eye of a distant god. Because she’s a cold fish. This is the scat of cheap iconoclasm. It doesn’t hang together as an argument. And it doesn’t match up with the Dillard who shows up in profiles, interviews, and the memoirs of former students.

One of Dillard’s best uncollected essays is titled, “How I Wrote the Moth Essay and Why.” In a few pages, she assesses both a previous piece of writing and her life at the time. She uses the critic’s scalpel in two directions at once. The resulting essay lays bare Dillard’s personal fears (childlessness, loneliness) and her preoccupations (what is the point of me? what am I trying to say?), and then she shows how all this motivated her to produce a piece of writing that was and wasn’t about those things. She also reveals her method of composition: She does not begin by looking down from an imperious height. She has no foreknowledge of what she is writing toward or about, at least not at first. She uses data from past journals and produces a “babbling” first draft and then a surgically altered revision. She is an artisan, working a chert blade.

During the first decade of her silence, there were rumors that perhaps she was suffering from cognitive decline. In 2016, John Freeman, a former editor for Granta tracked Dillard to Cripple Creek, Va., where she lived with her husband, the biographer Bob Richardson. Freeman wrote a profile of her current life, the out-of-the-way cabin, the backwoods store, the books she keeps reading. The passages about Dillard and her husband are the best part of the piece. Freeman writes: “Watching Annie and Bob over breakfast, editing each other’s stories and officiating over the presentation of flatware, coffee, second and third helpings, it’s clear that whatever came before, this is the show.” Of his wife, Richardson says: “She’s the smartest person I’ve ever met.” It’s sweet, the kind of thing that you want to hear from people who are entering into old age together. By the end of Freeman’s profile, Dillard addresses her silence, albeit with a kind of shrug. “I switched to painting,” she says. You don’t get the sense that she regrets the change up, or that she did it because her project ran out of steam. She just moved on. Not deeper into loneliness or isolation: rather into a more private kind of fellowship, the journey of real companionship.

But what about the writing? Is there any more writing? There is, and there isn’t. In a 2016 interview with Melissa Block of National Public Radio, Dillard speaks like someone who is aware of how many pages are left in her text of the world and who has reconciled herself to the limits that are built into our time here. She talks about writing. She’s still writing. But not for us. Really, just for herself, mostly. She’s already put more than her fair share into the world of texts, more than enough to keep the rest of us going.
DILLARD: I write a lot of emails. I write in my own journal when something extraordinary or funny happens. And there’s some nice imagery in there. I don’t think of what to do with it.

BLOCK: You don’t think about another book at this point.

DILLARD: No, I don’t. I had good innings, as the British say. I wrote for 38 years at the top of my form, and I wanted to quit on a high note.
Annie Dillard seems to have almost no other choice but to prod life, poke it, search every place in it for hidden, buried meaning, and then produce her own text, or texts, for herself sometimes, sometimes for others, sometimes for purposes she’s not even sure of–one more link in the long textual chain of being. There is a word for this kind of writer, the kind that acts like a guide, the sort who enters into a fellowship with you and brings you with her wherever her thoughts lead, high or low, grand or lowly, who writes about the world without even knowing what to do with it: essayist.

To be an essayist is a fine purpose, but a purpose lasts only so long. As with lives. Socrates is a man. All men die. Socrates must, well, you know the line. Last summer, during the haphazard malaise of the coronavirus pandemic, Dillard’s husband Bob Richardson died. He said of her in an interview near the end of his life: “I learned from her that you have to go all out every day. Hold nothing back. The well will refill.”

So we go on, filling the silence. Or, perhaps, finding the silences and listening to what they will tell us.

The day that I began writing this essay, I went for a run. This is how most of my essays begin. At first, they’re word knots and phrases in my head. Thoughts and observations and quotes and ideas, a mute mess. I go for a run and the ideas begin to knock loose from one another and then back together again in a pattern that makes sense, that gathers into something far stronger.

I ran as I often do along the edge of Riverside Park, past pre-war buildings on the Upper West Side, past plaques that commemorate where J. Robert Oppenheimer lived or where Edgar Allen Poe wrote his poems. I thought about history. I thought about how we all end up swallowed by silence. Traffic had snarled the on and off ramps from the West Side Highway. I thought about how even before we’re lost in history, we’re lost in our daily lives. I also thought of the Annie Dillard quote I see more than any other in social media posts: “How we spend our days is, of course, how we spend our lives.”

I stopped for Gatorade near the Museum of Natural History. On Columbus Avenue, I noticed a new bookstore. One I’d never visited before. I wandered inside to have a look. An essay section near the rear offered two columns of books. I went straight to the D’s. But Dillard’s books weren’t there. I searched the Staff Recommendations table. I searched the display table for Our Favorite Reads. Nothing. This made me angry. I didn’t turn over the displays or drive out the cashiers, but I kind of wanted to. Someone needs to change this, I thought. More people need to know about Dillard. That’s when I realized what I needed to do. That’s when I understood where this essay needed to go.

One more thing happened. The important thing. Exiting the bookstore in a huff, tired, irritated, I noticed the wreckage of a black BMW at the curb. It had been there earlier. But I’d walked right past it. I hadn’t slowed enough to notice. It was a total ruin, wheels gone, windows shattered, hood crumpled, left side flat as if smashfisted by a giant. There was no explanatory sign, no one standing watch, no one paying mind. Just a casual profundity. Like a perfect monument to all we don’t see, can’t see, won’t see, the marvelous terrible strange that is here and will be gone soon and no one will believe us that it was there once but is now gone because how do you frame the unexpected? You can’t frame or explain: you can only point and try to make others see. Did you see the wreck? Did you read Holy the Firm? Did you look up in time for the eclipse? Do you know Annie Dillard? Did I hear you call my name, or was that the voice of God?

Bonus Link:
Line, Run, Breath: On Annie Dillard and the Circuitous Work of Writing

Image Credit: Wikipedia

Edgar Allan Poe: Self-Help Guru

You may have read a lot of Edgar Allan Poe. Chances are you’ve read a little whether you’re a fan or not. Poe’s influence, as James Wood wrote of Flaubert, is “almost too familiar to be visible,” while Poe’s work is standard in school curricula across the U.S. and beyond. I still remember how, when I was living in Australia a few years ago and happened to be seated at a coworker’s kitchen table, her eight-year-old son burst in, gushing about “The Raven,” impressed that a poem, of all things, could be so scary.

Still, Poe’s work has its less-visited corners. Take “A Chapter of Suggestions,” an 1844 essay in which Poe set aside literary criticism to advance a different and, to my eye, more personal set of ideas. There is some profound shit in there. The first paragraph goes like this, no preamble:
In the life of every man there occurs at least one epoch, when the spirit seems to abandon, for a brief period, the body, and, elevating itself above mortal affairs just so far as to get a comprehensive and general view, makes thus an estimate of its humanity, as accurate as is possible, under any circumstances, to that particular spirit. The soul here separates itself from its own idiosyncrasy, or individuality… All the important good resolutions which we keep—all startling, marked regenerations of character—are brought about at these crises of life.
A quick and dirty translation? Once or twice in our lifetimes, you and I will experience dissociative moments that allow us to glimpse our humanity beyond the idiosyncratic snags of our personalities. Understanding the self in fact requires transcending the self, however briefly. And such crises may lead to breakthroughs, epiphanies, moments of much-longed-for change. It’s like what people say now about eating shrooms.

Poe, we can be pretty sure, wasn’t writing under any psychedelic influence, but he did have his own experiences of anxiety, depression, and/or periodic breaks with reality. He wrote “A Chapter of Suggestions” for money, netting 50 cents a page when it was published in an 1845 gift book, which may have helped alleviate some of his anxiety, if only for a moment. The reason you and I might find it a funny, poignant reading experience today is because Poe’s “Suggestions” sound a lot like contemporary self-help. He veers through a series of disconnected paragraphs, rattling on about probability, the imagination, why disappointed artists may drink too much in midlife, and more. Here are his bugbears, weaknesses, obsessions, hopes—plus a little how-to, dusted with 19th-century pop science and transparent wish-thinking—in one place and under 2,000 words.

In paragraph four, he remarks how often our first grasp of an idea, our initial intuitive impression, turns out to be the most accurate. We know this from our childhood reading, Poe says. We encounter a poem as a kid and we love it. Later, we grow up only to scoff at the same poem. Then we reach yet another stage in which we return to our initial impression, the right one.

What’s startling now, even eerie, is how accurate Poe’s description of this process is for those of us who—like my coworker’s bowled-over son—loved “The Raven” as children, thrilling to its dark rhythms and atmosphere of glamorous doom. Later, as undergrads or grad students, we put that enthusiasm behind us, knowing without having to ask that, past a certain age, it’s not cool to like Poe. And then, once we’ve graduated, lost whatever grasp we ever gained of theory, and reentered civilian life, we recognize all over again how good “The Raven” really is, how effective and successful. Yes, it is a carefully calculated shot at reaching fame by satisfying popular taste, and that is precisely why it works.

Or take paragraph seven, in which Poe tells us that being a genius means attracting haters, “a set of homunculi, eager to grow notorious by the pertinacity of their yelpings,” but also that, crucially, not everyone who attracts haters is a genius:
All men of genius have their detractors; but it is merely a non distributio medii to argue, thence, that all men who have their detractors are men of genius. Yet, undoubtedly, of all despicable things, your habitual sneerer at real greatness, is the most despicable…Their littleness is measured by the greatness of those whom they have reviled.
Clock the pettiness of detractors by the outsize nature of what they attack: Now that’s insightful.

Skipping to paragraph nine, Poe informs us that geniuses like getting drunk. It grows out of habits picked up in youth, he says, but later becomes more about somehow coping with the unfairness of existence: “The earnest longing for artificial excitement, which, unhappily, has characterized too many eminent men, may thus be regarded as a psychal want, or necessity. . . a struggle of the soul to assume the position which, under other circumstances, would have been its due.”

Readers familiar with Poe’s blog-like Marginalia series and his galaxy-brained prose-poem Eureka will spot familiar strains of thinking in this paragraph and the larger “Chapter.” There is Poe’s tendency to make grand pronouncements about the character of geniuses—something he got on a tear about with some frequency—and all of which, it should come as no surprise, might be applied directly to himself. Call it self-justification, but there might be more to it than that. It’s these moments when we get the keenest insight into Poe’s own turbulent and amazingly successful life—from the person who lived it. How to make sense of its devastating ups and downs, victories and reversals?

Well, he tells us. Such self-knowledge as he ever gained came from moments of profound crisis, as well as sidelong looks and returning to first impressions—or so we might conclude from reading the “Chapter.” And maybe we might now recognize that the same process gives us our best chance of attaining wisdom and self-knowledge in our own lives, too.

Elsewhere, Poe liked to present himself as a towering, one-take auteur. Think of “The Philosophy of Composition,” the essay in which he dubiously, maybe-kinda satirically claimed that he wrote “The Raven” according to some ultra-precise formula. Whereas in “A Chapter of Suggestions” he shows himself to be no stranger to the sorts of questions you and I ask our cracked ceilings at 4 a.m. He wondered about his own lurching, intermittent progress, about the faults and failings of his character—what the scholar Jerome McGann once listed as “his lies, his follies, his plagiarisms, his hypocrisies”—and whether these might sink him.

Maybe the most surprising thing about “A Chapter of Suggestions” is how, in the 177 years since its publication, and in contrast to the rest of Poe’s work, it’s received so little critical attention. If there’s a single paper dedicated to it in all of JSTOR, I haven’t found it. Burton R. Pollin, who wrote an introduction identifying its origins and genre, concluded that the essay might fall under “general philosophy,” alongside other species of vaguely high-minded magazine filler common in Poe’s era.

We might go a step further. Poe, it seems to me, was pondering the subject of personal growth. Where he landed was somewhere between Coleridge at his most aphoristic, metaphysical and self-justifying, and Tim Ferriss, author of The 4-Hour Work Week, who’s now in a psychedelic phase and who’s never really gotten credit, either, for the range of his thought.

Besides: lies, follies, hypocrisies? That’s most of us, if we’re honest. No wonder then that the ways we come to insight are so strange, glancing, sidelong. Or that we tend to second-guess our initial impressions, only later coming to realize, to know—and, of course, to rationalize. It’s weird to be human, even as we’re unable to compare it with, say, being a macaque or an orchid. We lurch from crisis to crisis, and what knowledge we ever arrive at is likely to surprise us. You can almost see Poe enacting this process in his essay. It makes me think that our childhood assessments of him were right all along: in his darkness, odd rhythms and peculiar wisdom, Poe really was cool. And still is.

Bonus Links:
Edgar Allan Poe Was a Broke-Ass Freelancer
Was Jordan Peele’s ‘Us’ Inspired by an Edgar Allan Poe Story?
Twenty-Five Ways to Roast a Raven: The Spiciest Criticism of Edgar Allan Poe
Poe’s ‘Eureka’ Is a Galaxy-Brained Space Opera for Our Times

Image Credit: Flickr/Alvaro Tapia.

On Memory and Literature

-

My grandmother’s older sister Pauline Stoops, a one-room school teacher born in 1903, had lived in a homestead filled with poetry, which sat on a bluff overlooking the wide and brown Mississippi, as it meandered southward through Hannibal, Missouri. Pauline’s task was to teach children aged six to seventeen in history, math, geography, and science; her students learned about the infernal compromise which admitted Missouri into the union as a slave state and they imagined when the Great Plains were a shallow and warm inland sea millions of years ago; they were taught the strange hieroglyphics of the quadratic equation and the correct whoosh of each cursive letter (and she prepared them oatmeal for lunch every day as well). Most of all, she loved teaching poetry — the gothic morbidities of Edgar Allen Poe and the sober patriotism of John Greenleaf Whittier, the aesthetic purple of Samuel Taylor Coleridge and the mathematical perfection of Shakespeare. A woman whom when I knew her was given to extemporaneous recitations of memorized Walt Whitman. She lived in the Midwest for decades, until she followed the rest of her mother’s family eastward to Pennsylvania, her siblings having moved to Reading en masse during the Depression, tracing backwards a trail that had begun with distant relations. Still, Hannibal remained one of Pauline’s favorite places, in part because of its mythic role, this town where Mark Twain had imagined Huck Finn and Tom Sawyer playing as pirates along the riverbank.

“I had been to school most all the time and could spell and read and write just a little,” recounts the titular protagonist in The Adventures of Huckleberry Finn, “and could say the multiplication table up to six times seven is thirty-five, and I don’t reckon I could ever get any further than that if I was to live forever. I don’t take no stock in mathematics, anyway.” Huck’s estimation of poetry is slightly higher, even if he doesn’t read much. He recalls coming across some books while visiting the wealthy Grangerfords, including John Bunyan’s Pilgrim’s Progress that was filled with statements that “was interesting, but tough” and another entitled Friendship’s Offering that was “full of beautiful stuff and poetry; but I didn’t read the poetry.” Had Huck been enrolled in in Mrs. Stoops’ classroom he would have learned verse from a slender book simply entitled One Hundred and One Poems with a Prose Supplement, compiled by anthologizer Roy Cook in 1916. When clearing out Pauline’s possessions with my grandma, we came across a 1920 edition of Cook’s volume, with favorite lines underlined and pages dog-eared, scraps of paper now a century old used as bookmarks. Cook’s anthology was incongruously printed by the Cable Piano Company of Chicago (conveniently located at the corner of Wabash and Jackson), and included advertisements for their Kingsbury and Conover models, to which they promise student progress even for those with only “a feeble trace of musical ability,” proving that in the United States Mammon can take pilgrimage to Parnassus.

The flyleaf announced that it was “no ordinary collection,” being both “convenient,” “authoritative,” and most humbly “adequate,” while emphasizing that at fifteen cents its purchase would “Save many a trip to the Public Library, or the purchase of a volume ten to twenty times its cost.” Some of the names are familiar — William Wordsworth, Alfred Lord Tennyson, Oliver Wendell Holmes, Rudyard Kipling (even if many are less than critically fashionable today). Others are decidedly less canonical — Francis William Bourdillon, Alexander Anderson, Edgar A. Guest (the last of whom wrote pablum like “You may fail, but you may conquer – / See it through!”). It goes without saying that One Hundred and One Poems with a Prose Supplement is overwhelmingly male and completely white. Regardless, there’s a charm to the book, from the antiquated Victorian sensibility to the huckster commercialism. Even more strange and moving was my grandmother’s reaction to this book bound with a brown hardcover made crooked by ten decades of heat and moisture, cold and entropy, the pages inside turned the texture of fall sweetgum and ash leaves as they drop into the Mississippi. When I mentioned Henry Wadsworth Longfellow, my grandmother (twenty years Pauline’s junior) began to perfectly recite from memory “By the shores of Gitchee Gumee, /By the shining Big-Sea-Water,” going on for several stanzas of Henry Wadsworth Longfellow’s distinctive percussive trochaic tetrameter. No doubt she hadn’t read “The Song of Hiawatha” in decades, perhaps half-a-century, and yet the rhythm and meter came back to my grandmother as if she was the one looking at the book and not me.

My grandmother’s formal education ended at Centerville High School in Mystic, Iowa in 1938; I’ve been fortunate enough to go through graduate school and receive an advanced degree in literature. Of the two of us, only she had large portions of poetry memorized; I on the other hand have a head that’s full of references from The Simpsons. If I’m able to recall more than a quarter of a single Holy Sonnet by John Donne I’d be amazed, yet I have the entirety of the Steve Miller Band’s “The Joker” memorized for some reason. Certainly, I have bits and pieces here and there, “Death be not proud” or “Batter my heart three-personed God” and so on, but when it comes to making such verse part of my bones and marrow, I find that I’m rather dehydrated. Memorization was once central to pedagogy, when it was argued that committing verse to instantaneous recall was a way of preserving cultural legacies, that it trained students in rhetoric, and that it was a means of building character. Something can seem pedantic about such poetry recitation; the provenance of fussy antiquarians, apt to start unspooling long reams of Robert Burns or Edward Lear in an unthinking cadence, readers who properly hit the scansion, but where the meaning comes out with the wrong emphasis. Still, such an estimation can’t help but leave the flavor of sour grapes on my tongue where poetry should be, and so the romanticism of the practice must be acknowledged. 

Writing exists so that we don’t have to memorize, and yet there is something tremendously moving about recalling words decades after you first encountered them. Memorization’s consequence, writes Catherine Robson in Heart Beats: Everyday Life and the Memorized Poem, was that “these verses carried the potential to touch and alter the worlds of the huge numbers of people who took them to heart.” Books can burn, but as long as a poem endures in the consciousness of a person, they are in possession of a treasure.  “When the topic of verse memorization is raised today,” writes Robson, “the invocation is often couched within a lament.” Now we’re all possessors of personal supercomputers that can instantly connect us to whole libraries — there can seem little sense to make iambs and trochees part of one’s soul. Now the soul has been outsourced to our smartphones, and we’ve all become cyborgs, carrying our memories in our pockets rather than our brains. But such melancholy over forgetfulness has an incredibly long history. Socrates formulated the most trenchant of those critiques, with Plato noting in the Phaedrus that his teacher had once warned that people will “cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.” Important to consider where Socrates places literature, for if it is “within” — like heart, brain, or spleen — rather than some dead thing discarded on inked reeds. According to Socrates, writing is idolatrous; the difference between memorization and actual literature is the equivalent to a painting and reality. Though it must be observed that the only reason we care who Socrates happens to be is because Plato wrote his words down.

Poetry most evokes literature’s first role as a vehicle of memory, because the tricks of prosody – alliteration and assonance; consonance and rhyme – endured because they’re amenable to quick recall. Not only do such attributes make it possible to memorize poetry, they facilitate its composition as well. For literature wasn’t first written on papyrus but rather in the mind, and that was the medium through which it was recorded for most of its immeasurably long history. Since the invention of writing, we’ve tended to think of composition as an issue of a solitary figure committing their ideas to the eternity of paper, but the works of antiquity were a collaborative affair. Albert Lord explains in his 1960 classic The Singer of Tales that “oral epic song is narrative poetry composed in a manner evolved over many generations by singers of tales who did not know how to write; it consists of the building of metrical lines and half lines by means of formulas and formulaic expressions and of the buildings of songs by the use of themes.” Lord had accompanied his adviser, the folklorist and classicist Milman Parry, to the Balkans in 1933 and then again in 1935, where they recorded the oral poetry of the largely illiterate Serbo-Croatian bards. They discovered that recitations were based in “formulas” that made remembering epics not only easier, but also made their performances largely improvisational, even if the contours of a narrative remained consistent. From their observations, Parry and Lord developed the “Oral-Formulaic Theory of Composition,” arguing the pre-literate epics could be mixed and matched in a live telling, by using only a relatively small number of rhetorical tropes, the atoms of spoken literature.

Some of these formulas — phrases like “wine-dark sea” and “rosy-fingered dawn” for example — are familiar to any reader of The Iliad and The Odyssey, and the two discovered that such utterances had a history in Balkans and the Peloponnesus that goes back millennia. There’s an important difference between relatively recent works like Virgil’s The Aeneid (albeit composed two millennia ago) and the epics of Homer that predate the former by at least eight centuries. When Virgil sat down to pen “I sing of arms and man,” he wasn’t actually singing. He was probably writing, while whoever it was — whether woman or man, women or men — that invoked “Sing to me of the man, Muse, the man of twists and turns” most likely did utter those words accompanied by a lyre. The Aeneid is a work of literacy, while those of Homer are of orality, which is to say it was composed through memory. Evocatively, there is some evidence that the name “Homer” isn’t a proper noun. It may be an archaic Greek verb, a rough translation being “to speak,” or better yet “to remember.” There were many homers, each of them remembering their own unique version of such tales, until they were forgotten into the inert volumes of written literature.  

Socrates’ fears aren’t without merit. Just as the ability to Google anything at any moment has made contemporary minds atrophied with relaxation, so too does literacy have an effect on recall. With no need to be skilled in remembering massive amounts of information, reading and writing made our minds surprisingly porous. From the Celtic fringe of Britain to the Indus Valley, from the Australian Outback to the Great Plains of North America, ethnographers recount the massive amounts of information which pre-literate peoples were capable of. Poets, priests, and shamans were able to memorize (and adapt when needed) long passages by deft manipulation of rhetorical trope and mnemonic device. When literacy was introduced in places, there was a marked cognitive decline in peoples’ ability to memorize things. For example, writing in the journal Australian Geography, linguist Nick Reid explains that the oral literature of the aboriginal Narrangga people contains narrative details which demonstrate an accurate understanding of the geography of York Peninsula some 12,500 years ago, before melting glaciers irrevocably altered the coastline. Three hundred generations of Narrangga have memorized and told tale of marshy lagoons which no longer exist, an uninterrupted chain of recitation going back an astounding thirteen millennia. Today, if every single copy of Jonathan Franzen’s The Corrections and David Foster Wallace’s Infinite Jest were to simultaneously vanish, who among us would be able to recreate those books?

It turns out that some folks have been able to train their minds to store massive amounts of language. Among pious Muslims, people designated as Hafiz have long been revered for their ability to memorize the 114 surahs of the holy Quran. Allah’s words are thus written into the heart of the reverential soul, so that language becomes as much a part of a person as the air which fills lungs or the blood which flows in veins. “One does not have to read long in Muslim texts,” writes William Graham in Beyond the Written Word: Oral Aspects of Scripture in the History of Religion, “to discover how the ring of the qur’anic text cadences the thinking, writing, and speaking of those who live with and by the Qur’an.” Still relatively common in the Islamic world, the process of memorizing not just Longfellow or a few fragments of T.S. Eliot, but rather an entire book, is still accomplished to a surprising degree. Then there are those who through some mysterious cognitive gift (or curse depending on perspective) possess eidetic memory, and have the ability to commit entire swaths of text to retrieval without the need for mnemonic devices or formulas. C.S. Lewis could supposedly quote from memory any particular line of John Milton’s Paradise Lost that he was asked about; similar claims have been made about critic Harold Bloom.

Prodigious recall need not only be the purview of otherworldly savants, as people have used similar methods as a Hafiz or a Narrangga to consume a book. Evangelical minister Tom Meyer, also known as the “Bible Memory Man,” has memorized twenty books of scripture, while actor John Bassinger used his stage-skills to memorize all six thousand lines of Paradise Lost, with an analysis some two decades later demonstrating that he was still able to recite the epic with some 88% accuracy. As elaborated on by Lois Parshley in Nautilus, Bassinger used personal associations of physical movement and spatial location to “deep encode” the poem, quoting him as saying that Milton is a “cathedral I carry around in my mind… a place that I can enter and walk around at will.” No other type of art is like this — you can remember what a painting looks like, you can envision a sculpture, but only music and literature can be preserved and carried with you, and the former requires skills beyond memorization. As a scholar I’ve been published in Milton Studies, but if Bassinger and I marooned on an island somewhere, or trapped in the unforgiving desert, only he would be in actual possession of Paradise Lost, while I sadly sputtered half-remembered epigrams about justifying the ways of God to man.

Bassinger, who claims that he still loses his car keys all the time, was able to memorize twelve books of Milton by associating certain lines with particular movements, so that the thrust of an elbow may be man’s first disobedience, the kick of a leg being better to rule in hell than to serve in heaven. There is a commonsensical wisdom in understanding that memory has always been encoded in the body, so that our legs and arms think as surely as our brains do. Walter Ong explains in Orality and Literacy that “Bodily activity beyond mere vocalization is not… contrived in oral communication, but is natural and even inevitable.” Right now, I’m composing while stooped over, in the servile position of desk siting, with pain in my back and crick in my neck, but for the ancient bards of oral cultures the unspooling of literature would have been done through a sweep of the arms or the trot of a leg. Motion and memory being connected in a walk. Paradise Lost as committed by Bassinger was also a “cathedral,” a place that he could go to, and this is one of the most venerable means of being able to memorize massive amounts of writing. During the Renaissance, itinerant humanists used to teach the ars memoriae, a set of practical skills designed to hone memory. Chief among these tutors was the sixteenth-century Italian occultist, defrocked Dominican, and heretic Giordano Bruno, who took as students King Henry III of France and the Holy Roman Emperor Rudolf II (later he’d be burnt at the stake in Rome’s Campo de’ Fiori, though for unrelated reasons).

Bruno used many different methodologies, including mnemonics, associations, and repetitions, but his preferred approach was something called the method of loci. “The first step was to imprint on the memory a series of loci or places,” writes Dame Frances Yates in The Art of Memory. “In order to form a series of places in memory… a building is to be remembered, as spacious and varied a one as possible, the forecourt, the living room, bedrooms, and parlors, not omitting statues and other ornaments with which the rooms are decorated.” In a strategy dating back to Cicero and Quintilian, Bruno taught that the “images by which the speech is to be remembered… are then placed in imagination on the memorial places which have been memorized in the building,” so that “as soon as the memory… requires to be revived, all these places are visited in turn and the various deposits demanded of their custodians.” Bruno had his students build in their minds what are called “memory palaces,” architectural imaginings whereby a line of prose may be associated with an opulent oriental rug, a stanza of poetry with a blue Venetian vase upon a mantle, an entire chapter with a stone finishing room in some chateau; the candle sticks, fireplace kindling, cutlery, and tapestries each hinged to their own fragment of language, so that recall can be accessed through a simple stroll in the castle of your mind.

It all sounds very esoteric, but it actually works. Even today, competitive memorization enthusiasts (this is a real thing) use the same tricks that Bruno taught. Science journalist Joshua Foer recounts how these very same methods were instrumental in his winning the 2006 USA Memory Championship, storing poems in his mind by associating them with places as varied as Camden Yards and the East Wing of the National Gallery of Art, so that he “carved each building up into loci that would serve as cubbyholes for my memories.” The method of loci is older than Bruno, than even Cicero and Quintilian, and from Camden Yards and the National Gallery to Stonehenge and the Nazca Lines, spatial organization has been a powerful tool. Archeologist Lynne Kelly claims that many Neolithic structures actually functioned as means for oral cultures to remember text, arguing in Knowledge and Power in Prehistoric Societies: Orality, Memory, and the Transmission of Culture that “Circles or lines of stones or posts, ditches or mounds enclosing open space… serve as memory theaters beautifully.”

Literature is simultaneously vehicle, medium, preserver, and occasionally betrayer of memory. Just as our own recollections are more mosaic than mirror (gathered from small pieces that we’ve assembled as a narrative with varying degrees of success), so too does writing impose order on one thing after another. Far more than memorized lines, or associating stanzas with rooms, or any mnemonic trick, memory is the ether of identity, but it is fickle, changing, indeterminate, and unreliable. Fiction and non-fiction, poetry and prose, drama and essay — all are built with bricks of memory, but with a foundation set on wet sand. Memory is the half-recalled melody of a Mr. Rogers’ Neighborhood song played for your son while you last heard it decades ago; it’s the way that a certain laundry detergent smells like Glasgow in the fall and a particular deodorant as Boston in the cool summer; how the crack of the bat at PNC Park brings you back to Three Rivers Stadium, and Jagged Little Pill always exists in 1995. And memory is also what we forget. Our identities are simply an accumulation of memories — some the defining moments of our lives, some of them half-present and only to be retrieved later, and some constructed after the fact.

“And once again I had recognized the taste of the crumb of madeleine soaked in her decoration of lime flowers which my aunt used to give me,” Marcel Proust writes in the most iconic scene of In Remembrance of Things Past, and “immediately the old gray house upon the street, where her room was, rose up like the scenery of a theater.” If novels are a means of excavating the foundations of memory, then Proust’s magnum opus is possibly more associated with how the deep recesses of the mind operate than any other fiction. A Proustian madeleine, signifying all the ways in which sensory experiences trigger visceral, almost hallucinatory memories, has become a mainstay, even while most have never read In Remembrance of Things Past. So universal is the phenomenon, the way in which the taste of Dr. Pepper can propel you back to your grandmother’s house, or Paul Simon’s Graceland can place you on the Pennsylvania Turnpike, that Proust’s madeleine has become the totem of how memories remain preserved in tastes, sounds, smells. Incidentally, the olfactory bulb of the brain, which processes odors, is close to the hippocampus where memories are stored, so that Proust’s madeleine is a function of the cerebral cortex. Your madeleine need not be a delicately crumbed French cookie dissolving in tea, it could just as easily be a Gray’s Papaya waterdog, a Pat’s cheesesteak, or a Primanti Brother’s sandwich (all of those work for me). Proust’s understanding of memory is sophisticated, for while we may humor ourselves into thinking that our experiences are recalled with verisimilitude, the reality is that we shuffle and reshuffle the past, we embellish and delete, and what’s happened to us can return as easily as its disappeared. “The uncomfortable reality is that we remember in the same way that Proust wrote,” argues Jonah Lehrer in Proust was a Neuroscientist. “As long as we have memories to recall, the margins of those memories are being modified to fit what we know now.”

Memory is the natural subject of all novels, since the author composes from the detritus of her own experience, but also because the form is (primarily) a genre of nostalgia, of ruminating in the past (even an ostensibly invented one). Some works are more explicitly concerned with memory, their authors reflecting on the malleability, plasticity, and endurance of memory. Consider Tony Webster in Julian Barnes’ The Sense of an Ending, ruminating on the traumas of his school years, noting that we all live with the assumption that “memory equals events plus time. But it’s all much odder than this. Who was it said that memory is what we thought we’d forgotten? And it ought to be obvious to us that time doesn’t act as a fixative, rather as a solvent.” Amnesia is the shadow version of memory, all remembrance haunted by that which we’ve forgotten. Kazuo Ishiguro’s parable of collective amnesia The Buried Giant imagines a post-Arthurian Britannia wherein “this land had become cursed with a mist of forgetfulness,” so that it’s “queer the way the world’s forgetting people and things from only yesterday and the day before that. Like a sickness come over us all.” Jorge Luis Borges imagines the opposite scenario in his short story “Funes the Memorius,” detailing his friendship with a fictional Uruguayan boy who after a horse-riding accident is incapable of forgetting a single detail of his life. He can remember “ever crevice and every molding of the various houses.” What’s clear, despite Funes being “as monumental as bronze,” is that if remembering is the process of building a narrative for ourselves, then ironically it requires forgetting. Funes’ consciousness is nothing but hyper-detail, and with no means to cull based on significance or meaning, it all comes to him as an inchoate mass, so that he was “almost incapable of ideas of a general, Platonic sort.”

Between the cursed amnesiacs of Ishiguro and the damned hyperthymiac of Borges are Barnes’ aging characters, who like most of us remember some things, while finally forgetting most of what’s happened. Tellingly, a character like Tony Webster does something which comes the closest to writing — he preserves the notable stuff and deletes the rest. Funes is like an author who can’t bring himself to edit, and the Arthurian couple of Ishiguro’s tale are those who never put pen to paper in the first place. Leher argues that Proust “believed that our recollections were phony. Although they felt real, they were actually elaborate fabrications,” for we are always in the process of editing and reediting our pasts, making up new narratives in a process of revision that only ends with death. This is to say that memory is basically a type of composition — it’s writing. From an assemblage of things which happen to us — anecdotes, occurrences, traumas, intimacies, dejections, ecstasies, and all the rest — we impose a certain order on the past; not that we necessarily invent memories (though that happens), but rather that we decide which memories are meaningful, we imbue them with significance, and then we structure them so that our lives take on the texture of a narrative. We’re able to say that had we not been in the Starbucks near Union Square that March day, we might never have met our partner, or if we hadn’t slept in and missed that job interview, we’d never have stayed in Chicago. “Nothing was more central to the formation of identity than the power of memory,” writes Oliver Sachs in The River of Consciousness, “nothing more guaranteed one’s continuity as an individual,” even as “memories are continually worked over and revised and that their essence, indeed, is recategorization.” We’re all roman a clef in the picaresque of our own minds, but bit characters in the novels written by others.

A century ago, the analytical philosopher Bertrand Russell wrote in The Analysis of Mind that there is no “logical impossibility in the hypothesis that the world sprang into existence five minutes ago, exactly as it then was, with a population that ‘remembered’ a wholly unreal past.” Like most metaphysical speculation there’ something a bit sophomoric about this, though Russell admits as such when he writes that “I am not here suggesting that the non-existence of the past should be entertained as hypothesis,” only that speaking logically nobody can fully “disprove the hypothesis.” This is a more sophisticated version of something known as the “Omphalos Argument” — a cagey bit of philosophical book-keeping that had been entertained since the eighteenth-century — whereby evidence of the world’s “supposed” antiquity (fossils, geological strata, etc.) were seen as devilish hoaxes, and thus the relative youthfulness of the world’s age could be preserved alongside biblical inerrancy (the multisyllabic Greek word means “naval,” as in Eve and Adam’s bellybutton). The five-minute hypothesis was entertained as a means of thinking about radical skepticism, where not only all that we see, hear, smell, taste, and touch are fictions, but our collective memories are a fantasy as well. Indeed, Russell is correct in a strictly logical sense; writing this at 4:52 P.M. on April 20th, 2021, and there is no way that I can rely on any outside evidence, or my own memories, or your memories, to deductively and conclusively prove with complete certainty that the universe wasn’t created at 4:47 P.M. on April 20th, 2021 (or by whatever calendar our manipulative robot-alien overlords count the hours, I suppose).

Where such a grotesque possibility errs is that it doesn’t matter in the slightest. In some ways, it’s already true; the past is no longer here and the future has yet to occur, we’ve always been just created in this eternal present (whatever time we might ascribe to it). To remember is to narrate, to re-remember is still to narrate, and to narrate is to create meaning. Memories are who we are — the fundamental particles of individuality. Literature then, is a type of cultural memory; a conscious thing whose neurons are words, and whose synapses are what authors do with those words. Writing is memory made manifest, a conduit for preserving our identity outside of the prison of our own skulls. A risk here, though. For memory fails all of us to varying degrees — some in a catastrophic way — but everyone is apt to forget most of what’s happened to them. “Memory allows you to have a sense of who you are and who you’ve been,” argues Lisa Genova in Remember: The Science of Memory and the Art of Forgetting. Those neurological conditions which “ravage the hippocampus” are particularly psychically painful, with Genova writing that “If you’ve witnessed someone stripped bare of his or her personal history by Alzheimer’s disease, you know firsthand how essential memory is to the experience of being human.” To argue that our memories are ourselves is dangerous, for what happens when our past slips away from view? Pauline didn’t suffer from Alzheimer’s, though in her last years she was afflicted by dementia. I no longer remember what she looked like, exactly, this woman alive for both Kitty Hawk and the Apollo mission. I can no longer recall what her voice sounded like. What exists once our memories are deleted from us, when our narratives have unraveled? What remains after that deletion is something called the soul. When I dream about Pauline, I see and hear her perfectly.

Image: Pexels/Jordane Mathieu.

The Hunger Artist: Thoreau and the Irony of Performance Art

-

After spending almost a year translating English professor Laura Dassow Walls’s most recent biography, Henry David Thoreau: A Life, I was finally done. I thought I deserved some celebration, something fun, fiction perhaps. So, I took The Norton Anthology of Short Fiction from my bookshelf and flipped to a random page: “A Hunger Artist” by Franz Kafka. At first, I was disappointed at the serendipity. As a teenager, I had read the story twice in Chinese —it revolves around a weird man who starves to death for a performance—but I decided to go with the flow. This time, the story made me tremble. You may think I say this because my mind was still full of Thoreau, but it is true: “A Hunger Artist” is a portrait of Thoreau’s life.

Thoreau is now widely regarded as a nature writer and political activist, but a close look at both his life and works suggests an inherent performative quality. Take Walden and “Civil Disobedience,” two of his most famous pieces. He displayed his rejection of industrialization and materialism by living by the lake for two years, two months and two days; after being confined for one night in a Concord jail, he wrote “Civil Disobedience,” which embodied his resistance to slavery and the Mexican-American War. As Laura Dassow Walls beautifully puts it, Thoreau, known for his endeavors in “the experiment of life,” aspired “to turn life itself, even the simplest acts of life, into a form of art.” However, this performance artist side also makes him controversial.

For example, during his Walden years, the practitioner of avowed self-sufficiency went back home every weekend for dinner, and his mother probably did his laundry. Hypocrite? Yes, that’s what Kathryn Schulz calls him in her famous 2015 New Yorker piece, “Pond Scum.” But I wonder if hypocrisy is avoidable in any public staging: any dramatized gesture might strike others as fake. The problem of performance is also far more complicated than that. With any expressive art form, something is always lost along the way; this results in a disparity between what the performers think of their acts and what the audience takes away from them.

Shortly before his suicide, David Foster Wallace wrote a short critique of Kafka’s humor in “Laughing with Kafka”: “Kafka’s comedy is always also tragedy, and this tragedy always also an immense and reverent joy.” It is Wallace’s style to drop bombs of recondite wisdom without further explanation. But he offers an interesting lens through which to view both the hunger artist and Thoreau: while they offer their lives as tragic, the audience always receives them as comic.

Both Kafka’s hunger artist and Thoreau, in their own ways, have very serious religious motivations. The fictional character is a fasting performer, and the climax of his show “was fixed by his impresario at forty days,” a loud echo of the sacrifice of Jesus Christ’s journey into the desert. In American Nonviolence: A History of an Idea, theology professor Ira Chernus argues, “Thoreau’s religious life, which was for him the sum total of his life, was a quest for direct experience of this spiritual process of ultimate reality.” To Thoreau, God’s “Higher Laws” manifest most strongly in nature, where he first saw the interconnectedness of all reality. For example, in his first book, A Week on the Concord and Merrimack Rivers, Thoreau was shaken by the image of innocent fishes thrown into the hydraulic machinery of the Billerica Dam. Soon, he saw similar power and injustice rampant in human society: slaves controlled by their owners; Native Americans expelled by Anglo Immigrants; Mexicans threatened by the war of conquest. Thoreau’s various roles—spiritual seeker, writer, abolitionist, naturalist, and environmentalist—aligned with one another in his religious pursuit; he tried to live up to his moral ideals.

However, in a modern world, serious religious practices happen and stay in the church. In the public sphere, a secular audience tends to receive everything—religious performance included—as entertainment. Therefore, the more seriously the performers act, the more entertaining they become. As Kafka writes: “He made no secret of this, yet people did not believe him, at best they set him down as modest, most of them, however, thought he was out for publicity or else was some kind of cheat who found it easy to fast because he had discovered a way of making it easy, and then had the impudence to admit the fact, more or less.” Because nobody fasts anymore, only Kafka’s hunger artist knows that fasting is the easiest thing in the world. But even a simple message like this gets warped by the public’s skepticism.

The last thought in the quote—“some kind of cheat”—is the same accusation Schultz levels against Thoreau’s grand Walden show: he “kept going home for cookies and company.” (Note the secular word choice here.) Yet Thoreau is a bit different than Kafka’s performer. The reason Thoreau had to head back to Concord so often is perhaps more daunting, not more cheerful. As Walls explains in her biography, “Thoreau kept on taking jobs as the town handyman, just as he’d done for years—jobs on which he depended for his modest but still necessary income.” He did carpentry, painted houses, and built fences for a dollar a day. He didn’t live comfortably in his cabin, romanticizing his ascetic life as Schultz implies. Thoreau’s Walden years were as difficult as the rest of his early life. Evidence suggests that he didn’t even have a “loo” in his “lake house.” But in Thoreau’s time, even the poverty he wore as a badge seemed ridiculous to others. Ralph Waldo Emerson, Thoreau’s mentor and close friend, tried to reason with Thoreau’s actions from a secular perspective, but he too ended up with contempt: “I cannot help counting it a fault in him that he had no ambition.”

Any human flaws and distress can often strike a humorous note in a secular context. Our laughter has a cruel nature; we take pleasure feeling superior to others. Take physical appearance, for example. Centuries ago, Aristotle had already identified the link between ugliness and comedy in Poetics: “Comedy, as we have said, is a representation of inferior people, not indeed in the full sense of the word bad, but the laughable is a species of the base or ugly. It consists of some blunder or ugliness that does not cause pain or disaster…” Charlie Chaplin was devastatingly handsome, but he knew that he needed a toothbrush mustache, a derby hat, and a duck-like gait to appear comical. In “A Hunger Artist,” Kafka adopted an “anti-hero” to add to the character’s absurdity. He looks “pallid in black tights, with his ribs sticking out so prominently.” He is so odd that the only suitable place for him is in a cage among the straws. Nobel Prize Laureate Gabriel Garcia Marquez’s  “A Very Old Man with Enormous Wings” is a similar story. The protagonist—the supposed “angel”—is bald, toothless, and has “huge buzzard wings, dirty and half-plucked.” To match his appearance, he is shut in a chicken coop. Similarly, and unfortunately, Thoreau was born ugly. When the soon-to-be famous author Nathaniel Hawthorne came to live in Concord in 1842, he thought the 25-year-old Thoreau “a singular character…ugly as sin, long-nosed, queer-mouthed, and with uncouth and somewhat macabre behavior Although Hawthorne would later claim that Thoreau’s ugliness suited his honest and agreeable character, I find his use of the expression “ugly as sin” very interesting. Today, the phrase has lost much of its religious connotation; at the time, however, Thoreau’s sorry appearance seemed to suggest some hidden, inner flaw. Not only because he lacked the charisma that naturally accompanies beauty, but because his failure to live up to God’s image seemed to contradict his self-portrait of a god-like, moral man.

Through performance, ugliness—among other human flaws—is received by the audience as otherness. (Consider, for example, our reaction to Chaplin’s characters: it is not that they look ugly—well, they do—but that they look odd, and thus hilarious.) Still, we must remember, as Aristotle says, that the strangeness must not “cause pain or disaster,” or else people won’t laugh. Wallace uses the phrase “entertainment as reassurance” to distinguish American humor (think about Tom and Jerry) from Kafka’s humor. Wallace suggests that Kafka’s jokes are unsettling and thus inaccessible to American college students. But I think the balance between eccentricity and comedy is present in Kafka’s stories; it is the audience, not Kafka, searching for “reassurance.”

From the very beginning, eccentricity offends people because it violates social norms. In “What Is to Be Done about the Problem of Creepy Men?,” her discussion about people’s judgment of “creepiness,” law scholar Heidi Matthews reminds us that our “gut” has more to do with “regulating the boundaries of social mores than keeping us safe.” She has a point there, but I would argue that social norms are our primary source of security. So, to cope with the uncanniness of eccentricity in others, we try to explain their behaviors in a way that will solidify the validity of our social rules.

Consider the media coverage of any appalling crime. The first thing journalists do is to seek out explanations for the macabre behavior, which is usually when the family shit comes in. We are satisfied with the fact that the perpetrator was, for example, abused by his father in his childhood. We feel safe because, as long as we prescribe family values to our children, they won’t grow into psychopaths. Wallace, in the same essay on Kafka’s humor, mentions some of the tropes Kafka plays on in “A Hunger Artist.” The word “anorexia” shares the same etymological root with the Greek word for “longing.” Therefore, we can read the protagonist’s strange behavior as “starved for attention or love-starved.” We don’t know whether or not he fasts in order to build connections with people; yet when we believe that he does, we are not troubled by his strange conduct.

Then, to further strengthen our wounded sense of security, we emphasize the otherness of the “other” even more. When someone commits a horrifying crime, the newspapers are eager to interview his classmates, teachers, neighbors, and even those who only had chance encounters with him; they are searching for any possible hints to the nature of his otherness. Therefore, as readers, we feel relieved that we can always detect those signs in a potential criminal and thus avoid danger. Also, because the odd—as the word and its synonyms suggest—are rare, once we lock them up, we will be fine. Once we feel secure, we can devour their eccentricity with pleasure in the same way we relish celebrity gossip.

It is no coincidence that the stories mentioned above—“A Hunger Artist” and “A Very Old Man with Enormous Wings”—both apply metaphors of confinement: the cage and chicken coop. We keep distinct boundaries between us and the other; as long as these boundaries are in place, freak shows are amusing. Yet, the most disturbing moment comes when the eccentric claim they are no different than us, that they abide by social norms, and that we should see ourselves in them. Towards the end of Kafka’s story, the hunger artist confesses he is a normal person:
“Are you still fasting?” asked the overseer, “when on earth do you mean to stop?” “Forgive me, everybody,” whispered the hunger artist; only the overseer, who had his ear to the bars, understood him. “Of course,” said the overseer, and tapped his forehead with a finger to let the attendants know what state the man was in, “we forgive you.” “I always wanted you to admire my fasting,” said the hunger artist. “We do admire it,” said the overseer, affably. “But you shouldn’t admire it,” said the hunger artist. “Well then we don’t admire it,” said the overseer, “but why shouldn’t we admire it?” “Because I have to fast, I can’t help it,” said the hunger artist. “What a fellow you are,” said the overseer, “and why can’t you help it?” “Because,” said the hunger artist, lifting his head a little and speaking, with his lips pursed, as if for a kiss, right into the overseer’s ear, so that no syllable might be lost, “because I couldn’t find the food I liked. If I had found it, believe me, I should have made no fuss and stuffed myself like you or anyone else.” These were his last words, but in his dimming eyes remained the firm though no longer proud persuasion that he was still continuing to fast.
This is the moment when we lose our laughter. The artist hints at a possibility that any of us could become him as simple as that. But Kafka is still able to maintain the comedy by showing people’s desperation in clinging to their safety nets. In the story, the way people forget the strange artist is by buttressing his otherness. After his death, a panther is put into the cage to replace him. Unlike the pathetic fasting performer, the animal is full of life and shows no nostalgia about his freedom. The ending achieves two things. First, it erases people’s sad memories by offering something completely different. Second, it reassures people of otherness. For example, if someone should mention the hunger artist again, people can point at the animal cage, suggesting the late performer was not even human.

There is a similar tension between Thoreau’s lifelong performance and his spectators. Many readers, though they admire him, find his self-righteous and didactic tone unbearable. (Consider this quote in the opening chapter of Walden: “Most of the luxuries, and many of the so-called comforts of life, are not only indispensable, but positive hindrances to the elevation of mankind.”) I tend to view these “teachings” as his confessions; I can even imagine him speaking in the same voice of the hunger artist: I have to live a principled life, I can’t help it…I couldn’t find an existing ethical lifestyle.

Thoreau, like Kafka’s hunger artist, was addicted to confessing. He admitted his hypocrisy. He fussed about human nature. Take his attitude toward eating meat: when he was with friends, he ate whatever was served. But alone in the woods, he interrogated himself about the ethics of eating animals. “I have no doubt that it is a part of the destiny of the human race, in its gradual improvement, to leave off eating animals, as surely as the savage tribes have left off eating each other when they came in contact with the more civilized.” He also groaned about the immoral modern world of which he was a part. After he took to natural science, he questioned its ethics: “The inhumanity of science concerns me as when I am tempted to kill a rare snake that I may ascertain its species—I feel that this is not the means of acquiring true knowledge.” When land surveying finally brought him steady income at the age of 32, he found himself becoming an accomplice in destroying his beloved nature. The forest he had surveyed just a year ago was clear-cut and subdivided into fifty-two house lots by its owner. “Trade curses everything it handles,” Thoreau remarked. His reaction resonated with his antipathy towards commerce back in his Harvard years: materialism, he said in a debate, “enslaves us, turns us into brutes. To be human is to cast off these material desires and walk forth, freely, into paradise.” Ironically, even his most honest gesture to fight against immorality seems suspicious and hence quixotic. “You all know,” he warned his neighbors in his first public speech in Concord, “the lecturer who speaks against money is being paid for his words—and that’s the lesson you remember.”

Thoreau’s words are disturbing to us because they reveal our hypocrisy. We don’t want to be plagued by moral quandaries every minute of our lives. In truth—like the aforementioned fictional characters—Thoreau “lived in a cage” throughout his performance career, as he spent much of his time in isolation. On the one hand, he never joined any political organization. His faith in individualism was consistent with his faith in moral freedom promised by God. Thoreau was cautious to avoid any coercion and believed “shared religious or moral values will enhance community only if they are adopted voluntarily.” On the other hand, the public was eager to paint his heroic singularity into eccentricity. After that work was done, his audience could proudly conclude that Thoreau’s solitude led to his isolation; it was his personal fault, and the spectators were let off the hook. “Poor Thoreau,” Schulz derides him in her New Yorker article. “He, too, was the victim of a kind of shipwreck—for reasons of his own psychology, a castaway from the rest of humanity.” Schulz’s criticism was typical during Thoreau’s lifetime. After Thoreau’s imprisonment, Emerson defended his own adherence to the social norm—paying tax—by scolding his young friend: “Your true quarrel is with the state of Man.” When people laughed at Thoreau’s quirkiness, they successfully simplified and silenced his message. That is why, in her same-titled essay “Civil Disobedience,” Hanna Arendt doubts Thoreau’s politics by quoting scholar Nicholas W. Puner: “Civil disobedience practiced by a single individual is unlikely to have much effect. He will be regarded as an eccentric more interesting to observe than to oppress.”

For Wallace, the central comedy of Kafka’s work is the horrific struggle that Kafka’s characters undergo to establish and confirm their human selves. Wallace’s view reminds me of Albert Camus’s final analysis on Sisyphus in The Myth of Sisyphus: “The struggle itself toward the heights is enough to fill a man’s heart. One must imagine Sisyphus happy.” Whereas one can only speculate about the heart of Kafka’s characters or Sisyphus, Thoreau exuded joy and hope. Since the Fugitive Slave Act passed by Congress in 1850, he had been grilling himself with this dreadful query: “I walk toward one of our ponds, but what signifies the beauty of nature when men are base?” But Thoreau never let himself wallow in despair. In a journal entry that would later appear in the ending of “Slavery in Massachusetts,” Thoreau captured a silver lining in nature: “But it chanced the other day that I scented a white water-lily, and a season I had waited for had arrived. It is the emblem of purity.” As Walls argues, rooted in “the slime and muck of earth,” those pure, fragrant flowers symbolize Thoreau’s belief in the “purity and courage” that will be born of “the sloth and vice of man, the decay of humanity.”

Though Thoreau and Kafka’s characters may experience a similarly absurd reality during their lifetimes, their afterlives are as different as night and day. The characters in Kafka’s fiction are often trapped in time. The hunger artist has to repeat the 40-day performance over and over again. Gregor, in The Metamorphosis, is first urged by human time and then tormented by bug-time. In the eponymous short story, Kafka’s Poseidon is burdened with endless paperwork and never gets to see the oceans. (Yes, another tragedy turned comedy!) The only thing that saves them from the labyrinth of time is death—which, then again, leads to nothingness, meaninglessness, and the irredeemable. In contrast, as Walls shows in her biography, Thoreau was able to believe in “the constant slow work of creation” by enlarging the scale of time:
It was easy to see destruction, which is sudden and spectacular: everyone hears the crash of a falling tree. But who hears the growth of a tree, the constant slow work of creation? “Nature is slow but sure.” She wins the race by perseverance; she knows that seeds have many uses, not just to reproduce their kind. “If every acorn of this year’s crop is destroyed, never fear! She has more years to come.” Here was his [Thoreau’s] solution to the baffling waste of the white oak crop: what made no sense on a human scale could be understood by lengthening the measure of time to the scale of the planet. The man who was running out of time now thought as if he had all the time, literally, in the world.
Consequently, Thoreau’s death transcends him into a living soul in his books—pure and bodiless—the state he longed for when he was alive. Over time and space, he himself facilitates creation in the way he favored: he inspires his readers to grow voluntarily, freely, and deliberately. Among them, probably two of the most famous Thoreauvian readers are Martin Luther King Jr. and Mahatma Gandhi. Roughly a century after Thoreau’s death, both Gandhi and Dr. King put Thoreau’s philosophy into practice. In India and the United States, officers resigned their posts, jails were filled with conscious objectors, and the “machinery” of the unjust system was “clogged.” Through his seemingly ineffective political struggle, Thoreau was able to elevate those that came after him to a different point in that struggle.

While people today still find themselves stuck in absurd, Kafkaesque situations, we mustn’t deny the slow but sure progress of human civilization: the abolition of slavery, the creation of the national park service, women’s suffrage, the end of segregation—just to name some of the most visible examples in the United States. Indeed, as Thoreau once wrote in a letter to Harry Blake, a Harvard Divinity School graduate, “It is not in vain that man speaks to man. This is the value of literature.” Despite all the voyeurism, blasphemy, and suppression, Thoreau’s life of performance art is truly an immense and reverent joy.

Protagonists

-

This essay is excerpted from the author’s memoir-in-progress, The Vulgar American.

In May of 2019, I’d been living in the U.K. for four months, and suddenly I was hearing a lot less frequently from my mom. She wasn’t feeling well. She was often at the doctors. She was losing weight. Having trouble keeping food down. They kept prescribing her things for stomach ulcers and sending her home.

But nothing about your diet has changed, I said.

I took a lot of Advil the other day, she said.

You always take Advil.

I know, but this was a lot. And with wine.

I don’t think so, Mom. I knew that when she said wine, she meant one glass, two at most. Are you stressed about something?

No, she said.

Then it doesn’t make sense.

*

I was taught that an easy way to write fiction was to create a character who wants something. And then to place obstacles in front of that character. And watch them overcome those obstacles

Every time I tried to write fiction this way, I failed.

Sometimes I didn’t understand a person wanting something that wasn’t achievable. Sometimes I didn’t understand an obstacle that could be overcome.

The obstacles life threw at me were disasters that couldn’t be faced or overcome or fixed. They stopped me. They stopped me and on the other side of obstacles like that, I no longer wanted the things I wanted before.

And what was it that I wanted? I wanted to go to grad school, so I applied to many and went to the one program that offered me a full ride (both times). I wanted to go to Canada, so I applied to grad school again, and I didn’t get in, so my family didn’t get to go. I had to want something else, something new, because the things I wanted had clear paths and clear dead ends.

Was I supposed to dream bigger? Was I supposed to dream more abstractly? Was I supposed to be pining after someone or something unlikely? Was I supposed to, in the end, surprise myself?

*

When I wrote Naamah, she didn’t want anything. She was set on a path by her husband and had to make the best of it.

As her author, I offered her things, escapes, gifts, friendships, lovers—the opposite of obstacles. I could never have sat down to the page and offered that woman obstacles, forced them on her.

I felt like No-Face in Spirited Away, holding out handfuls of gold to Chihiro, making a gentle noise, uh, uh.

And as I was not offering gold or greed, as I was not judging her, ever, Naamah took her time and considered her gifts, and often gently refused them, just as Chihiro had.

*

A woman often cannot want.

*

A woman makes do.

A woman makes do well, but that is not to be confused with a woman who is happy.

*

I often considered the happiness of a woman who wants and pursues. But I did not write about that woman because I had never met a woman like that.

*

When my mother was getting sick, I was running out of my medicine. The doctors in the U.K. gave me the exact prescriptions as I’d had before, but when I went to the pharmacy, the Trazadone wasn’t there.

What do you mean?

We don’t have it.

Yes. I understand that. When will you have it?

We don’t know.

I stopped myself from asking What do you mean? again. Instead I said, What do I do?

We’ll call you when we have it.

But you don’t know when that will be?

No.

No. I said it to myself.

*

The day I found out my mother’s pancreatic cancer diagnosis, I walked into my doctor’s practice and said I needed grief counseling. They made me the appointment to discuss that with my doctor for 10 minutes.

*

I’m changing so many words to Americanize my language. I went to my surgery. They made an appointment for me to see my GP. I lived in a flat. My son went to primary school. Next he would go to secondary school. I said ground floor to get around not saying first floor. And so on. And so on.

I was living a life I almost recognized in a language I almost recognized. The new words and appropriate contexts weren’t hard to learn, and yet they made everything seem slightly off, like Britain was the other side of a shimmering field that separated two parallel universes, and the big differences were in the healthcare system and the thing I missed most was Target.

*

I learned how to sleep without the Trazadone. I didn’t fall asleep immediately anymore, but it happened. Perhaps because my first response to my mother’s sickness was that I was much more prone to sleep. At night. During the day. Anytime no one needed anything of me.

I no longer needed anything of myself. Not to eat or write or amuse myself or feel purpose. Nothing.

I’d tell myself that I’d enjoy going to see a movie, and then I’d sleep in the theater. I was like the old man in the back row. I should have sat next to him. My snoring woke me up.

At another point in my life, I would have been embarrassed. Not then.

*

I was most awake from 3:30 pm to 6:30 pm—the hours between when my son came home from school and my husband came home from work. For three hours every day I was alive and brimming and engaged. That was what I could handle.

And those hours brought me joy.

I was living for those hours and those hours made me understand what I was living for and what life was and what I wanted more of, the most of. Though this is more about what happened after she died, and not about early that summer when she got sick.

*

My mother read more than anyone I knew. More than any English teacher, more than any writer, more than any student of writing. She read the Booker Prize winner every year and then she’d read the finalists and then she’d go back and read the winners from the ‘70s when she got bored.

She’d fall in love with an author and read every book they’d ever written. I remember when she did this with Atwood, and I fell in love with the dust jacket of The Blind Assassin with its thick paper and gold lettering. I was 16.

She’d read all the classics—everything by Dickens, Tolstoy, Hemingway. And she’d read the less literary, too. She knew when to give me The Clan of the Cave Bear. When to give me The Thorn Birds. When to give me The Fountainhead.

And when I wanted literary: The Source, Mila 18, Marjorie Morningstar. My whole reading life was guided by her. Often I’d already read the books assigned in English class a few years before at her recommendation. The Catcher in the Rye, A Separate Peace, Of Mice and Men.

She didn’t mind when I didn’t like something—like For Whom the Bell Tolls. She gave me the next book. Sometimes we read a breakout hit one right after the other. She’d finish and then the copy would become mine. This is how we read the Best American Short Stories collection every year, too.

She was honest and frank about the books and their shortcomings, but she always read them to the end. She could always find something to admire.

When she got sick she couldn’t read, and she didn’t want to talk about books either. We were both thinking it. I was bringing up books she would never get to read.

*

I didn’t think about whether my mother was a woman who wanted things unless she asked me to.

Do you think I should have been a teacher? she asked.

I don’t know. You loved teaching.

This was after she retired. She had taken more and more classes about different art forms since then. Sculpture, upholstery, basket-weaving.

I think I should have been an artist. or I think I could have been an artist. or Do you think I should have been an artist?

Artists don’t make much money. It’s not a good life.

She knew when I spoke like this, I was speaking about my own life, but she didn’t comment on that part of what I was saying, that underlying simmer.

I think I could have, she said.

*

When I was very young, I realized my mother hadn’t done things because she had had me. She often talked about the travel she had thought she would do. I cried.

Why are you crying? she asked me.

I tried to explain that I’d kept her from what she wanted. I’d changed her. I’d held her up. I’d stopped her.

No, she said. Well, yes, she said. Of course you did, she said. But because I wanted you to. Because I wanted you.

It wasn’t easy to explain for either of us.

*

My mother didn’t say—Don’t worry. We can still do those things. Or I can still do those things.

I swore to myself that when I had a child, I would still do everything I wanted to do. I would strap him to me and take him with me and off we would go.

It was a naïve way to think about life and children and wants. But it helped me to have the child I wanted, and then keep living.

*

I also thought about my mother differently after that, as if she had submitted to her life. Given in and given up.

That, too, was naïve.

She had shaped her life. She had filled it with the things she wanted—her children—and then provided for us. And she’d done it better than I ever could.

And when we got older she traveled and made art and had her musings about her what-ifs.

Life is long and different parts of it capture us, and that can look like resignation, but it’s the impositions of the things we want, and the needs of those things change, and life changes again. And if you never felt resigned to it, then you weren’t. You were waiting.

Naamah, and all the women in my books, are apologies to my mother, for misunderstanding.

*

How did you teach a life like that to fiction writers? How did you teach a woman’s life?

Was the craft of fiction around desire and obstacle taught by men and for men, for men protagonists and men readers? Was fiction part of the patriarchy? Was the craft of fiction part of the patriarchy?

The answers were all yes, and the answers were always surprising only because I had never thought of the questions before.

*

The pharmacy called two months later and said they had my Trazadone.

I don’t want it, I said. Even though I did want it. I wanted it badly. But I didn’t want to come to rely on it again, and then be told I could not have it.

My whole life was a reminder of everything I had come to rely on that was no longer mine to have.

Image Credit: Pixabay