Elegy of the Walker

-

By the conclusion of Mildred Lissette Norman’s 2,200 mile hike on the Appalachian Trail in 1952—the steep snow-covered peaks of New Hampshire’s White Mountains; the autumnal cacophony of Massachusetts’ brown, orange, and red Berkshires; the verdant greens of New York’s Adirondacks and Pennsylvania’s Alleghanies; the misty roll of Virginia’s Blue Ridge; the lushness of North Carolina and Georgia’s Great Smoky Mountains—she would wear down the soles of her blue Sperry Topsiders into a hatchwork of rubber threads, the rough canvas of the shoes ripping apart at the seams. “There were hills and valleys, lots of hills and valleys, in that growing up period,” Norman would recall, becoming the first woman to hike the trail in its entirety. The Topsiders were lost to friction, but along with 28 additional pairs of shoes over the next three decades, she would also gain a new name—Peace Pilgrim. The former secretary would (legally) rechristen herself after a mystical experience somewhere in New England, convinced that she would “remain a wanderer until mankind has learned the way of peace.”


Peace Pilgrim’s mission began at the Rose Bowl Parade in 1953, gathering signatures on a petition to end the Korean War. From Pasadena she trekked over the Sierra Nevada, the hardscrabble southwest, the expansive Midwestern prairies, the roll of the Appalachians and into the concrete forest of New York City. She gained spectators, acolytes, and detractors; there was fascination with this 46-year-old woman, wearing a simple blue tunic emblazoned in white capital letters with “Walking Coast to Coast for Peace,” her greying hair kept up in a bun and her pockets containing only a collapsible toothbrush, a comb, and a ballpoint pen. By the time she died in 1981, she had traversed the United States seven times. “After I had walked almost all night,” she recalled in one of the interviews posthumously collected into Peace Pilgrim: Her Life and Work in Her Own Words, “I came out into a clearing where the moonlight was shining down… That night I experienced the complete willingness, without any reservations, to give my life to something beyond myself.” It was the same inclination that compelled Abraham to walk into Canaan, penitents to trace Spain’s Camino de Santiago, or of the whirling Mevlevi dervishes traipsing through the Afghan bush. It was an inclination toward God.

Something about the plodding of one foot after another, the syncopation mimicking the regularity of our heartbeat, the single-minded determination to get from point A to point B (wherever those mythic locations are going to be) gives walking the particular enchantments that only the most universal of human activities can have. Whether a stroll, jog, hike, run, saunter, plod, trek, march, parade, patrol, ramble, constitutional, wander, perambulation, or just plain walk, the universal action of moving left foot-right foot-left foot-right foot marks humanity indelibly, so common that it seemingly warrants little comment if you’re not a podiatrist. But when it comes to the subject, there are as many narratives as there are individual routes, for as Robert Macfarlane writes in The Old Ways: A Journey on Foot, “a walk is only a step away from a story, and every path tells.” Loathe we should be to let such an ostensibly basic act pass without some consideration.

Rebecca Solnit writes in Wanderlust: A History of Walking that “Like eating or breathing, [walking] can be invested with wildly different cultural meanings, from the erotic to the spiritual, from the revolutionary to the artistic.” Walking is leisure and punishment, introspection and exploration, supplication and meditation, even composition. As a tool for getting lost, both literally and figuratively, of fully inhabiting our being, walking can empty out our selfhood. A mechanism for transmuting a noun into a verb, or transforming the walker into the walking. When a person has pushed themselves so that their heart pumps like a piston, that they feel the sour burn of blisters, the chaffing of denim, so that breathing’s rapidity is the only focus, then there is something akin to pure consciousness (or possibly I’m just fat). And of course, all that you can simply observe with that consciousness, unhampered by screen, so that walking is “powerful and fundamental,” as Cheryl Strayed writes in her bestseller Wild: From Lost to Found on the Pacific Coast Trail: an account of how Strayed hiked thousands of miles following the death of her mother, and learning “what it was like to walk for miles with no reason other than to witness the accumulation of trees and meadows, mountains and deserts, streams and rocks, rivers and grasses, sunrises and sunsets… It seemed to me it had always felt like this to be a human in the wild.”     

Maybe that sense of being has always attended locomotion, ever since a family of Australopithecus pressed their calloused heals into the cooling volcanic ash of Olduvai Gorge in Tanzania some 3.7 million years ago. Discovered in 1976 by Mary Leaky, the preserved footprints were the earliest example of hominoid bipedalism. Two adults and a child, the traces suggested parents strolling with a toddler, as if they were Adam and Eve with either Cain or Abel. Olduvai’s footprints are smaller than those of a modern human, but they lack the divergent toe of other primates, and they indicate that whoever left them moved from the heel of their feet to the ball, like most of us do. Crucially, there were no knuckle impressions left, so they didn’t move in the manner that chimpanzees and gorillas do. “Mary’s footprint trail was graphically clear,” explains Virginia Morell in Ancestral Passions: The Leakey Family and the Quest for Humankind’s Beginnings, “the early hominoids stood tall and walked as easily on two legs as any Homo sapiens today… it was apparently this upright stance, rather than enlarged crania, that first separated these creatures from other primates.”

The adults were a little under five-feet tall and under 100 pounds, covered in downy brown fur with slopping brow and overbit jaw, with the face of an ape but the uncanny eyes of a human, the upright walking itself transforming them into the latter. Walking preceded words, the ambulation perhaps necessitating the speaking. Australopithecus remained among the pleasant green plains of east Africa, but by the evolution of anatomically modern humans in the area that is now Kenya, Tanzania, and Ethiopia, and walking became the engine by which people disseminated through the world. Meandering was humanity’s insurance, as Nicholas Wade writes in Before the Dawn: Recovering the Lost History of Our Ancestors, that as little as 50,000 years ago and the “ancestral human population, the first to possess the power of fully articulate modern speech, may have numbered only 5,000 people, confined to a homeland in northeast Africa.” In such small numbers, and in such a circumscribed area, humanity was prisoner to circumstance, where an errant volcano, draught, or epidemic could have easily consigned us to oblivion. Walking as far as we could was our salvation.

Humans would walk out of Africa into Asia and perhaps by combination of simple boat and swimming down the Indonesian coast into Australia, across the Bering Strait and into North America, and over the Panamanian isthmus and into South America, with distant islands like Madagascar, New Zealand, and Iceland waiting for sailing technology to ferry people to their shores millennia after we left the shade of the Serengeti’s bulbous baobab trees. We think of our ancestors as living in a small world, but there’s was an expansive realm, all the more so since it wasn’t espied through a screen. Partner to burrowing meerkats peaking over the dry scrub of the Kalahari, nesting barn owls overlooking the crusting, moss-covered bark of the ancient Ciminian Forest, the curious giant softshell tortoises of the Yangtze River. To walk is to be partner to the natural world, it is to fully inhabit being an embodied self. Choosing to be a pedestrian today is to reject the diabolic speed of both automobile and computer. Macfarlane writes in The Wild Places that nature is “invaluable to us precisely because… [it is] uncompromisingly different… you are made briefly aware of a world at work around and beside our own, a world operating in patterns and purposes that you do not share.” Sojourn into a world so foreign was the birthright of the first humans, and it still is today, if you choose it. 

All continents, albeit mostly separated by unsettlingly vast oceans, are in some form or another connected by thin strips of land here or there, something to the advantage of Scottish Victorian explorer Sir George Thompson who walked from Canada to Western Europe, via Siberia. More recently there was the English explorer George Meegan who from 1977 to 1983 endeavored to walk from Patagonia to the northern tip of North America, which involved inching up South America’s Pacific coast, crossing the Darien Gap into Central America, circling the Gulf Coast and walking up the Atlantic shore, following the Canadian border, and then walking from the Yukon into Alaska. Meegan’s expedition covered 19,019 miles, the longest recorded uninterrupted walk. Effected by the nervous propulsion that possibly compelled that first generation to leave home, Meegan explains in The Longest Walk: The Record of our World’s First Crossing of the Entire Americas that “once the idea seemed to be a real possibility, once I thought I could do it, I had to do it.” Along the way Meegan wore out 12 pairs of hiking boots, got stabbed once, and met peanut-farmin’ Jimmy Carter at his Georgian homestead.

Meegan’s route was that of the first Americans, albeit accomplished in reverse. The most recent large landmass to be settled, those earliest walkers observed a verdant expanse, for as Craig Childs describes the Paleolithic continent in Atlas of a Lost World: Travels in Ice Age America, the land east of the Bering Strait was a “mosaic of rivers and grasslands… horizons continuing on as if constantly giving birth to themselves—mammoths, Pleistocene horses, and giant bears strung out as far as the eye could see. It must have seemed as if there was no end, the generosity of this planet unimaginable.” A venerable (and dangerous) tradition to see America as an unspoiled Paradise, but it’s not without its justifications, and one that I’ve been tempted to embrace during my own errands into the wilderness. Raised not far from Pittsburgh’s Frick Park, a 644-acre preserve set within the eastern edge of the city, a bramble of moss-covered rocky creeks and surprisingly steep ravines, a constructed forest primeval meant to look as it did when the Iroquois lived there, and I too could convince myself that I was a settler upon the frontier. Like all sojourns into the woods, I found that my strolls in Frick Park couldn’t help but have a bit of the mythic about them, especially at dusk.

One day when I was a freshman, a friend and I took a stack of cheap, pocket-sized, rough orange-covered New Testaments which the Gideons, who were standing the requisite constitutionally mandated distance from our high school, had been assiduously handing out as part of an ill-considered attempt to convert our fellow students. In a pique of adolescent blasphemy, we went to a Frick Park path, and walked through the cooling October forest as twilight fell, ripping the cheap pages from the bibles and letting them fall like crisp leaves to the woods’ floor, or maybe inadvertently as a trail of Eden’s apple seeds. No such thing as blasphemy unless you already ascent to the numinous, and as our heretical stroll turned God on his head, a different pilgrim had once roamed woods like these. During the Second Great Awakening when revivals of strange fervency and singular belief burnt down the Appalachian edge, a pious adherent of the mystical Swedenborgian faith named John Chapman was celebrated for traipsing through Pennsylvania, Ohio, Indiana, and Illinois. Briefly a resident of the settlement of Grant’s Hill in what is today downtown Pittsburgh, about several miles west of Frick Park, and Chapman’s mission was to spread the gospel of the New Church, along with the planting of orchards. Posterity remembers him as Johnny Appleseed.

Folk memory has Johnny
Appleseed fixed in a particular (and peculiar way): the bearded frontiersmen,
wearing a rough brown burlap coffee sack, a copper pot on his head, and a
trundled bag over his shoulder as the barefoot yeoman plants apples across the
wide expanse of the West. I’d wager there is a strong possibility you thought
he was as apocryphal as John Henry or Paul Bunyan, but Chapman was definitely
real; a Protestant St. Francis of whom it was said that he walked with a tamed wolf
and that due to his creaturely benevolence even the mosquitoes would spare him
their sting. Extreme walkers become aspects of nature, their souls as if the
migratory birds that trace lines over the earth’s curvature. Johnny Appleseed’s
walking was a declaration of common ownership over the enormity of this land. Sometime
in the 1840s, Chapman found himself listening to the outdoor sermonizing of a fire-and-brimstone
Methodist preacher in Mansfield, Ohio. “Where is the primitive Christian, clad
in coarse raiment, walking barefoot to Jerusalem?” the minister implored
the crowd, judging them for their materialism, frivolity, and immorality.
Finally, a heretofore silent Johnny Appleseed, grown tired of the uncharitable
harangue, ascended the speaker’s platform and hiked one grime-covered,
bunion-encrusted, and blistered black foot underneath the preacher’s nose.
“Here’s your primitive Christian,” he supposedly declared. Even
Johnny Appleseed’s gospel was of walking.

“John Chapman’s appearance at the minister’s stump,” writes William Kerrigan in Johnny Appleseed and the American Orchard: A Cultural History, made the horticulturalist a “walking manifestation of a rejection of materialism.” Not just figuratively a walking embodiment of such spirituality, but literally a walking incarnation of it. The regularity of putting one foot after the other has the rhythm of the fingering of rosary beads or the turning of prayer wheels; both intimately physical and yet paradoxically a means of transcending our bodies. “Walking, ideally, is a state in which the mind, the body, and the world are aligned,” writes Solnit, “as though they were three characters finally in conversation together, three notes suddenly making a chord.” Hence walking as religious devotion, from the Australian aborigine on a walkabout amidst the burnt ochre Outback, to the murmuring pilgrim tracing the labyrinth underneath the stone flying buttresses of Chartres Cathedral, and the Hadji walking over hot sands towards Mecca, or the Orthodox Jew graced with the gift of deliberateness as she walks to shul on Shabbat. Contemplative, meditative, and restorative, religiously speaking walking can also be penitential. In 2009 the Irish Augustinian Fr. Michael Mernagh walked from Cork to Dublin on a pilgrimage of atonement that he single-handedly took in penitence for the Church’s shameful silence regarding child sexual abuse. Not just a pilgrimage, but a protest, with Fr. Mernagh saying of the rot infecting the Church that the “more I have walked the more I feel it is widespread beyond our comprehension.” Atonement is uncomfortable, painful even. As pleasant as a leisurely stroll can be, a penitential hike should strain the lungs, burn the muscles. If penitence isn’t freely taken, however, then it’s no longer penitence. Especially if there’s no reason for contrition, then it’s something else—punishment. Or torture.

“The drop outs began,” recalled Lt. Col. William E. Dyess. “It seemed that a great many of the prisoners reached the end of their endurance at about the same time. They went down by twos and threes. Usually, they made an effort to rise. I never can forget their groans and strangled breathing as they tried to get up. Some succeeded.” Many didn’t. A Texas Air Force officer with movie-star good looks, Dyess had been fighting with the 21st Pursuit Squadron in the Philippines when the Japanese Imperial Army invaded in 1942. Along with some 80,000 fellow American and Filipino troops, he would be marched the 70 miles from Marviales to Camp O’Donnell. Denied food and water, forced to walk in the scorching heat of the jungle sun, with temperatures that went well over 100 degrees, it’s estimated that possibly 26,000 prisoners—a third of those captured—perished in that scorching April of 1942.

Bayonet and heat, bullet and sun, all goaded the men to put one foot in front of the other until many of them couldn’t any more. Transferred from the maggot and filth infested Camp O’Donnell, where malaria and dengue fever took even more Filipinos and Americans, Dyess was able to escape and make it back to American lines. While convalescing in White Sulphur Springs, W.Va., Dyess narrated his account—the first eyewitness American testimony to the Bataan Death March—to Chicago Tribune writer Charles Leavelle. Prohibited from release by the military, Leavelle would finally see the publication of Bataan Death March: A Survivor’s Account in 1944. Dyess never read it; he died a year before. His P-38G-10-LO Lightning lost an engine during takeoff at a Glendale, Calif., airport, and rather than risk civilian casualties by abandoning the plane, Dyess crashed it into a vacant lot, so that his life would be taken by flying rather than by walking.

Walking can remind us that we’re alive, so that it’s all the more obscene when such a human act is turned against us, when the pleasure of exertion turns into the horror of exhaustion, the gentle burn in muscles transformed into spasms, breathing mutated into sputtering. Bataan’s nightmare, among several, was that it was walking that couldn’t stop, wasn’t’ allowed to stop. “It is the intense pain that destroys a person’s self and world,” writes philosopher Elaine Scarry in The Body in Pain: The Making and Unmaking of the World, “a destruction experienced spatially as either the contraction of the universe down to the immediate vicinity of the body or as the body swelling to fill the entire universe.” Torture reminds us that we’re reducible to bodies; it particularizes and universalizes our pain. With some irony, walking does something similar, with the exertion of moving legs and swinging arms, our wide-ranging mobility announcing us as citizens of the universe. Hence the hellish irony of Bataan, or the 2,200 miles from Georgia to Oklahoma that more than 100,000 Cherokee, Muscogee, Seminole, Chickasaw, and Choctaw were forced to walk by the U.S. federal government between 1830 and 1850, the January 1945 40-mile march of 56,000 Auschwitz prisoners to the Loslau train station in sub-zero temperatures, or the 2.5 million residents of Phnom Penh, Cambodia, forced to evacuate into the surrounding countryside by the Khmer Rouge in 1975. These walks are as hell, prisoners followed by guards with guns and German shepherds, over the hard, dark ground.

Harriet Tubman’s walks were also in the winter, feet trying to gain uncertain purchase upon frozen bramble, stumbling over cold ground and slick, snow covered brown leaves, and she too was pursued by men with rifles and dogs. She covered similar distances as those who were forced to march, but Tubman was headed to another destination, and that has made all the difference. Crossing the Mason-Dixon line in 1849, Tubman recalled that “I looked at my hands to see if I was the same person. There was such glory over everything; the sun came like gold through the trees, and over the fields, and I felt like I was in Heaven.” But she wasn’t in Heaven, she was in Pennsylvania. For Tubman, and for the seventy enslaved people whom she liberated on thirteen daring missions back into Maryland, walking was arduous, walking was frightening, walking was dangerous, but more than anything walking was the price of freedom.

Tubman would rightly
come to be known as the Moses of her people (another prodigious walker),
descending into the antebellum South like Christ harrowing Hell. The network of
safe-houses and sympathetic abolitionists who shepherded the enslaved out of
Maryland, and Virginia, and North Carolina into Pennsylvania, and New England
and Canada, who quartered the enslaved in cold, dusty, cracked root cellars and
hidden passageways, used multiple means of transportation. People hid in the
backs of wagons underneath moldering produce, they availed themselves of
steamships and sometimes the Underground Railroad was a literal railroad. One
enterprising man named Henry Box Brown even mailed himself from Richmond to
Philadelphia, the same year Tubman arrived in the Quaker City. But if the
Underground Railroad was anything, it was mostly a process of putting one foot
before the other on the long walk to the north.

              Familiar with the ebbs and flows of the
brackish Chesapeake as it lapped upon the western shores of Dorchester County,
Tubman was able to interpret mossy rocks and trees to orient herself, navigating
by the Big Dipper and Polaris. Her preferred time of travel was naturally at
night, and winter was the best season to abscond back, deploying the silence
and cold of the season as camouflage. Dorchester is only a scant 150 miles from
Philadelphia, but those even further south – South Carolina, Georgia, even
Mississippi – would also walk to freedom. Eric Foner explains in Gateway to
Freedom: The Hidden History of the Underground Railroad, that “Even
those who initially escaped by other means ended up having to walk significant
distances.” Those nights on the Underground Railroad must have been
terrifying. Hearing the resounding barks of Cuban hounds straining at slave
catchers’ leashes, the metallic taste of fear sitting in mouths, bile rising up
in throats. Yet what portals of momentary grace and beauty were there, those
intimations of the sought-after freedom? To see the graceful free passage of a
red-tailed hawk over the Green Ridge, the bramble thickets along the
cacophonous Great Falls of the cloudy Potomac, the luminescence of a blue moon
reflected on a patch of thick ice in the Ohio River?

            During that same decade, and the French dictator Napoleon III was, through ambitious city planning, inventing an entirely new category of walker – the peripatetic urban wanderer. Only a few months after Tubman arrived in Philadelphia, and four thousand miles across the ocean, something new in the annals of human experience would open at Paris’ Au Con de la Rue – a department store. For centuries, walking was simply a means of getting from one place to another; from home to the market, from market to the church. With something, like the department store, or the glass covered merchant streets known as arcades, people were enticed not just to walk somewhere, but rather to walk everywhere. Such were the beginnings of the category of perambulator known as the flâneur, a word that is untranslatable, but carries connotations of wandering, idling, loafing, sauntering.

Being a flâneur means simply walking without purpose other than to observe; of strolling down Le Havre Boulevard and eyeing the window displays of fashions weaved in cashmere and mohair at Printemps, of espying the bakeries of Montmartre laying out macarons and Pain au chocalait; of passing the diners in outdoor brasseries of the Left Bank eating coq au vin. Before the public planner Georges-Eugène Hausmann’s radical Second Empire reforms, Paris was a crowded, fetid, confusing and disorganized assemblage of crooked and narrow cobblestoned streets and dilapidated half-timbered houses. Afterwards it became a metropolis of wide, magnificent boulevards, parks, squares, and museums. Most distinctively, there were the astounding 56,573 gas lamps that had been assembled by 1870, lit by a legion of allumeurs at dusk, so that people could walk at night. If Hausmann – and Napoleon III – were responsible for the arrival of the flâneur, then it was because the city finally had things worth seeing. For the privileged flâneur, to walk wasn’t the means to acquire freedom – to walk was freedom. 

“For the perfect flâneur,” writes poet Charles Baudelaire in 1863, “it is an immense joy to set up house in the heart of the multitude, amid the ebb and flow of movement, in the midst of the fugitive and the infinite… we might liken him to a mirror as vast as the crowd itself; or to a kaleidoscope gifted with consciousness.” Every great city’s pedestrian-minded public thoroughfares—the Avenue des Champs-Élysées and Cromwell Road; Fifth Avenue and Sunset Boulevard—is the rightful territory for the universal flâneur. Writers like Baudelaire compared the idyl of city walking to that other 19th-century innovation of photography; the flâneur existed inside a living daguerreotype, as if they had entered the hazy atmosphere of an impressionist painting, the gas lamps illuminating the drizzly fog of a Parisian evening. For the 20th-century German philosopher Walter Benjamin, who analyzed that activity in his uncompleted magnum opus The Arcades Project, the flâneur was the living symbol of modernity, writing that the “crowd was the veil from behind which the familiar city as phantasmagoria beckoned to the flaneur.”

I often played the role of flaneur for the two years my wife and I lived in Manhattan in a bit of much-coveted rent-controlled bliss. When your estate is 200 square feet, there’s not much choice but to be a flâneur, and so we occupied ourselves with night strolls through the electric city, becoming familiar with the breathing and perspiring of the metropolis. Each avenue has its espirit de place: stately residential York, commercial First and Second with their banks and storefronts, imperial Madison with its regal countenance, Park with its aura of old money, barely reformed Lexington with its intimations of past seediness. In the city at night, we availed ourselves of looking at the intricate pyrotechnic window displays of Barneys and Bloomingdales, of the bohemian leisure of the Strand’s displays in front of Central Park, of the linoleum cavern underneath the 59th Street Bridge, and of course Grand Central Station, the United States’ least disappointing public space.

Despite gentrification, rising inequity, and now the pandemic, New York still amazingly functions according to what the geographer Edward Soja describes in Thirdspace as being a place where “everything comes together… subjectivity and objectivity, the abstract and the concrete, the real and the imagined, the knowable and the unimaginable.” Yet Manhattan is perhaps more a nature preserve for the flâneur, as various economic and social forces over the past few decades have conspired to make our species extinct. The automobile would seem to be a natural predator for the type, and yet even in the deepest environs of the pedestrian unfriendly suburbs the (now largely closed) American shopping mall fulfilled much the same function as Baudelaire’s arcades. To stroll, to see, to be seen. A new threat has emerged in the form of Amazon, which portends to end the brick-and-mortar establishment, the coronavirus perhaps the final death of the flâneur. If that type of walker was birthed by the industrial revolution, then it now appears late capitalism is his demise, undone by our new tyrant Jeff Bezos.

The rights of the flâneur were never equally distributed, with scant mention of the flâneur needing to be hyperaware of his surroundings, of needing to carry keys in his fist, or having to arm himself with mace. While it’s not an entirely unknown word, flâneuse is a much rarer term, and it’s clear that the independence and assumed safety that pedestrian exploration implies is more often than not configured as masculine. Women have, of course, been just as prodigious in mapping the urban space with their feet as have men, with Lauren Elkin joking in Flâneuse: Women Walk the City in Paris, New York, Tokyo, Venice, and London that many accounts assume that a “penis were a requisite walking appendage, like a cane.” She provides necessary corrective to the male-heavy history of the flaneur, while also acknowledging that the risks are different for women. Describing the anonymity that such walking requires, Elkin writes that “We would love to be invisible the way a man is. We’re not the ones to make ourselves visible… it’s the gaze of the flaneur that makes the woman who would join his ranks too visible to slip by unnoticed.”

As a means of addressing this inequity that denies more than half the world’s population safe passage through public spaces, the activist movement Take Back the Night held its first march in Philadelphia, after the 1975 murder of microbiologist Susan Alexander Speeth as she was walking home. Take Back the Night used one of the most venerable of protest strategies—the march—as a means of expressing solidarity, security, defiance, and rage. Andrea Dworkin stated the issue succinctly in her treatise “The Night and Danger,” explaining that “Women are often told to be extra careful and take precautions when going out at night… So when women struggle for freedom, we must start at the beginning by fighting for freedom of movement… We must recognize that freedom of movement is a precondition for everything else.” Often beginning with a candlelight vigil, participants do exactly that which they’re so often prevented from doing—walking freely at night. So often paeons to walking that are penned by men wax rhapsodic about the freedom of the flaneur, but forget how gendered the simple act of walking is. Dworkin’s point is that women never forget it.

Few visuals are quite as powerful as seeing thousands of women and men moving with intentionality through a public space, hoisting placards and signs, chanting slogans, and reminding the powers that be what mass mobilization looks like. There is a debate to be had about the efficacy of protest. But at their absolute most charged, a protest seems like it can change the world; thousands of feet walking as one, every marcher a small cell in a mighty Leviathan. In that uncharacteristically warm February of 2003, I joined the 5,000 activists who marched through the Pittsburgh neighborhood of Oakland against the impending war in Iraq. There were the old hippies wearing t-shirts against the Vietnam War, the slightly drugged out looking college-aged Nader voters, Muslim women in vermillion hijabs and men in olive keffiyeh, the Catholic Workers, and the Jews for Palestine, the slightly menacing balaclava wearing anarchists, and of course your well-meaning liberals such as myself.

We marched past Carnegie-Mellon’s frat row, young Republicans jeering us with cans of Milwaukee’s Best, through the brutalist concrete caverns of the University of Pittsburgh’s campus, and under the watchful golem that was the towering gothic Cathedral of Learning, up to the Fifth Avenue headquarters of CMU’s Software Engineering Institute, a soulless mirrored cube reflecting the granite gargoyles blackened by decades of steel mill exhaust who were watchfully positioned on St. Paul’s Cathedral across the street. Supposedly both the SEI and the adjacent RAND Institute had DoD contracts, developing software that would be used for drone strikes and smart bombs. With righteous (and accurately placed) indignation, the incensed crowd chanted, and we felt as a singular being. On that same day, in 650 cities around the world, 11 million others marched, history’s largest global protest. It felt as if by walking we’d stop the invasion. Reader, we did not stop the invasion.      

Despite those failures, the experience is indicative of how walking alters consciousness. Not just in a political sense, but in a personal one as well (though those things are not easily extricated). There is a methodology for examining how walking alters our subjectivity, a discipline with the lofty and vaguely threatening name of “psychogeography.” Theorist Guy Debord saw the practice as a means of reenchanting space and place, developing a concept called dérive, which translates from the French as “drifting,” whereby participants “drop their usual motives for movement and action, their relations, their work and leisure activities, and let themselves be drawn by the attractions of the terrain and the encounters they find there,” as he is quoted in the Situationist International Anthology. Sort of a hyper-attenuated version of being the flaneur, psychogeographers perceived familiar streets, squares, and cities from an entirely different perspective. Other psychogeographical activities included tracing out words by the route a walker takes through the city, or mapping smells and sounds. The whole thing has an anarchic sensibility about it, but with the whimsy of the Dadaists, while just as enthused with praxis as with theory.

For example, in his travelogue Psychogeography the Anglo-American novelist Will Self journeys from JFK to Manhattan while on foot. Sneakers crunching over refuse alongside the Van Wyck, the metropolitan detritus that exists in those scrubby brown patches that populate the null-voids that exist between somewhere and somewhere else. Nothing can really compare to entering New York on foot, as Self did. It’s fundamentally different from arriving in a cab driving underneath the iconic steel girders of the Manhattan Bridge, or being ferried into the Parthenon that is Grand Central, or even disembarking from a Peter Pan Bus in the grody cavern of Port Authority. Walking is a “means of dissolving the mechanized matrix which compresses the space-time continuum” Self writes, with the walker acting as “an insurgent against the contemporary world, an ambulatory time traveler.” For the psychogeographers, how we move is how we think, so that if we wish to change the later, we must first alter the former.

So it would seem. Writing in the 18th century, Jean-Jacques Rousseau remarked in The Confessions that “I can only meditate when I’m walking… my mind only works with my legs.” In his autobiography Ecce Homo, Friedrich Nietzsche injuncted, “Sit as little as possible; do not believe any idea that was not born in the open air and of free movement…. All prejudices emanate from the bowels.” Meanwhile, his contemporary Søren Kierkegaard wrote that “I have walked myself into my best thoughts.” Most celebrated of the walking philosophers, Immanuel Kant, had daily constitutionals across Konigsberg’s bridges that merchants set their watches by him. Wallace Stevens famously used to write his poems as he stomped off across the antiseptic landscape of Hartford, Conn. He walked as scansion, his wingtips pounding out iambs and trochees with the wisdom that understands verse is as much of the body as it is of the mind, so that “Perhaps/The truth depends on a walk.” Walking is a type of consciousness, a type of thinking. Walking is a variety of reading, the landscape unspooling as the most shining of verse, so that every green leaf framed against a church’s gothic tower in a dying steel town, both glowing with an inner light out of the luminescence of the golden hour, is the most perfect of poems, only to be seen by she who gives it the time to be considered.

Image Credit: SnappyGoat

Henry Vaughan’s Eternal Alchemy

-

Mercury has a boiling point of 674.1 degrees Fahrenheit and a freezing point of -37.89 degrees, rendering it the only metal that’s liquid at room temperature. Malleable, fluid, transitory—the element rightly lends itself to the adjective “mercurial,” a paradoxical substance that has the shine of silver and the flow of water, every bit as ambiguous as the Greek god from whom it derives its name. Alchemists were in particular drawn to mercury’s eccentric behavior, as Sam Kean explains in The Disappearing Spoon: And Other True Tales of Madness, Love, and the History of the World from the Periodic Table of the Elements, writing that they “considered mercury the most potent and poetic substance in the universe.” Within sequestered granite basements and hidden cross-timbered attics, in cloistered stone monasteries and under the buttresses of university halls, alchemists tried to encounter eternity, and very often their medium of conjuration was mercury. The 13th-century English natural philosopher Roger Bacon writes in Radix Mundi that “our work is hidden… in the body of mercury,” while in the 17th-century the German physician Michael Maier declared in Symbola aurea mensae that “Art makes metals out of… mercury.” Quicksilver which pools like some sort of molten treasure is one of the surprising things of this world. To examine mercury and its undulations is to see time itself moving, when metal appears acted upon by entropy and flux, the disintegration of all that which is solid into pure water. Liquid metal mercury is a metaphysical conceit.

Alchemy has been practiced since antiquity, but the decades before and during the Scientific Revolution were a golden age for the discipline. In England alone there were practitioners like John Dee, Edward Kelley, and Robert Boyle (who was also a chemist proper), and then there was the astrologer and necromancer Isaac Newton, who latter had some renown as a physicist and mathematician.  Such were the marvels of the 17th century, this “age of mysteries!” as described by the Anglo-Welsh poet Henry Vaughan–born 400 years ago tomorrow. He could have had in mind his twin brother, Thomas, among the greatest alchemists of that era, whose “gazing soul would dwell an hour, /And in those weaker glories spy/Some shadows of eternity.” Writing under the name Eugenius Philalethias, Thomas was involved in a project that placed occultism at the center of knowledge, seeing in the manipulation of matter answers to fundamental questions about reality. Though Henry is the more remembered of the two brothers today (though “remembered” is a relative term), the pair were intellectually seamless in their own time, seeing in both poetry and alchemy a common hermetic purpose. Four centuries later, however, and alchemy no longer seems an avenue to eternity. Thomas’s own demise demonstrated a deficiency of those experiments, for despite mercury’s supposed poetic qualities, among its more tangible properties is an extreme toxicity, and when heated in a glass it can be accidentally inhaled, or when handled by an ungloved hand (as alchemists were apt to do) it can be absorbed through the skin. The resultant poisoning has several symptoms, not least of which are muscle spasms, vision and hearing problems, hallucinations, and ultimately death. Such was the fate of Thomas after some mercury got up his nose in that apocalyptic year of 1666, when plague and fire destroyed London.

Despite Thomas being an Anglican vicar removed from his position for “being a common drunkard, a common swearer… a whoremaster” and who would be satirized nearly a century later by Jonathan Swift in A Tale of a Tub as the greatest author of nonsense “ever published in any language,” he was still in pursuit of something very particular and important, though it’s perhaps easy to mock alchemy nearly four centuries later. In works like Anima Magica Abscondita; or A Discourse on the Universal Spirit of Nature and Aula Lucis, or The House of Light, Thomas evidenced a noble and inquisitive spirit about nature, a desire to both “Have thy heart in heaven and thy hands upon the earth,” as he writes in the first of those two volumes. Thomas searched for eternity, a sense of what the fundamental units of existence are, and he rearranged matter and energy until it killed him.

“Their eyes were generally fixed on higher things,” writes Michael Schmidt in Lives of the Poets, and indeed whether in manipulation of chemicals or words, both Vaughans desired the transcendent, the infinite, the eternal; what Henry called “authentic tidings of invisible things.” This essay is not mainly about Thomas—it’s about Henry, a man who rather than making matter vibrate chose to arrange words, and in abandoning chemicals for poetry came much closer to eternity. Thomas was an alchemist of things and Henry was one of words, but the goal was identical—”a country/Afar beyond the stars,” where things are shining and perfect. This necessarily compels a question—how eternal can any poetic voice ever actually be? Mine is not a query about cultural endurance; I’m not asking for how long will a poet like Vaughan be studied, read, treasured. I’m asking how possible is it for Vaughan—for any poet—to ascend to the perspective of Aeternitas, to strip away past, present, and future, to see all as if one gleaming moment of light, divorced from our origins, our context, our personality, and in that mystical second to truly view existence as if from heaven? And for the purpose of the poet (and the reader), to convey something of this experience in the fallen and imperfect medium of language?

Because he was a man of rare and mystical faith, for whom the higher ecstasies of metaphysical reverence subsumed the lowly doldrums of moralistic preening, it can be easy to overlook that Vaughan—like all of us who are born and die—lived a specific life. The son of minor Welsh gentry (and whose second language was English, often writing “in imitation of Welsh prosody” as the editors of the Princeton Handbook of Poetic Terms note), Vaughan was most likely a soldier among the Royalists, brother of the esteemed scholar Thomas (of course), and a physician for his small borderland’s town, until dying in 1695 at the respectable age of 74, the “very last voice contained entirely within what many regards as the great century of English poetry” as Schmidt writes. Vaughan may not be as celebrated as other visionary poets, yet he deserves to be included among William Blake and Emily Dickinson as one of the most immaculate. Such perfection took time.

His earliest “secular” verse is composed of largely middling attempts at aping the Cavalier Poets with their celebrations of leisure and the pastoral, yet sometime around 1650, “Vaughan seems to have experienced a spiritual upheaval more in the nature of a regeneration than of a conversion. He violently rejected secular poetry and turned to devotion,” as Miriam K. Starkman explains in 17th Century English Poetry. This turning of the soul was perhaps initiated by the trauma of the Royalist loss in the English Civil War, the death of his older brother, William, and most of all his discovery of the brilliant metaphysical poet and Anglican theologian George Herbert. But regardless of the reasons, his verse and faith took on a shining, luminescent quality, and his lyrics intimated the quality of burning sparks from hot iron. “Holy writing must strive (by all means) for perfection and true holiness,” Vaughan writes in the preface to his greatest collection, 1655’s Silex Scintillans, “that a door may be opened to him in heaven” (the Latinate title translates to “The Fiery Flint” in keeping with his contention that “Certain divine rays break out of the soul in adversity, like sparks of fire out of the afflicted flint”).

He was, first and foremost, a Christian, and an Anglican one at that, writing his verse during the Puritan Interregnum when his church was abolished, and those of his theological position prohibited from their community and liturgy. In his prose treatise of 1652, The Mount of Olives, or Solitary Devotions, he offers “this my poor Talent to the Church,” now a “distressed Religion” for whom “the solemn and public places of meeting… [are] vilified and shut up.” To these traumas must be added Vaughan’s marginalization in England as a Welshmen, for as Schmidt writes “Wales, [is] his true territory,” indeed so much so that he called himself the “Silurian” after the fearsome Celtic tribe that had once made his birthplace their home. In championing Vaughan, or any poet, as “eternal,” we risk reducing them, of subtracting that which makes them human. Silex Scintillans is consciously written in a manner whereby it can be easy to strip the lyrics of theological particularity, which makes him an easy poet for the heretics among us to champion, and yet Vaughan (ecstatic though he may be) was an orthodox Anglican.

Vaughan’s particular genius is in being able to write from a perspective that seems eternal, for theology may be of man but faith is of God, and he is a poet that understands the difference. The result is a verse that though it was written by an Anglican Welshman in the 17th century, reads (when it’s soaring the highest) as if it came from some place alien, strange, and beautiful. Consider one of his most arresting poems from Silex Scintillans, titled “The World.” Along with John Donne and Dickinson, Vaughan is among the best crafters of first lines in the history of poetry, writing “I saw Eternity the other night, /Like a great ring of pure and endless light.” This is a poem that begins like the Big Bang. A simple trochaic rhyming couplet, its epigraphic minimalism lends itself to the very eternity of which it speaks. To my contemporary ear, there is something oddly colloquial about Vaughan’s phrasing, speaking of seeing “Eternity the other night” like you might mention having run into a friend somewhere, even though what the narrator has experienced is “Time in hours, days, years, /Driv’n by the spheres/Like a vast shadow mov’d; in which the world/And all her train were hurl’d.”

In its understatement there’s something almost funny about the line, as the casual tone before the enjambment transitions into the sublime cosmicism after the first comma. That’s the other thing—the narrator had this transcendent experience “the other night”—the past tense is crucial. Eternity is possible, but it’s only temporary (at least while we live on earth). Schmidt correctly observes that Vaughan’s lyric “does not sustain intensity throughout, dwindling to deliberate allegory,” though that’s true of any poem which begins with a powerful and memorable line—Donne and Dickinson weren’t ever able to sustain such energy through an entire lyric either. What’s so powerful in “The World” is that this inevitable rhetorical decline is reminiscent of the actual mystical experience itself, whereby that enchanted glow must necessarily be diminished over time. Leah Marcus argues in Childhood and Cultural Despair: A Theme and Variations in Seventeenth-Century Literature that the “dominant mood in Vaughan’s poetry is pessimism, and a sense of deep loss which occasional moments of vision can only partly alleviate.” Such an interpretation is surely correct, for loss marked not only Vaughan’s life, but his mystical experiences as well, where despite his spiritual certainty (and he is not a poet of doubt), transcendence itself must be bounded by space and time, a garden to which we are only sometimes permitted to visit.

Better to describe Vaughan as the poet of eternity deferred, an ecstatic who understands that in a fallen world paradise is only ever momentary. “My Soul, there is a country/Afar beyond the stars,” Vaughan writes in another poem entitled “Peace,” continuing “Where stands a winged sentry… There, above noise and danger/Sweet Peace sits, crown’d with smiles.” He writes with certainty, but not with dogmatism; perhaps assured of his own election, he doesn’t mistake his earthly present for his heavenly future, and that combination of assurance and humility has lent itself to the eternal mode of the lyrics. Note the reference to where perfection can be found, possibly an echo of Hamlet’s “undiscovered country,” as well as the cosmological associations of such paradise being located deep within the outer universe. Along with his contemporary Thomas Traherne, Vaughan is one of the great interstellar poets, though such imagery must necessarily be read as allegorical, the mystic translating experience into metaphor. “Private experience is communicated in the language of Anglican Christianity,” observes Schmidt, and such language must forever be contingent, a gesture towards the ineffable but by definition not the ineffable itself. “His achievement,” Schmidt writes, “is to bring the transcendent almost within reach of the senses.”

There are brilliant poets and middling ones, influential and obscure, radical and conservative, but the coterie of those able to look upon eternity with transparent eye and encapsulate their experience in prosody, no matter how relative and subjective, are few. During the Renaissance, Herbert with his “something understood” was one of these poets; Donne with his “little world made cunningly” was another. Among others include Gerard Manley Hopkins and the vision of God’s grandeur “shining from shook foil” in the 19th-century alongside Christina Rossetti who had “no words, no tears.” In the 20th there was Robert Frost (hackneyed though he is misremembered) intoning that “Nature’s green is gold” and Marianne Moore for whom “Mysteries expound mysteries;” Jean Toomer’s “lengthened tournament for flashing gold,” and Denise Levertov who could “Praise/god or the gods, the unknown;” Louise Gluck whose hands are “As empty now as at the first note,” Martin Espada who prays that “every humiliated mouth… fill with the angels of bread,” and Kaveh Akbar who “always hoped that when I died/I would know why.”  Then of course there are aforementioned Dickinson and Blake, frequently John Milton, often Walt Whitman, and most of the time Traherne. Hard to discern a thorough line through the eternal poets, eternal because they seem to have gone to that permanent place and returned with some knowledge. When Schmidt writes that “Vaughan’s chosen territory” was the “area beyond the senses, accessible only to intuition,” I suspect that same description could be made from those on my truncated syllabus.

Henry Vaughan’s poetry demonstrates the difference between eternity and immortality. The latter is the juvenile desire of the alchemists, this intention to transmute base metals to gold, to acquire the philosopher’s stone, to construct homunculi and perhaps live forever. Immortality is understood as being like this life right now, only with more of it. Eternity is something else entirely. Another difference is that immortality isn’t real. Writing in The Mount of Olives, Vaughan describes our lives as a “Wilderness… A darksome, intricate wood full of ambushes and dangers; a forest with spiritual hunter, principalities and powers.” By contrast, eternity is that domain of perfection—it is the gentle buzz of a bee in the stillness of a summer dusk, the scent of magnolia wafting on a breeze, the reflection of moonlight off an unrippling pond. “They are all gone into the world of light! /And I alone sit ling’ring here,” Vaughan writes in another of his perfect opening lines. One of the most poignantly sad sentences in Renaissance poetry. The world weariness is there, the estrangement, the alienation—but the world of light is still real. “I cannot reach it, and my striving eye/Dazzles at it, as at eternity,” he mournfully writes, and yet “we shall gladly sit/Till all be ready.” What faith teaches is that we’re all exiles from that Zion that is eternity, to which we shall one day return. What Vaughan understands is that if we seek eternity, it is now.

Marvelous Mutable Marvell

-

Twelve years to the day after King Charles I ascended the scaffold at Whitehall to become the first European sovereign decapitated by his subjects, and his son had the remains of Oliver Cromwell, the Lord Protector during England’s republican decade and the man whom as much as any other was responsible for the regicide, exhumed from his tomb in Westminster Abby so that the favor could be returned. Interregnum signified dreary theocracy, when the theaters were closed, Christmas abolished, and even weekend sports banned. By contrast, Charles promised sensuality, a decadent rejection of austere radicalism. But if there was one area of overlap, it would be in issues of rapprochement, for though pardons were issued to the opposition rank-and-file, Charles still ordered the execution of 31 signers of his father’s judgement, remembered by the subject of this essay as that moment when the king had “bowed his comely head/Down as upon a bed.” Even the dead would not be spared punishment, for in January of 1661 Cromwell was pulled from his grave, flesh still clinging to bone, languid hair still hanging from grey scalp, and the Royalists would hang him from a gibbet at Tyburn.


Eventually Cromwell’s head would be placed on a pike in front of Westminster Hall where it would molder for the next two decades. Such is the mutability of things. Through this orgy of revenge, a revolutionary who’d previously served as Latin Secretary (tasked with translating diplomatic missives) slinked out of London. Already a celebrated poet, he’d been appointed by Cromwell because of his polemical pamphlets. After the blanket pardon he emerged from hiding, but still Charles had him arrested, perhaps smarting when in 1649 the author had described Royalists as a “credulous and hapless herd, begott’n to servility, and inchanted with these popular institutes of Tyranny.” By any accounting John Milton’s living body should have been treated much the same as Cromwell’s dead one, for he was arrested, imprisoned, and no doubt fully expected himself to be flayed and hung up at Tyburn. But he would avoid punishment (and go on to write Paradise Lost). Milton’s rescue can be attributed to another poet a decade his junior.  

Once a keen supporter of Cromwell, but now the new government completely trusted this younger poet and his word was enough to keep Milton from the scaffold. Before the regicide he’d been a royalist, during Interregnum a republican, and upon Restoration a royalist again, all of which earned him the sobriquet of “The Chameleon.” That perhaps impugns him too much, for if Andrew Marvell understood anything it was fickle mutability, how in a mercurial reality all that can be steadfast is merely the present. A man loyal not to regimes but to friends. He was born on this day 400 years ago, he is among the greatest poets, and the relative silence that accompanies his quadricentenary speaks to his major theme: our eventual oblivion. “Thy beauty shall no more be found,” Marvell’s narrator warns in his most famous poem, and it might as well be prophecy for his own work.    

“His grave needs neither rose nor rue nor laurel; there is no imaginary justice to be done; we may think about him… for our own benefit, not his,” writes T.S. Eliot in his seminal essay “Andrew Marvell,” published on this day a century ago in the Times Literary Supplement. Just as John Donne had been known primarily for his sermons, so too was Marvell remembered chiefly as a politician (he was elected as an MP prior to Restoration, a position he stayed in through his death), until Eliot encouraged critics to take a second look at his verse. By Eliot’s recommendation, Marvell was now understood as one of the greatest poets of the century, perhaps only lesser than his friend Milton. Jonathan Bate writes in the introduction to the Penguin Classics edition of Andrew Marvell: The Complete Poems that his “small body of highly wrought work constitutes English poetry’s most concentrated imaginative investigation of the conflicting demands of the active and the contemplative life, the self and society, the force of desire and the pressure of morality, the detached mind and the body embedded in its environment.” Perhaps Marvell deserves a little bit of rose or rue or laurel. Yet his quadricentennial passes without much of a mention, though presumably some other writers will (hopefully) consider his legacy today. A quick Google search shows that several academic societies are providing (virtual) commemoration, including a joint colloquium hosted by St. Andrews, and appropriately enough a Marvell website maintained by  the University of Hull in his northern English hometown (which features this insightful essay by Stewart Mottram about Marvell and COVID-19). Yet by comparison to other recent literary anniversaries—for Milton, Mary Shelly, Walt Whitman, Charles Dickens, of course, William Shakespeare—there has largely been quiet about Marvell, even though Eliot once said regarding his poetry that a “whole civilization resides in these lines.”

Marvell published only a handful of lyrics in his own life, and while he authored examples of virtually every popular genre of the time, from ode to panegyric, country house poem to satire (though not epic), he stretched their bounds, made them unclassifiable and slightly strange. Having written across three major periods of 17th-century English literary history—during the Caroline reign, the Commonwealth, and then Restoration—he awkwardly fits within both of the two dominant traditions of that age, metaphysical and Cavalier poetry. “Was he classic or romantic, metaphysical or Augustan, Cavalier or Puritan… republican or royalist?” asks Elisabeth Story Donno in the introduction to Andrew Marvell: The Critical Heritage, and indeed such ambiguity threads through his verse and especially his life. This Hull-born son of a stern Calvinist minister who after dalliances on the continent possibly converted to Catholicism (tarred on return as now being an “Italo-Machiavellian”), who served kings and republicans, and who seamlessly switched political and religious allegiances.


The great themes of Marvell’s biography and his poetry are mutability and transitoriness; his verse is marked by the deep sense that as it once was, it no longer is; and as it is, it no longer shall be. “Underlying these themes is the knowledge that in love or action time can’t be arrested or permanence achieved,” writes Michael Schmidt in Lives of the Poets. “A sanctioned social order can be ended with an ax, love is finite, we grow old.” With an almost Taoist sense of the impermanence of life, Marvell casts off opinions and positions with ease, and even his own place within the literary firmament is bound to change and alter. “But at my back I always hear/Time’s winged chariot hurrying near,” as Marvell astutely notes. A poet who has “no settled opinions, except the fundamental ones,” writes Schmidt, though his “political readjustments in times of turmoil have not told against him. There is something honest seeming about everything he does, and running through his actions a constant thread of humane concern.” Steadfast though we all wish we could be, Marvell is the poet of Heraclitus’s stream, always reminding us that this too shall pass. “Even his name is slippery,” writes Bates, “In other surviving documents it is as variously spelt as… Marvell, Mervill, Mervile, Marvel, Mervail. We are not sure how to pronounce it.” A convenient metaphor.

However Marvell pronounced his name, it’s his immaculate words and the enchanted ways that he arranged them that are cause for us to return to him today, a poet for whom his “charms are as real as they are hard to define,” as Schmidt writes. Marvell has no great drama in his oeuvre, no Hamlet or King Lear, he penned no epics, no Paradise Lost. Donne, one of his great influences, was more innovative, and George Herbert provided a more unified body of work. And yet in about six or so poems—”The Garden,” “Upon Appleton House,” “An Horatian Ode Upon Cromwell’s Return from Ireland,” the sequence known as the Mower poems, “Bermudas,” and of course “To His Coy Mistress”—are among the most perfect in the language, all written around 1650, when the poet acted as tutor to the daughter of Lord General Thomas Fairfax, commander of the New Model Army before Cromwell. Schmidt argues that “Most of his poems are in one way or another ‘flawed’… [the] tetrameter couplets he favored prove wearying… Some of the conceits are absurd. Many of the poems… fail to establish a consistent perspective… Other poems are static: an idea is stated and reiterated in various terms but not developed… There are thematic inconsistencies.” Yet despite Schmidt’s criticism of Marvell, he can’t help but conclude that “because of some spell he casts, he is a poet whose faults we not only forgive but relish.”

It’s because of Marvell’s overweening honesty when it comes
to those faults, themselves a function of the world in flux, which is to say a
reality where perfection must always be deferred. In “The Garden,”
one of the great pastoral poems, he describes “The mind, that ocean where
each kind/Does straight its own resemblance find;/Yet it creates, transcending these,
/For other worlds, and other seas;/Annihilating all that’s made.”  His “Upon Appleton House,” which
takes part in the common if retroactively odd convention of the country-house
poem wherein the terrain of a wealthy estate is described, contrasts the artifice
of the garden with its claim to naturalism; in “An Horatian Ode Upon
Cromwell’s Return from Ireland,” he gives skeptical praise to the Lord
Protector (back from a genocidal campaign across the sea), where the “forward
youth that would appear/Must now forsake his Muses dear,/Nor in the shadows
sing/His numbers languishing.” Mutability defined not just Marvell’s
politics but his poetry; steadfastly Christian (of some denomination or
another) he gives due deference to the eternal, but the affairs of men are
fickle, and time’s arrow moves in only one direction.

History’s unforgiving erraticism is sometimes more apparent, and Marvell lived during the most chaotic of English centuries. When the Commonwealth that Marvell served collapsed, and Charles II came back from his exile in 1660, it was to triumphant fanfare in a London that was exhausted after years of stern Puritan dictatorship. Crowds thronged the streets wearing sprigs of oak leaves to welcome the young King Charles and the spring of Restoration. In the coronation portrait by John Michael Wright, the 30-year-old king is made to appear the exact opposite of priggish Cromwell; he is bedecked in ribbons and lace, tight white hose and cascading violet velvet, his curly chestnut wig long in the French style (as he would have picked up in the Sun King’s Court at Versailles) with a thin brunette mustache above a wry smile, a royal scepter in one hand and an orb in the other. If Cromwell was a Puritan hymn, then Charles was a baroque sonata by Henry Purcell. Marvell learned the melodies of both. His contemporary biographer Nigel Smith notes in Andrew Marvell: The Chameleon that the poet “made a virtue and indeed a highly creative resource of being other men’s (and women’s) mirrors.” What is espied in a mirror, it must be said, depends on what you put in front of it. Mirrors are ever-changing things. Such is mutability.  

There must be a few caveats when I argue that Marvell’s fame has diminished. Firstly, to any of my fellow scholars of the Renaissance (now more aridly known as “early modernists”) who are reading, of course I know that you still study and teach Marvell, of course I know that articles, dissertations, presentations, and monographs are still produced on him, of course I know that all of us are familiar with his most celebrated poems. It would be a disingenuous argument to claim that I’m “rediscovering” a poet who remains firmly canonical, at least among my small tribe. Secondly, to any conservatives reading, I’m not suggesting that Marvell has been eclipsed because he’s been “cancelled,” or anything similarly silly, though honestly there would probably be ample reason to do so even if that were a thing (and as perfect a poem as it is, “An Horatian Ode Upon Cromwell’s Return from Ireland” is problematic in the least), but bluntly his stock isn’t high enough to warrant being a target in that way. Thirdly, and most broadly, I’m not claiming that his presence is absent from our imagination, far from it. If anything, Marvell flits about as a half-remembered reference, a few turns of phrase that lodge in the brain from distant memories of undergraduate British literature survey courses, or a dappling of lines that end up misattributed to some other author (I’ve seen both “Times winged chariot” and “green thought in a green shade” identified with Shakespeare). Modern authors as varied as Eliot, Robert Penn Warren, Ursula K. Le Guinn, William S. Burroughs, Terry Pratchet, Philip Roth, Virginia Woolf, Archibald MacLeish, Primo Levi, and Arthur C. Clark either quote, reference, allude toward, or directly write about Marvell. Even Stephen King quotes from “To His Coy Mistress” in Pet Sematary. Regardless, on this muted birthday, ask yourself how many people you know are aware of Marvell, among a handful of the greatest poets writing during the greatest period of English poetry?

“Thus, though we cannot make our Sun/Stand still, yet we will make him run.” So concludes Marvell’s greatest accomplishment, “To His Coy Mistress.” Like an aesthetic syllogism, the poem piles an abundance of gorgeous imagery in rapid succession to make an argument about how the present must be seized, for immortality is an impossibility. Of course, the conceit of the poem isn’t novel; during the Renaissance it was downright cliché. Carpe diem poetry—you may remember the term from the Peter Weir film Dead Poets Society—goes back to Latin classics by poets like Catullus and Lucretius. A representative example would be Marvell’s contemporary, the Cavalier poet Robert Herrick, and his lyric “Gather Ye Rosebuds While Ye May.” These poems commonly made an erotic argument, the imagined reader a woman whom the narrator is attempting to seduce into bed with an argument about the finite nature of life. Donne’s obvious precursor “To His Mistress Going to Bed (Elegy 19)” is perhaps the most brilliant example of the form. Bate emphasizes that such a “motif is typical of the Cavalier poets… Marvell, however, takes the familiar Cavalier genre and transposes it to a realm beyond ideology.” When reading Donne, the “poems are written in the very voice of a man in bed with a real woman,” as Bates writes, most likely his beloved wife, Anne, for whom he had no compunctions about blatantly stating his sexual yearning. Marvell, by contrast, “never fleshes out his imaginary mistress,” and the result is that “He is not really interested in either the girl or the Cavalier pose; for him, it is experience itself that we must seize with energy, not one particular lifestyle.” The mistress to whom Marvell is trying to woo is his own soul, and the purpose is to consciously live in a present, for that’s all that there is.

“Had we but World enough, and Time,” Marvell begins, “This coyness Lady were no crime.” Ostensibly an argument against chastity, his reasoning holds not just for sex (though the declaration of “My vegetable Love should grow” a few lines later certainly implies that it’s not not about coitus). Eroticism marks the first half of the lyric, explicitly tied to this theme of time’s fleetingness, while imagining the contrary. “A hundred years should go to praise/Thine Eyes, and on thy Forehead Gaze. /Two hundred to adore each breast:/But thirty thousand to the rest. /An Age at least to every part, /And the last Age should show your heart.” Marvell is working within the blazon tradition, whereby the physical attributes of a beloved are individually celebrated, yet Bates is correct that the hypothetical woman to whom the lyric is a cipher more than a person—for nothing actually is described, merely enumerated. For that matter, the time frames listed—100 years, 30,000, an ambiguous “Age”—all seem random, but as a comment on eternity, there’s a wisdom in understanding that there’s no difference between a second or an epoch. He similarly reduces and expands space, calling upon venerable exoticized images to give sense of the enormity of the world (and by consequence how small individual realities actually are). “To walk, and pass our long Loves Day. /Thou by the Indian Ganges side… I would/Love you ten year before the Flood:/And you should if you please refuse/Till the Conversion of the Jews.” Marvell’s language (orientalist though it may be) is playful; it calls to mind the aesthetic sensuality of Romantics like William Taylor Coleridge, where he imagines the mistress picking rubies off the ground. But the playfulness is interrupted by the darker tone of the second half of the poem.

“But at my back I always hear/Time’s winged chariot hurrying near:/And yonder all before us lie/Desarts of vast Eternity.” The theology of “To His Coy Mistress” is ambiguous; are these deserts of vast eternity the same as the immortality of Christian heaven, or does it imply extinction? Certainly, the conflation of death with a desert seems to deny any continuation. It evokes the trepidation of the French Catholic philosopher and Marvell’s contemporary Blaise Pascal, who in his Pensées trembled before the “short duration of my life, swallowed up in an eternity before and after, the little space I fill engulfed in the infinite immensity of spaces whereof I know nothing.” Evocatively gothic language follows: Marvell conjures the beloved’s moldering corpse in “thy marble Vault,” where “Worms shall try/That long preserv’d Virginity: And your quaint Honour turn to dust,” a particularly grotesque image of the phallic vermin amidst a decomposing body, especially as “quaint” was a contemporary pun for  genitalia. “The Grave’s a fine and private place, /But none I think do there embrace.” In the final third of the poem, Marvell embraces almost mystical rhetoric. “Let us roll all our Strength, and all/Our sweetness, up into one Ball:/And tear our Pleasures with rough strife, /Thorough the iron gates of Life.” This couple melding together is obviously a sexual image, but he’s also describing a singularity, a monad, a type of null point that exists beyond space and time. If there is one clear message in “To His Coy Mistress,” it’s that this too shall pass, that time will obliterate everyone and everything. But paradoxically, eternity is the opposite of immortality; for if eternity is what we desire, then it’s in the moment. Traditional Carpe diem demands that we live life fully for one day we’ll die; Marvell doesn’t disagree with that, but his command is that to truly live life fully is to understand that it’s the present that exists eternally, that we always only live in this present right now, so we must ask ourselves what the significance of this brief second is. What’s mere fame to that?

Marvell’s star has faded not because he wasn’t a great poet, not because he deserves to be forgotten, not because he’s been replaced by other writers or because of any conscious diminishment of his stature. His fame has ebbed because that’s what happens to people as time moves forward. The arguments about what deserves to be read, taught, and remembered often overlooks the fact that forgetting is a function of what it means to be human. What makes “To His Coy Mistress” sublime is that there is a time-bomb hidden with it, the poet’s continuing obsolescence firmly confirming the mutability of our stature, of our very lives. Marvell’s own dwindling fame is a beautiful aesthetic pronouncement, a living demonstration of time’s winged chariot, and the buzzing of the wings of oblivion forever heard as a distant hum. After his death in 1678, he was ensconced within seemingly ageless marble, and set within the catacombs of St. Giles-in-the-Fields, a gothic stone church at the center of London, though true to its name it was at the edge of the city when Marvell was alive. Ever as fortune’s wheel turns, the neighborhood is now bathed in perennial neon light—theaters with their marquees signs and the ever-present glow of electronic billboards. Red double-decker busses and black hackney-carriages dart past the church were Marvell slumbers, unconscious through empire, through Industrial Revolution, through the explosions of the blitz. 

“But a Tombstone can neither contain his character, nor is Marble necessary to transmit it to posterity,” reads his epitaph, “it is engraved in the minds of this generation, and will be always legible in his inimitable writings, nevertheless.” Probably not. For one day, there will less seminars about Marvell, and then no more; less anthologizing, and then the printings will stop; less interest, except in the minds of antiquarians, and then they too shall forget. One day, even Shakespeare shall largely be forgotten as well. The writing on Marvell’s tomb will become illegible, victim to erosion and entropy, and even the banks of the Thames will burst to swallow the city. When he is forgotten, paradoxically maybe especially when he is, Marvell teaches us something about what is transient and what is fleeting. Time marches forward in only one direction, and though it may erase memory it can never annihilate that which has happened. Regardless of heaven, Marvell lived, and as for all of us, that’s good enough. His verse still dwells in eternity, whether we utter his name or not, for what is rendered unto us who are mortal is the ever-living life within the blessed paradise of a second, which is forever present.

Image Credit: Wikipedia.

On Literature and Consciousness

- | 2

Not far from where the brackish water of the delta irrigates the silty banks of the Nile, it was once imagined that a man named Sinuhe lived. Some forty centuries ago this gentleman was envisioned to have walked in the cool of the evening under a motley pink sky, a belly full of stewed pigeon, dates, and sandy Egyptian beer. In twelve pages of dense, narrative-rich prose, the author recounts Sinuhe’s service to Prince Senworset while in Libya, the assassination of the Pharaoh Amenemhat, and the diplomat’s subsequent exile to Canaan wherein he becomes the son-in-law of a powerful chief, subdues several rebellious tribes, and ultimately returns to Egypt where he can once again walk by his beloved Nile before his death. “I, Sinuhe, the son of Senmut and of his wife Kipa, write this,” begins the narrative of the titular official, “I do not write it to the glory of the gods in the land of Kem, for I am weary of gods, nor to the glory of the Pharaohs, for I am weary of their deeds. I write neither from fear nor from any hope of the future, but for myself alone.” This is the greatest opening line in imaginative literature, because it’s the first one ever written. How can the invention of fiction itself be topped?  

Whoever pressed stylus to papyri some eighteen centuries before Christ (The Tale of Sinuhe takes places two hundred years before it was written) has her central character pray that God may “hearken to the prayer of one far away!… may the King have mercy on me… may I be conducted to the city of eternity.” Fitting, since the author of The Tale of Sinuhe birthed her character from pure thought, and in the process invented fiction, invented consciousness, invented thought, even invented being human. Because Sinuhe is eternal — the first completely fictional character to be conveyed in a glorious first-person prose narration. The Tale of Sinuhe is the earliest of an extant type, but there are other examples from ancient Egypt, and indeed Mesopotamia, and then of course ancient Greece and Rome as well. As a means of conveying testimony, or history, or even epic, there is a utility to first-person narration, but The Tale of Sinuhe is something so strange, so uncanny, so odd, that we tend to ignore it by dint of how abundantly common it is today. Whoever wrote this proto-novel was able to conceive of a totally constructed consciousness and to compel her readers to inhabit that invented mind.

Hard to know how common this type of writing was, since so little survives, but of the scraps that we have there is a narration that can seem disturbingly contemporary. Written some eight-hundred years after Sinuhe’s tale, The Report of Wenamun is a fictional story about a traveling Egyptian merchant who in sojourns throughout Lebanon and Cyprus must confront the embarrassment of being a subject from a once-great but now declining empire. With startling literary realism, Wenamun grapples with his own impotence and obsolescence, including descriptions of finding a character “seated in his upper chamber with his back against a window, and the waves of the great sea of Syria… [breaking] behind him.” What a strange thing to do, not to report on the affairs of kings, or even to write your own autobiography, but rather to manifest from pure ether somebody who never actually lived, and with a charged magic make them come to life. In that image of the ocean breaking upon the strand — the sound of the crashing water, the cawing of gulls, the smell of salt in the air — the author has bottled pure experience and handed it to us three millennia later.

Critic Terry Eagleton claims in The Event of Literature that “Fiction is a question of how texts behave, and of how we treat them, not primarily of genre.” What makes The Tale of Sinuhe behave differently is that it places the reader into the skull of the imagined character, that it works as a submersible pushing somebody deep into the murky darkness of not just another consciousness, but that replicates experience of being another mind. That’s what makes the first-person different; that it catalogues the moments which constitute the awareness of another mind — the crumbly texture of a madeleine dunked in tea, the warmth of a shared bed in a rickety old inn on a rainy Nantucket evening, the sad reflective poignancy of pausing to watch the ducks in Central Park — and makes them your own, for a time. The first-person narrative is a machine for transforming one soul into another. Such narration preserves a crystalline moment in words like an insect in amber. The ancient Egyptians believed that baboon-faced Thoth invented writing (sometimes he was an ibis). Perhaps it was this bestial visage who wrote these first fictions? Writing in his introduction to the collection Land of Enchanters: Egyptian Short Stories from the Earliest Times to the Present Day, Bernard Lewis says that the Tale of Sinuhe “employs every sentence construction and literary device known in… [her] period together with a rich vocabulary to give variety and color to… [her] narrative.” Fiction is a variety of virtual reality first invented four thousand years ago.

By combining first-person narration with fictionality, the author built the most potent mechanism for empathy which humans have ever encountered; the ability to not just conjure consciousness from out of nothing, but to inhabit another person’s life. Critic James Wood writes in How Fiction Works that first-person narration is “generally a nice hoax: the narrator pretends to speak to us, while in fact the author is writing to us, and we go along with the deception happily enough.” Wood isn’t wrong – first-person narration, in fact all narration, is fundamentally a hoax, or maybe more appropriately an illusion. What I’d venture is that this chimera, the fantasy of fiction, the mirage of narration, doesn’t just imitate consciousness — it is consciousness. Furthermore, different types of narration exemplify different varieties of consciousness, all built upon that hard currency of experience, so that the first-person provides the earliest intimations of what it means to be a mind in time and space. That nameless Egyptian writer gave us the most potent of incantations — that of the eternal I. “By the end of the 19th century [BCE],” writes Steven Moore in The Novel: An Alternative History, “all the elements of the novel were in place: sustained narrative, dialogue, characterization, formal strategies, rhetorical devices.” Moore offers a revisionist genealogy of the novel, pushing its origins back thousands of years before the seventeenth-century, but regardless of how we define the form itself, it’s incontrovertible that at the very least The Tale of Sinuhe offers something original. Whether or not we consider the story to be a “novel,” with all of the social and cultural connotations of that word (Moore says it is, I say eh), at the very least Sinuhe is the earliest extant fragment of a fictional first-person narrative, and thus a landmark in the history of consciousness.  

What the Big Bang is to cosmology, The Tale of Sinuhe is to literature; what the Cambrian Explosion is to natural history, so narrative is to culture. An entire history of what it means to be a human being could focus entirely on the different persons of narration, and what they say about all the ways in which we can understand reality. Every schoolchild knows that narration breaks down into three persons: the omniscient Father of the third-person, the eerie Spirit of the second-person placing the reader themselves into the narrative, and the Son of the first-person, whereby the reader is incarnated into the character. It’s more complicated than this, of course; there’s first-person unreliable narrators and third-person limited omniscient narrators; plural first-person narratives and free indirect discourse; and heterodiegetic narrators and focalized homodiegetic characters. Regardless of the specifics of what narrative person a story is written in, the way a narrative is told conveys certain elements of perception. The voice which we choose says something about reality; it becomes its own sort of reality. As Michael McKeon explains in Theory of the Novel: A Historical Approach, the first-person form is associated with “interiority as subjectivity, of character as personality and selfhood, and of plot as the progressive development of the integral individual.”

The history of the novel is a history of consciousness. During the eighteenth-century, in the aftershocks of the emergence of the modern novel, first-person faux-memoirs — fictional accounts written with the conceit that they were merely found documents like Daniel Defoe’s Robinson Crusoe or Jonathan Swift’s Gulliver’s Travels — reflected both the Protestant attraction towards the self-introspection which the novel allowed for, but also an anxiety over its potentially idolatrous fictionality (the better to pretend that these works actually happened). By the nineteenth-century that concern manifested in the enthusiasm for epistolary novels, a slight variation on the “real document” trope for grappling with fictionality, such as Mary Shelley’s Frankenstein and Bram Stoker’s Dracula. Concurrently, the nineteenth-century saw the rise in the third-person omniscient narrations, with all of its intimations of God-like eternity that we associate with novels like Charles Dickens’ Bleak House and Leo Tolstoy’s War and Peace. By contrast, the twentieth-century marked the emergence of stream-of-consciousness in high modernist novels such as James Joyce’s Finnegan’s Wake and William Faulkner’s The Sound and the Fury; and following the precedent of Gustave Flaubert and Jane Austen, authors like Virginia Woolf in Mrs. Dalloway and D.H. Lawrence in The Rainbow made immaculate use of free indirect discourse, where the intimacy of the first person is mingled into the bird’s eye perspective of the third. All of these are radical; all of them miracles in their own way. But the first-person is only that which is able to transplant personal identity itself.

“With the engine stalled, we would notice the deep silence reigning in the park around us, in the summer villa before us, in the world everywhere,” writes Orhan Pamuk in his novel The Museum of Innocence. “We would listen enchanted to the whirring of an insect beginning vernal flight before the onset of spring, and we would know what a wondrous thing it was to be alive in a park on a spring day in Istanbul.” Though I have never perambulated alongside the Bosporus on a spring day, Pamuk is still able to, by an uncanny theurgy, make the experience of his character Kemal my own. Certainly lush descriptions can be manifested in any narrative perspective, and a first-person narrator can also provide bare-bones exposition, yet there is something particularly uncanny about the fact that by staring at ink stains on dead trees we can hallucinate that we’re entirely other people in Istanbul. The existentialist Martin Heidegger argued that the central problem of philosophy was that of “Being,” which is to say the question of what it means to be this self-aware creature interacting with an abstract, impersonal universe. By preserving experience as if a bit of tissue stained on a microscope slide, first-person narration acts as a means of bracketing time outward, the better to comprehend this mystery of Being. “We ourselves are the entities to be analyzed,” Heidegger writes in Being and Time, and what medium is more uniquely suited for that than first-person narration?

The first-person isn’t merely some sort of textual flypaper that captures sensory ephemera which flit before the eyes and past the ears, but it even more uncannily makes the thoughts of another person — an invented person — your own. By rhetorical alchemy it transmutes your being. Think of P.G. Wodehouse’s Bertie, who for all of his flippancy and aristocratic foppishness arrives on the page as a keen and defined personality in his own right (and write). “I don’t know if you have the same experience, but a thing I have found in life is that from time to time, as you jog along, there occur moments which you are able to recognize immediately with the naked eye as high spots,” Wodehouse writes in The Code of the Woosters. “Something tells you that they are going to remain etched, if etched is the word I want, forever on the memory and will come back to you at intervals down the years, as you are dropping off to sleep, banishing that drowsy feeling and causing you to leap on the pillow like a gaffed salmon.” If genteel and relaxed Anglophilia isn’t your thing, you can rather turn into the nameless narrator of Ralph Ellison’s Invisible Man, buffeted and assaulted by American racism and denied self-definition in his almost ontological anonymity, who recalls the “sudden arpeggios of laughter lilting across the tender, springtime grass — gay-welling, far-floating, fluent, spontaneous, a bell-like feminine fluting, then suppressed; as though snuffed swiftly and irrevocably beneath the quiet solemnity of the vespered air now vibrant with somber chapel bells.”

Those who harden literature into reading lists, syllabi, and ultimately the canon have personal, social, and cultural reasons for being attracted to the works that they choose to enshrine, yet something as simple as a compelling narrative voice is what ultimately draws readers. An author who is able to conjure from pure nothingness a personality that seems realer than real is like Prospero conjuring specters out of the ether. Denis Johnson was able to incarnate spirits from spirits in his classic novel of junkie disaffection Jesus’ Son. Few novels are able to convey the fractured consciousness of the alcoholic (and I know) as well as Johnson’s does; his unnamed character known only as “Fuckhead” filtering through the distillery of his mind every evasion, half-truth, duplicity (to self and others), and finally radical honesty that the drug addict must contend with if they’re to achieve any semblance of sanity. Writing of an Iowa City bar and its bartender, Johnson says that “The Vine had no jukebox, but a real stereo continually playing tunes of alcoholic self-pity and sentimental divorce. ‘Nurse,’ I sobbed. She poured doubles like an angel, right up to the lip of a cocktail glass, no measuring. ‘You have a lovely pitching arm.’” For those normies of a regular constitution, the madness of addictions like Fuckhead’s can seem a simple issue of willpower, yet the distinctive first-person of Jesus’ Son conveys, at least a bit, what it means to be locked inside a cage of your own forging. Fuckhead recalls seeing the aforementioned bartender after he’d gotten clean: “I saw her much later, not too many years ago, and when I smiled she seemed to believe I was making advances. But it was only that I remembered. I’ll never forget you. Your husband will beat you with an extension cord and the bus will pull away leaving you standing there in tears, but you were my mother.”

Alcoholism isn’t the only manifestation of consciousness at war with itself; self-awareness turned inside out so that a proper appraisal of reality becomes impossible. Consider the butler Stevens in Kazuo Ishiguro’s The Remains of the Day, who carries himself with implacable dignity and forbearance, and is committed to the reputation of his master Lord Darlington, while overlooking the aristocrat’s Nazi sympathies. Stevens is a character of great sympathy, despite his own loyalty being so extreme that it becomes a character deficiency. No mind is ever built entirely of abstractions. Rather, our personalities are always constituted by a million prosaic experiences; there is much more of the human in making a cup of coffee or scrubbing a toilet than there is anything that’s simply abstract. “I have remained here on this bench to await the event that has just taken place – namely, the switching on of the pier light,” remembers Stevens, “for a great many people, the evening is the most enjoyable part of the day. Perhaps, then, there is something to his advice that I should cease looking back so much, that I should… make the best of what remains of my day.” A distillation of individual thought as it experiences the world, to move from switching on a light to a reflection on mortality. Ishiguro captures far more of life in Remains of the Day than does a manifesto, a treatise, a syllogism, a theological tract.

By its very existence, fictional literature is an argument about divinity, about humanity, about creativity. That an author is able to compel a reader into the perspective of a radically different human being is the greatest claim that there is to the multiplicity of existence, of the sheer, radiating glory of being. Take Marilyn Robinson’s Rev. John Ames, an elderly Congregationalist minister in small-town Iowa in 1956, who unfolds himself in a series of letters he’s writing to his young son, which constitutes the entirety of the novel Gilead. Devout, reverential, and most of all good, Rev. Ames’ experiences — not just the bare facts of his life but his reactions to them — is exceedingly distant from my own biography. As with so many readers of Gilead, I feel that there is a supreme honor in being able to occupy Ames’ lectern for a time, to read his epistles as if you were his son. In what for me remains one of the most oddly moving passages in recent American literature, Robinson writes how “Once, we baptized a litter of cats.” Ames enumerates the unusual piety of his childhood, and how that compelled a group of similarly religious children fearful of the perdition which awaited pagan kittens to dressing the animals in doll clothing while uttering Trinitarian invocations and using water to mark the cross on their furry foreheads. “I still remember how those warm little brows felt under the palm of my hand. Everyone has petted a cat, but to touch one like that, with the pure intention of blessing it, is a very different thing.” The way that Ames recounts the impromptu feline conversion is funny, in the way that the things children do often are, especially the image of the meowing cat dressed like a baby, their perfidious mother stealing them back by the napes of their neck, all while the event is marked by the young future minister trying to bring as many of the kittens to Christ as he could. “For years we would wonder what, from a cosmic viewpoint, we had done to them. It still seems to be a real question. There is a reality in blessing… It doesn’t enhance sacredness, but it acknowledges it, and there is a power in that. I have felt it pass through me so to speak.”

And so have I, so to speak, felt that same emotion, despite not being a Congregational minister, a resident of rural Iowa, or a man born in the last decades of the Victorian era writing letters to his son. What the descriptions of first-person narration accomplish — both the litany of detail in the material world, and the self-reflection on what it means to be an aware being — is the final disproof of solipsism. The first-person narration, when done deftly and subtly, proves that yours isn’t the only consciousness, because such prose generates a unique mind and asks you to enter into it; the invitation alone is proof that you aren’t all that exists. Literature itself is a giant conscious being — all of those texts from Sinhu onward mingling together and interacting across the millennia like synapses firing in a brain. Writing may not be reducible to thought, but it is a type of thought. Fiction is thus an engine for transformation, a mechanism for turning you into another person. Or animal. Or thing. Or an entity omniscient. Writing of the feline baptism, Ames remembers that “The sensation is one of really knowing a creature, I mean really feeling its mysterious life and your own mysterious life at the same time.” This, it seems to me, is an accurate definition of literature as well, so that fiction itself is a type of baptism. Writing and reading, like baptism — and all things which are sacred — is simply the act of really knowing a creature.

Annotate This: On Footnotes

-

Good Protestant that he was, the sixteenth-century English printer Richard Jugge was a believer in the immutable word of God and that the faithful were afforded salvation through the inviolate letters of scripture alone. Through reading the gospels, the good Christian could find their heavenly reward, not in donation, or morality, or good works. Jugge was ordained in that priesthood of all believers composed of all those who were baptized – no need for black-clad priests to keep the Bible locked away in Latin, because any pious Englishman could well enough read the words for him (or her)self, so that the most humble boy plowing the field knew as much chapter and verse as the Pope himself. The meaning of scripture was simple, straight-forward, and literal.

But…. sometimes any reader could use some help interpreting the finer points of Hebrew or Koine translation. And… anyone could be forgiven for needing the guidance of expertise to better comprehend what exactly it was that God wanted them to do, think, believe, and feel. What was needed was some explication. Some explanation. Some interpretation. So Jugge, laying the copper keys into his printing press to produce copies of the official Bishop’s Bible translation in 1568, faced a bit of a conundrum. How were stolid English Protestants to properly read – of their own accord – a book written by ancient Jews millennia ago? How could such a foreign book be applied to the great theological disputes of the day, from the nature of the Eucharist to the finer points of double predestination? His solution was in the details, or rather their margins, for in Richard Jugge’s printshop located just north of St. Paul’s Cathedral and just downward from heaven was birthed the Reformation’s most enduring contribution to typography – the footnote.[1]

God may have supplied the Word, but Jugge was content to supply the gloss (or at least the typographic organization of it). Jugge’s footnotes, it should be emphasized, were certainly not the first instance of written commentary on scripture (or other books) included within a volume itself. “Notes in the margin and between the lines — so-called glosses — had featured in written documents since time immemorial,” writes Keith Houston in Shady Characters: The Secret Life of Punctuation, Symbols, and Other Typographical Marks. Marginalia has a venerable history in both manuscript and print, and there is a tradition of using the white space of the page’s border to draw attention to something within the text, from the creatures and monsters populating the illustrations of the Book of Kells, beings who sometimes point and mock transcription errors, to the curious symbol of the manicule, a drawing of a hand looking a bit like one of Terry Gilliam’s Monty Python animations, which gestures to the important bits within a book. “But no traditional form of annotation — from the grammarian’s glosses to the theologian’s allegories to the philologist’s emendations – is identical to the historical footnote,” historian Anthony Grafton reminds us in his invaluable study The Footnote: A Curious History.

Along with other typographical devices that are separate from the main text, from epigraphs to bibliographies, I’ve always had an affection for the lowly footnote. Anybody who has ever had to do advanced academic work has a love/hate relationship with the form; the way in which the piling up of footnotes and endnotes exhibits deep reading (or the performance of it), the scholarly equivalent of getting a stamp in your passport, but also the confusing tangle of references, citations, and long-ago academic debates that a young researcher must tip their cap to in due-deference, which can be a perfunctory slog at best. Footnotes can be an exercise in arid, sober, boring credit-giving, but some of the most dynamic monographs have the best stuff squired away in the footnotes. One of my favorite footnotes is in Stephen Booth’s masterful New Critical close readings of Shakespeare’s Sonnets, whereby regarding a particularly contentious biographical issue, the editor notes that “Shakespeare was almost certainly homosexual, bisexual, or heterosexual.” Looking for such treasure within the footnotes — digressions, jokes, insecurities, reflections, meditations, disagreement, discourse, conversation, and snark in addition to reference, citation, credit, and acknowledgement — is like a form of textual beachcombing, pulling up something shiny out of a sandpile. As Bruce Anderson enthuses in the Stanford Magazine, “Footnotes allow us not only to see the prejudices of old sources, but the biases and convictions of the footnote himself. They provide readers with the intellectual map that the writer has used to arrive at her conclusions.” Footnotes are several things at once – labyrinth, but also diagram; honeycomb and map; portrait of thought and bibliographic graveyard.

Jugge’s innovation wasn’t commentary, it was knowing when to shunt commentary away at the bottom of the page. It wasn’t his Bible’s only perk, for a book is always more than the words on the page (whether such should be claimed by Protestants or New Critics). David Daniel describes the edition in his study The Bible in English, noting that Jugge’s volume was “lavish in its ornaments in initial letters, fresh title-pages with portraits, 124 distinct woodblock illustrations, and four maps.” Work on the translation which constituted the Bishop’s Bible was largely overseen by Archbishop of Canterbury Matthew Parker who was vested by Elizabeth I with the task of making a version of scripture that, though it was in English, avoided the overly radical tone of the Geneva Bible, which had been translated by refugees who’d crowded to the Swiss canon during the reign of the queen’s sister. Daniel is less than enthusiastic about Parker’s translation, writing that “scholar though he was, [he] could not write even reasonably pleasing English.” [2] Writing poetry wasn’t the Archbishop’s goal however, for the new edition was tasked by the queen to avoid the overly Protestant affectations of previous translations.

Translation was always a sectarian affair. In choosing to call some religious authorities “elders,” or even “ministers,” rather than “priests” and in translating a word normally rendered as “gift” to the word “grace,” Protestant translators dressed the “naked sense” (as the Renaissance translator William Tyndale describe literal reading) of the Hebrew and Greek originals in Protestant clothes. That’s always the point of a footnote, to pick some form of clothing whereby the author or editor renders the text in a certain fashion. In this regard, the Bishop’s Bible (in keeping with its episcopal name) was steadfastly less Protestant than the Geneva Bible (in keeping with its steadfastly Calvinist name). “The marginal notes… are Protestant,” explains Daniel, but Parker exerted himself to “abstain from ‘bitter notes.’” If anything, the Bishop’s Bible necessitated the reduction of marginalia. Instances as when the Geneva Bible glosses Revelation 11:7, which describes the “beast that cometh out of the bottomless pit” as matter-of-factly referring to the Pope, wouldn’t find room in Jugge’s printing. In fact, most of the marginalia from the Geneva Bible wouldn’t be repeated in the Bishop’s Bible, as all of the writing which crowded around the page would be conveniently placed at the bottom.[3]

Footnotes, marginalia, and glosses such as those in the Bishop’s Bible belong to an awkward category of literature called “paratext,” the ancillary stuff that surrounds the main event of a book, which can include introductions, prefaces, prologues, afterwords, blurbs, bibliographies, copyright information, epigraphs, and even covers and author pictures, among other things. Paratext is basically all of that which can be ignored in a text but where a person can still be credibly said to have read that book in good faith (it takes a dogged completist to include ISBN information in their reading itinerary). Easy to forget that all of those accoutrements which we associate with books, but that dim in the background to mere static, have their own histories separate from the book itself. Gerrard Genette writes in Paratexts: Thresholds of Interpretation that a “text is rarely presented in an unadorned state, unreinforced and unaccompanied by a certain number of verbal or other productions.”

Jugge and Parker may have believed that doctrine was verified through recourse to the text alone, but which text? No book is an island, and its shores are sandy and defuse out into the wider ocean of literature, so that paratext does “surround it and extend it,” as Genette notes, for “More than a boundary or a sealed border the paratext is, rather, a threshold.” Stated theology had it that all which was inviolate would be found in the distance between Genesis and Revelation, yet Jugge’s footnotes belied that quixotism with something altogether more utopian — all books must be, whether spoken or not, visible or not, apparent or not, in conversation at all times with a bevy of other works. Footnotes are the receipts of those dialogues. They are, as Chuck Zerby writes in The Devil’s Details: A History of Footnotes, “One of the earliest and most ingenious inventions of humankind… an indispensable tool of the scholar and a source of endlessly varied delight for the layperson.”[4]

Christians were in some ways Johnny-come-latelies to this method of freezing minds in contrast into the borders of the sacred page. While the footnote was a Protestant innovation, the process of standardizing commentary into a book has precursors in the Jewish Talmud. That massive collection is typographically organized with the primary text of the Mishna, constituting an assemblage of Jewish oral law, placed at the center of the page, which is then illuminated by rabbinical analysis known as Gemara. A tractate of the Talmud will display a portion of Mishnah with the Gemara, by several different authors, arrayed around the primary text. Gemara can be, at times, quite spirited, with Barry Scott Wimpfheimer writing in The Talmud: A Biography that the text “has the habit of citing its rabbinic authorities and tolerating disagreements.”

For example, in one Talmudic tractate where the Mishna is concerned with the prohibition on mixing milk with meat, a disagreement within the marginal gloss of the Gemara erupts as to whether fowl is properly considered meat, especially since poultry produce no dairy (and thus the moral imperative of the original commandment would seem irrelevant). Within the Mishna, Rabbi Akiva says that “Cooking the meat of… a bird in milk is not prohibited by Torah law,” (though he allows that it would be forbidden by the oral law). His interlocutor Rabbi Yosei HaGelili argues that a close conjunction of verses in Deuteronomy regarding the prohibition on eating a kid cooked in its mother’s milk, alongside a ban on ingesting carrion, “indicates that the meat of an animal that is subject to be prohibited due to the prohibition of eating an unslaughtered carcass is prohibited to cook in milk. Consequently, one might have thought it prohibited to cook in milk the meat of a bird.” The interpretive gloss of Gemera is a demonstration of a text’s living properties, the way in which correspondence, dialogue, and debate takes place even within the static environs of a book, and how any codex is an artifact of discussion that exists beyond the parameters of the page itself.

That footnotes are in some sense scholarly is central to our connotative sense of them. A book that is taken off the shelf at random advertises itself as academic if there are footnotes, endnotes, and bibliographical lists littering its pages. We see the footnote as a mark of erudition, as a means of demonstrating knowledge and giving attribution. The footnote is an exercise in due deference. It makes sense that alongside the categorization system of chapter and verse which structures early modern Bibles (and which was quickly adopted by Catholics), that the footnote was both a product of Protestantism and of modernity. Ministers might not have been priests, capable of enacting sacramental changes in our world, but they were like physicians, lawyers, and professors — which is to say bourgeoise professionals selling expertise. A minister can’t change water into wine, but what he’s selling is the explanation of how Hebrew conjugation and Greek declension necessitate the need to tithe. Footnotes are a mark of scholarly professionalism, and rhetorically they impart a heady ethos of demonstrating that the author appears to know what their talking about.     

From one perspective, the footnotes in the Bishop’s Bible arguably marked a new method of composition, whereby a concern with evidence to bolster argumentation helped facilitate the transition from the Renaissance into the Enlightenment. The eighteenth-century, era of encyclopedists and lexicographers, was when the footnote became an academic mainstay; both a sop to the new empiricism whereby inductive arguments must be made by recourse to something outside the author’s reasoning, but also counterintuitively a pale ghost of the ancient reliance on past authorities. John Burrow explains in A History of Histories: Epics, Chronicles, Romances and Inquiries from Herodotus and Thucydides to the Twentieth Century that footnotes in the Enlightenment did ”far more than just identify sources… [they were] a way of bringing erudition to the support of the text without cluttering it with documents… an idiosyncratic art form, a commentary… [that] gives rein to a relaxed, garrulous intimacy which acts in counterpoint with the tautly controlled formality of the text.”

Perhaps the eighteenth-century’s greatest footnote-enthusiast is the historian Edward Gibbon in Decline and Fall of the Roman Empire, who uses the form to bitchily comment on antiquity’s greatest thinkers. Regarding the Church Father Origen’s self-castration, Gibbon notes that the goal was “to disarm the tempter;” when considering Saint Augustine, the historian says that his “learning is too often borrowing and… [his] arguments are too often his own.” [5] The role of footnotes, whether in the Talmud, the Bishop’s Bible, eighteenth-century historiography, or modern scholarship is to ostensibly provide source and evidence, but footnotes also provide a glimpse into a mind struggling and sometimes fighting with the material. “They evoke the community of historians, scholars and antiquaries, ancient and modern, in a kind of camaraderie of admiration, scorn and sometimes smut,” writes Burrow. Footnotes are a personal, individual, and estimably humane variety of typography.

Books are never self-contained, they are multileveled, contradictory things, and this is demonstrated best by that great engine of contradiction, the novel. When fully compelled, the novelistic imagination broaches no arbitrary unites, instead reveling in the multifaceted, and thus it’s a genre uniquely suited to mimetically presenting a mind in disagreement with itself. Novelists have thus made great stylistic use of the footnote as a means of demonstrating the simultaneous reverence and disagreement which the form is capable of enacting. Books as varied as James Joyce’s Finnegan’s Wake, Roberto Bolaño’s Nazi Literature in the Americas, Junot Diaz’s The Brief Wondrous Life of Oscar Wao, Mark Danielewski’s House of Leaves,[6] and of course David Foster Wallace’s Infinite Jest all make use of the footnote’s literary possibilities, allowing parallel narratives to take place in the margins, unseen narrators to comment, digressions, disagreements, and debates to occur within the white space of the page. The gold standard of the post-modern novel using footnotes to both bifurcate and further a narrative is Vladimir Nabokov’s Pale Fire, a high modernist poem written by the character John Shade, with the footnoted annotations by the critic Charles Kinbote containing a narrative that ranges from campus bildungsroman to novel of international intrigue.  It’s the sort of book where Shade can wax “For we die every day; oblivion thrives/Not on dry thighbones but on blood-ripe lives,/And our best yesterdays are now foul piles/Of crumpled names, phone numbers and foxed files,” while his biographer can note in the margins “This brand of paper… was not only digestible but delicious.”

An irony in that the reformers said that salvation was afforded through scripture alone, precisely at the same moment that they placed that book in a web of all of its mutual influences (and every other book as well). The Bible might be a thing-unto-itself, but its footnotes are an exhibition in how no book can survive without that almost mystical web of mutual interdependence with other books innumerable. “To the inexpert, footnotes look like deep root systems, solid and fixed,” writes Grafton, but to the “connoisseur, however, they reveal themselves as anthills, swarming with constructive and combative activity.” What makes a footnote remarkable, arguably, is that it provides evidence of that constructive and combative activity, since all studies, histories, monographs, novels, poems, and plays are born from a similar struggle between the author and reality, whether or not those scars are displayed. Since no book is written by an infant, or by a disembodied eternal consciousness, or from the pure and absolute ether of the void, then every book is the result of a writer reading other books. Footnotes are simply acknowledgement and demonstration of that, and every book written is threaded through with the spectral presence of footnotes, stated or unstated.

Image credit: Unsplash/Aaron Burden .

[1] It
should be said that though both dividing chapter and verse in biblical books
was an innovation of Protestants for whom the doctrine of sola Scriptura necessitated
a certain amount of making scripture more convenient for lay readers, it was something
which the Catholic Church took to quickly. The same is true for footnotes; by
1582 the Douay-Reims translation of the Bible, prepared by English priests
living on the continent (and predating the King James Bible by three decades)
had already adopted Jugge’s innovation of footnotes for their own purposes.
[2] The Hebraist Gerald Hammon is even less charitable about the Bishop’s Bible, describing it as “For the most part… a lazy and ill-informed collection of what had gone before, or, in its original parts, the work of third-rate scholars and second-rate writers.

[3] Houston explains that “The Renaissance… marked a change… for the marginal note passed from the province of the reader to that of the writer; notes grew longer and more frequent as authors preemptively surrounded the narrative with their own ‘authorized’ commentary.”

[4] Zerby’s text is idiosyncratic, eccentric, and endlessly readable.

[5] I’ve borrowed both of these examples from Anderson’s helpful article.

[6] Which I started to read in 2003 but, though it’s an incredible novel, I put aside about two thirds of the way through. I’d decided that I’d pick up the horror novel, which is ostensibly about a house that is bigger on the inside than the outside, when I was in a psychologically healthier place. I’ve never picked it up again.

Ten Ways to Lose Your Literature

-

“I have built a monument more durable than bronze.” — Horace

1.Inconceivable that a man with a disposition like Ben Jonson’s wouldn’t take an offered drink, and so it can be expected that the poet would enjoy a few when he completed his four-hundred mile, several months long walk from London to Edinburgh in the summer of 1618. Mounted as a proto-publicity stunt, Jonson’s walk was a puckish journey between the island’s two kingdoms, where once the Poet Laurette was in view of the stolid, black-stoned Edinburgh Castle upon its craggy green hill he’d be hosted by his Scottish equivalent, William Drummond of Hawthornden.

At a dinner on September 26th, the bill for victuals was an astonishing (for the time) £221 6s 4d. Imagine the verse composed at that dinner, the lines jotted onto scraps or belted out as revelries, the lost writing which constitutes the far larger portion of any author’s engagement with words. Also his engagement with ale and sack, for it was Jonson’s loose tongue that led to a discussion about a different type of literature lost to posterity, when he confessed that one of their poetic colleagues, a Reader of Divinity at Lincoln’s Inn named Dr. John Donne, had decades before attempted a scurrilous mock-epic entitled Metempsychosis; or, The Progress of the Soule.

Drawing its title from a Pythagorean doctrine that’s roughly similar to reincarnation, Donne’s Metempsychosis is a poem wherein he sings the “progress of a deathless soul, /Whom Fate, which God made, but doth not control, /Placed in most shapes.” He abandoned the project some 520 lines later. Drummond recorded that Jonson had said that the “conceit of Donne’s transformation… was that he sought the soul of the Apple which Eve pulled, and thereafter made it the soul of a Bitch, then of a she-wolf, and so of a woman.” Written before Donne converted to Protestantism, Metempsychosis was a parody of the Reformation, whereby the “soul” of the forbidden fruit would migrate through various personages in history, contaminating them with its malevolence. Jonson told Drummond that “His general purpose was to have brought in all the bodies of the Heretics from the soul of Cain,” perhaps moving through other villains from Haman to Judas.

As it was, Donne mostly charted the apple’s soul through various animal permutations, ending with Cain rather than starting with him. It seems that Donne feared his own mind, and the inexorable logic that drove his poem to a conclusion which was unacceptable. Donne “wrote but one sheet, and now since he was made a Doctor he repented highly and seeketh to destroy all his poems.” Jonson thought that Donne’s intent was to have the final manifestation of that evil spirit appear in the guise of John Calvin. Others have identified an even more politically perilous coda to Metempsychosis, when the at-the-time staunchly Catholic Donne imagined that the “great soul which here amongst us now/Doth dwell, and moves that hand, and tongue, and brow… (For ‘tis the crown, and last strain of my song)” and assumed that the poet was speaking of Elizabeth I.

Biographer John Stubb notes in John Donne: The Reformed Soul that Metempsychosis was “a politically directed piece of writing,” which is the biggest reason why none of you will ever read the entire poem. Donne himself wrote to a friend of his that there must be “an assurance upon the religion of your friendship that no copy shall be taken for any respect.” Metempsychosis is one way of losing your words. Literature as fragment, literature as rough draft, literature as the discarded. The history of writing is also the shadow history of the abandoned, a timeline of false-starts and of aborted attempts. What Donne wrote of Metempsychosis is, even in its stunted form, the longest poem which the lyricist ever penned, and yet it’s a literary homunculus, never brought to fruition. Never burnt upon the pyres by his critics either, because it would never be completed.

2.This is a syllabus of all which you shall never read: Jane Austen’s Sanditon, which exists only in outline form written in 1817, the year its author died of Addison’s disease, and that promised to tell the narrative of Sir Edward Denham whose “great object in life was to be seductive.” John Milton’s epic about Merlin entitled The Arthuriad. Over 2/3rds of the work of Aristotle, with all that survives composed of lecture notes. A thundering abolitionist speech delivered by the congressman Abraham Lincoln on May 29, 1856, where one observer said that he spoke “like a giant inspired.” Isle of the Cross by Herman Melville, which told the tale of romance between the Nantucket maiden Agatha Hatch Robertson and a shipwrecked sailor and crypto-bigamist. Melville explained in a letter to Nathaniel Hawthorne that Isle of the Cross concerned “the great patience & endurance, & resignedness of the women of the island in submitting so uncomplainingly to the long, long absences of their sailor husbands” (Rejected by publishers; no record ).

A series of versified narratives of Aesop penned by Socrates, the philosopher famed for despising writing. Hypocritical though Socrates may have been, he inspired Plato to burn all of the poems he wrote, and to argue for the banning of such trifles in The Republic. Sacred scripture has its absences as well, for the Bible references any number of ostensibly divine books which are to be found nowhere today. The Book of the Covenant mentioned in Exodus 24:7, The Book of the Wars of the Lord cited in Numbers 21:14, Acts of Uzziah whose authority is checked at Chronicles 26:22, and the evocatively titled Sayings of the Seers, which is called upon at Chronicles 33:19. Stuart Kelly, author of The Book of Lost Books: An Incomplete History of All the Great Books You’ll Never Read, notes of the Bible that it is “A sepulcher of possible authors, a catafalque of contradictory texts.” Scripture is a “library, but one in ruins.” Scholars know that there are, for example, a Gospel of Perfection and even more arrestingly a Gospel of Eve, neither of which made the final cut or whose whereabouts are known today.

Of the sacred books of the oracular Roman Sibyllines, not a single hexameter survives. Concerning sublime Sappho, she who was celebrated as the tenth muse, only one complete lyric endures. Nor are Socrates and Aristotle the only Greek philosophers for whom posterity records virtually none of their original writings; the great Cynic Diogenes of Sinope, who lived in an urn, masturbated in the Agora, and told Alexander the Great that the only thing the ruler of the world could do for him is get out of the light, supposedly wrote several manifestos, none of which still exists.  

In 1922 a suitcase filled with Ernest Hemingway’s papers, including the draft of a finished World War I novel, was stolen from Paris’ Gare de Lyon Metro station. A few decades later and the Marxist theorist Walter Benjamin fled towards the Spanish border after the Nazis invaded France, with a briefcase containing a manuscript. He’d commit suicide in Portbou, Spain in fear that the SS was following; when his confidant Hannah Arendt arrived in America, she tried to find Benjamin’s book, but the draft is seemingly lost forever. Lord Byron’s manuscripts weren’t misplaced, but were burned at the urging of his Scottish publisher John Murray, who was horrified by the wanton sexuality in the famed rake’s autobiography, not least of which may have been a confession of an incestuous relationship with his sister. Nineteenth-century critic William Gifford said that the poet’s recollections were “fit only for a brothel and would have damned Lord Byron to everlasting infamy.”

Franz Kafka desired that the entirety of his unpublished corpus be destroyed by his friend Max Brod, writing that “whatever you happen to find, in the way of notebooks, manuscripts, letters, my own and other people’s, sketches and so on, is to be burned unread to the last page.” Unfortunately for Kafka, but to literature’s benefit, Brod turned out to be a terrible friend. Of that which survives, however, much was incomplete, including Kafka’s novel Amerika with its infamous description of the Statue of Liberty holding a sword aloft over New York Harbor. A very different type of burning was that of the literary theorist Mikhail Bakhtin, who was holed up in a Moscow apartment when the Soviet Union was invaded by the Nazis. Bakhtin was forced to use his manuscript as cigarette paper. The book which he was working on, which would mostly go up in tobacco smoke, was a study of the German novel.  

Of the ninety plays which Euripides wrote, only eighteen survive. Aeschylus also wrote ninety tragedies, but of his, only six can be performed today. From his play The Loves of Achilles only one haunting line survives — “Love feels like the ice held in the hand by children.” Sophocles’ has seven plays which are extant, but we know that he penned an astounding 123 dramas. William Shakespeare was the author of a Love’s Labours Won and with his later collaborator John Fletcher a lost play based on Miguel Cervantes novel Don Quixote entitled The History of Cardenio, an irresistible phantom text. Hamlet, it has been convincingly argued, was based on an earlier play most likely written by Thomas Kyd which scholars have given the clinical name of Ur-Hamlet, though no traces of that original are available. Our friend Jonson also has a significant number of lost works, not least of which was 1597’s The Isle of Dogs which he cowrote with Thomas Nash and that was briefly performed at the Bankside Theater. Like Donne’s poem, the subject matter of The Isle of Dogs was potentially treasonous, with the Privy Council ruling that it contained “very seditious and slanderous matter,” banning the play, and briefly arresting its two authors.

When it comes to such forgotten, hidden, and destroyed texts, Kelly argues that a “lost book is susceptible to a degree of wish fulfillment. The lost book… becomes infinitely more alluring simply because it can be perfect only in the imagination.” Hidden words have a literary sublimity because they are hidden; their lacunae functions as theme. Mysteriousness is the operative mood of lost literature; whether it’s been victim of water or fire, negligence or malfeasance, history or entropy, what unites them is their unknowability. They are collectively the great unsolved of literature. There’s a bit of Metempsychosis about it, with a more benign lost soul connecting a varied counter-canon from Aristotle to Byron to Austen to Hemingway. Pythagoras who believed that all souls and ideas were united by an unseen divine filament which replicated throughout eternity and infinity would have some insight on the matter. Sadly, none of Pythagoras’ writings happen to survive.  

3. The claim that Hernan Cortez was welcomed by Montezuma into Tenochtitlan—that city of verandas, squares, canals, and temples —as if he were the feather-plumed Quetzalcoatl, owes much to the accounts gathered by Bernardino de Sahagun. The Franciscan friar assembled a group of Nahuatl speakers to preserve what remained of Aztec culture, including folklore, philosophy, religion, history, and linguistics. This sliver would be preserved in the Florentine Codex, named after the Italian city where it would one day be housed. The Nahuatl authors attempted to resurrect the world of their parents, when Tenochtitlan was larger and more resplendent than any European city, a collection of cobalt colored palaces, observatories, and libraries such that the conquistador Bernal Diaz del Castillo recalled “seeing things never before heard of, never before seen.” Miguel Leon-Portilla translates much of the Florentine Codex in The Broken Spears: The Aztec Account of the Conquest of Mexico, an evocation of that which “had been stormed and destroyed, and a great host of people killed and plundered… causing terror wherever they went, until the news of the destruction spread through the whole land.”

Cortez’s conquest took two years and was completed by 1521. Eight years later, the Spanish inquisitor Juan de Zumarraga, fabled in his native land as a witch-hunter, arrived and assembled a massive auto-de-fe of Aztec and Mayan books—with the Mestizo historian Juan Bautista Pomar noting that such treasures “were burned in the royal houses of Nezahualpiltzintli, in a large depository which was the general archive.” If Cortez was guilty of killing thousands of Aztecs, ultimately millions in the pandemics he spread, then Zumarraga was a murderer of memory. One assaulted the body and the other the mind, but the intent was the same — the extinction of a people. Lucien X. Polastron writes in Books on Fire: The Tumultuous Story of the World’s Great Libraries that the “conquistador was there to kill and capture, the cleric to erase; the bishop fulfilled his mission while satisfying his conscious desire to destroy the pride and memory of the native people.” The Jesuit Jose de Acosta mourned that “We’ve lost many memoires of ancient and secret things, that could have been of great utility. This derives from a foolish zeal,” it being left to those like Sahagun to try and redeem the Spanish of this holocaust they’d unleashed.

In A Universal History of the Destruction of Books: From Ancient Sumer to Modern-day Iraq by Fernando Baez argues that “books are not destroyed as physical objects but as links to memory… There is no identity without memory. If we do not remember what we are, we don’t know what we are.” Zumarraga’s atrocity is only one of many examples, including the destruction of the famed library at Alexandria and the 1536 Dissolution of the Monasteries when the English King Henry VIII would immolate Roman Catholic books in a campaign of terror which destroyed almost the entirety of the early medieval English literary heritage, save for a few token works like Beowulf which would be later rediscovered. Paradoxically, burning books is an acknowledgement of the charged power contained therein.

4.Sometime in 1857, an enslaved woman named Hannah Bond escaped from the North Carolina plantation of John Hill Wheeler. Light-skinned enough to pass as white, Bond dressed in drag and boarded a train due-north, eventually arriving in upstate New York. There Bond would board with a Craft family, from whom she would take her new surname. Eventually the newly christened Hannah Craft would pen in careful hand-writing a novel entitled The Bondswoman’s Narrative, the title perhaps a pun on its author’s previous slave name. Displaying a prodigious knowledge of the books in Wheeler’s library which Craft had been able to read in stolen moments, The Bondwoman’s Narrative is a woven quilt of influences (like all novels are); a palimpsest of things read but not forgotten.

There are scraps of Horace Walpole ’s gothic pot-boiler The Castle of Otranto and Charlotte Bronte’s tale of a woman constrained in Jane Eyre; Harriet Beecher Stowe’s sentimentalized slavery in Uncle Tom’s Cabin and the poet Phyliss Wheatley’s evocation of bondage, and more than any other literary influence that of Charles Dickens’ Bleak House. Drawing from the brutal facts of her own life, The Bondwoman’s Narrative concerns another Hannah’s escape from a plantation. “In presenting this… to a generous public I feel a certain degree of diffidence and self-distrust,” wrote Craft, “I ask myself or the hundredth time How will such a literary venture, coming from a sphere so humble be received?” Written sometime between 1855 (possibly while still in North Carolina) and 1861 (during the earliest days of the Civil War), Craft’s question wouldn’t be answered until 2002, after the manuscript was found squirreled away in a New Jersey attic.

As far as we know, The Bondswoman’s Narrative is the only surviving novel by an enslaved black woman. There were an assemblage of slave narratives ranging from Olaudah Equiano’s The Interesting Narrative from 1789 to Solomon Northup’s terrifying 1853 captivity narrative Twelve Years a Slave and Frederick Douglass’ autobiographical trilogy. Purchased at auction by the Harvard scholar Henry Louis Gates, who would edit, annotate, and publish the book, Craft’s rediscovery evokes the biblical story about King Josiah restoring Jerusalem’s Temple after finding the Book of Deuteronomy, its purpose to restore a sense of justice. Craft’s intent was that Americans must “recognize the hand of Providence in giving to the righteous the reward of their works, and to the wicked the fruit of their doings.”  

There is no discussing lost literature without consideration of that which is found. Just as all literature is haunted by the potential of oblivion, so all lost books are animated by the redemptive hope of their rediscovery. Craft’s book is the mark of a soul; evidence of that which is left over after the spirit has told us what it needs to tell us, even if it takes centuries to hear. A miracle in its rediscovery, Craft’s book is the rare survivor from hell that teaches us how much is lost as humans are lost. What lyrics were written in the minds of those working plantations which we shall never read; what verse revised in the thoughts of those being marched into the gas chamber? This is among the saddest of all lost literature. Craft’s rediscovery provides the divine promise of that canon of lost books—that literature may be lost, but maybe only for a time.

5.During the spring of 2007, I read the entirety of the pre-Socratic metaphysician Democritus. The assignment took me forty-five minutes. I did it in a Starbucks on Forbes Avenue in Pittsburgh and my Venti wasn’t even cold by the time I finished. When we consider literature that has been lost, literature which has survived, and literature that has been rediscovered, it must be understood that much is fragmentary — sometimes in the extreme. Democritus, the “laughing philosopher,” the father of science, who first conceived of atoms, endures in about six photocopied pages. “And yet it will be obvious that it is difficult to really know of what sort each thing is,” reads one of Democritus’ surviving fragments, and how correct he was.

Democritus is not unique; most ancient philosophers exist either only in quotation, paraphrase, or reputation. No tomes survive of Thales, Heraclitus, Parmenides, Protagoras, or Zeno, only some flotsam and jetsam here and there. As I’ve already mentioned, Pythagoras has no words of his which survive, and all of Aristotle is turgid second-hand lecture notes. The classical inheritance of Greece and Rome exists, where it does, in shards. Harvard University Press’s celebrated Loeb Classical Library, which prints translations of Greek and Latin literature, has 542 books in the entire sequence, from Apollonius to Xenophon. More than can be read in a long weekend, no doubt, but easily accessible during the course of a lifetime (or a decade). Easy to assume that the papyri of Athens and Rome were kindling for Christians who condemned such pagan heresy, though that’s largely a slander. The reality is more prosaic, albeit perhaps more disturbing in a different way. Moisture did more to betray the classical past than Christianity did; for decay is a patient monarch willing to wilt Plato as much as a grocery list. Something to remember as we have our endless culture wars about what should or shouldn’t be in the canon. What’s remembered happens to simply be what we have.  

Fragmentation defines literature — there is a haunting of all which can’t be. Fragments are faint whispers of canons inaccessible. Lacunae is sometimes structured into writing itself, for literature is a graveyard filled with corpses. Sometimes a body is hidden in plain sight – consider Shakespeare’s Sonnet 146. That poem begins “Poor soul, the centre of my sinful earth, / […] these rebel powers that thee array.” Because the meter is so aggressively broken, it’s understood that a typesetter’s mistake was responsible for the deletion of whatever the poet intended. Jarring to realize that Shakespeare is forever marred with an ellipsis. Dan Peterson writes in Reading Shakespeare’s Sonnets: A New Commentary that the aporia in the poem “tends to obsess most commentators,” but that the “poem deserves it; we shouldn’t allow it to be completely ruined by a compositor thinking about his dinner.” Several pages are spent by Peterson trying to use prosody in the service of forensics, with various degrees of plausibility entertained, including Shakespeare having possibly meant to write “fenced by,” “starv’d by,” or “fooled by,” all of which any good New Critic will tell you would imply wildly different interpretations.

I’d like to offer an alternative possibility, based not on
sober scansion but irresponsible conjecture. Peterson notes that the sonnet is
one which “says that the body is a lousy home for the soul, which ends enslaved
to its gaudy, pointless, sensual, self-consuming worldliness… it proposes
nothing sort of renunciation of worldly things, a mortification of the flesh in
exchange for the revival and revivification of the spirit.” Maybe then the gap
is the point, an indication that the matter of the poem can never really
intimate the soul of meaning, where the black hole of the typographical mistake
is actually as if an open grave, an absolute zero of meaning that sublimely
demonstrates the theme of the sonnet itself? Because the gulf between printed
word and the meanings which animate them is a medium for sublimity, the entirety
of all that which we don’t know and can never read as infinite as the universe
itself.

In the first-century the Roman critic Longinus, whose identity is unknown beyond his name (another way to lose your literature), argued that the “Sublime leads the listeners not to persuasion, but to ecstasy… the Sublime, giving to speech an invincible power and strength, rises above every listener.” Romantic era critics saw the sublime in the Alps and the Lake District; American transcendentalists saw it in the Berkshires and Adirondacks. For myself, I gather the trembling fear of the sublime when I step into the Boston Public Library at Copley Square, when I cross the fierce Fifth Avenue lions of the New York Public Library, and underneath the green-patina roof of the Library of Congress. To be surrounded by the enormity of all that has been written which you shall never read both excites and horrifies me – all the more so when you consider all that is lost to us, whether from misplacement, destruction, or having never been written in the first place (the last category the most sublime of all).

Longinus’s “On the Sublime” is also fragmented, struck through with gaps and errors. He tantalizes us at one point with “There is one passage in Herodotus which is generally credited with extraordinary sublimity,” but there is nothing more sublime than a vacuum, for what follows is nothing. Latter he promises that “loosely allied to metaphors are comparisons and similes, differing only in this,” but the page is missing. And at one point he claims that Genesis reads “Let there be light, and there was. Let there be earth, and there was,” though it could be entertained that Longinus is simply quoting an alternative version of the Bible which is lost. His essay was built with words from hidden collections, a gesture towards Alberto Manguel’s observation in The Library at Night that the “weight of absence is as much a feature of any library as the constriction of order or space… by its very existence, [it] conjures up its forbidden or forgotten double.”

6.W.H. Auden was the most ruthless self-editor as his decades-long war of attrition against his most celebrated lyric, “September 1st, 1939,” demonstrates. Originally published in the New Republic on the occasion of Adolph Hitler’s invasion of Poland, “September 1st, 1939” is well-remembered and well-loved for a reason. “I sit in one of those dives/On Fifty-second street,” where Auden hears news of the panzer divisions rushing towards Warsaw and Krakow. Here among mundanity, where “Faces along the bar/Cling to their average day,” Auden invokes feelings all too familiar to us in the contemporary moment (hence the endurance of the poem), these “Waves of anger and fear” felt by the drinkers at the bar; men who feel “Uncertain and afraid/As the clever hopes expire/Of a low dishonest decade.” Nonetheless, Auden maintains that the “lights must never go out, /The music must always play,” in the penultimate stanza including a line that has moved people for eight decades – “We must love one another or die.”  

Eighteen years later and Auden would write to a critic that “Between you and me, I loathe that poem.” He saw it as sentimental pablum; most importantly Auden felt that it was simply a lie. His biographer Edward Mendelsohn explains in Early Auden, Later Auden: A Critical Biography that “By his own standards, if not those of his readers, these public poems failed.” In later editions he changed the line to “We must love one another and die,” the conjunction giving an entirely different meaning (albeit literally truer). The red pen is not easily pulled out once a book is in print, for though he omitted the line in collections released in both 1945 and 1966, it was inevitable that “September 1st, 1939” would circulate, even though he wrote that it was “trash which he is ashamed to have written.” This poem is not lost literature, but rather a case of a failed attempt to have buried the word. Impossible to imagine that Auden didn’t despair at the simple fact that there had been a time when he could have strangled the poem in the crib, that before “September 1st, 1939” was sent out into the world the sovereign power of the strike-through had still once been his.

Luckily for us Auden didn’t do that, but it does demonstrate that one of the most effective means of losing literature is in editing and revising. How many innumerable drafts of famous novels and poems exist, revisions immeasurable? If there is any modernist credo, it’s one of valorizing the red pen. F. Scott Fitzgerald’s injunction to “kill your darlings,” and anecdotes about Hemingway writing a staggering forty-seven drafts of A Farewell to Arms (but only one of his novels is in some tossed luggage somewhere). Such is the masochism of contemporary composition advice, whereby if there is one inviolate truism it’s that writing isn’t writing unless its rewriting. Vladimir Nabokov who bragged that “My pencils outlast their erasers”; Truman Capote saying “I believe in the scissors”; and E.B. White and William Strunk’s commandment in The Elements of Style that “A sentence should contain no unnecessary words” (I reject that advice). Lean, muscular, masculine, taut, minimalist prose was the point of writing, and as such loss became an intrinsic part of literature itself.

Hannah Sullivan in her brilliant The Work of Revision examines how the cult of editing emerged, looking at how technology in part facilitated the possibility of multiple drafts. With the introduction of mechanical means of composition (i.e. the typewriter) authors had, for the first time, the possibility to relentlessly write and rewrite, and a certain ethos of toughness surrounding the culling of words developed. “Our irrepressible urge to alter ‘the givens’ helped to create Modernism,” argues Sullivan, and “remakes us right to the end.” In some ways, contemporary technology haunts us with the ghosts of exorcised drafts more than mere typewriters ever could. Sullivan had a record of typed pages to look back at: drafts with underlined and struck out passages, a cacophony of addition carrots and transposition marks and the eternal promise of “TK,” rendered in ink and whiteout, but she had a record.

With Word processing, editing and revision can be
instantaneous in a manner that they couldn’t with a Remington, so that drafts
exist layered on top of each other, additions and deletions happening rapidly
in real time, with no record of what briefly existed before like some quantum
fluctuation. A final copy is the result of writing, but is not writing itself.
It rather represents the aftermath of a struggle between the author and the
word, merely the final iteration of something massive, and copious, and large
spreading its tendrils unseen backwards into a realm of lost literature.
Revision is a rhizomatic thing, each one of the branches of potential writing
hidden and holding aloft the tiny plant. A final draft is the corpse left over
after the life that is writing has ended.  

7.How talented an actor must Edwin Forrest have been that on May 10th, 1849 his fans would be willing to riot in his defense after it was perceived that he was being slighted by rival thespian William Charles Macready? The two had long come to blows in the press over who was the superior Shakespearean actor, and they each had their own partisans. Where Macready was delicate and refined, Forest was rough-hewn and rugged; Macready delivered his lines with elegance, Forest with swagger and punch. Advocates for Macready watched their hero perform Hamlet in the palatial theaters of Broadway; the faction of Forrest was content to drink and brawl in front of the Bowery’s stages. Most importantly, Macready was British and Forrest an American. Shakespeare was thus an issue of patriotic loyalty, with Nigel Cliff writing in The Shakespeare Riots: Revenge, Drama, and Death in Nineteenth-Century America that the Bard was “fought over, in frontier saloons no less than in aristocratic salons, with an almost hysterical passion.”

“Early in life,” Forrest once said, “I took a great deal of exercise and made myself what I am, a Hercules.” The “Bowery Boys” of the Five Points slums were delighted by Forrest’s American self-regard. Which actor you preferred, and whose style of delivery you saw as superior, said much about who you were as a person. Forrest was preferred by the working class, both Know Nothing nativists and Irish immigrants thrilled to him, while the Anglophilic New York aristocracy attended Macready plays in droves. Following three nights of rambunctious, mocking “Bowery Boys” buying seats out at Macready’s title performance in Macbeth (of course) at the Astor Place Opera, a riot would explode, leaving as many as three dozen people dead, and over a hundred injured. The worst violence in the city since British prison ships dotted the harbor during the Revolution, and until the Draft Riots would burn through New York during the Civil War. All of it over the power of performances that none of us shall ever see.  

No literature is more intrinsic to human experience than performance, and no literature is more perishable. The New York World said that Forrest had “head tones that splintered rafters,” and reviewers noted the distinctive elongated pauses of Macready’s delivery, but the fact is that theirs is an art that will never be accessible to us. Sometimes Macready is configured as a stuffy Laurence Olivier to Forrest’s virile Marlon Brando, but more than likely both would have performed in the rigid, stylized style that reigned supreme in a theater where there was no technology that could amplify voices and where the idea of the naturalistic method would have seemed bizarre. We can’t really know though — Forrest died in 1872, followed less than six months later by Macready, both within five years of Thomas Edison’s invention of the phonograph.

We know a tremendous amount about the men, and eventually the women, who first performed some of the most iconic of plays centuries before Forrest and Macready. Even with contemporary accounts, however, we’ll never actually be able to see Richard Burbage, knowing rather the names of the characters he first played — Hamlet, Lear, Othello. Likewise, the Elizabethan comedian Richard Tarlton has had his monologues rendered mute by history. Or the Kings Men’s great comedic actor William Kempe, famous for his improvisational shit-talking at jeering audiences, though none of his jibes come down to us, even while it believed that he was instrumental in the composition of one of his most famous characters – Lear’s fool. And the incomparable Edward Alleyn, confidant of Christopher Marlowe (and son-in-law of Donne) who was regarded as the greatest actor to ever grace the Rose Theater stage, and who mastered a subtle art so ephemeral that it disappeared the moment the play ended. Of this assemblage, Stanley Wells writes in Shakespeare & Co.: Christopher Marlowe, Thomas Dekker, Ben Jonson, John Middleton, John Fletcher and the Other Players in His Story that they were “leading actors who would have been stars whenever they were born.” Well, maybe. Who is to say?

Part of the glory of the theater is its gossamer transience, the way in which each performance is different, how it can’t be replicated. A script is an inert thing, while the play is the thing forever marked by its own impermanence. In the years after Macready and Forrest died, Edison gave us the illusion of eternity, the idea that voices and images could be preserved. Nothing signaled a greater shift in human consciousness over the past millennium than the myth that both very far away or long after somebody’s death (not dissimilar states) that their identity could be preserved in an eternal present. We can’t watch Forrest or Macready — but we can Olivier and Brando. It seems fundamentally different, a way of catching forever the ephemeral nature of performance, of preserving the fleeting. An illusion though, for decay just takes longer to come. Film must have seemed a type of immortality, but it’s estimated that 75% of silent films are lost forever, and as many as 90% of all films made before 1929. Flammable nitrate and the fickle junking of Hollywood studios proved as final as death, because not only can you never watch Forrest or Macready, you also can’t see Lon Chaney in London After Midnight, Theda Bara in Cleopatra, or Humor Risk — the first film starring the Marx Brothers.

8. “It is one hundred years since our children left,” reads a cryptic, anonymous missive in the town record of the German city of Hamelin from 1384. The only evidence for the basis of that disturbing fairy tale about the Pied Piper, he of mottled cloth and hypnotic music who drew all of the children of the town away from their parents after he’d already depleted it of vermin. Fairy tales operate by strange dream logic, chthonic echoes from the distant past which exist half-remembered in our culture. Hypotheses have been proffered as to what the story may have been based on, why those children were taken. Explanations include that the youth of the city were victims of mass psychosis in a manner similar to the outbreaks of compulsive dancing which marked the late Middle Ages; it’s been suggested that they were victims of plague, that they’d been sold into slavery, or that they’d been recruited by a roving preacher to join one of the ill-fated “Children’s Crusades” that sent thousands of adolescents off to the Levant. Regardless of the basis for the fairy tale, it’s story has played down in our culture like an idee fix, in the nineteenth-century appearing not just in the Brothers Grimm’s Children’s and Household Tales, but also in poems by Johann Goethe and Robert Browning, with “Pied Piper” a short-hand for the siren’s manipulative call.

Who then “authored” the original story? Do we credit the reinvention of the Grimm’s as its origin, do we count the source material from which they drew their inspiration, does each text influenced by the tale stand on its own? Was it whatever forlorn ancestor made that annotation in Hamlin’s ledger? The Pied Piper himself? The nature of a fairy tale is that everyone is their reader but nobody is their author. Jack Zipes writes in Why Fairy Tales Stick: The Evolution and Relevance of a Genre that “We respond to these classical tales almost as if we were born with them, and yet, we know full well that they have been socially produced and induced and continue to be generated.” The Grimms and other Romantic-minded folklorists saw the fairy tale as arising spontaneously from the collective genius of the people, and there is a sense in which these anonymous tales are a collaborative venture of composition which takes places over centuries, millennia even. They are, in a sense, examples of lost literature finding itself, their creators’ anonymity a different form of oblivion.

The fairy tales which we all seem to intuitively know — Cinderella, Beauty and the Beast, Rumpelstiltskin — were collected by linguists like the Brothers Grimm, but it was in the twentieth-century that folklorists were actually able to categorize them. Chief among the classification systems developed for fairy tales is the Aarne-Thompson-Uthar Index, a complex method of charting the various narrative relationships between disparate stories, with an accompanying numeric mark to distinguish individual narratives. Cinderella, for example, is ATU 510A; Beauty and the Beast is ATU 425C. Scholars were able to thus chart stories to their potential beginnings. The tale of Cinderella finds its earliest iteration in ancient Greek writings of the geographer Strabo; researchers at the University of Durham have been able to ascertain that the Beast first met Belle in a version from an astounding four-thousand years ago. Jamie Terhani and Sara Graça da Silva, the folklorists who used phylogenetic means to chart alterations in Indo-European languages so as to estimate the approximate age of various fairy tales, have claimed that The Devil and the Smith (of which the Faust legend is an iteration) may have first been told six millennia ago.   

So many variations, so many lost stories, whispered only to infants in swaddling clothe over millennia. We can never know what exactly the earliest version of those stories was like; we’ll never know the names of those who composed them. Fairy tales pull at our soul like the vestigial leg of an amputee, a dull ache of people long since gone whose stories we still tell even though we’ve forgotten the creators. Anonymous literature of this sort is the most intimate, told to children before bed-time, repeated to families preparing food around a kitchen table. “I would venture to guess that Anon,” wrote Virginia Woolf in A Room of One’s Own, “who wrote so many poems without signing them, was often a woman.” Such is the lost literature of our mothers, and our grandmothers, of “Anon” who is the greatest writer who ever lived (or didn’t).  Nothing is as intrinsic to our sense of identity like these sorts of stories, when all else is stripped away from us — popular paper backs, avant-garde experimentation, canonical literature — fairy tales will remain. While our libraries are inscribed with names like “Shakespeare” and “Cervantes,” we’ll never be able to chisel into stone the legion of those who composed Cinderella.

9.Since his brother died, Amadeo Garcia Garcia can only speak his native tongue to another human in his dreams. His language was once used to express love and anger, to console and castigate, to build, to instruct, to preserve, now relegated only to nocturnal phantoms. Over the last several decades, fewer and fewer people were able to understand Taushiro, till only Garcia’s immediate family knew the language, and now they’ve all died. A linguistic isolate spoken in the Peruvian Amazon, Taushiro is like all languages in that its syntax and grammar, its morphology and diction, necessarily shapes its speaker’s perception of reality.

Journalist Nicholas Casey who introduced Garcia’s story to the world in a New York Times article notes that the “entire fate of the Taushiro people now lies with its last speaker, a person who never expected such a burden and has spent much of his life overwhelmed by it.” When it joins that graveyard of discarded language, alongside Akkadian and Manx, Ainu and Etruscan, what will pass is nothing so dry as a dictionary, but an entire vision of the world. Literature is language and all languages are literature, forged collaboratively in the discourse between people. When the only ones left to talk to are ghosts of dead loved ones in dreams, it’s as if the coda for an entire universe.

Linguist K. David Harrison explains in When Languages Die: The Extinction of the World’s Languages and the Erosion of Human Knowledge what exactly is at stake. As of the turn of this century, there were 6,912 distinct languages spoken in the world, albeit the vast majority of those spoken by exceedingly few people (as with Taushiro and its speakership of one). He explains that 204 of those languages have less than ten speakers, and that an additional 344 have no more than a hundred. By the end of this century, the number of spoken languages will be half that previous number, if we’re lucky. Victim to globalization and “development,” Harrison says that we stand to lose an “immense edifice of human knowledge, painstakingly assembled over millennia by countless minds, [which] is eroding, vanishing into oblivion.”

Garcia can give us indications of what the stories he heard from his parents were like, of how it feels to speak a language that doesn’t distinguish between numbers, or where diction is whittled down to a pristine simplicity, but we’ll never really know since none of us can speak Taushiro. It was the anthropologist Edward Sapir and his student Benjamin Whorf who made the fullest argument as to the way that these unique qualities produce thought, where language isn’t the result of ideas, but rather that ideas were the result of language. Their estimation was that things like tense, person, subject-verb-object order, and so on, don’t just convey information—they create it. Whorf was an insurance claims adjuster intimately aware of how much of reality depends on the language through which we sieve our experience; it was he who was responsible for the convention of “flammable” things being marked as such, as opposed to the grammatically correct “inflammable,” which he had discovered people took to mean the opposite.   

Their Sapir-Whorf Hypothesis is succinctly stated by the
first of the two in his 1929 The Status of Linguistics as a Science, when
he argued that “Human beings do not live in the objective world alone… but are
very much at the mercy of the particular language which has become the medium
of expression for their society… The worlds in which different societies live
are distinct worlds, not merely the same world with different labels attached.”
English is not French is not Greek is not Farsi is not Punjabi. Taushiro is not
English. Translation is feeling about in a darkened room and being able to discern
the outline of the door, but it doesn’t give one the ability to step through
into the other room (only perhaps to hear some muffled conversation with an ear
pressed against the wall).

When a tongue has genuinely stopped moving there is an
insurmountable difference separating us from its literature. We’ll never quire
get the fear in Elamite accounts of Alexander the Great invading the Achaemenid
Empire; nor understand the vanquished pathos of the god Chemosh speaking in his
native Moabite; or the longing implicit in the poetry of Andalusian Arabic.
Each one of those languages had their own last speakers, as lonely as Garcia,
like Lot surveying his destroyed home and thinking he was the last man on
Earth, or as its said in Taushiro, “Ine aconahive ite chi yi tua tieya ana na’que
I’yo lo’.”

10.A thousand virgin trees have been planted in the Nordmarka forest near Oslo. Just saplings today, the Norwegian spruces are embanked by older birch and fur trees, but the new plantings are marked to be felled in 2114, after they’ve grown for a century. At that point, they’ll be pulped and turned to paper, which will be transported to the Deichman Library, which houses a printing press that will be used to produce the first editions of books that will have been compiled over the preceding ten decades and maintained in the sanctum of a wooden space known as the “Silent Room.” This is the Scottish artist Katie Peterson’s Future Library Project, in which a different prominent author will contribute a novel every year until the completion date, with the understanding that nobody will be allowed to read their contribution until 2114.

Which means that none of you reading this today will ever be able to parse Margaret Atwood’s novel Scribbler Moon. What its plot is, who the characters she’s created are, or the themes entertained, is all a glorious absence, save for that evocative two-word title. Nor can you read David Mitchell’s From Me Flows What You Call Time, whose title makes it sound as if it were an imagined novel from his Cloud Atlas, the author remarking that taking part in the project was a “vote of confidence in the future.” The most recent contribution has come from native son Karl Ove Knausgaard, an untitled work which may or may not contain descriptions of breakfast that go on for pages. Knausgaard said of Paterson’s vision, that it’s “such a brilliant idea, I very much like the thought that you will have readers who are still not born — it’s like sending a little ship from our time to them.”

A vote of confidence in the future is a beautiful description
of a beautiful project, if an idiosyncratic one. It’s also a definition of
literature, for even though the writer must primarily create for herself,
literature still must transmit in the connections between minds. Literature is
a vote of confidence in the future, in the present, in the past – it’s a vote
of confidence in other people. The Future Library Project is in keeping with
those theorists who are concerned with “deep time,” with the profoundly long
view and arc of human history as it rushes away from us. The Long Now
Foundation of San Francisco is one such organization that encourages all of us
to think in the sorts of terms that Paterson does, to understand that innumerable
civilizations have fallen and so shall ours, but that there is a way in which
history ever moves forward.

Stewart Brand writes of a future Library of Alexandria in The Clock of the Long Now: Time and Responsibility, imagining a “10,000-Year Library… [in] a vast underground complex hewn out of rock – preferably a mountain.” The Long Now Foundation tentatively is taking suggestions for what a 10,000-Year library might look like, what books should be included, and how we’re to understand the continuity of an institution that would be older than all of recorded human history. “Fantasy immediately calls up a refuge from the present,” Brand writes, “a place of weathered stone walls and labyrinthine stacks of books, at a remote location with far horizons. It is a place for contemplative research and small, immersive conferences on topics of centenary and millennial scope.” Surely, he knows that there is something quixotic in this vision, just as Paterson no doubt understands that a century hence it’s quite possible that nobody will be left around to read those books in Oslo.

Literature is forever in the process of being lost, and it’s hubristic to assume that what we read today will be around to be read tomorrow. Nevertheless, that’s the beauty in Peterson and Brand’s dreams, that it conceives of a way that all which is lost shall someday be found, that all which is feeble can be preserved. Theirs is a struggle of attrition against that most merciless of editors known as entropy. All literature is of a similar resistance against time, mortality, finitude, limitation. To write it to commit an act of faith, to pray that what words you’ve assembled shall last longer than you, and that they’ll hopefully be found by at least someone who shall be, however briefly, changed.

Bonus Links:—Ten Ways to Live ForeverTen Ways to Save the WorldTen Ways to Look at the Color Black

Image Credit: Pexels/Bakr Magrabi.

Letter from the Capitol

- | 1

The Confederate battle standard never flew within the Capitol Building — until January 6th, 2021. During the Civil War, that cankered, perfidious, malignant, cancerous cabal of traitors who grandiosely called themselves the “Confederate States of America” had many northern strategic inflection points in which they stabbed into the nation’s body, and because of these, for a time, it seemed as if they might be triumphant. General John Hunt Morgan’s 2nd Kentucky Calvary Regiment raided not just in that unfortunate border state, but in 1863 they pierced into Indiana and Ohio as well. Morgan would finally surrender in Salineville, Ohio, which latitudinally is almost as far north as Connecticut. Even more incongruously and a year later, 21 veterans of Morgan’s Raid crossed over the Canadian border, that land then colonized by a Southern-sympathizing Great Britain, and attacked the sleepy hamlet of St. Albans, Vermont, including robbing the bank and forcing the citizens at gun point to swear fealty to the Confederacy. The most violent (and most famous) invasion of the north was the traitor Robert E. Lee’s campaign in Pennsylvania, the goal of which was to possibly capture or burn down Philadelphia, but which was stopped at the infamous “High Water Mark” of the Confederacy when Union General George C. Meade turned back the Army of Northern Virginia at Gettysburg—a battle that took more than 50,000 American lives in three days. During Lee’s campaign in southern Pennsylvania, free Black women and men had to flee north, as the Confederate raiders would send those they kidnapped into a southern bondage.

For sheer absurdity, among the closest positions that the rebels ever got to the national capital was the Marshall House Inn in Alexandria, Virginia, where a Confederate flag was displayed that was so large and so tall that Lincoln could see it from the White House across the Potomac. A few weeks after Ft. Sumter and Union troops occupied the city, marching down red-bricked King Street where slave markets had sold thousands of human beings less than ten miles from the Capitol Building. When Colonel Elmer Ephraim Ellsworth of the 11th New York Volunteer Infantry Regiment ascended to the roof of the hotel to remove the flag, the proprietor of the Marshall House shot him dead, the first Union casualty of the Civil War. Despite being able to see the warped cross of the Confederate battle standard from the portico of the White House, Lincoln steadfastly refused to move the capital to safer points further north, arguing that the abandonment of Washington would be a capitulation to the seditionists.

“Let us be vigilant,” Lincoln telegraphed to the worried Maryland governor in 1864, “but keep cool. I hope neither Baltimore nor Washington will be sacked.” Not for lack of desire, as that same year Confederate Lieutenant Jubal Early would attack Ft. Stevens in the Northwest Quadrant of the District of Columbia, in a battle that would take close to nine hundred men. Long had the secessionists dreamed of Washington as the capital of their fake nation. In the decades before the Civil War some imagined a “Golden Circle,” which would be a veritable empire of slavery, with the South Carolina Senator Robert Barnwell Rhett imperially enthusing that “We will expand… over Mexico – over the isles of the sea — over the far-off Southern — until we shall establish a great Confederation,” their twisted nation stretching from Panama to the District of Columbia. Until last week the Confederate flag never flew within the Capitol.

There the man casually strolls across the red-and-blue mosaic floor of some antechamber in the Capitol, dressed in jeans and a black hoodie with a tan hunting vest; hoisted over his shoulder is the Confederate flag, its colors matching the tiles. It shouldn’t be lost on anybody that his uniform is the exact same “suspicious” article of clothing which Black pre-teenagers have been shot for wearing, even while this man is able to raid the very seat of government unmolested. Because America is many things, but it is not subtle, the man in the photograph is centered by two gild-framed oil paintings. One is of Charles Sumner, the Massachusetts Senator and abolitionist nearly caned to death by an opponent on the legislative floor of this very building, and who denounced the “unutterable wrongs and woes of slavery; profoundly believing that, according to the true spirit of the Constitution, and the sentiments of the fathers, it can find no place under our National Government” before Congress in 1852. The other portrait, almost predictably, is of John C. Calhoun, the South Carolina Senator and Vice President under Andrew Jackson, who in 1837 would declaim that the “relation now existing in the slaveholding states… instead of an evil, [is] a good. A positive good,” and would then gush about what a kind and benevolent slave-master he was. It would be harder to stage a more perfect encapsulation of the American dichotomy than our weekend warrior did on Wednesday, the continual pull between those better angels of our nature and the demons of history, who are never quite exorcized and are often in full possession of the body politic. A power in that grotesque image, the cosplaying Confederate momentarily self-anointing himself sovereign as he casually strolls through the chamber. Chillingly strolled, one might say, for all of these terrorists acted with as impunity as if they had the knowledge there would be no consequences to their actions. It reminds us that the mantra “This isn’t who we are” is at best maudlin and at worst a complete lie.

The siege against the Capitol on the day that Congress met for the constitutionally mandated and largely pro-forma ritual of officially counting the Electoral College votes to certify Joe Biden and Kamala Harris as the rightful victors of the 2020 presidential race can be examined from many directions, of course. Security experts can parse why there was such a profound failure at ensuring the safety of the session; political scientists can explain how social media algorithms has increasingly radicalized adherents of the far-right; historians can place movements like QAnon and the Proud Boys in a genealogy of American nativism and European fascism. Everyone should be able to say that ultimate responsibility lay with the stochastic terrorism promoted by the lame-duck president and his congressional sycophants in the Sedition Caucus, as well as his media enablers with whom he is clasped in a toxic symbiotic relationship. All those approaches to analysis are valid, but I choose to look at the day as a literary critic and a resident of Washington D.C., because those things are what I am. But incongruity alone, even the uncanny alone, can’t quite provide the full critical lexicon for what we witnessed on our televisions that afternoon, the sense that even more than an inflection point, we were viewers of a cracked apocalypse. How do we make sense of an attempted American putsch, the almost-nightmare of a coup?

Because the cultural idiom of this nation is Hollywood, and our interpretive lenses are by necessity through that of the movies, I can’t help but feel that much of what we saw seemed prefigured in film. The terrible logic of America is that our deepest nightmares and desires always have a way of enacting themselves, of moving from celluloid to reality. Look at the photograph of Jake Angeli, the self-styled “QAnon Shaman,” shirtless and bedecked in racoon fur with buffalo horns upon his head (in pantomime of the very people whom this nation enacted genocide upon) with his face smeared in the colors of the American flag, standing at the dais of the Speaker of the House, and tell me that it doesn’t look like a deleted scene from The Postman. Or examine the photograph of a smiling ginger man in a stocking cap emblazoned with “TRUMP,” casually waving as he jauntily strolls underneath the rotunda past John Trumbull’s massive painting Surrender of General Burgoyne holding under his arm a pilfered wood podium decorated with a gold federal eagle, his hero’s adage that “when the looting starts, the shooting starts” apparently only to be selectively enforced. It looks like something from the post-apocalyptic movie The Book of Eli.

And then, most chillingly (and disturbingly underreported), there was the painstakingly assembled set of gallows, placed a bit beyond the equestrian monument to Ulysses S. Grant, who with great courage and strength broke the first iteration of the Ku Klux Klan, from which one vigilante hung that most American symbol of a noose. When remembered in light of the black-clad and masked men photographed with guns and zip-ties, it should make all of us consider just how much more tragic this violation, which was already a grotesque abomination, could have been. Horrifying to recall that the narrative conceit in Margaret Atwood’s The Handmaid’s Tale (and its television adaptation) that allowed for the theocratic dictatorship to ascend to power was the mass murder of a joint session of Congress. Sometimes #Resistance liberals get flak for their fears of fascism, but it would be easier to mock those anxieties if our country didn’t so often look like a science fiction dystopia.

It’s my suspicion that pop culture — that literature — is capable of picking up on some sort of cultural supersonic wavelength, those deep historical vibrations that diffuse in circles outward from our present into both past and future. There is something incantatory about those visions generated in word and special-effect, so that the eeriness of seeing marauding fascists overtake the Capitol grounds feels like something we’ve seen before. Think of all the times we’ve watched the monuments of Washington D.C. destroyed on film. Last week — while half paying attention to a block of cheesy apocalypse movies on the Syfy network that were supposed to count down the days left in the year — I saw the U.S.S. John F. Kennedy aircraft carrier pushed into the city by an Atlantic tsunami where it rolled across the National Mall and crushed the White House in Roland Emmerich’s godawful 2012. I’ve seen the executive mansion punctuated by bombs and dotted with bullet holes in the spectacularly corny Antoine Fuqua movie Olympus Has Fallen, and according to Thrillist the Capitol itself has been laid waste in no less than nine movies, including Day After Tomorrow, Earth vs. the Flying Saucers, G.I. Joe: Retaliation, Independence Day, Olympus Has Fallen, Superman II, White House Down, and X-Men: Days of Futures Past. Probably the impulse to watch this sort of thing is equal parts vicarious thrill and enactment of deep fears. I remember that when I saw Independence Day (also by Emmerich, the Kurosawa of schlock) after it came out, the 1996 theater audience erupted into cheers and claps when the impenetrable wall of exploding alien flames incinerated its way across D.C. and shattered the white dome of the Capitol like an egg being thrown into a fire-place. Was that applause an expressed opinion about Newt Gingrich? About Bill Clinton? Something darker?

After the terrorist attacks on 9/11, now almost twenty years ago, there was a profoundly shortsighted prediction that the hideous spectacle of Americans seeing the World Trade Center collapse would forever cure us of our strange desire to see our most famous buildings, and the people within them, destroyed. A perusal into the Olympian corpus of the Marvel Cinematic Universe (seemingly the only entertainment which Hollywood bothers to produce anymore) will testify that such an estimation was, to put it lightly, premature. French philosopher Guy Debord could have told us this in 1967 in his Society of the Spectacle, wherein he noted that “all of life presents itself as an immense accumulation of spectacles. Everything that was directly lived has moved away into a representation,” to which it could be added that the inverse is also accurate – everything that has been represented has seemingly moved into life. Which doesn’t mean that scenes like those which we witnessed on Wednesday aren’t affecting – no, the opposite is true. People reach to the appraisal that “it looks like a movie” not to be dismissive, but rather because cinema is the most powerful mythopoesis that we’re capable of.

What’s needed, of course, is a vocabulary commensurate with what exactly all of us saw. A rhetoric capable of grappling with defilement, with violation, with desecration, but because all we have is movies, that’s what we’re forced to draw upon. They gave us the ability to think about the unthinkable before it happened; the chance to see the unseeable before it was on our newsfeeds. If the vision of the screen is anemic, that’s not necessarily our fault — we measure the room of our horror with the tools which we’ve inherited. Few square miles of our civic architecture are quite so identified with our quasi-sacred sense of American civil religion as the grounds of the U.S. Capitol, and so the spectacle of a clambering rabble (used as a Trojan Horse for God knows what more nefarious group of actors) calls to mind fiction far more than it does anything which actually has happened. That’s the cruelty of our current age — that so frequently our lives resemble the nightmare more than the awakening. The Capitol siege was very much an apocalypse in the original Greek sense of the word: an unveiling, a rupture in normal history that signals why all of this feels so cinematic — though it’s hard to tell if it’s the beginning or ending of the movie, and what genre we’re exactly in. As Timothy Denevi writes about the assault in LitHub, “What is a culmination, after all, except the moment in which everything that could happen finally does? Where are we supposed to go from there?”

Important to remember that everything which could happen has already happened before, at some point. That’s what the bromide about this not being who we are gets wrong — this is, at least partially, who we’ve always been, albeit not in this exact or particular way. What happened at the eastern edge of the Mall this week has shades of the Wilmington Insurrection of 1898 in which an conspiracy of white supremacists plotted against the Black leadership of the North Carolina city ushered in Jim Crow at the cost of hundreds of lives (and then untold millions over the next century). The assault on the Capitol has echoes of the Election Riots of 1874, when members of the White League attacked Black voters in Eufaula, Alabama, leaving behind dozens of wounded women and men, and seven corpses. These are two examples of hundreds of similar events that shamefully liter our nation’s history, albeit most citizens have never heard of them. Hell, most people didn’t know about the Tulsa race massacre of 1921 — still less than a century ago — until HBO’s Watchmen dramatized it. The issue is exactly the same: White supremacists think that only their votes count, and will do anything to enforce that conviction.

That the supporters of the man who currently occupies the Oval Office believe any number of insane and discounted conspiracy theories about election fraud — claims rejected in some sixty lawsuits and a 9-0 Supreme Court decision — is to in some ways miss the point. Listen to their language — the man who instigated Wednesday’s riot emphasizes that he simply wants to count “legal” votes and ask yourself what that means, and then realize why the fevered rage of his mob focuses on places like Detroit, Philadelphia, and Atlanta. If the only people who’d been allowed to vote for Trump were white people, then he would have won the election in his claimed landslide — that’s what he and his supporters mean by “legal” votes. The batshit insane theories are just fan fiction to occlude the actual substance of their political belief. Such anti-democratic sentiment is also an American legacy, an American curse. The connection between what happened on Capitol Hill and in Wilmington, Eufaula, and Tulsa; or Fort Bend, Texas in 1888; or Lake City, South Carolina in 1897; or Ocoee, Florida in 1920; or in Rosewood, Florida in 1923 (you can look them all up), or any number of other thousands of incidents, may seem tangential. It isn’t.

When I lived in Massachusetts there was a sense of history that hung thick in the air, all of those centuries back to the gloomy Puritans and their gothic inheritance. Historical markers punctuated the streets of Boston and her suburbs, and there was that rightfully proud Yankee ownership of the American Revolution. Our apartment was only a mile or so from the Lexington battle green where that shot heard around the world rang out, and I used to sometimes grab a coffee and read a magazine on one of its pleasant benches in what was effectively a pleasant park, battle green thoughts in a green shade. Part of me wanted to describe this part of the country as haunted, and perhaps it is, but its ghosts seem to belong to a distant world, a European world. By contrast, when I moved to Washington DC, the American specters moved into much clearer focus. If Massachusetts seems defined by the Revolution, then the District of Columbia, and Maryland, and Virginia are indelibly marked by the much more violent, more consequential, more important, and more apocalyptic conflagration of the Civil War. In his classic Love and Death in the American Novel, the critic Leslie Fiedler described the nation as “bewilderingly and embarrassingly, a gothic fiction, nonrealistic and negative, sadist and melodramatic — a literature of darkness and the grotesque in a land of light and affirmation.” Our national story is a Jekyll and Hyde tale about the best and worst aspirations at conflict within the Manichean breast of a nation which fancied itself Paradise but ended up somewhere points further south.

Because I have a suspicion that poetry is capable of telling the future, that everything which can or will happen has already been rendered into verse somewhere (even if obscured), a snatch of verse from a Greek poet accompanied my doom scrolling this week. “Why isn’t anything going on in the senate?” Constantin Cavafy asked in 1898, “Why are the senators sitting there without legislating?” I thought about it when I first heard that the mob was pounding at the Capitol door; it rang in my brain when I saw the photographs of them parading through that marble forest of statuary hall, underneath that iron dome painted a pristine white. “Because the barbarians are coming today,” Cavafy answered himself. I thought about it when I looked at the garbage strewn through the halls, the men with their feet up on legislators’ desks, cackling at the coup they’d pulled. “What’s the point of senators making laws now? Once the barbarians are here, they’d do the legislating.” For a respite, it seems that the barbarians have either been pushed back or left of their own accord. In that interim, what will be done to make sure that they don’t return? Because history and poetry have taught us that they always do.

Image credit: Pexels/Harun Tan.

On Dreams and Literature

-

“We are such stuff/As dreams are made on, and our little life/Is rounded with a sleep.” — William Shakespeare, The Tempest (1611)

“A candy-colored clown they call the sandman/tiptoes to my room every night/just to sprinkle stardust and to whisper/’Go to sleep, everything is alright.’” — Roy Orbison, “In Dreams” (1963)

Amongst the green-dappled Malvern Hills, where sunlight spools onto spring leaves like puddles of water in autumn, a peasant named Will is imagined to have fallen asleep on a May day when both the warmth and the light induce dreams. Sometime in the late fourteenth-century (as near as we can tell between 1370 and 1390), the poet William Langland wrote of a character in Piers Plowman who shared his name and happened to fall asleep. “In a summer season,” Langland begins, “when soft was the sun, /I clothed myself in a cloak as I shepherd were… And went wide in the world wonders to hear.” Presentism is a critical vice, a fallacy of misreading yourself into a work, supposedly especially perilous if it’s one that’s nearly seven centuries old. Hard not to commit that sin sometimes. “But on a May morning,” Langland writes, and I note his words those seven centuries later on a May afternoon, when the sun is similarly soft, and the inevitable drowsiness of warm contentment takes over my own nodding head and drowsy eyes so that I can’t help but see myself in the opening stanza of Piers Plowman.

“A marvel befell me of fairy, methought./I was weary with wandering and went me to rest/Under a broad bank by a brook’s side,/And as I lay and leaned over and looked into the water/I fell into a sleep for it sounded so merry.” Good close readers that we are all supposed to be, it’s imperative that we don’t read into the poem things that aren’t actually in it, and yet I can’t help but imagine what that daytime nocturn was like. The soft gurgle of a creek through English fields, the feeling of damp grass underneath dirtied hands, and of scratchy cloak against unwashed skin; the sunlight tanning the backs of his eyelids; that dull, corpuscular red of daytime sleep, the warmth of day’s glow flushing his cheeks, and the almost preternatural quiet save for some bird chirping. The sort of sleep you fall into when you’re on a train that rocks you to sleep in the sunlight of late afternoon. It sounds nice.

Piers Plowman is of a medieval poetic genre known as a dream allegory, or even more enticingly as a dream vision. Most famous of these is Dante Alighieri’s The Divine Comedy, where its central character (who as in Piers Plowman shares the poet’s name) discovers himself in a less pleasant wood than does Will, for that “When I had journeyed half of our life’s way,/I found myself within a shadowed forest,/for I had lost the path that does not stray.” The Middle Ages didn’t originate the dream vision, but it was the golden age of the form, where poets could express mystical truths in journeys that only happened within heads resting upon rough, straw-stuffed pillows. Langland’s century alone saw Geoffrey Chaucer’s Parliament of Fowls, John Gower’s Vox Clamantis, John Lydgate’s The Temple of Glass, and the anonymously written Pearl (by the same lady or gent who wrote Sir Gawain and the Green Knight). Those are only English examples (or I should say examples by the English; Gower was writing in Latin), for the form was popular in France and Italy as well. A.C. Spearing explains in Medieval Dream-Poetry that while sleeping “we undergo experiences in which we are freed from the constraints of everyday possibility, and which we feel to have some hidden significance,” a sentiment which motivated the poetry of Langland and Dante.

Dante famously claimed that his visions — of perdition, purgatory, and paradise — were not dreams, and yet everything in The Divine Comedy holds to the genre’s conventions. Both Langland and Dante engage the strange logic of the nocturne, the way in which the subconscious seems to rearrange and illuminate reality in a manner that the prosaicness of overrated wakefulness simply cannot. Dante writes that the “night hides things from us,” but his epic is itself proof that the night can just as often reveal things. Within The Divine Comedy Dante is guided through the nine circles of hell by the Roman poet Virgil, from the antechamber of the inferno wherein dwell the righteous pagans and classical philosophers, down through the frozen environs of the lowest domain whereby Lucifer forever torments and is tormented by that trinity of traitors composed of Cassius, Brutus, and Judas. Along the way Dante is privy to any number of nightmares, from self-disemboweling prophets to lovers forever buffeted around on violent winds (bearing no similarity to a gentle Malvern breeze). In the Purgatorio and Paradiso he is spectator to far more pleasant scenes (though telling that more people have read Inferno, as our nightmares are always easiest to remember), whereby he sees a heaven that’s the “color that paints the morning and evening clouds that face the sun,” almost a description of the peacefulness of accidentally nodding off on an early summer day.

Both The Divine Comedy and Piers Plowman express verities accessed by the mind in repose; Langland’s poem, for not beginning in a dark wood but rather in a sunny field, embodies mystical apprehensions as surely as does Dante. A key difference is that Langland’s allegory is so obvious (as anyone who has seen the medieval play Everyman can attest is true of the period). Characters named after the Seven Deadly Sins, or called Patience, Clergy, and Scripture (and Old Age, Death, and Pestilence) all interact with Will — whose name has its own obvious implications. By contrast, Dante’s characters bear a resemblance to actual people (or they are actual people, from Aristotle in Limbo to Thomas Aquinas in Heaven), even while the events depicted are seemingly more fantastic (though in Piers Plowman Will witness both the fall of man and the harrowing of hell). Both are, however, written in the substance of dreams. Forget the didactic obviousness of allegory, the literal cipher that defines that form, and believe that in a field between Worcestershire and Hertfordshire Will did plumb the mysteries of eternity while sleeping. What makes the dream vision a chimerical form is that maybe he did. That’s the thing with dreams and their visions; there is no need to suspend disbelief. We’re not in the realm of fantasy or myth, for in dreams order has been abolished, anything is possible, and nothing is prohibited, not even flouting the arid rules of logic.

A danger to this, for to dream is to court the absolute when we’re at our most vulnerable, to find eternity in a sleep. Piers Plowman had the taint of heresy about it, as it inspired the revolutionaries of 1381’s Peasant Rebellion, as well as the adherents of a schismatic group of proto-Protestants known as Lollards. Arguably the crushing of the rebellion led to an attendant attack by authorities on vernacular literature like Piers Plowman, in part explaining the general dismalness of English literature in the fifteenth-century (which excluding Mallory and Skelton is the worst century of writing). Scholars have long debated the relationship between Langland and Lollardy, but we’ll let others more informed tease out those connections[. T]he larger point is that dreaming can get you in trouble. That’s because dreaming is the only realm in which we’re simultaneously complete sovereign and lowly subject; the cinema we watch when our eyes are closed. Sleep is a domain that can’t be reached by monarch, tyrant, state, or corporation — it is our realm.

Dreams had a radical import in George Orwell’s dystopian classic 1984. You’ll recall that in that novel the main character of Winston Smith is a minor bureaucrat in totalitarian Oceania. Every aspect of Smith’s life is carefully controlled; his life is under total surveillance, all speech is regulated (or made redundant by New Speak), and even the truth is censored, altered, and transformed (which the character himself has a role in). Yet his dreams are one aspect of his life which the government can’t quite control, for Smith “had dreamed that he was walking through a pitch-dark room. And someone sitting to one side of him had said as he passed: ‘We shall meet in the place where there is no darkness.’” Sleep is an anarchic space where the dreamer is at the whims of something much larger and more powerful than themselves, and by contrast where sometimes the dreamer finds themselves transformed into a god. A warning here though — when total independence erupts from our skulls into the wider world (for after all, it is common to mutter in one’s sleep) there is the potential that your unconsciousness can betray you. Smith, after all, is always monitored by his telescreen.

Whether it’s the thirteenth or the twenty-first centuries, dreaming remains bizarre. Whether we reduce dreams to messages from the gods and the dead, or repressed memories and neurosis playing in the nursery of our unconscious, or simply random electric flickering of neurons, the fact that we spend long stretches of our life submerging ourselves in bizarre parallel dimensions is so odd that I can’t help but wonder why we don’t talk about it more (beyond painful conversations recounting dreams). So strange is it that we spend a third of our lives journeying to fantastic realms where every law of spatiality and temporality and every axiom of identity and principle of logic is flouted, that you’d think we’d conduct ourselves with a bit more humility when dismissing that which seems fantastic in the experience of those from generations past who’ve long since gone to eternal sleep. Which is just to wonder that when William Langland dreamt, is it possible that he dreamt of me?

Even with our advancements in the modern scientific study of the phenomenon, their mysteriousness hasn’t entirely dissipated. If our ancestors saw in dreams portents and prophecies, then this oracular aspect was only extended by Sigmund Freud’s The Interpretation of Dreams. He who inaugurated the nascent field of psychoanalysis explained dreams as a complex tapestry of wish fulfillment and sublimation, an encoded narrative that mapped onto the patient’s waking life and that could be deciphered by the trained therapist. Freud writes that there “exists a psychological technique by which dreams may be interpreted and that upon the application of this method every dream will show itself to be a senseful psychological structure which may be introduced into an assignable place in the psychic activity of the waking state.” Not so different from Will sleeping in his field. The origin may be different — Langland sees in dreams visions imparted from God and Freud finds their origin in the holy unconsciousness, but the idea isn’t dissimilar. Dreaming imparts an ordered and ultimately comprehensible message, even though the imagery may be cryptic.

Freud has been left to us literary critics (who’ve even grown tired of him over the past generation), and science has abandoned terms like id, ego, and superego in favor of neurons and biochemistry, synapses and serotonin. For neurologists, dreaming is a function of the prefrontal cortex powering down during REM sleep, and of the hippocampus severing its waking relationship with the neocortex, allowing for a bit of a free-for-all in the brain. Scientists have discovered much about how and why dreaming happens — what parts of the brain are involved, what cycles of wakefulness and restfulness a person will experience, when dreaming evolved, and what functions (if any) it could possibly serve. Gone are the simple reductionisms of dream interpretation manuals with their categorized entries about your teeth falling out or of showing up naked to your high school biology final. Neuroscientists favor a more sober view of dreaming, whereby random bits of imagery and thought thrown out by your groggy chemical induced brain rearrange themselves into a narrative which isn’t really a narrative. Still, as Andrea Rock notes in The Mind at Night: The New Science of How and Why we Dream, “it’s impossible for scientists to agree on something as seemingly simple as the definition of dreaming.” If we’re such stuff as dreams are made of, the forensics remain inconclusive.

Not that dreaming is exclusively a human activity. Scientists have been able to demonstrate that all mammals have some form of nocturnal hallucination, from gorillas to duck-billed platypuses, dolphins to hamsters. Anyone with a dog has seen their friend fall into a deep reverie; their legs pump as if they’re running, occasionally they’ll even startle-bark themselves awake. One summer day my wife and I entertained our French bulldog by having her chase a sprinkler’s spray. She flapped her jowly face at the cool gush of water with a happiness that no human is capable of. That evening, while she was asleep, she began to flap her mouth again, finally settling into a deeper reverie where she just smiled. Dreaming may not necessarily be a mammalian affair — there are indications that both birds and reptiles dream — albeit it’s harder to study creatures more distant from us. Regardless, evidence is that animals have been sleeping perchance to dream for a very long time, as it turns out. Rock writes that “Because the more common forms of mammals we see today branched off from the monotreme line about 140 million years ago… REM sleep as it exists in most animals also emerged at about the time that split occurred.” We don’t know if dinosaurs dreamt, but something skittering around and out of the way of their feet certainly did.

If animal brains are capable of generating pyrotechnic missives, and if dreaming goes back to the Cretaceous, what then of the future of dreaming? If dogs and donkeys, cats and camels are capable of dreaming, will artificial intelligences dream? This is the question asked by Philip K. Dick’s Do Androids Dream of Electric Sheep?, which was itself the source material for Ridley Scott’s science fiction film classic Blade Runner. Dick was an author as obsessed with illusion and reality, the unattainability of truth, and doubt as much as any writer since Plato. In his novel’s account of the bounty hunter Rick Deckard’s decommissioning of sentient androids, there is his usual examination of what defines consciousness, and the ways in which its illusions can present realities. “Everything is true… Everything anybody has ever thought,” one character says, a pithy encapsulation of the radical potential of dreams. Dick imagined robots capable of dreaming with such verisimilitude that they misapprehended themselves to be human, but as it turns out our digital tools are able to slumber in silicon.

Computer scientists at Google have investigated what random images are produced by a complex artificial neural network as it “dreams,” allowing the devices to filter various images they’ve encountered and to recombine, recontextualize, and regenerate new pictures. In The Atlantic, Adrienne LaFrance writes that the “computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.” Artificial Intelligence has improved to an unsettling degree in just the past decade, even though a constructed mind capable of easily passing the Turing Test has yet to be created, though that seems more an issue of time than possibility. If all of the flotsam and jetsam of the internet could coalesce into a collective consciousness emerging from the digital primordial like some archaic demigod birthing Herself from chaos, what dreams could be generated therein? Or if it’s possible to program a computer, a robot, an android, an automaton to dream, then what oracles of Artificial Intelligence could be birthed? I can’t help but thrill to the idea that we’ll be able to program a desktop version of the Delphic Oracle analyzing its own microchipped dreams. “The electric things have their life too,” Dick wrote.

We’ve already developed AI capable of generating completely realistic-looking but totally fictional women and men. Software engineer Philip Wang invented a program at ThisPersonDoesNotExist.com which does exactly what its title advertises itself as doing: it gives you a picture of a person who doesn’t exist. Using an algorithm that combs through actual images, Wang’s site uses something called a generative adversarial network to create pictures of people who never lived. If you refresh the site, you’ll see that the humans dreamt of by the neural network aren’t cartoons or caricatures, but photorealistic images so accurate that they look like they could be used for a passport. So far I’ve been presented with an attractive butch woman with sparkling brown eyes, a broad smile, and short curly auburn hair; a strong jawed man in his 30s with an unfortunate bowl cut and a day’s worth of stubble who looks a bit like swimmer Michael Phelps; and a nerdy-looking Asian man with a pleasant smile and horn-rimmed glasses. Every single person the AI presented looked completely average and real, so that if I encountered them in the grocery store or at Starbucks I wouldn’t think twice, and yet not a single one of them was real. I’d read once (though I can’t remember where) that every invented person we encounter in our dreams has a corollary to somebody that we once met briefly in real life, a waitress or a store clerk whose paths we crossed for a few minutes drudged up from the unconscious and commissioned into our narrative. I now think that all of those people come from ThisPersonDoesNotExist.com.

One fictional person who reoccurs in many of our dreams is “This Man,” a pudgy, unattractive balding man with thick eyebrows, and an approachable smile who was the subject of Italian marketer Andrea Natella’s now defunct website “Ever Dream This Man?” According to Natella, scores of people had dreams about the man (occasionally nightmares) across all continents and in dozens of countries. Blake Butler, writing in Vice Magazine, explains that “His presence seems both menacing and foreboding at the same time, unclear in purpose, but haunting to those in whom he does appear.” This Man doesn’t particularly look like any famous figure, nor is he so generic that his presence can be dismissed as mere coincidence. A spooky resonance concerns this guy who looks like he manages a diner on First Avenue and 65th emerging simultaneously in thousands of peoples’ dreams (his cameo is far less creepy after you’re aware of the website). Multiple hypotheses were proffered, ranging from This Man being the product of the collective unconscious as described by Carl Jung to Him being a manifestation of God appearing to people from Seattle to Shanghai (my preferred theory). As it turns out, he was simply the result of a viral marketing campaign.

Meme campaigns aside, the sheer weirdness of dreams can’t quite exorcize them of a supernatural import — we’re all looking for portents, predictions, and prophecies. Being submerged into what’s effectively another universe can’t help but alter our sense of reality, or at least make us question what exactly that word means. For years now I’ve had dreams that take place in the same recurring location — a detailed, complex, baroque alternate version of my hometown of Pittsburgh. This parallel universe Pittsburgh roughly maps onto the actual place, though it appears much larger and there are notable differences. Downtown, for example, is a network of towering, interconnected skyscrapers all accessible from within one another (there’s a good bookstore there); a portion of Squirrel Hill is given over to a Wild West experience set. It’s not that I have the same dreams about this place, it’s that the place is the same, regardless of what happens to me in those dreams when I’m there. So much so that I experience the uncanny feeling of not dreaming, but rather of sliding into some other dimension. An eerie feeling comes to me from a life closer then my own breath, existing somewhere in the space between atoms, and yet totally invisible to my conscious eye.

Such is the realm of seers and shamans, poets and prophets, as well as no doubt yourself — the dream realm is accessible to everyone. As internal messages from a universe hidden within, whereby the muse and oracle are within your own skull. Long have serendipitous missives arisen from our slumber, even while we debate their ultimate origin. Social activist Julia Ward Howe wrote “Battle Hymn of the Republic” when staying at Washington D.C.’s Ward Hotel in 1861, the “dirtiest, dustiest filthiest place I ever saw.” While “in a half dreaming state” she heard a group of Union soldiers marching down Pennsylvania Avenue singing “John Brown’s Body,” and based on that song Howe composed her own hymn while in a reverie. Howe’s dreaming was in keeping with a melancholic era enraptured to spiritualism and occultism, for she commonly was imparted with “attacks of versification [that] had visited me in the night.” The apocalyptic Civil War altered peoples’ dreams, it would seem. Jonathan White explores the sleep-world of nineteenth-century Americans in his unusual and exhaustive study Midnight in America: Darkness, Sleep, and Dreams During the Civil War, arguing that peoples’ “dream reports were often remarkably raw and unfiltered… vividly bringing to life the horrors of the conflict; for others, nighttime was an escape from the hard realities of life and death in wartime.”

Every era imparts its own images, symbols, and themes into dreams, so that collective analysis can tell us about the concerns of any given era. White writes that during the Civil War people used dreams to relive “distant memories or horrific experiences in battle, longing for a return to peace and life as they had known it before the war, kissing loved ones that had not seen for years, communing with the dead, traveling to faraway places they wished they could see in real life,” which even if the particulars may be different, is not so altered from our current reposes. One of the most famous of Civil War dreamers was Abraham Lincoln, whose own morbid visions were in keeping with slumber’s prophetic purposes. Only days before his assassination, Lincoln recounted to his bodyguard that he’d had an eerily realistic dream in which he wandered from room to room in the White House. “I heard subdued sobs,” Lincoln said, as “if a number of people were weeping.” The president was disturbed by the sound of mourning, “so mysterious and so shocking,” until he arrived in the East Room. “Before me was a catafalque, on which rested a corpse wrapped in funeral vestments,” the body inside being that of Lincoln. Such dreams are significant — as the disquieting quarantine visions people have had over the past two months can attest to. We should listen — they have something to tell us.

Within literature dreams seem to always have something to say, a realm of the fantastic visited in novels as diverse as L. Frank Baum’s Wizard of Oz, Charles Dickens’ A Christmas Carol, Neil Gaiman’s Sandman, and Lewis Carol’s Alice in Wonderland. The dream kingdom is a place where the laws of physics are muted, where logic and reason no longer hold domain, and the wild kings of absurdity are allowed to reign triumphant. Those aforementioned novels are ones in which characters like Dorothy, Ebenezer Scrooge, Morpheus, and Alice are subsumed into a fantastical dream realm, but there are plenty of books with more prosaic dream sequences, from Mr. Lockwood’s harrowing nightmare in Emily Brontë’s Wuthering Heights to Raskolnikov’s violent childhood dreams in Fyodor Dostoevsky’s Crime and Punishment. “I’ve dreamt in my life dreams that have stayed with me ever after, and changed my ideas,” writes Brontë, “they’ve gone through and through me, like wine through water, and altered the color of my mind.” Then there is the literature that emerges from dreams, the half-remembered snippets and surreal plot lines, the riffs of dialogue and the turns of phrase that are birthed from the baked brain of night. Think of the poppy reveries of Thomas DeQuincy’s Confessions of an English Opium-Eater or the delicious purpose of Samuel Taylor Coleridge’s “Kubla Khan” written in a similar drug haze until the poet was interrupted by that damned person from Porlock.

Spearing writes that “from the time of the Homeric poems down to the modern novel, it is surely true that the scene of the great bulk of Western literature has not been the internal world of the mind, in which dreams transact themselves, but the outer, public world of objective reality,” but this misses an important point. All novels actually occur in the internal world of the mind, no matter how vigorous their subjects may be. I’ll never be able to see the exact same cool colors of Jay Gatsby’s shirts that you envision, nor will I hear the exact timber of Mr. Darcy’s voice that you imagine, in the same way that no photographs or drawings or paintings can be brought back from the place you go to when you sleep. Dreaming and reading are unified in being activities of fully created, totally self-contained realities. Furthermore, there is a utopian freedom in this, for that closed off dimension, that pinched off universe which you travel to in reveries nocturnal or readerly is free of the contagion of the corrupted outside world. There are no pop-up ads in dreams, there are no telemarketers calling you. Even our nightmares are at least our own. Here, as in the novel, the person may be truly free.

Dreaming is the substance of literature. It’s what comes before, during, and after writing and reading, and there can be no fiction or poetry without it. There is no activity in waking life more similar to dreaming than reading (and by proxy writing, which is just self-directed reading). All necessitate the complete creation of a totally constructed universe constrained within your own head and accessible only to the individual. The only difference between reading and dreaming is who directs the story. As in a book as in our slumber, the world which is entered is one that is singular to the dreamer/reader. What you see when you close your eyes is forever foreign to me, as I may never enter the exact same story-world that you do when you crack open a novel. “Life, what is it but a dream?” Carol astutely asks.

We spend a third of our day in dream realms, which is why philosophers and poets have always rightly been preoccupied with them. Dreams necessarily make us question that border between waking and sleeping, truth and falsity, reality and illusion. That is the substance of storytelling as well, and that shared aspect between literature and dreaming is just as important as the oddity of existing for a spell in entirely closed off, totally self-invented, and completely free worlds. What unites the illusions of dreams and our complete ownership of them is subjectivity, and that is the charged medium through which literature must forever be conducted. Alfred North Whitehead once claimed that all of philosophy was mere footnotes to Plato — accurate to say that all of philosophy since then has been variations on the theme of kicking the tires of reality and questioning whether this exact moment is lived or dreamt.

The pre-Socratic metaphysician Gorgias was a radical solipsist who thought that all the world was the dream of God and the dreamer was himself. Plato envisioned our waking life as but a pale shadow of a greater world of Forms. Rene Descartes in Meditations on First Philosophy forged a methodology of radical doubt, whereby he imagined that a malicious demon could potentially deceive him into thinking that the “sky, the air, the earth, colors, shapes, sounds and all external things are merely the delusions of dreams which he has devised to ensnare my judgment. I shall consider myself as not having hands or eyes, or flesh, or blood or senses, but as falsely believing that I have all these things.” So, from the assumption that everything is a dream, Descartes tried to latch onto anything that could be certain. Other than his own mind, he wasn’t able to find much. In dreams there is the beginning of metaphysics, for nothing else compels us to consider that the world which we see is not the world which there is, and yet such philosophical speculation need not philosophers, since children engage in it from the moment they can first think.

When I was a little kid, I misunderstood that old nursery rhyme “Row Your Boat.” When it queried if “life is but a dream,” I took that literally to mean that all which we experience is illusion, specter, artifice. In my own abstract way I assumed that according to the song, all of this which we see: the sun and moon, the trees and flowers, our friends and family, are but a dream. And I wondered what it would be like when I woke up, who I would recount that marvelous dream too? “I had the strangest dream last night,” I imagined telling faces unknown with names unconveyed. I assumed the song meant that all of this, for all of us, was a dream — and who is to know what that world might look like when you wake up? Such a theme is explored in pop culture from the cyberpunk dystopia The Matrix to the sitcom finale Newhart, because this sense of unreality, of dreams impinging on our not-quite-real world is hard to shake. Writing about a classic metaphysical thought-experiment known as the Omphalos Argument (from the Greek for “navel,” as relating to a question of Eden), philosopher Bertrand Russell wrote in The Analysis of Mind that “There is no logical impossibility… that the world sprang into being five minutes ago, exactly as it then was, with a population that ‘remembered’ a wholly unreal past.” Perhaps we’ve just dozed off for a few minutes then? Here’s the thing though — even if all of this is a dream — it doesn’t matter. Because in dreams you’re innocent. In dreams you’re free.

Image credit: Pexels/Erik Mclean.

A Year in Reading: Ed Simon

-

So. How are we expected to begin these things? How can I write about reading in this year of all years, this Annus Horribilis of American authoritarianism, American division, American plague? There’s no judgement in that question – it’s genuine. Because to not state the obvious would be callous: at the time of this writing there have been a quarter of a million deaths that were largely preventable if there had only been a modicum of concern from both the government and the collective citizenry.

At the same time, to wallow in all of that misfortune, the pandemic death count rising, the spate of police murders of Black citizens, the brazen incitements to violence from the thankfully defeated president, could just be more fodder for doomscrolling (the term popularized by the journalist Karen K. Ho). No doubt you’re familiar with this activity, for the correct answer to the question of “What did you read this year?” would be “Facebook, Reddit, and Twitter. CNN, The New York Times, and The Washington Post. Comment sections. Comment sections. Comment sections.” If anything quite expressed the emotional tenor of this wicked reality for most of us, it was the feeling of being dead-eyed and exhausted, eyeballs vibrating in their sockets and blood straining in our temples, ensconced in the cold glow of the smart-phone screen as you endlessly stared at travesty after travesty. Androids with our Androids.

Being who I am, I’ve got an inclination to write about the triumph of reading, the warmth from pages expressing the ineffable separateness of these people whom we happen to share the world with, for a bit. The way in which literature acts as conduit for connection, the building of worlds with words, kingdoms of interiority claimed through the audacious act of writing, and so on. But do you know what I actually did with most of my free time? Doomscrolling. Just like you. How could it be otherwise? Companion to our worry, companion to our fear, companion to our free minutes. To endlessly scroll through our social media newsfeeds fed that demon of acedia nestled in each individual skull, simultaneously giving us the illusion of control, the strange pleasure of anxiety, and the empty calories that filled our bellies but did nothing to finally satiate our hunger.

Nothing new in this, what Daniel Defoe described of 1665 in his novel A Journal of the Plague Year, whereby the “apprehension of the people was likewise strangely increased… addicted to prophecies and astrological conjurations, dreams, and old wives’ tale than ever they were before or since,” something to keep in mind as I endlessly refreshed Nate Silver. It reminded me of the childhood feeling that I used to have after hours of Nintendo; that shaky, bile-stomached emotion that I imagine senior citizens feeding quarters into Atlantic City slot machines must feel. Easier to pretend that this was a type of reading; knowing facts without reflection, horror without wisdom.

Yet I did read books this year. If I’m being honest, I didn’t read terribly widely or terribly deeply, and there is a distinct before and after as regards the plague, but I still forced myself to read even if it was at a glacial speed compared to normal, even if it was sometimes joyless. I did so because I felt that I had to, in the same way you white-knuckle it through flight turbulence by humming to yourself. I did it because I was scared that if I didn’t, I might forget how. And through that, I still had beautiful moments of reading, incandescent ones, transcendent ones. Books were still able to move me when two thousand people had died, or when two hundred thousand people had. Reading may sometimes feel like a frivolity, but it isn’t. All of that stuff I said in the second paragraph, the quasi-mocking tone about how I’m apt to argue that literature is about connection? Well, you knew I was setting that up rhetorically to knock it down. I don’t always feel that sentiment to be true, but you need not feel something to know it’s true (then again, I’ve always been a works instead of faith guy). Don’t fault me for being predictable.

This is the third year I’ve been lucky enough to be able to write one of these features for The Millions, and maybe it’s the English teacher in me, but I always have a need to tie together what I’ve read into some sort of cohesive syllabus. Summers past I used to actually theme my beach reading around subjects; one year I read novels according to the very specific criteria that they had to be about tremendous changes which happened in an instant (Tom Perrotta’s The Leftovers; Kevin Brockmeier’s The Illumination); in another season, all of the works on my docket were contemporary novels of manners (Jeffrey Eugenides’s The Marriage Plot; Dean Bakopoulos’s My American Unhappiness). This season of pandemic, it seemed that the dominant subject of the novels which I read was family.

In all of their complexities, almost every novel which I pleasure-read in 2020 examined family in its multitudinous complexity. Happy families and broken families; families of fate and families of choice; tragic families and triumphant families. I couldn’t have known it on New Years Day, but there was something appropriate in this, for this year was – in all of its darkness – for many a year of family. In the elemental stillness of quarantine people got to know their families with a new intimacy (for good and bad); some broods found themselves broken, some made new again. Most crucially, and at the risk of being maudlin, the pandemic distilled to an immaculate purity the centrality of family. My family’s own year was divided by the beautiful caesura of welcoming our first child into this world, the miracle of new life deserving of every cliché that can be said about it, a grace and gift that all of the beautiful rhetoric I can muster would scarcely be worthy of.

If novels serve any purpose, it’s to act as engines of empathy (whether or not that makes the world a better place is a question for somebody of a higher pay grade), and so I was able to see a bit of myself in Jonathan Safran Foer’s description of being a new father from his door-stopper of a book Here I Am. Jacob Bloch reminisces on moments with his first son, “the smell of the back of his neck; how to collapse an umbrella stroller with one hand… the transparency of new eyelids… my own inability to forgive myself for the moments I looked away and something utter inconsequential happened, but happened.” While Jacob and I share a parents’ love and a District of Columbia mailing address, the Blochs of Cleveland Park live in a slightly different universe from my own, though one marked by similarly tumultuous global crises, a throwback to the great male mid-century novelist canon for our century, set against the backdrop of a potentially apocalyptic war in the Middle East.

The Blochs are an unhappy family. Jacob is petty, anxious, and narcissistic; his wife Julia is unfulfilled; his father Irv is opinionated and hypocritical; his grandfather Isaac is a suicidal Holocaust survivor; his children Sam, Max, and Benjy each have their fair share of neuroses for being so young, and his Israeli cousin Tamir is simultaneously boastful and sensitive, flashy and wise. Across the daily travails of the Bloch family, from the threat of a cancelled Bar Mitzvah, the indiscretions and infidelities, and the sufferings of a beloved elderly family dog (which lent itself to one of the most moving scenes I read this year), there is the omnipresent question of Judaism and its relation to Israel, played out in a world where antisemitism is very much not a past phenomenon. Envy has always made it difficult for me to appreciate Foer, but for its occasional indulgences, Here I Am is a novel of profound beauty – especially in its dialogue, though all writers should have some humility. When Jacob gets into a fight with Max about the respective influence of Roth versus Kanye West, his son responds about the former that “First of all, I’ve never even heard of that person.”

From Cleveland Park to Harlem, Imbolo Mbue imagines a very different family experience in Behold the Dreamers, though perhaps not such a very different family (for all parents want what is good for their children). Jende Jonga has overstayed his three-month visa, and has brought over from their native Cameroon his wife Neni and their young son. Working as a livery driver, Jenda’s cousin is able to get him a job as a private chauffeur for Clark Edwards, investment banker at Lehman Brothers in 2007. Mbue depicts the ways in which money and legal status effect two radically different groups of people during the last major economic collapse. Fundamentally a novel about the American Dream, which is to say a novel about money and the way it differentiates one man from another, Behold the Dreamers movingly and brilliantly tells the sort of New York story that can be so easy to overlook.

Immigration is at the
core of Behold the Dreamers – what it means to forever fear deportation,
the sort of hard work that puts a pain in the back and feet that require five
Tylenol at a time, the crowding of a one-bedroom Uptown apartment with husband,
wife, son, and newborn daughter. So triumphant are the dreams of immigrant
aspiration, that there is a surreal beauty in a (c.2008) boast that “He will
take us to a restaurant in the Trump Hotel… He will hire Donald Trump himself
to cook steak for us,” so that the nativist is made to humbly genuflect before
the very sort of people whom he has subsequently tortured.  Mbue writes about her characters with a such
a humane tenderness that even when they’re cruel, or shortsighted, or fearful,
there is still a fundamental love which makes their full humanity apparent, so
that by the conclusion a reader will even have some sympathy for the investment
banker who is implicated in all that went wrong in 2008. With almost perfect
pitch for how people talk to one another, Mbue moves from the kitchens of
Harlem where Cameroonians prepare ekwang and ndole, to the gilded
living rooms of Park Avenue and the spacious backyards of the Hamptons. “Why
did you come to America if your town is so beautiful?” Clark asks his driver.
“Jende laughed, a brief uneasy laugh. ‘But sir,’ he said. ‘America is
America.’”

Both of these books came to me from the neighborhood mainstay of Capitol Hill Books, across the street from the red-bricked environs of the palatial nineteenth-century Eastern Market. The proprietors of the bookstore had an ingenious concept whereby readers would fill out a form about their reading preferences, and an upper limit on how much money they’d be willing to spend, and then they would compile a sealed grab-bag of mystery tomes which would be left in front of the store at an agreed upon time, like some sort of illicit literary handoff. My main method of finding totally new books, not pushed by algorithm or article, was precluded after the libraries closed, and so Capitol Hill Books’ invitation to take a literary leap into the unknown was a welcome diversion. Because the store is an amazing place, only a few blocks from the Library of Congress and the Supreme Court, with creased, underlined paperback volume crammed into every conceivable inch of the converted townhouse (including the bathroom), and because the coronavirus has demolished the economy and small business people received little of the relief which they were due from the federal government, I’m going to feature several other independent bookstores in Washington D.C. who deserve your money more than the website named after a South American rainforest. Please consider buying from them, or from any of the other bookstores I’m featuring – you don’t even have to live in the District (but of course I encourage you to buy from your own local independents – if you’re a fellow Pittsburgher I can attest to the glories of Classic Lines, Amazing Books & Records, and White Wale Bookstore).

Maybe save some of your lucre for the funky cool Solid State Books on H Street, in the neighborhood variously called NoMA or the Atlas District, depending on which gentrifying real estate agent you talk to. Solid State Books is the type of simultaneously sleek and cozy storefront that calls for you to wander after a dinner of Ethiopian or Caribbean food, coffee in hand, as you paw through the delicious tables of new novels. It embodies the sleek urbanity of bookstore wandering that’s become all too rare in mid-sized American cities, and though the pandemic makes that singular joy impossible right now, Solid State is available for curbside pickup. Consider purchasing Annie Liontas’s Let Me Explain You or Mary Beth Keane’s Ask Again, Yes, two novels that share with Behold the Dreamers a sense of immigrant possibility (and failure, pain, and tribulation) in the greater New York metro area. If Mbue had a love for the city from Malcolm X Boulevard down to Washington Square Park, then Liontas looks across the Hudson to the great Jersey Purgatory of Meadowlands strip malls, oil refineries, and diners, all the way down I-95 to New York’s greatest suburb of Philadelphia. It’s there that Stavros Stavros Mavrakis owns the Gala Diner, and where following a series of prophetic intimations concerning his impending death, sends accusatory emails to his three daughters and his ex-wife. “I, Stavros Stavros, have ask God to erase the mistakes of my life; and God has answer, in a matter of speaking, That it is best to Start Over, which requires foremost that We End All that is Stavros Stavros. No, not with suicide. With Mercy.”

Liontas’ character is King Lear as filtered through Nikos Kazantzakis, and in her main character’s incorrigibility – his yearning, his toxicity, and his potential for grace – she writes a tragi-comic parable about the American Dream. Let Me Explain You is a fractured fairy tale recounted by Stavros Stavros, and his broken, suffering, and triumphant daughters Stavroula, Litza, and Ruby. The Gala’s proprietor is one of the most distinctive voices since, well, Jonathan Safran Foer’s Ukrainian narrator Alex in Everything Is Illuminated, and Stavros Stavros hilarious and moving exposition marks Liontas as a major talent. Within Let Me Explain You there is an excavation of the layers of pride and woundedness, success and failure, which marks much of the immigrant experience, a digging deep into the strata of its characters’ histories. Liontas goes beyond the smudged and laminated menus of the Gala – the plates of crispy gyro meat smothered in tzatziki; pork roll, egg and cheese sandwiches; the disco fries covered in gravy; and the flimsy blue-and-white cups of cheap coffee with their ersatz meander design – to demonstrate that Shakespearean drama can happen even in Camden County.

Keane’s Ask Again, Yes takes place in points farther north, along the section of the Acela corridor immediately north of New York, as the upwardly mobile suburbs of Westchester stretch onward from outside the Bronx to leafy Connecticut, in communities like New Rochelle, Scarsdale, and Gillam. The last place is where two NYPD rookies – Francis Gleason and Brian Stanhope – who worked the same beat together in the 1970s Death Wish era of urban blight, coincidentally find themselves as neighbors, both following a suburban dream of fenced in lawns, Fourth of July grilling, and strip mall supermarkets. Like both Stavros Stavros and Jende, Francis is also an immigrant, this time from the west of Ireland. “One minute he’d been standing in a bog on the other side of the Atlantic,” Kean writes, “and the next thing he knew he was a cop. In America. In the worst neighborhood of the best known city in the world.”

A reserved man, Francis
isn’t particularly fond of Brian’s American volume, or of the latter’s erratic
wife Anne Stanhope, who like Gleason was also Irish-born. Despite Francis’
reservations about the Stanhopes, their children – young Kate Gleason and Peter
Stanhope – develop an intense adolescent romance that spans decades and has
combustible implications for the families. The story features a single instance
of incredible violence, the trauma of which alters both the Gleasons and the
Stanhopes, forcing them to ask how life is lived after such a rupture. Keane’s
novel is that rare thing in our contemporary era, where the culture industry
has for too long been obsessed with anti-heroes and gentle nihilism – it’s a
narrative of genuine moral significance, that’s just as concerned with
redemption as damnation, that takes contrition as seriously as that which gets
you to the point where grace is even necessary.

If you still haven’t gotten New York City out of your system, and if pandemic restrictions have you missing colleges and universities (as Zoom instruction is inevitably so much more anemic), then consider picking up a copy of James Gregor’s campus novel Going Dutch from East City Bookshop. A charming Capitol Hill mainstay that’s half descended into a basement right on Pennsylvania Avenue, not far from the string of restaurants and shops known as Barracks Row, East City Bookshop has excellent sections of history, politics, and contemporary novels, and is the sort of place where you can get twee mugs produced by the Unemployed Philosophers’ Guild. It’s the sort of bookstore that if it were in the Village, could predictably be perused by Gregor’s characters Richard and Anne, two New York University comparative literature grad students who enter into a strange psychosexual affair. Both are working on their dissertations in medieval Italian literature, but only Anne can be said to have any preternatural talent in her scholarship, which Richard is more than happy to exploit in his own research. While Richard unsuccessfully flits through Grindr, he and Anne fall closer and closer together, the two eventually agreeing to a relationship that is equal parts sex and plagiarism. “Part of him found her annoying,” Gregor writes of Richard’s feelings towards Anne, “another part was curious to observe her. There was something both needling and captivating about her that he couldn’t explain… emitting waves of musky, indeterminately foreign glamor… [he] found himself strangely excited by her presence in the classroom. It wasn’t attraction exactly, but he felt the blurred outlines of that category.” Anne is a very particular type of paradoxically worldly ingenue, a spinster with an edge, and Richard and her relationship falls deeper and deeper into pathology and the pathetic.

Washington D.C. and Los Angeles are some 2,654 miles apart, but a visit to Dupont Circle’s classic Kramer’s (because of the coffee bar it features it is now officially known as Kramer Books and Afterwords) can bestow upon you sunny California in novel form, with three titles that feature the Golden State in all of its seedy resplendence – Tracy Chevalier’s At the Edge of the Orchard, Patrick Coleman’s The Churchgoer, and The Millions’ staff writer Edan Lepucki’s Woman No. 17. District hullabaloo had it that the storied Kramer’s was potentially going to leave its Dupont Circle location, making the neighborhood infinitely poorer, but luckily the owners opted to continue their lease on the storied storefront where Monica Lewinsky once purchased a copy of Walt Whitman’s Leaves of Grass for Bill Clinton. Once our plague year has ended, shoppers will still be able to stop into the Connecticut Avenue location in this neighborhood of embassies and gay bars, and pick up any of the aforementioned California titles (in the meantime, consider ordering them online).

For pure folkloric
Americana, Chevalier’s At the Edge of the Orchard is an equally
beautiful and brutal novel, immaculate in its consummate weirdness. Chevalier
recounts tale of Robert Goodenough, son of Ohio apple growers James and Sadie
Goodenough, who in the decade before the Civil War searches for tree saplings
in northern California on behalf of a British naturalist who sells them to his
countrymen that have the unusual desire to grow sequoias and redwoods on the
grounds of English country estates. While traipsing through the hills north of
San Francisco, humbled by the forest cathedrals of the redwoods, Robert relives
the traumas of the unspeakable domestic violence in the frontier country which
left him an orphan. “Though grafted at the same time, they had grown up to be
different sizes; it always surprised James that the tree could turn out as
varied as his children.” Chevalier’s novel examines the ways that human
motivations can be unpredictable as the route that branching roots might take,
pruning back the exigencies of an individual human life to an elemental, almost
folkloric essence, and testing the soil of myth and memory to write a
luminescent novel that’s part fairy-tale, part parable, part Greek tragedy, and
part Western.

A different American
myth is explored in Coleman’s The Churchgoer, a brilliant neo-noir that true
to that venerable genre’s greatest of conventions places its seedy subject
matter of sex and criminality in the estimably pleasant and sunny- forever-75-degrees
of southern California. Mark Haines is a recovering alcoholic and drug addict,
a night watch security guard, a San Diego beach bum, and a former youth pastor
who has lost any faith in the God that failed him. He becomes embroiled in the
affairs of a mysterious and beautiful young runaway (as one does) named Cindy
Liu, a woman who comes from the same world of evangelical platitudes and
megachurch hypocrisies as he does, and when she goes missing and his night
watch partner is murdered (perhaps connected?) Haines embarks on an
investigation every bit worthy of Dashiell Hammett or Raymond Chandler. Reflecting
on a former parishioner who may be involved in sundry affairs, Mark notes that
“I didn’t like any of this. I didn’t like being questioned… If they wanted to
know what he was afraid of when he was seventeen, what he asked for prayers
about, how many times a week on average he committed the sin of self-pollution
against his better intentions, I could dig all that out from somewhere in my
brain… [but] Confession usually pulled up well short of the deeper truth.” The
true pleasure of Coleman’s novel isn’t plot (though the speed of pages turned
would recommend it for that alone), but rather language, which is always true
of the best noir books. The Churchgoer tastes like a gulp of cold black
coffee at an AA meeting which a cigarette has been cashed into, it sounds like
the static of a television left on until 3a.m. and the hum of a neon light in
the bar window of an Oceanside dive, it feels like insomnia and paranoia.

Lepucki makes great use
of the oppressive sunlight of California in her Hitchcockian domestic
tragicomedy Woman No. 17. Her second novel after the excellent
post-apocalyptic California, Lepucki explores the sultry side of
Hollywood Hills, where wealthy writer Lady Daniels hires a college student as a
live-in nanny to care for her young son while the former finishes an
experimental memoir, made possible off of alimony from her still-close film
producer ex-husband. “It was summer. The heat had arrived harsh and bright,
bleaching the sidewalks and choking the flowers before they had a chance to
wilt… I preferred to stay at home: ice cubes in the dog bowl, Riesling in the
freezer,” Lady says. Alternating between Lady and S., the art student whom she
hires without a proper vetting, Woman No. 17 explores the intersections
of obsession and sexuality, transgression and performance, in recounting how S.
becomes increasingly unhinged in an “art project” which involves imitating her
alcoholic mother and seducing Lady’s mute, adolescent, older son. As At the
Edge of the Orchard explores the traumas of family, and The Churchgoer examines
what it means to both be rejected by family and to construct a new family of
your own volition, so too does Lepucki interrogate the illusions of intimacy
and the way in which the mask we choose to wear can quickly become our face.

As the final two novels I’m writing about take as their subject the very soul of the nation, I recommend that you put in an order to buy Nell Zink’s Doxology and Kathleen Alcott’s America was Hard to Find at the District of Columbia literary institution of Politics and Prose. Perhaps the most foundational of bookstores in the D.C. literary ecosystem, Politics and Prose shares a Cleveland Park setting (or at least half-of-one) with Zink’s much anticipated novel, while Alcott’s America Was Hard to Find ranges over the entire continent, and the surface of the moon as well. Drawing its title from a poem by the radical priest and anti-Vietnam War activist Father Daniel Berrigan, Alcott’s novel is a bildungsroman for the American century. Audaciously reimagining the last fifty years of history, America is Hard to Find tells the story of the brief liaison of Air Force pilot Vincent Kahn and bartender Fay Fern, which results in the birth of their illegitimate son Wright. Kahn goes on to become the first man to walk on the moon, and Fay a domestic terrorist in a far-left group similar to the Weather Underground or the Symbionese Liberation Army. Easy to imagine the two as proxies for a type of Manichean struggle in the American spirit – the square astronaut and the radical hippie. Yet Alcott is far too brilliant of an author to pen simple allegory or didactic parable, for America Was Hard to Find is the sort of novel where mystery and the fundamental unknowability of both the national psyche and those of the people condemned to populate it are expressed in shining prose on every page.

The moon was everything he had loved about the high desert,” Alcott writes of Kahn’s first sojourn on that celestial body, “where nothing was obscured, available to you as far as you wished to look, but cast in tones that better fit the experience, the grays that ran from sooty to metallic, the pits dark as cellars. Most astonishing was the sky, a black he had never seen before, dynamic and exuberant. With a grin he realized the only apt comparison. It was glossy like a baby girl’s church shoes – like patent leather.

Alcott’s prose is so lyrical, so gorgeous, that it can be almost excruciating to read (I mean this as a compliment), a work that is so perfectly poetic that a highlighter would run out of ink before you’re a tenth of the way through the novel. There are scenes of arresting, heart-breaking beauty, none more so than the doomed life of Wright, a gay man who perishes in our country’s last plague. “There is a kind of understanding that occurs just after,” writes Alcott, “If we are lucky, we catch it at the door on our way out, watch it enter the rooms we have left. It is not always possible to tell the exact moment you have separated from the earth. So much of what we know for certain is irrelevant by the time we know it.”  

True to its title, there is something almost sacramental in Zink’s Doxology, with its poignant ruminations on both ecology and aesthetics as told througha generation-spanning story focused on Pam and Daniel Svoboda and their precocious daughter Flora. Originally 2/3rds of a Lower East Side 80s and 90s rock band situated somewhere on the spectrum between post-punk and grunge, the final member of their trio is Joe, a gentle musical genius with undiagnosed Williams Syndrome who was the only one to go onto any type of success before overdosing on September 11, 2001. Split between New York City and the Washington D.C. of Pam’s Fugazi-listening-Adams-Morgan-Clubbing youth, Doxology is an ultimately uncategorizable book about the connections of family forged in hardship and the transcendent power of creation. Zink’s narration is refreshingly Victorian, having no problem dwelling in exposition and displaying the full omniscience we require of our third-person narrators (though her Author as God has a sense of humor). Daniel “was an eighties hipster. But that can be forgiven, because he was the child of born-again Christian dairy-farm workers from Racine, Wisconsin” or that Joe’s “father was a professor of American history at Columbia, his mother had been a forever-young party girl in permanent overdrive who could drink all night, sing any song and fake the piano accompaniment, and talk to anybody about anything. In 1976 she died.”

Contrary to the order in which I’ve recounted this syllabus, I read Doxology in January, and as with Lauren Groff’s excellent speculative epic Arcadia, Zink’s novel moves into the near future from the time of its publication date in 2019. Recounting the effect that historical events like Desert Storm, 9/11, and the financial collapse of 2008 have on the Sveboldas, not to mention the election of Donald J. Trump, Doxology ends in the summer of 2020, a year after it was written and half a year after I read it. Flora lives in Washington, having been effectively raised by her grandparents, and in our infernal year as imagined by Zink she is a wounded environmental activist living in the Trumpian twilight. “On the last Wednesday in July, Washington was bathed in an acrid mist. The roses and marble facades stood sweating in air that stank of uncertainty. It was a smell that ought to be rising from burning trash, not falling from the sky as fawn-colored haze.”

Some sort of ecological
catastrophe has befallen the United States – perhaps a meltdown at a nuclear
power plant – and the burnt ochre sun struggling through pink overcast skies
speaks to the omnipresence of death. The Trump administration, of course,
denies any knowledge, telling people that they should simply live their lives,
and FOX News runs exposes about noodle thickness rather than the radioactive
plume which seems to be spreading over the east coast. With the uncanny
prescience that can only be imparted to us by a brilliant writer, I remember
finishing Zink’s novel and wondering what awaited us in the months ahead. Unnerving
to think of it now, but when I read Doxology I’d yet to have worn a face
mask outside, or heard of “social distancing.” I’d yet to have felt the itchy
anxiety that compels one to continually use hand-sanitizer, or to flinch from
whenever you hear a cough during the few minutes a day when your dog’s bladder
compels you to leave your apartment. When I read Doxology, already
fearful for the year ahead, not a single American had yet died of this new
disease, and I hadn’t yet heard the word coronavirus.

More from A Year in Reading 2020

Do you love Year in Reading and the amazing books and arts content that The Millions produces year round? We are asking readers for support to ensure that The Millions can stay vibrant for years to come. Please click here to learn about several simple ways you can support The Millions now.

Don’t miss: A Year in Reading 20192018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, 2007, 2006, 2005

Who’s Afraid of Theory?

- | 9

In a pique of indignation, the editors of the journal Philosophy and Literature ran a “Bad Writing Contest” from 1995 to 1998 to highlight jargony excess among the professoriate. Inaugurated during the seventh inning of the Theory Wars, Philosophy and Literature placed themselves firmly amongst the classicists, despairing at the influence of various critical “isms.” For the final year that the contest ran, the “winner” was Judith Butler, then a Berkeley philosophy professor and author of the classic work Gender Trouble: Feminism and the Subversion of Identity. The selection which caused such tsuris was from the journal Diacritics, a labyrinthine sentence where Butler opines that the “move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brough the question of temporality into the thinking of structure,” and so on. If the editors’ purpose was to mock Latinate diction, then the “Bad Writing Contest” successfully made Butler the target of sarcastic opprobrium, with editorial pages using the incident as another volley against “fashionable nonsense” (as Alan Sokal and Jean Bricmont called it) supposedly reigning ascendant from Berkeley to Cambridge.

The Theory Wars, that is the administrative argument over which various strains of 20th-century continental European thought should play in the research and teaching of the humanities, has never exactly gone away, even while departments shutter and university work is farmed out to poorly-paid contingent faculty. Today you’re just as likely to see aspersions on the use of critical theory appear in fevered, paranoid Internet threads warning about “Cultural Marxism” as you are on the op-ed pages of the Wall Street Journal, even while at many schools literature requirements are being cut, so as to make the whole debate feel more like a Civil War reenactment than the Battle of Gettysburg. In another sense, however, and Butler’s partisans seem to have very much won the argument from the ‘80s and ‘90s—as sociologically inflected Theory-terms from “intersectionality” to “privilege” have migrated from Diacritics to Twitter (though often as critical malapropism)—ensuring that this war of attrition isn’t headed to armistice anytime soon.   

So, what exactly is “Theory?” For scientists, a “theory” is a model based on empirical observation that is used to make predictions about natural phenomenon; for the lay-person a “theory” is a type of educated guess or hypothesis. For practitioners of “critical theory,” the phrase means something a bit different. A critical theorist engages with interpretation, engaging with culture (from epic poems to comic books) to explain how their social context allows or precludes certain readings, beyond whatever aesthetic affinity the individual may feel. Journalist Stuart Jeffries explains the history (or “genealogy,” as they might say) of one strain of critical theory in his excellent Grand Hotel Abyss: The Lives of the Frankfurt School, describing how a century ago an influential group of German Marxist social scientists, including Theodor Adorno, Max Horkheimer, Walter Benjamin, and Herbert Marcuse, developed a trenchant vocabulary for “what they called the culture industry,” so as to explore “a new relationship between culture and politics.” At the Frankfurt Institute for Social Research, a new critical apparatus was developed for the dizzying complexity of industrial capitalism, and so words like “reify” and “commodity fetish” (as well as that old Hegelian chestnut “dialectical”) became humanistic bywords.  

Most of the original members of the Frankfurt School were old fashioned gentlemen, more at home with Arnold Schoenberg’s 12-tone avant-garde then with Jelly Roll Morton and Bix Beiderbecke, content to read Thomas Mann rather than Action Comics. Several decades later and a different institution, the Center for Contemporary Cultural Studies at the University of Birmingham in the United Kingdom, would apply critical theory to popular culture. These largely working-class theorists, including Stuart Hall, Paul Gilroy, Dick Hebdige, and Angela McRobbie (with a strong influence from Raymond Williams) would use a similar vocabulary as that developed by the Frankfurt School, but they’d extend the focus of their studies into considerations of comics and punk music, slasher movies and paperback novels, while also bringing issues of race and gender to bear in their writings.

In rejecting the elitism of their predecessors, the Birmingham School democratized critical theory, so that the Slate essay on whiteness in Breaking Bad or the Salon hot take about gender in Game of Thrones can be traced on a direct line back through Birmingham. What these scholars shared with Frankfurt, alongside a largely Marxian sensibility, was a sense that “culture was an important category because it helps us to recognize that one life-practice (like reading) cannot be torn out of a large network constituted by many other life-practices—working, sexual orientation, [or] family life,” as elucidated by Simon During in his introduction to The Cultural Studies Reader. For thinkers like Hall, McRobbie, or Gilroy, placing works within this social context wasn’t necessarily a disparagement, but rather the development of a language commensurate with explaining how those works operate. With this understanding, saying that critical theory disenchants literature would be a bit like saying that astronomical calculations make it impossible to see the beauty in the stars.

A third strain influenced “Theory” as it developed in American universities towards the end of the 20th century, and it’s probably the one most stereotypically associated with pretension and obfuscation. From a different set of intellectual sources, French post-structural and deconstructionist thought developed in the ‘60s and ‘70s at roughly the same time as the Birmingham School. Sometimes broadly categorized as “postmodernist” thinkers, French theory included writers of varying hermeticism like Jacques Derrida, Michel Foucault, Giles Deleuze, Jean Lyotard, Jacques Lacan, and Jean Baudrillard, who supplied English departments with a Gallic air composed of equal parts black leather and Galois smoke. Francois Cusset provides a helpful primer in French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States, the best single volume introduction on the subject. He writes that these “Ten or twelve more or less contemporaneous writers,” who despite their not inconsiderable differences are united by a “critique of the subject, of representation, and of historical continuity,” with their focus the “critique of ‘critique’ itself, since all of them interrogate in their own way” the very idea of tradition. French theory was the purview of Derridean deconstruction, or of Foucauldian analysis of social power structures, the better to reveal the clenched fist hidden within a velvet glove (and every fist is clenched). For traditionalists the Frankfurt School’s Marxism (arguably never all that Marxist) was bad enough; with French theory there was a strong suspicion of at best relativism, at worst outright nihilism.

Theory has an influence simultaneously more and less enduring than is sometimes assumed. Its critics in the ‘80s and ‘90s warned that it signaled the dissolution of the Western canon, yet I can assure you from experience that undergraduates never stopped reading Shakespeare, even if a chapter from Foucault’s Discipline and Punish might have made it onto the syllabus (and it bears repeating that contra the reputation of difficulty, the latter was a hell of a prose stylist). But if current online imbroglios are any indication, its influence has been wide and unexpected, for as colleges pivot towards a business-centered STEM curriculum, the old fights about critical theory have simply migrated online. Much of the criticism against theory in the first iteration of this dispute was about what such thinkers supposedly said (or what people thought they were saying), but maybe even more vociferous were the claims about how they were saying things. The indictment about theory then becomes not just an issue of metaphysics, but one of style. It’s the claim that nobody can argue with a critical theorist because the writing itself is so impenetrable, opaque, and confusing. It’s the argument that if theory reads like anything, that it reads like bullshit.

During the height of these curricular debates there was a cottage industry of books that tackled precisely scholarly rhetoric, not least of which were conservative screeds like Allan Bloom’s The Closing of the American Mind: How Higher Education has Failed Democracy and Impoverished the Souls of Today’s Students and E.D. Hirsh Jr.’s The Dictionary of Cultural Literacy. Editors Will H. Corral and Daphne Patai claim in the introduction to their pugnacious Theory’s Empire: An Anthology of Dissent that “Far from responding with reasoned argument to their critics, proponents of Theory, in the past few decades, have managed to adopt just about every defect in writing that George Orwell identified in his 1946 essay ‘Politics and the English Language.’” D.G. Myers in his contribution to the collection (succinctly titled “Bad Writing”) excoriates Butler in particular, writing that the selection mocked by Philosophy and Literature was “something more than ‘ugly’ and ‘stylistically awful’… [as] demanded by the contest’s rules. What Butler’s writing actually expresses is simultaneously a contempt for her readers and an absolute dependence on their good opinion.”

Meanwhile, the poet David Lehman parses Theory’s tendency towards ugly rhetorical self-justification in Signs of the Times: Deconstruction and the Fall of Paul de Man, in which he recounts the sundry affair whereby a confidante of Derrida and esteemed Yale professor was revealed to have written Nazi polemics during the German-occupation of his native Belgium. Lehman also provides ample denunciation of Theory’s linguistic excess, writing that for the “users of its arcane terminology it confers elite status… Less a coherent system of beliefs than a way of thinking.” By 1996 and even Duke University English professor Frank Lentricchia (in a notoriously Theory-friendly department) would snark in his Lingua Franca essay “Last Will and Testament of an Ex-Literary Critic” to (reprinted in Quick Studies: The Best of Lingua Franca) “Tell me your theory and I’ll tell you in advance what you’ll say about any work of literature, especially those you haven’t read.”

No incident illustrated more for the public the apparent vapidity of Theory than the so-called “Sokal Affair” in 1996, when New York University physics professor Alan Sokal wrote a completely meaningless paper composed in a sarcastic pantomime of critical theory-speak entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity” which was accepted for publication in the prestigious (Duke-based) journal Social Text, with his hoax simultaneously revealed by Lingua Franca. Sokal’s paper contains exquisite nonsense such as the claim that “postmodern sciences overthrow the static ontological categories and hierarchies characteristic of modernist science” and that “these homologous features arise in numerous seemingly disparate areas of science, from quantum gravity to chaos theory… In this way, the postmodern sciences appear to be converging on a new epistemological paradigm.” Sokal’s case against Theory is also, fundamentally, about writing. He doesn’t just attack critical theory for what he perceives as its dangerous relativism, but also at the level of composition, writing in Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science that such discourse “exemplified by the texts we quote, functions in part as a dead end in which some sectors of the humanities and social sciences have gotten lost.” He brags that “one of us managed, after only three months of study, to master the postmodernist lingo well enough to publish an article in a prestigious journal.” Such has long been the conclusion among many folks that Theory is a kind of philosophical Mad Libs disappearing up its own ass, accountable to nobody but itself and the departments that coddle it. Such was the sentiment which inspired the programmers of the Postmodern Essay Generator, which as of 2020 is still algorithmically throwing together random Theory words to create full essays with titles like “Deconstructing Surrealism: Socialism, surrealism and deconstructivist theory” (by P. Hans von Ludwig) and “Social realism and the capitalist paradigm of discourse” (by Agnes O. McElwaine).

Somebody’s thick black glasses would have to be on too tight not to see what’s funny in this, though there’s more than a bit of truth in the defense of Theory that says such denunciations are trite, an instance of anti-intellectualism as much as its opposite. Defenses of Theory in the wake of Sokal’s ruse tended to, not unfairly, query why nobody questions the rarefied and complex language of the sciences but blanches when the humanities have a similarly baroque vocabulary. Status quo objections to that line of thinking tend to emphasize the humanness of the humanities; the logic being that if we’re all able to be moved by literature, we have no need to have experts explain how that work of literature operates (as if being in possession of a heart would make one a cardiologist). Butler, for her part, answered criticism leveled against her prose style in a (well written and funny!) New York Times editorial, where she argues, following a line of Adorno’s reasoning, that complex prose is integral to critical theory because it helps to make language strange, and forces us to interrogate that which we take for granted. “No doubt, scholars in the humanities should be able to clarify how their work informs and illuminates everyday life,” Butler admits, “Equally, however, such scholars are obliged to question common sense, interrogate its tacit presumptions and provoke new ways of looking at a familiar world.”

To which I heartily agree, but that doesn’t mean that the selection of Butler’s mocked by Philosophy and Literature is any good. It costs me little to admit that the sentence is at best turgid, obtuse, and inelegant, and at worst utterly incomprehensible. It costs me even less to admit that that’s probably because it’s been cherry picked, stripped of context, and labeled as such so that it maximizes potential negative impressions. One can defend Butler— and Theory—without justifying every bit of rhetorical excess. Because what some critics disparage about Theory—its obscurity, its rarefied difficulty, its multisyllabic technocratic purpleness—is often true. When I arrived in my Masters program, in a department notoriously Theory-friendly, I blanched as much as Allan Bloom being invited to be a roadie on the Rolling Stones’ Steel Wheel Tour. For an undergraduate enmeshed in the canon, and still enraptured to that incredibly old-fashioned (but still intoxicating) claim of the Victorian critic Matthew Arnold in Culture and Anarchy that the purpose of education was to experience “the best which has been thought and said,” post-structuralism was a drag. By contrast, many of my colleagues, most of them in fact, loved Theory; they thrilled to its punkish enthusiasms, its irony laden critiques, its radical suspicion of the best of which has been thought and said. Meanwhile I despaired that there were no deconstructionists in Dead Poets Society.

I can no longer imagine that perspective. It’s not quite that I became a “Theory Head,” as one calls all of those sad young men reading Deleuze and  Félix Guattari while smoking American Spirit cigarettes, but I did learn to stop worrying and love Theory (in my own way). What I learned is that Theory begins to make sense once you learn the language (whether it takes you three months or longer), and that it’s innately, abundantly, and estimably useful when you have to actually explain how culture operates, not just whether you happen to like a book or not. A poet can write a blazon for her beloved, but an anatomist is needed to perform the autopsy. Some of this maturity came in realizing that literary criticism has always had its own opacity; that if we reject “binary opposition,” we would have to get rid of “dactylic hexameter” as well. The humanities have always invented new words to describe the things of this world that we experience in culture. That’s precisely the practice attacked by John Martin Ellis, who in his jeremiad Against Deconstruction took on Theory’s predilection towards neologism, opining that “there were plenty of quite acceptable ordinary English words for the status of entrenched ideas and for the process of questioning and undermining them.” All of that difference, all of that hegemony, and so much phallologocentricism… But here’s the thing— sometime heteroglossia by any other name doesn’t smell as sweet.

Something anachronistic in proffering a defense of Theory in the third decade of the new millennium; something nostalgic or even retrograde. Who cares anymore? Disciplinary debates make little sense as the discipline itself has imploded, and the anemic cultural studies patois of the Internet hardly seems to warrant the same reflection, either in defense or condemnation. In part though, I’d suggest that it’s precisely the necessity of these words, and their popularity among those who learned them through cultural osmosis and not through instruction, that necessitates a few statements in their exoneration. All of the previous arguments on their behalf—that the humanities require their own jargon, that this vocabulary provides an analytical nuance that the vernacular doesn’t—strike me as convincing. And the criticism that an elite coterie uses words like “hegemonic” as a shibboleth are also valid, but that’s not an argument to abandon the words—it’s an argument to instruct more people on what they mean.

But I’d like to offer a different claim to utility, and that’s that Theory isn’t just useful, but that it’s beautiful. When reading the best of Theory, it’s as if reading poetry more than philosophy, and all of those chewy multisyllabic words can be like honey in the mouth. Any student of linguistics or philology—from well before Theory—understands that synonyms are mythic and that an individual word has a connotative life that is rich and unique. Butler defends the Latinate, writing that for a student “words such as ‘hegemony’ appears strange,” but that they may discover that beyond its simpler meaning “it denotes a dominance so entrenched that we take it for granted, and even appear to consent to it—a power that’s strengthened by its invisibility.” Not only that, I’d add that “hegemony,” with its angular consonants hidden like a sharp rock in the middle of a snowball, conveys a sense of power beyond either brute strength or material plenty. Hegemony has something of the mysterious about it, the totalizing, the absolute, the wickedly divine. To simply replace it with the word “power” is to drain it of its impact. I’ve found this with many of those words; that they’re as if occult tone poems conveying a hidden and strange knowledge; that they’re able to give texture to a picture that would otherwise be flat. Any true defense of Theory must, I contend, give due deference to the sharp beauty that these sometimes-hermetic words convey.

As a totally unscientific sample, I queried a number of my academic (and recovering academic) colleagues on social media to see what words they would add to a list of favorite terms; the jargon that others might roll their eyes at, or hear as grad school clichés, but that are estimably useful, and dare I say it—beautiful. People’s candidates could be divided in particular ways, including words that remind us of some sort of action, words that draw strength from an implied metaphorical imagery, and words that simply have an aural sense that’s aesthetically pleasing (and these are by no means exhaustive or exclusive). For example, Derrida’s concept of “deconstruction,” a type of methodological meta-analysis that reveals internal contradictions within any text, so as to foreground interpretations that might be hidden, was a popular favorite word. “Deconstruction” sounds like an inherently practical term, a word that contractors rather than literary critics might use, the prefix connotes ripping things down while the rest of the word gestures towards building them (back?) up. A similar word that several responders mentioned, albeit one with less of a tangible feel to it, was “dialectics,” which was popularized in the writings of the 19th-century German philosopher George Wilhelm Friedrich Hegel, was mediated through Karl Marx, and was then applied to everything by the Frankfurt School. As with many of these terms, “dialectics” has variable meaning depending on who is using it, but it broadly refers to an almost evolutionary process whereby the internal contradictions of a concept are reconciled, propelling thought into the future. For the materialist deployment of the term by Marx and his followers, the actual word has an almost mystical gloss to it, the trochaic rhythm of the word itself with its up-down-up-down beat evoking the process of thesis-antithesis-synthesis to which the term itself applies. Something about the very sound of “dialectic” evokes both cutting and burying to me, the psychic struggle that the word is supposed to describe.

Then there are the words that are fueled with metaphorical urgency, short poems in their own right that often appropriated from other disciplines. Foucault used words like “genealogy” or “archeology” when some might think that “history” would be fine, and yet those words do something subtly different than the plodding narrative implied by the more prosaic word. With the former there is a sense of telling a story that connects ideas, trends, and themes within a causal network of familial relations, the latter recalls excavation and the revealing of that which remains hidden (or cursed). Deleuze and Guatari borrowed the term “rhizome” from botany, which originally described the complex branching of root systems, now reapplied to how non-hierarchical systems of knowledge propagate. “Rhizome” pays homage to something of beauty from a different way of understanding the world—it is not filching, it is honoring. The Italian Marxist Antonio Gramsci similarly borrowed the term “subaltern,” later popularized by Gayatri Chakravorty Spivak, for whom it came to designate communities of colonized people who are simultaneously exoticized and erased by imperial powers. The word itself was a term used for junior officers in the British colonial service. Finally, I’m partial to “interiority” myself, used to denote fictional representations of consciousness or subjectivity. Yet “interiority,” with its evocation of a deep subterranean network or the domestic spaces of a many-roomed mansion, says something about consciousness that the more common word doesn’t quite.

My favorite critical jargon word, however, is “liminal.” All of us who work on academic Grub Street have their foibles, the go-to scholarly tics marking their prose like an oily fingerprint left on Formica. We all know the professor with their favored jargon turn (often accompanied by an equivalent hand movement, like an intricate form of Neapolitan), or the faculty member who might be taken to yelling out “Hegemonic!” at inopportune times. Thus, I can’t help but sprinkle my own favored term into my writing like paprika in Budapest goulash. My love for the word, used to designate things that are in-between, transitioning, and not quite formed, has less to do with its utility than with the mysterious sense of the sounds that animate it. It’s always been oddly onomatopoeic to me, maybe because it’s a near homophone to “illuminate,” and makes me think of dusk, my favorite time of day. When I hear “liminal” it reminds me of moonbeams and cicadas at sunset; it reminds me that the morning star still endures even at dawn. An affection for the term has only a little to do with what’s useful about it, and everything to do with that connotative ladder that stretches out beyond its three syllables. I suspect that when we love these words, this jargon, it’s an attraction to their magic, the uncanny poetry hidden behind the seemingly technocratic. The best of Theory exists within that liminal space, between criticism and poetry; justifying itself by recourse to the former, but always actually on the side of the latter—even if it doesn’t know it.           

Image Credit: Wikipedia