Letter from the Capitol

-

The Confederate battle standard never flew within the Capitol Building — until January 6th, 2021. During the Civil War, that cankered, perfidious, malignant, cancerous cabal of traitors who grandiosely called themselves the “Confederate States of America” had many northern strategic inflection points in which they stabbed into the nation’s body, and because of these, for a time, it seemed as if they might be triumphant. General John Hunt Morgan’s 2nd Kentucky Calvary Regiment raided not just in that unfortunate border state, but in 1863 they pierced into Indiana and Ohio as well. Morgan would finally surrender in Salineville, Ohio, which latitudinally is almost as far north as Connecticut. Even more incongruously and a year later, 21 veterans of Morgan’s Raid crossed over the Canadian border, that land then colonized by a Southern-sympathizing Great Britain, and attacked the sleepy hamlet of St. Albans, Vermont, including robbing the bank and forcing the citizens at gun point to swear fealty to the Confederacy. The most violent (and most famous) invasion of the north was the traitor Robert E. Lee’s campaign in Pennsylvania, the goal of which was to possibly capture or burn down Philadelphia, but which was stopped at the infamous “High Water Mark” of the Confederacy when Union General George C. Meade turned back the Army of Northern Virginia at Gettysburg—a battle that took more than 50,000 American lives in three days. During Lee’s campaign in southern Pennsylvania, free Black women and men had to flee north, as the Confederate raiders would send those they kidnapped into a southern bondage.

For sheer absurdity, among the closest positions that the rebels ever got to the national capital was the Marshall House Inn in Alexandria, Virginia, where a Confederate flag was displayed that was so large and so tall that Lincoln could see it from the White House across the Potomac. A few weeks after Ft. Sumter and Union troops occupied the city, marching down red-bricked King Street where slave markets had sold thousands of human beings less than ten miles from the Capitol Building. When Colonel Elmer Ephraim Ellsworth of the 11th New York Volunteer Infantry Regiment ascended to the roof of the hotel to remove the flag, the proprietor of the Marshall House shot him dead, the first Union casualty of the Civil War. Despite being able to see the warped cross of the Confederate battle standard from the portico of the White House, Lincoln steadfastly refused to move the capital to safer points further north, arguing that the abandonment of Washington would be a capitulation to the seditionists.

“Let us be vigilant,” Lincoln telegraphed to the worried Maryland governor in 1864, “but keep cool. I hope neither Baltimore nor Washington will be sacked.” Not for lack of desire, as that same year Confederate Lieutenant Jubal Early would attack Ft. Stevens in the Northwest Quadrant of the District of Columbia, in a battle that would take close to nine hundred men. Long had the secessionists dreamed of Washington as the capital of their fake nation. In the decades before the Civil War some imagined a “Golden Circle,” which would be a veritable empire of slavery, with the South Carolina Senator Robert Barnwell Rhett imperially enthusing that “We will expand… over Mexico – over the isles of the sea — over the far-off Southern — until we shall establish a great Confederation,” their twisted nation stretching from Panama to the District of Columbia. Until last week the Confederate flag never flew within the Capitol.

There the man casually strolls across the red-and-blue mosaic floor of some antechamber in the Capitol, dressed in jeans and a black hoodie with a tan hunting vest; hoisted over his shoulder is the Confederate flag, its colors matching the tiles. It shouldn’t be lost on anybody that his uniform is the exact same “suspicious” article of clothing which Black pre-teenagers have been shot for wearing, even while this man is able to raid the very seat of government unmolested. Because America is many things, but it is not subtle, the man in the photograph is centered by two gild-framed oil paintings. One is of Charles Sumner, the Massachusetts Senator and abolitionist nearly caned to death by an opponent on the legislative floor of this very building, and who denounced the “unutterable wrongs and woes of slavery; profoundly believing that, according to the true spirit of the Constitution, and the sentiments of the fathers, it can find no place under our National Government” before Congress in 1852. The other portrait, almost predictably, is of John C. Calhoun, the South Carolina Senator and Vice President under Andrew Jackson, who in 1837 would declaim that the “relation now existing in the slaveholding states… instead of an evil, [is] a good. A positive good,” and would then gush about what a kind and benevolent slave-master he was. It would be harder to stage a more perfect encapsulation of the American dichotomy than our weekend warrior did on Wednesday, the continual pull between those better angels of our nature and the demons of history, who are never quite exorcized and are often in full possession of the body politic. A power in that grotesque image, the cosplaying Confederate momentarily self-anointing himself sovereign as he casually strolls through the chamber. Chillingly strolled, one might say, for all of these terrorists acted with as impunity as if they had the knowledge there would be no consequences to their actions. It reminds us that the mantra “This isn’t who we are” is at best maudlin and at worst a complete lie.

The siege against the Capitol on the day that Congress met for the constitutionally mandated and largely pro-forma ritual of officially counting the Electoral College votes to certify Joe Biden and Kamala Harris as the rightful victors of the 2020 presidential race can be examined from many directions, of course. Security experts can parse why there was such a profound failure at ensuring the safety of the session; political scientists can explain how social media algorithms has increasingly radicalized adherents of the far-right; historians can place movements like QAnon and the Proud Boys in a genealogy of American nativism and European fascism. Everyone should be able to say that ultimate responsibility lay with the stochastic terrorism promoted by the lame-duck president and his congressional sycophants in the Sedition Caucus, as well as his media enablers with whom he is clasped in a toxic symbiotic relationship. All those approaches to analysis are valid, but I choose to look at the day as a literary critic and a resident of Washington D.C., because those things are what I am. But incongruity alone, even the uncanny alone, can’t quite provide the full critical lexicon for what we witnessed on our televisions that afternoon, the sense that even more than an inflection point, we were viewers of a cracked apocalypse. How do we make sense of an attempted American putsch, the almost-nightmare of a coup?

Because the cultural idiom of this nation is Hollywood, and our interpretive lenses are by necessity through that of the movies, I can’t help but feel that much of what we saw seemed prefigured in film. The terrible logic of America is that our deepest nightmares and desires always have a way of enacting themselves, of moving from celluloid to reality. Look at the photograph of Jake Angeli, the self-styled “QAnon Shaman,” shirtless and bedecked in racoon fur with buffalo horns upon his head (in pantomime of the very people whom this nation enacted genocide upon) with his face smeared in the colors of the American flag, standing at the dais of the Speaker of the House, and tell me that it doesn’t look like a deleted scene from The Postman. Or examine the photograph of a smiling ginger man in a stocking cap emblazoned with “TRUMP,” casually waving as he jauntily strolls underneath the rotunda past John Trumbull’s massive painting Surrender of General Burgoyne holding under his arm a pilfered wood podium decorated with a gold federal eagle, his hero’s adage that “when the looting starts, the shooting starts” apparently only to be selectively enforced. It looks like something from the post-apocalyptic movie The Book of Eli.

And then, most chillingly (and disturbingly underreported), there was the painstakingly assembled set of gallows, placed a bit beyond the equestrian monument to Ulysses S. Grant, who with great courage and strength broke the first iteration of the Ku Klux Klan, from which one vigilante hung that most American symbol of a noose. When remembered in light of the black-clad and masked men photographed with guns and zip-ties, it should make all of us consider just how much more tragic this violation, which was already a grotesque abomination, could have been. Horrifying to recall that the narrative conceit in Margaret Atwood’s The Handmaid’s Tale (and its television adaptation) that allowed for the theocratic dictatorship to ascend to power was the mass murder of a joint session of Congress. Sometimes #Resistance liberals get flak for their fears of fascism, but it would be easier to mock those anxieties if our country didn’t so often look like a science fiction dystopia.

It’s my suspicion that pop culture — that literature — is capable of picking up on some sort of cultural supersonic wavelength, those deep historical vibrations that diffuse in circles outward from our present into both past and future. There is something incantatory about those visions generated in word and special-effect, so that the eeriness of seeing marauding fascists overtake the Capitol grounds feels like something we’ve seen before. Think of all the times we’ve watched the monuments of Washington D.C. destroyed on film. Last week — while half paying attention to a block of cheesy apocalypse movies on the Syfy network that were supposed to count down the days left in the year — I saw the U.S.S. John F. Kennedy aircraft carrier pushed into the city by an Atlantic tsunami where it rolled across the National Mall and crushed the White House in Roland Emmerich’s godawful 2012. I’ve seen the executive mansion punctuated by bombs and dotted with bullet holes in the spectacularly corny Antoine Fuqua movie Olympus Has Fallen, and according to Thrillist the Capitol itself has been laid waste in no less than nine movies, including Day After Tomorrow, Earth vs. the Flying Saucers, G.I. Joe: Retaliation, Independence Day, Olympus Has Fallen, Superman II, White House Down, and X-Men: Days of Futures Past. Probably the impulse to watch this sort of thing is equal parts vicarious thrill and enactment of deep fears. I remember that when I saw Independence Day (also by Emmerich, the Kurosawa of schlock) after it came out, the 1996 theater audience erupted into cheers and claps when the impenetrable wall of exploding alien flames incinerated its way across D.C. and shattered the white dome of the Capitol like an egg being thrown into a fire-place. Was that applause an expressed opinion about Newt Gingrich? About Bill Clinton? Something darker?

After the terrorist attacks on 9/11, now almost twenty years ago, there was a profoundly shortsighted prediction that the hideous spectacle of Americans seeing the World Trade Center collapse would forever cure us of our strange desire to see our most famous buildings, and the people within them, destroyed. A perusal into the Olympian corpus of the Marvel Cinematic Universe (seemingly the only entertainment which Hollywood bothers to produce anymore) will testify that such an estimation was, to put it lightly, premature. French philosopher Guy Debord could have told us this in 1967 in his Society of the Spectacle, wherein he noted that “all of life presents itself as an immense accumulation of spectacles. Everything that was directly lived has moved away into a representation,” to which it could be added that the inverse is also accurate – everything that has been represented has seemingly moved into life. Which doesn’t mean that scenes like those which we witnessed on Wednesday aren’t affecting – no, the opposite is true. People reach to the appraisal that “it looks like a movie” not to be dismissive, but rather because cinema is the most powerful mythopoesis that we’re capable of.

What’s needed, of course, is a vocabulary commensurate with what exactly all of us saw. A rhetoric capable of grappling with defilement, with violation, with desecration, but because all we have is movies, that’s what we’re forced to draw upon. They gave us the ability to think about the unthinkable before it happened; the chance to see the unseeable before it was on our newsfeeds. If the vision of the screen is anemic, that’s not necessarily our fault — we measure the room of our horror with the tools which we’ve inherited. Few square miles of our civic architecture are quite so identified with our quasi-sacred sense of American civil religion as the grounds of the U.S. Capitol, and so the spectacle of a clambering rabble (used as a Trojan Horse for God knows what more nefarious group of actors) calls to mind fiction far more than it does anything which actually has happened. That’s the cruelty of our current age — that so frequently our lives resemble the nightmare more than the awakening. The Capitol siege was very much an apocalypse in the original Greek sense of the word: an unveiling, a rupture in normal history that signals why all of this feels so cinematic — though it’s hard to tell if it’s the beginning or ending of the movie, and what genre we’re exactly in. As Timothy Denevi writes about the assault in LitHub, “What is a culmination, after all, except the moment in which everything that could happen finally does? Where are we supposed to go from there?”

Important to remember that everything which could happen has already happened before, at some point. That’s what the bromide about this not being who we are gets wrong — this is, at least partially, who we’ve always been, albeit not in this exact or particular way. What happened at the eastern edge of the Mall this week has shades of the Wilmington Insurrection of 1898 in which an conspiracy of white supremacists plotted against the Black leadership of the North Carolina city ushered in Jim Crow at the cost of hundreds of lives (and then untold millions over the next century). The assault on the Capitol has echoes of the Election Riots of 1874, when members of the White League attacked Black voters in Eufaula, Alabama, leaving behind dozens of wounded women and men, and seven corpses. These are two examples of hundreds of similar events that shamefully liter our nation’s history, albeit most citizens have never heard of them. Hell, most people didn’t know about the Tulsa race massacre of 1921 — still less than a century ago — until HBO’s Watchmen dramatized it. The issue is exactly the same: White supremacists think that only their votes count, and will do anything to enforce that conviction.

That the supporters of the man who currently occupies the Oval Office believe any number of insane and discounted conspiracy theories about election fraud — claims rejected in some sixty lawsuits and a 9-0 Supreme Court decision — is to in some ways miss the point. Listen to their language — the man who instigated Wednesday’s riot emphasizes that he simply wants to count “legal” votes and ask yourself what that means, and then realize why the fevered rage of his mob focuses on places like Detroit, Philadelphia, and Atlanta. If the only people who’d been allowed to vote for Trump were white people, then he would have won the election in his claimed landslide — that’s what he and his supporters mean by “legal” votes. The batshit insane theories are just fan fiction to occlude the actual substance of their political belief. Such anti-democratic sentiment is also an American legacy, an American curse. The connection between what happened on Capitol Hill and in Wilmington, Eufaula, and Tulsa; or Fort Bend, Texas in 1888; or Lake City, South Carolina in 1897; or Ocoee, Florida in 1920; or in Rosewood, Florida in 1923 (you can look them all up), or any number of other thousands of incidents, may seem tangential. It isn’t.

When I lived in Massachusetts there was a sense of history that hung thick in the air, all of those centuries back to the gloomy Puritans and their gothic inheritance. Historical markers punctuated the streets of Boston and her suburbs, and there was that rightfully proud Yankee ownership of the American Revolution. Our apartment was only a mile or so from the Lexington battle green where that shot heard around the world rang out, and I used to sometimes grab a coffee and read a magazine on one of its pleasant benches in what was effectively a pleasant park, battle green thoughts in a green shade. Part of me wanted to describe this part of the country as haunted, and perhaps it is, but its ghosts seem to belong to a distant world, a European world. By contrast, when I moved to Washington DC, the American specters moved into much clearer focus. If Massachusetts seems defined by the Revolution, then the District of Columbia, and Maryland, and Virginia are indelibly marked by the much more violent, more consequential, more important, and more apocalyptic conflagration of the Civil War. In his classic Love and Death in the American Novel, the critic Leslie Fiedler described the nation as “bewilderingly and embarrassingly, a gothic fiction, nonrealistic and negative, sadist and melodramatic — a literature of darkness and the grotesque in a land of light and affirmation.” Our national story is a Jekyll and Hyde tale about the best and worst aspirations at conflict within the Manichean breast of a nation which fancied itself Paradise but ended up somewhere points further south.

Because I have a suspicion that poetry is capable of telling the future, that everything which can or will happen has already been rendered into verse somewhere (even if obscured), a snatch of verse from a Greek poet accompanied my doom scrolling this week. “Why isn’t anything going on in the senate?” Constantin Cavafy asked in 1898, “Why are the senators sitting there without legislating?” I thought about it when I first heard that the mob was pounding at the Capitol door; it rang in my brain when I saw the photographs of them parading through that marble forest of statuary hall, underneath that iron dome painted a pristine white. “Because the barbarians are coming today,” Cavafy answered himself. I thought about it when I looked at the garbage strewn through the halls, the men with their feet up on legislators’ desks, cackling at the coup they’d pulled. “What’s the point of senators making laws now? Once the barbarians are here, they’d do the legislating.” For a respite, it seems that the barbarians have either been pushed back or left of their own accord. In that interim, what will be done to make sure that they don’t return? Because history and poetry have taught us that they always do.

Image credit: Pexels/Harun Tan.

Pandemic Life by the Numbers: The Golden Five

-

This post was produced in partnership with Bloom, a literary site that features authors whose first books were published when they were 40 or older.

Five people

Our social worlds have contracted — my inner pod is down to five. We “check in.” We trade links and photos by text. We don’t beat around the bush, we say what we’re feeling & thinking, what’s happening in today’s hanging-in-there news—textable snippets, emojis emojis emojis.
We make phone dates (Zoom & FaceTime feel like “work” now). Some of us are single, some are hunkering down with partners / families. Those of us who live alone feel acutely but do not readily verbalize the painful fact of long stretches without physical touch. Thank goodness for the dog, who is honorary number six.

Five pounds

In the beginning, I moved in with my sister. Together we cooked creative meals to stay sane, to feel warm and nourished and connected. We ate well. In the evenings we watched “Kim’s Convenience Store” with my nephews and ate large portions of sweets or bowls-full of crunchy salty things. We ate badly. Then my nephews went to sleep, and out came the wine and whiskey. We drank too much.

We talked late into the night about how grateful we were, how anxious we were, how hard life was even before pandemic (dredging up our childhood as the wine flowed); how a totally different kind of weirdness and uncertainty had replaced the former difficulty, and how life was always stressful and frightening, pandemic or no pandemic … so hey, let’s drink some more. We put on weight.

In the middle, I moved back to my own apartment. Subscribed to a vegetable delivery service and cooked a lot and did yoga and turned over a healthier leaf. Summer came, and the endless heat wave sucked away my appetite. I went back to work. I rode my bike from uptown to downtown and back again. Outdoor activity made everything seem just a little normal, intermittently pleasurable. Most of the weight came off.

In part two of the middle, election season took over, and everything went to shit again. Night-snacking, drinking, over-eating hangover, more anxiety-snacking, more drinking; rinse and repeat. The stakes (political, existential) were enormous; the possibility of compounded disaster utterly real. Heaviness—of body and spirit—returned.

In part three of the middle, the election came and went, and I began to sleep better. That particular anxiety—considering seriously leaving the country, starting a new life away from the daily tyranny of the idiot monster—quieted. But with the change of seasons and raging infections, venturing outside and the simplest tasks became again nerve-wracking. Keep your distance; move briskly; avert your face; wash wash wash your hands.

In the end, now—part one of the end—it’s those five pounds. I feel them all the time, weighing me down physically and mentally. They are the pounds that keep you from feeling ready and energized when you wake; in-the-zone while you work; relaxed in your clothes and in your soul. You know what I mean. I’m pretty sure you do. We won’t be feeling fully energized or relaxed again for some time.

Five minutes

For many of us, life has slowed down. No more rush-hour commuting, running from work to gym to kids’ soccer practice to the nursing home to a meal with friends. We’re doing less because less is do-able. Our stresses are more about isolation, destabilization, reorientation; less about time…

…if we lean into it, that is. We could, certainly, fill up that time with other multi-tasking, time-crunching activities. I’ve found myself opting for “five more minutes.” Slow it down, pause, take a moment. It’s just five minutes, but you begin to see—the meaningfulness of small increments.

Five minutes earlier means sitting with a cup of coffee, savoring it, and thinking of nothing at all before starting on a day of brain-squeezing work. Five more minutes allows for a slow stroll to and from the subway, on which non-mission-critical thoughts can occur (I miss her, I wonder how she’s doing), neighborhood storefronts, new and old, get noticed (order takeout this weekend, they look like they need business), and notes for an essay or story can be written in a notebook (Five Minutes; The Golden Five?). Five more minutes means biking up the hill, working up a good sweat and burn, then cruising down the other side with the wind in your face, instead of avoiding the hill altogether; or it means not running that red light, not nearly slamming into a pedestrian or, worse, a car. Five more minutes gives the dog five more minutes of gleeful sniffing and one or two more doggie encounters—the value of which could be, who knows, infinitely greater than what five minutes of comparable pleasure means to us.

Sometimes it’s not about what you do or don’t do in those five minutes; it’s about gentle transitions from one thing to the next. If you do yoga, you’ve heard this, about the signals you send to your self when you jerk your body around, from standing to sitting, from sitting to kneeling, from left to right. With five more minutes, you can finish doing X; take a breath; move calmly, maybe even gracefully, into Y.  It makes a difference. It makes all the difference.

Five dollars

Many of us have less work and less money. We are also spending less and/or differently. In NYC, the loss of bar & restaurant life is (I know, cry me a river, in the grand scheme of things it probably sounds melodramatic) tragic. Certainly it’s tragic for restaurant owners and workers, and for the city’s economy. For consumers, refraining from gathering in bars & restaurants in NYC is like… living on the coast and having no access to the ocean.

Structure helps. A plan. The eating-out budget is diminished, but I want these restaurants to make it. So it’s all about 5-dollar takeout lunches, a couple times a week. Mamoun’s falafel. Bahn-mi from Saigon Shack. Tofu shrimp in garlic sauce (the 1/2 portion) from my local Chinese takeout. Lebanese Zaatar. Curry beef or roast pork pastry from Fay Da bakery. Two cheese slices with can of soda from pretty much any pizza joint. A whole wheat everything bagel with a shmear of lox cream cheese from Bo’s. For a healthy option, a green protein smoothie from the smoothie place run by Big Russ’s barber shop. And yes, even occasionally steam table oxtails and collard greens (well-managed, socially distanced, sanitized setup) from Jacob’s Soul Food.

Five more months

Who’s to say — though the real-science people have been basically right all along. Five more months seems likely. Buckle in. Mask up. Take a breath, or two, or five. Check in with your single friends. Check in with your partnered friends. Spend your stimulus money on restaurants if you can. Don’t give away or let out your clothes yet: we’ll all find our better selves and our energy again on the other side.

Image credit: Pexels/cottonbro; Photos courtesy of the author.

On Dreams and Literature

-

“We are such stuff/As dreams are made on, and our little life/Is rounded with a sleep.” — William Shakespeare, The Tempest (1611)

“A candy-colored clown they call the sandman/tiptoes to my room every night/just to sprinkle stardust and to whisper/’Go to sleep, everything is alright.’” — Roy Orbison, “In Dreams” (1963)

Amongst the green-dappled Malvern Hills, where sunlight spools onto spring leaves like puddles of water in autumn, a peasant named Will is imagined to have fallen asleep on a May day when both the warmth and the light induce dreams. Sometime in the late fourteenth-century (as near as we can tell between 1370 and 1390), the poet William Langland wrote of a character in Piers Plowman who shared his name and happened to fall asleep. “In a summer season,” Langland begins, “when soft was the sun, /I clothed myself in a cloak as I shepherd were… And went wide in the world wonders to hear.” Presentism is a critical vice, a fallacy of misreading yourself into a work, supposedly especially perilous if it’s one that’s nearly seven centuries old. Hard not to commit that sin sometimes. “But on a May morning,” Langland writes, and I note his words those seven centuries later on a May afternoon, when the sun is similarly soft, and the inevitable drowsiness of warm contentment takes over my own nodding head and drowsy eyes so that I can’t help but see myself in the opening stanza of Piers Plowman.

“A marvel befell me of fairy, methought./I was weary with wandering and went me to rest/Under a broad bank by a brook’s side,/And as I lay and leaned over and looked into the water/I fell into a sleep for it sounded so merry.” Good close readers that we are all supposed to be, it’s imperative that we don’t read into the poem things that aren’t actually in it, and yet I can’t help but imagine what that daytime nocturn was like. The soft gurgle of a creek through English fields, the feeling of damp grass underneath dirtied hands, and of scratchy cloak against unwashed skin; the sunlight tanning the backs of his eyelids; that dull, corpuscular red of daytime sleep, the warmth of day’s glow flushing his cheeks, and the almost preternatural quiet save for some bird chirping. The sort of sleep you fall into when you’re on a train that rocks you to sleep in the sunlight of late afternoon. It sounds nice.

Piers Plowman is of a medieval poetic genre known as a dream allegory, or even more enticingly as a dream vision. Most famous of these is Dante Alighieri’s The Divine Comedy, where its central character (who as in Piers Plowman shares the poet’s name) discovers himself in a less pleasant wood than does Will, for that “When I had journeyed half of our life’s way,/I found myself within a shadowed forest,/for I had lost the path that does not stray.” The Middle Ages didn’t originate the dream vision, but it was the golden age of the form, where poets could express mystical truths in journeys that only happened within heads resting upon rough, straw-stuffed pillows. Langland’s century alone saw Geoffrey Chaucer’s Parliament of Fowls, John Gower’s Vox Clamantis, John Lydgate’s The Temple of Glass, and the anonymously written Pearl (by the same lady or gent who wrote Sir Gawain and the Green Knight). Those are only English examples (or I should say examples by the English; Gower was writing in Latin), for the form was popular in France and Italy as well. A.C. Spearing explains in Medieval Dream-Poetry that while sleeping “we undergo experiences in which we are freed from the constraints of everyday possibility, and which we feel to have some hidden significance,” a sentiment which motivated the poetry of Langland and Dante.

Dante famously claimed that his visions — of perdition, purgatory, and paradise — were not dreams, and yet everything in The Divine Comedy holds to the genre’s conventions. Both Langland and Dante engage the strange logic of the nocturne, the way in which the subconscious seems to rearrange and illuminate reality in a manner that the prosaicness of overrated wakefulness simply cannot. Dante writes that the “night hides things from us,” but his epic is itself proof that the night can just as often reveal things. Within The Divine Comedy Dante is guided through the nine circles of hell by the Roman poet Virgil, from the antechamber of the inferno wherein dwell the righteous pagans and classical philosophers, down through the frozen environs of the lowest domain whereby Lucifer forever torments and is tormented by that trinity of traitors composed of Cassius, Brutus, and Judas. Along the way Dante is privy to any number of nightmares, from self-disemboweling prophets to lovers forever buffeted around on violent winds (bearing no similarity to a gentle Malvern breeze). In the Purgatorio and Paradiso he is spectator to far more pleasant scenes (though telling that more people have read Inferno, as our nightmares are always easiest to remember), whereby he sees a heaven that’s the “color that paints the morning and evening clouds that face the sun,” almost a description of the peacefulness of accidentally nodding off on an early summer day.

Both The Divine Comedy and Piers Plowman express verities accessed by the mind in repose; Langland’s poem, for not beginning in a dark wood but rather in a sunny field, embodies mystical apprehensions as surely as does Dante. A key difference is that Langland’s allegory is so obvious (as anyone who has seen the medieval play Everyman can attest is true of the period). Characters named after the Seven Deadly Sins, or called Patience, Clergy, and Scripture (and Old Age, Death, and Pestilence) all interact with Will — whose name has its own obvious implications. By contrast, Dante’s characters bear a resemblance to actual people (or they are actual people, from Aristotle in Limbo to Thomas Aquinas in Heaven), even while the events depicted are seemingly more fantastic (though in Piers Plowman Will witness both the fall of man and the harrowing of hell). Both are, however, written in the substance of dreams. Forget the didactic obviousness of allegory, the literal cipher that defines that form, and believe that in a field between Worcestershire and Hertfordshire Will did plumb the mysteries of eternity while sleeping. What makes the dream vision a chimerical form is that maybe he did. That’s the thing with dreams and their visions; there is no need to suspend disbelief. We’re not in the realm of fantasy or myth, for in dreams order has been abolished, anything is possible, and nothing is prohibited, not even flouting the arid rules of logic.

A danger to this, for to dream is to court the absolute when we’re at our most vulnerable, to find eternity in a sleep. Piers Plowman had the taint of heresy about it, as it inspired the revolutionaries of 1381’s Peasant Rebellion, as well as the adherents of a schismatic group of proto-Protestants known as Lollards. Arguably the crushing of the rebellion led to an attendant attack by authorities on vernacular literature like Piers Plowman, in part explaining the general dismalness of English literature in the fifteenth-century (which excluding Mallory and Skelton is the worst century of writing). Scholars have long debated the relationship between Langland and Lollardy, but we’ll let others more informed tease out those connections[. T]he larger point is that dreaming can get you in trouble. That’s because dreaming is the only realm in which we’re simultaneously complete sovereign and lowly subject; the cinema we watch when our eyes are closed. Sleep is a domain that can’t be reached by monarch, tyrant, state, or corporation — it is our realm.

Dreams had a radical import in George Orwell’s dystopian classic 1984. You’ll recall that in that novel the main character of Winston Smith is a minor bureaucrat in totalitarian Oceania. Every aspect of Smith’s life is carefully controlled; his life is under total surveillance, all speech is regulated (or made redundant by New Speak), and even the truth is censored, altered, and transformed (which the character himself has a role in). Yet his dreams are one aspect of his life which the government can’t quite control, for Smith “had dreamed that he was walking through a pitch-dark room. And someone sitting to one side of him had said as he passed: ‘We shall meet in the place where there is no darkness.’” Sleep is an anarchic space where the dreamer is at the whims of something much larger and more powerful than themselves, and by contrast where sometimes the dreamer finds themselves transformed into a god. A warning here though — when total independence erupts from our skulls into the wider world (for after all, it is common to mutter in one’s sleep) there is the potential that your unconsciousness can betray you. Smith, after all, is always monitored by his telescreen.

Whether it’s the thirteenth or the twenty-first centuries, dreaming remains bizarre. Whether we reduce dreams to messages from the gods and the dead, or repressed memories and neurosis playing in the nursery of our unconscious, or simply random electric flickering of neurons, the fact that we spend long stretches of our life submerging ourselves in bizarre parallel dimensions is so odd that I can’t help but wonder why we don’t talk about it more (beyond painful conversations recounting dreams). So strange is it that we spend a third of our lives journeying to fantastic realms where every law of spatiality and temporality and every axiom of identity and principle of logic is flouted, that you’d think we’d conduct ourselves with a bit more humility when dismissing that which seems fantastic in the experience of those from generations past who’ve long since gone to eternal sleep. Which is just to wonder that when William Langland dreamt, is it possible that he dreamt of me?

Even with our advancements in the modern scientific study of the phenomenon, their mysteriousness hasn’t entirely dissipated. If our ancestors saw in dreams portents and prophecies, then this oracular aspect was only extended by Sigmund Freud’s The Interpretation of Dreams. He who inaugurated the nascent field of psychoanalysis explained dreams as a complex tapestry of wish fulfillment and sublimation, an encoded narrative that mapped onto the patient’s waking life and that could be deciphered by the trained therapist. Freud writes that there “exists a psychological technique by which dreams may be interpreted and that upon the application of this method every dream will show itself to be a senseful psychological structure which may be introduced into an assignable place in the psychic activity of the waking state.” Not so different from Will sleeping in his field. The origin may be different — Langland sees in dreams visions imparted from God and Freud finds their origin in the holy unconsciousness, but the idea isn’t dissimilar. Dreaming imparts an ordered and ultimately comprehensible message, even though the imagery may be cryptic.

Freud has been left to us literary critics (who’ve even grown tired of him over the past generation), and science has abandoned terms like id, ego, and superego in favor of neurons and biochemistry, synapses and serotonin. For neurologists, dreaming is a function of the prefrontal cortex powering down during REM sleep, and of the hippocampus severing its waking relationship with the neocortex, allowing for a bit of a free-for-all in the brain. Scientists have discovered much about how and why dreaming happens — what parts of the brain are involved, what cycles of wakefulness and restfulness a person will experience, when dreaming evolved, and what functions (if any) it could possibly serve. Gone are the simple reductionisms of dream interpretation manuals with their categorized entries about your teeth falling out or of showing up naked to your high school biology final. Neuroscientists favor a more sober view of dreaming, whereby random bits of imagery and thought thrown out by your groggy chemical induced brain rearrange themselves into a narrative which isn’t really a narrative. Still, as Andrea Rock notes in The Mind at Night: The New Science of How and Why we Dream, “it’s impossible for scientists to agree on something as seemingly simple as the definition of dreaming.” If we’re such stuff as dreams are made of, the forensics remain inconclusive.

Not that dreaming is exclusively a human activity. Scientists have been able to demonstrate that all mammals have some form of nocturnal hallucination, from gorillas to duck-billed platypuses, dolphins to hamsters. Anyone with a dog has seen their friend fall into a deep reverie; their legs pump as if they’re running, occasionally they’ll even startle-bark themselves awake. One summer day my wife and I entertained our French bulldog by having her chase a sprinkler’s spray. She flapped her jowly face at the cool gush of water with a happiness that no human is capable of. That evening, while she was asleep, she began to flap her mouth again, finally settling into a deeper reverie where she just smiled. Dreaming may not necessarily be a mammalian affair — there are indications that both birds and reptiles dream — albeit it’s harder to study creatures more distant from us. Regardless, evidence is that animals have been sleeping perchance to dream for a very long time, as it turns out. Rock writes that “Because the more common forms of mammals we see today branched off from the monotreme line about 140 million years ago… REM sleep as it exists in most animals also emerged at about the time that split occurred.” We don’t know if dinosaurs dreamt, but something skittering around and out of the way of their feet certainly did.

If animal brains are capable of generating pyrotechnic missives, and if dreaming goes back to the Cretaceous, what then of the future of dreaming? If dogs and donkeys, cats and camels are capable of dreaming, will artificial intelligences dream? This is the question asked by Philip K. Dick’s Do Androids Dream of Electric Sheep?, which was itself the source material for Ridley Scott’s science fiction film classic Blade Runner. Dick was an author as obsessed with illusion and reality, the unattainability of truth, and doubt as much as any writer since Plato. In his novel’s account of the bounty hunter Rick Deckard’s decommissioning of sentient androids, there is his usual examination of what defines consciousness, and the ways in which its illusions can present realities. “Everything is true… Everything anybody has ever thought,” one character says, a pithy encapsulation of the radical potential of dreams. Dick imagined robots capable of dreaming with such verisimilitude that they misapprehended themselves to be human, but as it turns out our digital tools are able to slumber in silicon.

Computer scientists at Google have investigated what random images are produced by a complex artificial neural network as it “dreams,” allowing the devices to filter various images they’ve encountered and to recombine, recontextualize, and regenerate new pictures. In The Atlantic, Adrienne LaFrance writes that the “computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.” Artificial Intelligence has improved to an unsettling degree in just the past decade, even though a constructed mind capable of easily passing the Turing Test has yet to be created, though that seems more an issue of time than possibility. If all of the flotsam and jetsam of the internet could coalesce into a collective consciousness emerging from the digital primordial like some archaic demigod birthing Herself from chaos, what dreams could be generated therein? Or if it’s possible to program a computer, a robot, an android, an automaton to dream, then what oracles of Artificial Intelligence could be birthed? I can’t help but thrill to the idea that we’ll be able to program a desktop version of the Delphic Oracle analyzing its own microchipped dreams. “The electric things have their life too,” Dick wrote.

We’ve already developed AI capable of generating completely realistic-looking but totally fictional women and men. Software engineer Philip Wang invented a program at ThisPersonDoesNotExist.com which does exactly what its title advertises itself as doing: it gives you a picture of a person who doesn’t exist. Using an algorithm that combs through actual images, Wang’s site uses something called a generative adversarial network to create pictures of people who never lived. If you refresh the site, you’ll see that the humans dreamt of by the neural network aren’t cartoons or caricatures, but photorealistic images so accurate that they look like they could be used for a passport. So far I’ve been presented with an attractive butch woman with sparkling brown eyes, a broad smile, and short curly auburn hair; a strong jawed man in his 30s with an unfortunate bowl cut and a day’s worth of stubble who looks a bit like swimmer Michael Phelps; and a nerdy-looking Asian man with a pleasant smile and horn-rimmed glasses. Every single person the AI presented looked completely average and real, so that if I encountered them in the grocery store or at Starbucks I wouldn’t think twice, and yet not a single one of them was real. I’d read once (though I can’t remember where) that every invented person we encounter in our dreams has a corollary to somebody that we once met briefly in real life, a waitress or a store clerk whose paths we crossed for a few minutes drudged up from the unconscious and commissioned into our narrative. I now think that all of those people come from ThisPersonDoesNotExist.com.

One fictional person who reoccurs in many of our dreams is “This Man,” a pudgy, unattractive balding man with thick eyebrows, and an approachable smile who was the subject of Italian marketer Andrea Natella’s now defunct website “Ever Dream This Man?” According to Natella, scores of people had dreams about the man (occasionally nightmares) across all continents and in dozens of countries. Blake Butler, writing in Vice Magazine, explains that “His presence seems both menacing and foreboding at the same time, unclear in purpose, but haunting to those in whom he does appear.” This Man doesn’t particularly look like any famous figure, nor is he so generic that his presence can be dismissed as mere coincidence. A spooky resonance concerns this guy who looks like he manages a diner on First Avenue and 65th emerging simultaneously in thousands of peoples’ dreams (his cameo is far less creepy after you’re aware of the website). Multiple hypotheses were proffered, ranging from This Man being the product of the collective unconscious as described by Carl Jung to Him being a manifestation of God appearing to people from Seattle to Shanghai (my preferred theory). As it turns out, he was simply the result of a viral marketing campaign.

Meme campaigns aside, the sheer weirdness of dreams can’t quite exorcize them of a supernatural import — we’re all looking for portents, predictions, and prophecies. Being submerged into what’s effectively another universe can’t help but alter our sense of reality, or at least make us question what exactly that word means. For years now I’ve had dreams that take place in the same recurring location — a detailed, complex, baroque alternate version of my hometown of Pittsburgh. This parallel universe Pittsburgh roughly maps onto the actual place, though it appears much larger and there are notable differences. Downtown, for example, is a network of towering, interconnected skyscrapers all accessible from within one another (there’s a good bookstore there); a portion of Squirrel Hill is given over to a Wild West experience set. It’s not that I have the same dreams about this place, it’s that the place is the same, regardless of what happens to me in those dreams when I’m there. So much so that I experience the uncanny feeling of not dreaming, but rather of sliding into some other dimension. An eerie feeling comes to me from a life closer then my own breath, existing somewhere in the space between atoms, and yet totally invisible to my conscious eye.

Such is the realm of seers and shamans, poets and prophets, as well as no doubt yourself — the dream realm is accessible to everyone. As internal messages from a universe hidden within, whereby the muse and oracle are within your own skull. Long have serendipitous missives arisen from our slumber, even while we debate their ultimate origin. Social activist Julia Ward Howe wrote “Battle Hymn of the Republic” when staying at Washington D.C.’s Ward Hotel in 1861, the “dirtiest, dustiest filthiest place I ever saw.” While “in a half dreaming state” she heard a group of Union soldiers marching down Pennsylvania Avenue singing “John Brown’s Body,” and based on that song Howe composed her own hymn while in a reverie. Howe’s dreaming was in keeping with a melancholic era enraptured to spiritualism and occultism, for she commonly was imparted with “attacks of versification [that] had visited me in the night.” The apocalyptic Civil War altered peoples’ dreams, it would seem. Jonathan White explores the sleep-world of nineteenth-century Americans in his unusual and exhaustive study Midnight in America: Darkness, Sleep, and Dreams During the Civil War, arguing that peoples’ “dream reports were often remarkably raw and unfiltered… vividly bringing to life the horrors of the conflict; for others, nighttime was an escape from the hard realities of life and death in wartime.”

Every era imparts its own images, symbols, and themes into dreams, so that collective analysis can tell us about the concerns of any given era. White writes that during the Civil War people used dreams to relive “distant memories or horrific experiences in battle, longing for a return to peace and life as they had known it before the war, kissing loved ones that had not seen for years, communing with the dead, traveling to faraway places they wished they could see in real life,” which even if the particulars may be different, is not so altered from our current reposes. One of the most famous of Civil War dreamers was Abraham Lincoln, whose own morbid visions were in keeping with slumber’s prophetic purposes. Only days before his assassination, Lincoln recounted to his bodyguard that he’d had an eerily realistic dream in which he wandered from room to room in the White House. “I heard subdued sobs,” Lincoln said, as “if a number of people were weeping.” The president was disturbed by the sound of mourning, “so mysterious and so shocking,” until he arrived in the East Room. “Before me was a catafalque, on which rested a corpse wrapped in funeral vestments,” the body inside being that of Lincoln. Such dreams are significant — as the disquieting quarantine visions people have had over the past two months can attest to. We should listen — they have something to tell us.

Within literature dreams seem to always have something to say, a realm of the fantastic visited in novels as diverse as L. Frank Baum’s Wizard of Oz, Charles Dickens’ A Christmas Carol, Neil Gaiman’s Sandman, and Lewis Carol’s Alice in Wonderland. The dream kingdom is a place where the laws of physics are muted, where logic and reason no longer hold domain, and the wild kings of absurdity are allowed to reign triumphant. Those aforementioned novels are ones in which characters like Dorothy, Ebenezer Scrooge, Morpheus, and Alice are subsumed into a fantastical dream realm, but there are plenty of books with more prosaic dream sequences, from Mr. Lockwood’s harrowing nightmare in Emily Brontë’s Wuthering Heights to Raskolnikov’s violent childhood dreams in Fyodor Dostoevsky’s Crime and Punishment. “I’ve dreamt in my life dreams that have stayed with me ever after, and changed my ideas,” writes Brontë, “they’ve gone through and through me, like wine through water, and altered the color of my mind.” Then there is the literature that emerges from dreams, the half-remembered snippets and surreal plot lines, the riffs of dialogue and the turns of phrase that are birthed from the baked brain of night. Think of the poppy reveries of Thomas DeQuincy’s Confessions of an English Opium-Eater or the delicious purpose of Samuel Taylor Coleridge’s “Kubla Khan” written in a similar drug haze until the poet was interrupted by that damned person from Porlock.

Spearing writes that “from the time of the Homeric poems down to the modern novel, it is surely true that the scene of the great bulk of Western literature has not been the internal world of the mind, in which dreams transact themselves, but the outer, public world of objective reality,” but this misses an important point. All novels actually occur in the internal world of the mind, no matter how vigorous their subjects may be. I’ll never be able to see the exact same cool colors of Jay Gatsby’s shirts that you envision, nor will I hear the exact timber of Mr. Darcy’s voice that you imagine, in the same way that no photographs or drawings or paintings can be brought back from the place you go to when you sleep. Dreaming and reading are unified in being activities of fully created, totally self-contained realities. Furthermore, there is a utopian freedom in this, for that closed off dimension, that pinched off universe which you travel to in reveries nocturnal or readerly is free of the contagion of the corrupted outside world. There are no pop-up ads in dreams, there are no telemarketers calling you. Even our nightmares are at least our own. Here, as in the novel, the person may be truly free.

Dreaming is the substance of literature. It’s what comes before, during, and after writing and reading, and there can be no fiction or poetry without it. There is no activity in waking life more similar to dreaming than reading (and by proxy writing, which is just self-directed reading). All necessitate the complete creation of a totally constructed universe constrained within your own head and accessible only to the individual. The only difference between reading and dreaming is who directs the story. As in a book as in our slumber, the world which is entered is one that is singular to the dreamer/reader. What you see when you close your eyes is forever foreign to me, as I may never enter the exact same story-world that you do when you crack open a novel. “Life, what is it but a dream?” Carol astutely asks.

We spend a third of our day in dream realms, which is why philosophers and poets have always rightly been preoccupied with them. Dreams necessarily make us question that border between waking and sleeping, truth and falsity, reality and illusion. That is the substance of storytelling as well, and that shared aspect between literature and dreaming is just as important as the oddity of existing for a spell in entirely closed off, totally self-invented, and completely free worlds. What unites the illusions of dreams and our complete ownership of them is subjectivity, and that is the charged medium through which literature must forever be conducted. Alfred North Whitehead once claimed that all of philosophy was mere footnotes to Plato — accurate to say that all of philosophy since then has been variations on the theme of kicking the tires of reality and questioning whether this exact moment is lived or dreamt.

The pre-Socratic metaphysician Gorgias was a radical solipsist who thought that all the world was the dream of God and the dreamer was himself. Plato envisioned our waking life as but a pale shadow of a greater world of Forms. Rene Descartes in Meditations on First Philosophy forged a methodology of radical doubt, whereby he imagined that a malicious demon could potentially deceive him into thinking that the “sky, the air, the earth, colors, shapes, sounds and all external things are merely the delusions of dreams which he has devised to ensnare my judgment. I shall consider myself as not having hands or eyes, or flesh, or blood or senses, but as falsely believing that I have all these things.” So, from the assumption that everything is a dream, Descartes tried to latch onto anything that could be certain. Other than his own mind, he wasn’t able to find much. In dreams there is the beginning of metaphysics, for nothing else compels us to consider that the world which we see is not the world which there is, and yet such philosophical speculation need not philosophers, since children engage in it from the moment they can first think.

When I was a little kid, I misunderstood that old nursery rhyme “Row Your Boat.” When it queried if “life is but a dream,” I took that literally to mean that all which we experience is illusion, specter, artifice. In my own abstract way I assumed that according to the song, all of this which we see: the sun and moon, the trees and flowers, our friends and family, are but a dream. And I wondered what it would be like when I woke up, who I would recount that marvelous dream too? “I had the strangest dream last night,” I imagined telling faces unknown with names unconveyed. I assumed the song meant that all of this, for all of us, was a dream — and who is to know what that world might look like when you wake up? Such a theme is explored in pop culture from the cyberpunk dystopia The Matrix to the sitcom finale Newhart, because this sense of unreality, of dreams impinging on our not-quite-real world is hard to shake. Writing about a classic metaphysical thought-experiment known as the Omphalos Argument (from the Greek for “navel,” as relating to a question of Eden), philosopher Bertrand Russell wrote in The Analysis of Mind that “There is no logical impossibility… that the world sprang into being five minutes ago, exactly as it then was, with a population that ‘remembered’ a wholly unreal past.” Perhaps we’ve just dozed off for a few minutes then? Here’s the thing though — even if all of this is a dream — it doesn’t matter. Because in dreams you’re innocent. In dreams you’re free.

Image credit: Pexels/Erik Mclean.

Extinguishing the Self: On Robert Stone

- | 5

1.
Until the pandemic forced us into hiatus, I curated a reading series for emerging writers in New York City. For 13 years, we met monthly at KGB Bar, a literary venue in the East Village. The bar was rarely full, but it was always a chore to get people to quiet down; we encouraged readers to invite friends, family, and other writers. On the best nights, the place was full of convivial anticipation, like we were throwing a big, bold send-off before these promising writers lit out to new territory.

Standing at the podium before events, the sights and sounds often reminded me of the wet, snowy Sunday evening when I heard the late, great Robert Stone read in the very same room. He was in the last years of a long life. His flowing beard was all white and emphysema made a whisper of his gravelly voice. His audience had dwindled and there were fewer people in the room that night than on evenings when I hosted readings for less accomplished authors. This is just one of the many lessons Stone taught me: that you can be nominated for four National Book Awards and a Pulitzer—and still face a half empty room at the end of your career.

You can get the full fathom five of Stone’s biography from any of a dozen sources. Stone himself wrote a memoir, Prime Green, that tracks through his Catholic school boyhood, the burden of growing up as the son/caretaker of a schizophrenic mother, and the hows and whys of his decision to run away and join the Navy as a young man. At the beginning of his career, he famously tripped with Ken Kesey and the Merry Pranksters. His second novel was made into a movie starring Nick Nolte. He once joked to an interviewer that he always ended up running into the much more famous author Paul Auster at parties. He was among the literati in the era when being seen around town was a part-time job, long before Entertainment Tonight and ages before social media made celebrity a full-time out-of-body experience.

“You were part of that world,” the interviewer Christopher Bollen said to him in 2013, talking about the druggy counterculture era. “But you have a rare career in that you moved beyond it. How?” Stone’s response was characteristic of the man: pointed, honest, and unglamorous. “I really, really wanted to write,” he said. He knew that his reach was beyond his grasp as a young writer. “I wanted to be a goof on the bus, but I wanted to write more.” So he went to work.

If you consult with the sages at Encyclopedia.com, this is how all that effort worked out for him: “Robert Anthony Stone (born 1937) was an American novelist whose preoccupations were politics, the media, and the random, senseless violence and cruelty that pervade contemporary life both in the United States and in parts of the world where the United States’ influence has extended, such as Latin America and Vietnam. His vision of the world is dark but powerful.”

Well, yes; but also, no.

The novelist Madison Smartt Bell, in an encomium in The New Yorker after Stone died in 2015, claimed that all Stone novels include the character of “a man whose idealism has been blunted by experience.” Certainly, this is true of Stone’s best books, by which I mean (in order): Dog Soldiers, A Flag for Sunrise, and Damascus Gate. In that same 2015 essay, Smartt observes that for all the protagonists of these books, the main narrative is of a redemption that must be earned. Nothing is handed to them. “Stone and his characters struggle with all received ideas at a very high level of intellectual honesty.”

In interviews and essays, Stone never denied that he wrote stories he hoped would capture the fancy of readers. He was not writing for his private muse. Nor was he a David Foster Wallace, tortured by inner Furies, pouring his thoughts onto the page in a losing bid for freedom. You can still watch Stone speak in numerous video interviews on YouTube; he smiles often, wears professorial jackets and ties, and lounges at tables beside a fire. Stone wrote big, rollicking stories like Conrad, Melville, and Dickens because those were the kind of stories that he loved and were large enough to suit his themes. He was a writer who lived in the world and wrote stories full of living.

On the word-by-word level, his work has the jostle and sting of real life; as a writer he inhabited the people in the stories in order to tell their tales. Speaking of A Flag for Sunrise with Kay Bonetti in 1982, he expresses the surprise he felt when two of his characters broke out into a dramatic quarrel at one point. “The day I started writing that piece I didn’t realize that was going to happen,” he said. “It just developed as I wrote the dialogue and imagined myself into the situation.”

For all his timeliness of story and milieu, however, you cannot approach a Robert Stone novel at high speed. He published four books after 1998; all of them have strengths but also none of them feel quite of our time. I suspect this is why his popularity began to wane after the publication of Damascus Gate. You either slow down and let the chemicals of his words do their thing, or you might as well fly on by.

2.
Stone’s best claim to literary fame is the 1975 National Book Award, when the selection committee picked his third novel, Dog Soldiers, as its fiction prizewinner. Stone’s description of his academic experience at Stanford a few years earlier could just as well describe this deeply paranoid masterpiece: “I spent a lot of my time, when I should have been writing, experiencing death and transfiguration and rebirth on LSD in Palo Alto.”

What were the National Book Award judges thinking when they chose to award the prize to this novel, a druggy, rough tale of a playwright-turned-journalist who loses his shit in Saigon and manages to ravage his entire life before the last page? Cast in granite prose, oracular in the best and worst ways, full of scenes that show but confide less than a gruff Midwestern boyfriend, Dog Soldiers has a thrilling plot, but I’m not sure I could tell you what happens in it, even on a close reread. Did the judges find in the book a reflection, darkly, of the chaotic post-Nixonian world in which they lived? Certainly, this is the easy go-to explanation for the adjunct profs who include it on reading lists and the marketing copywriters who prepared promo material for the latest reissue in 2018. It’s a book about hippies! ’Nam! Failed authority! LSD! Well, yes; and no. Dig a little deeper, and, as with Stone’s fiction, a complicated, interconnected counter story begins to take shape.

Fact: the year before Dog Soldiers won the National Book Award, the award was discontinued, briefly. So perfectly Stone. In 1974, the prize jury chose Gravity’s Rainbow by Thomas Pynchon, a writer so writerly that he refused to give interviews. This was apparently the last straw for the publishers who underwrote the prize. They cut their funding. Completely. National Book Award organizers refused to give up, though. They assembled a temporary committee to give their award one more time. They begged the likes of Exxon and Jackie Kennedy Onassis to donate enough moola to keep the lights on. It was one more sign of the times in an age when no institutions seemed like they were going to last. Exactly the kind of world that Dog Soldiers paints in miniature. A perfect choice. Almost as if it were the work of fate. Fate of the kind that flickers in the flames of Stone’s best work. Fate that you can laugh at and say you don’t believe in, but that still has a chance of being true. Robert Stone had to win the prize that year. Because we all needed an author preoccupied by outsiders to be granted the status of a literary insider—so he could go on writing, thinking, and teaching all of us for the next four decades.

3.
The first time I tried to read Robert Stone, I couldn’t stand his prose style. I was 22. Stone’s second masterpiece, A Flag for Sunrise, was on a grad school syllabus that also included the likes of Clockers, Under Western Eyes, and Gentlemen Prefer Blondes. The sine non qua for inclusion was that each book operated in a genre but also rose above its intentions, a concept that my thesis advisor and mentor, David Plante, also inculcated into our thinking in weekly creative writing workshops. He never said as much, but I believe that what Plante was trying to teach us was that if we were to become decent writers, which is to say writers worth our salt, we had to be contextual readers.

My first reading of Stone was troubled by the fact that I was addicted to Cormac McCarthy at the time. For my money, McCarthy is perhaps the only other major male author of the late 20th century who writes convincingly in the cut of the moment. McCarthy and Stone were born within four years of each other. They both had a stint in the military. Both write/wrote painfully slow, labored over their craft, and had very little commercial success at first. But in form and vision, they are opposing calculations on either side of an equals sign.

I read perhaps two pages of A Flag for Sunrise before putting the book back down again. The pace felt too slow. The sentences were sharp but stilted. Characters kept starting and stopping and staring. There was a nun with a man’s name. A lieutenant who was clearly also a drunk. Familiar and unfamiliar all at once. I attended the seminar session on the book without having read A Flag for Sunrise at all. How very Robert Stone of me. I had high hopes for the novel; I was myself trying to write a book that I envisioned as a literary novel with a great plot. That perfect fusion of high and low culture. But I was too eager, too hurried in my work, too starry-eyed with the idea of being done.

Cormac McCarthy novels reward you on a page-by-page basis, or at least they do if his stiff prosaic mescal is your kind of thing. A Stone novel takes longer to get going, and even longer to alter your insides. A Cormac cocktail hits you before the ice cubes melt; the work of Robert Stone will only be clear the next morning, when you realize that you blacked out hours before you got home.

I returned to A Flag for Sunrise a decade later. I revisited Stone in part because I had run out of Graham Greene novels worth my time. After my Cormac McCarthy phase ended, I suffered a long bout of Greene fever. God, how I adored Greene’s books. I still have flash burns on my heart from the pages of The Quiet American, The Heart of the Matter, and The Power and the Glory. To learn more about Greene as a writer, I had even gone so far as to read Greene on Capri, the memoir by Shirley Hazzard (herself a great writer, criminally overlooked both before and after she died).

The rediscovery of Stone was a relief, and a blessing, but not because he was aping Greene. There are plenty of lesser writers who do just that. No, Stone was a find because he added to what Greene was doing. His work possesses the urgency of Greene—the sense of people battling against the dark authorities of this world—but also something else, something that took me many novels and many hours of consideration to realize was lacking in Greene’s novels: a love of living.

Stone was often asked by interviewers for his thoughts on Graham Greene. He was never ambiguous: “He is not a favorite of mine,” he told Charles Ruas in a 1981 interview. He speculated that his antipathy was due to being compelled by nuns to read Greene and Waugh as a schoolboy. Stone was still clinging to that story when he spoke in 1982 to the Missouri Review. But by the end of his life, during his 2015 interview with Christopher Bollen, he no longer felt the need to tiptoe around his deeper feelings. “I always knew I hated Graham Greene,” he said, “even though I thought he was a really good writer.”

Stone’s antipathy, I think, was not professional so much as personal. Graham Greene, for all his talent as a writer, was not a good man. Just about 10 years after Greene died, the Daily Mail wrote a long, dark hit piece on him. The article is a slog through a great writer’s sins. A photo of Greene in late age is captioned as follows: “A man without honour: Graham Greene was an alcoholic who abandoned his wife and two children for affairs with a series of married mistresses.” I learned from the article that late in life, Greene tried to start a brothel on a Portuguese island. And that he shared a house in Italy with an avowed pederast. Asked about his estranged children, Greene is reported as saying: “I think my books are my children.” Graham Greene was the kind of person that no one would want to be constantly compared with. Not if you really cared about the company that you are perceived to keep. And certainly not if you were someone as humanistic, thoughtful, and apparently kind-hearted as Robert Stone.

Perhaps the important difference between Stone and Greene is that while Stone “really, really” wanted to be a writer, he wanted equally to be a good person. I don’t mean in the personal sense, although that seems to have mattered to Stone, too. I mean in the sense of saying things that help guide his readers to a better understanding and appreciation of the world. In the very first words of a taped interview with the Writer’s Institute in 1996, Stone says that people need stories in the same way that the waking mind needs dreams. We put together narratives in order to make sense of life. The punch line of a joke, he goes on to say, is actually a forced recognition of how things are. There is no natural narrative of things; it’s all just out there. “It is up to human will and human ingenuity to compose all this into a narrative.”

4.
If at this point you have in your mind the image of Robert Stone as a neo-Papa standing at his desk and writing out novels by long hand–then you are mistaken. Nor was he a Melvillian scrivener hunched over a desk for hours to write in a slanted longhand better suited for logging barrels of salt pork. The galloping narrative of his books will put you in mind of Stendahl; the moral weight of his vision is on level with Dostoevsky. But those two novelists dictated their work to stenographers. None of this applied to Stone. He was a typing man. He joined the navy as a radio operator, as he reports in his memoir. Later, as he told Bollen in 2013, he learned to type by taking Morse code. “I was using the typewriter from day one,” he said.

Not an Olivetti, either. A Paris Review feature from 1985 describes Stone as working in a cluttered attic at a table just large enough to hold his word processor. That’s right, a fully modern word processor. Unlike Cormac McCarthy, the image of the artist is not meant to be confused with the images in the work. Stone is neither ascetic nor saint. He was just a writer, a big-hearted one.

A picture of Stone in the mid-2010s in the Paris Review shows no fewer than two computer screens. One of them, a laptop with his reading glasses resting on the keys, is—I regret to inform the steampunks among you—almost certainly a MacBook. He was not a simple throwback. Or a caricature. His wife was a waitress when he met her. He remained married to her for his entire adult life. They lived in a simple house in Connecticut on the shore, and their two children, a boy and a girl, grew up and moved out and started their own lives as children everywhere are wont to do.

His work reads as if it were composed to the tune of clanging blacksmiths and left to cool under the stars somewhere far from land. This is the conundrum of good writing. It can take you anywhere. But in Stone’s case, the words you read were almost certainly crafted in a quiet place, by one person, typing in solitude, hopeful of the value of the time spent, but equally certain that it may never mean anything much to anyone. This is the gamble.

Stone suffered to bring the right words forth in the years before acclaim and even afterward. He worked on his first novel, A Hall of Mirrors, for six years. In the era of hot takes and from-the-hip tweets, six years is an eternity. But it is what it takes when you are groping in the dark as a writer.

I find the story of how long Stone labored on his initial book to be both inspirational and validating. I spent six years working on my first novel. It was my thesis while at Columbia. Stone labored over his work while a fellow at Stanford. He had to keep working on the manuscript after he graduated, as did I. He struggled to work and write at the same time. As he told the Paris Review about that first book: “I’d work for twenty weeks and then be on unemployment for twenty weeks and so on. So it took me a long, long time to finish it.” This is the writing life. I am writing the first draft of this essay while I sit on a wooden bench in a coffee shop in Harlem where ironically they are playing Creedence Clearwater Revival’s 1971 single, “Have You Ever Seen the Rain?” I have not been employed full time for months, except for a few consulting gigs. I have been desperately writing this whole time, concocting and executing the first draft of a new novel and rounding out essays like this one that have been ricocheting around in the steel drum of my mind for ages. You find a way to get the work done whenever and however you can, almost as a sidebar to real life. And yet it’s the part of life that you most want to talk about with an interviewer from a literary magazine. Stone anticipates everything that I feel as a writer. There is this long exchange, from that same Paris Review interview, which might as well be a diary entry from my own life, except in my case the book that I’m jazzed over is called Likeness, and it won’t win a National Book Award, because I’m no Robert Stone, but the feelings are all the same:
INTERVIEWER
Is writing easy for you? Does it flow smoothly?
STONE
It’s goddamn hard. Nobody really cares whether you do it or not. You have to make yourself do it. I’m very lazy and I suffer as a result. Of course, when it’s going well there’s nothing in the world like it. But it’s also very lonely. If you do something you’re really pleased with, you’re in the crazy position of being exhilarated all by yourself. I remember finishing one section of Dog Soldiers—the end of Hicks’s walk—in the basement of a college library, working at night, while the rest of the place was closed down, and I staggered out in tears, talking to myself, and ran into a security guard. It’s hard to come down from a high in your work—it’s one of the reasons writers drink.
Stone never figured out how to write quickly. He kept his standards on the top shelf. He spoke about this in one of his last interviews, with Tin House. The editor asks him about the plot of his final novel, The Death of the Black-Haired Girl. (A novel that, I must confess, I could not finish.) He insists that he does not have a plan for his work; that it just unfolds as he discovers it. “So you are not,” the interviewer asks, “in the Nabokov camp of treating your characters like “galley slaves”? I can almost hear Robert Stone chuckling in response. “Well,” he said, “I don’t treat them very well. But, no.

In an interview with Kay Bonetti, in 1982, she said: “Some critics feel you lost control of the structure in both A Hall of Mirrors [his second novel] and Dog Soldiers.” Stone’s response: “Yes. I guess I lost control.” And then he adds, importantly, and perfectly in tune with his Zen persona: “I’m pretty satisfied with the way they turned out.” Later in the same interview, he elaborates: “I see a great deal of human life limited, poisoned, frustrated, by fear and ignorance and the violence that comes from it…I think some of the people I write about are trying to get above that and get around it somehow.”

How a Stone novel ends is perhaps more important than any other fact about it. The ending is where at long last the slowly moving lines converge. The end is the closest we will ever get to the direct sunlight of his ideas. I remember distinctly where I read the ending of Damascus Gate. I was seated on a subway car headed to the Upper West Side apartment where I lived with my wife and daughter. I was an established adult by then, full-time job, mortgage, a little girl who called me papa. All that fell away as I read the book. The only world I knew as I hurtled under the streets of New York was the world of the catacombs under Jerusalem as reported to me by Stone. The characters are lost, confused, and the predators and prey are all mixed up. As a reader, I recall my heart pounding as I turned the pages. But truth be told, I also remember being confused. Like, seriously confused. As a character in a Stone novel might say: What the actual fuck is going on?

All of Stone’s work is about the confusing fate that lies in wait behind the world of likely events. The startling break. The upsetting loss, when all the odds were in your favor. Being confused, overwrought, out of luck, or nearly so—all of Stone’s characters arrive at this moment. And then they get up and push onward. You may or may not like his heroes. But you have to admire their will to live. There are moments in his work that anticipate the modern anti-heroes of Breaking Bad or True Detective. I cannot be the only person who saw a dark reflection in the ending of True Detective season two, when Frank Semyon bleeds out in the desert, and the ending of Dog Soldiers, when the mortally wounded Hicks walks as far as he can along a railroad track. Both men are deeply flawed and filled with hallucinations. Both men are dead long before they realize it.

Arguably, it is in film and television where you can locate Stone’s true heirs. Plenty of male novelists try to mug their way through tough-guy first novels a la Stone, but in so doing they confuse him with the likes of Hemingway, Mailer, and Roth. There’s no strut to his prose; there’s nothing self-aggrandizing in Stone’s work, nor did there seem to be in the man. If anything, his work is about the extinguishment of the self in a Buddhist sense. “You’ll never find Robert Stone in a Robert Stone book,” Wallace Stegner is said to have remarked famously after reading Dog Soldiers.

5.
Five years have passed since Stone’s death. Other than a brief burst of appreciation after his passing, in the form of admiring words from peers and former students alike, at this point his floating pyre has drifted out to sea. I suspect that the rolling tide of literary canonization will not bring him back to shore. His vision is too intentionally arch; his prose style far too mandarin. This saddens me, but I do not think that it would sadden Stone; certainly the man that I met once, very briefly, had no other expectation for what would happen in the world that went on without him.

I heard Stone speak and read from his memoir in December of 2009, on the Sunday evening when he appeared on a double bill for a book promo event at KGB Bar. There was a snowstorm coming, according to the weather reports, and a wet snow had started. I suspect this depressed turn out a little. But it also made the room feel brighter and warmer.

Stone arrived shortly after I did, entering alongside a taller, younger man. Later, I would learn this was Madison Smartt Bell. Bell was an accomplished novelist in his own right, and he had a book of his own to promote; but there was in his posture, his gesture, his way of introducing Stone to all of us, a clear deference for the literary lion in our midst. For his part, Stone had no airs. He had, I would learn later, visited the bar numerous times for readings. In an Identity Theory interview in 2003, he said to the interviewer: “I was in KGB last night and I think it’s very vital, even more vital than it used to be.” He seemed at ease when he stood at the lectern, adjusting the rickety lamp to illuminate the pages he carried. He wore reading glasses on the end of his nose. His eyes smiled when he glanced up.

He read a section from his memoir and the entire text of a short story. The story, “Honeymoon,” would appear three years later in his second story collection, Fun with Problems. At the story’s end, the main character swims to his death while scuba diving, plunging into the “uncolored world of fifteen fathoms. The weight of the air took him down the darkening wall.”

Afterward, the room was still with the quiet trance of a heady draught. Stone took a place at the wall near a corner to sign books. I hadn’t realized there would be a signing. Like a fool, I had not thought to bring any books. Clearly, others had a better sense of what to expect. One young man had brought a handled paper bag full of Robert Stone hardcovers. Stone, with a chortle, signed each one.

Someone from Houghton Mifflin was selling paperback copies of Prime Green. I bought one and then apologized when I slid it under Stone’s nose. I loved Dog Soldiers, I told him. I should have brought a copy with me. Indeed, I had read the book just two months earlier in huge gulping doses. He nodded. Your work is inspirational, I said. I had spent much of my time in line figuring out what to say, what wouldn’t be too fawning but would still convey the proper reverence.

“You’re a writer?” he asked.

“Yes,” I said, although I felt sheepish making this claim.

“Are you working on something? Is it going well?”

“I’m–I’m trying,” I said.

He nodded. He understood. He had seen thousands of versions of me before. I suspect he saw me as a lost child, one so alone as to not know how to ask for help. I was, at this time, twice divorced from literary agents, unpublished after a decade of trying, with not even a short story publication to my name. He told me, simply, to keep at it. That the writing is its own reward. The kind of wisdom that, to a young man, seems like resignation, but that to a man at middle age sounds a lot like fortitude and patience. He asked my name, double checked how to spell it, and then wrote his name and a quick line of encouragement on the title page and handed the book back to me.

After that I went down the bar’s long creaky stairs and out into the wet snowy night and back into the uncertainty of a writing life still largely unlived. I have been thinking about our short exchange ever since.

In a conversation with Robert Birnbaum after the release of Damascus Gate, Stone spoke about the epigraph in the book: “Losing it is as good as having it.” This is a line devoid of poetry and hardly worth an epigraph, unless you’ve bought into the long arrow path that passes through Stone’s oeuvre. As Stone explained to his interviewer, the quote wiped him out when he first heard it. “That which we have,” he said, “we invariably lose. And at the same time, it can’t be taken away.”

Stone was trying to say this same thing a decade earlier at the end of A Flag of Sunrise, when Holliwell is being rescued by a father and son who don’t speak English and seem hesitant to take the bloodied protagonist into their boat: “A man has nothing to fear, he thought to himself, who understands history.”

Losing everything, Stone tells us, is far better than never having anything at all. That full ironic detachment is a lesson that still resonates in our post-Cold War, post-American, pandemic-rankled world—with the empire teetering, so many of our heroes in retreat, and the very idea of grand masters in question, when the notion of a canon is more punch line than party line. Who can be a master? Who can speak for us all? Who is worthy? No one, obviously. But there are some voices that offer more to a listener than others. Stone’s is one of them.

Image Credit: Publishers Weekly.

Gravity and Grace and the Virus

-

In “Theses on the Philosophy of History,” Walter Benjamin famously writes of history as one long catastrophe, an atmosphere we continue to breathe in. “The tradition of the oppressed teaches us that the ‘state of emergency’ in which we live is not the exception but the rule…The current amazement that the things we are experiencing are ‘still’ possible in the twentieth century is not philosophical.” In the most often-cited thesis, Benjamin offers an image of catastrophe as physical and devastating, a continuous process of ruin blowing against a body. “This is how one pictures the angel of history… Where we perceive a chain of events, he sees one single catastrophe which keeps piling wreckage upon wreckage and hurls it in front of his feet.” Our happiness, Benjamin muses, “exists only in the air we have breathed, among people we could have talked to.” The idea, needless to say, has gained new relevance in the age of aerosols and droplets, of mass death and the fears of proximity. The air we breathe in has never seemed so central to our health and happiness. Benjamin and his wife survived the 1918 influenza epidemic, though I don’t know how much was known about transmission in that time.

In the early months of the pandemic, I didn’t know yet to be worried about aerosols. Like the rest of the country, I was still in a deep antagonism with surfaces, wondering whether I could infect myself with coronavirus if I touched a doorknob and then my pillow. But it was clear that something was crumbling, that it would not be solved easily, and (so?) I found myself deep in the work of Jewish philosophers writing during the Holocaust.

It seemed perverse to want to read about historical devastation when there was so much right around me. Why pick now to develop an obsession with the Holocaust? Yet it was strangely comforting to read things that had been written in a time of crisis—our inherited crisis, it always seemed growing up, despite the fact that my own family’s link was indirect and generations removed—and yet one that was not this crisis, this unbearable time. Written under the sign of catastrophe, the works, which included Gershom Scholem’s wonderful biography Walter Benjamin: The Story of a Friendship; Benjamin’s “Theses on the Philosophy of History”; and Simone Weil’s Gravity and Grace—all of which carried with them a newly familiar awareness of devastation and breakdown. It was the same old devastation I’d known as a Jewish child growing up in prosperous America, the endless and yet definitely historical loss that unfolded and unfolded, but now newly strange, because it was familiar. As I read about Benjamin’s despair at the increasingly inflated German mark, my best friend texted me from Brooklyn, our hometown, to ask for flour.

I didn’t read survivors. For some time I’d found it difficult to think or talk about the Holocaust head-on at all. I was struck by an essay-review in Jewish Currents in which Helen Betya Rubinstein writes, of Sheila Heti’s most recent novel, that “Motherhood is a book written around, or about, the Shoah.” Although the novel seemed to conflate a family curse with the Holocaust, Rubinstein noted, it also refused or was unable to make this connection in explicit terms. Could it be that it was no longer possible to write about the Holocaust in this explicit way? Was it necessary to write about it only subterraneously, lightly, glancingly—as though catching something off guard? And is this always true of devastation? (The articles have already been written about the strange absence of the 1918 flu in literature of the time and after.) Rubinstein fears “that the body of literature about the Shoah has too much saturated the culture, and is too full of errors and missteps, to withstand another, divergent attempt;…that it’s impossible to refer to that history without carrying the weight of all the ways the story’s already been told, including the ways it’s been misrepresented, manipulated, and abused.”

I felt these things too. What I hated in Holocaust literature and film was the American heroism so often implicit in them, the way you needed all those bodies to make an audience feel something. I thought it might help to read philosophy and biography, not memoir or fiction, and so to look sideways without looking away. Benjamin, Scholem, and Weil took their own unusual routes, straight through crisis.

Weil, a secular French Jew who longed to convert to Catholicism, died in a sanatorium, starving herself because, she said, she did not want to eat more than the rations available to the people of France. She was an extreme person whose extremity has engendered imitation and inspiration in the work of Fanny Howe and Chris Kraus, among others, and whose life has often been of greater interest to readers than her difficult work. She didn’t have any recorded romantic or sexual relationships, or children. She held people to sometimes impossibly high standards; she did not like Simone de Bouvoir, one of her few fellow female classmates at the elite university Ecole Normale Supérieure, and pronounced her bourgeois. (In this way of course I also find her charming.) She worked in factories, and taught the children of working people, who often found her bizarre and unrelatable (she was herself bourgeois, in a purely descriptive sense, and they were usually Christians) but lovable. In 1942, she’d gone away to New York, where she was safe, but then came back.

In March, I looked through her signature work, Gravity and Grace, for a useful quote, but it was all about detachment, which seemed impossible if not implausible. I was reading everything out of context. Or more in context than I ever had before? Art, Weil wrote, offers the only consolation we should seek or wish to give: “These [works of art] help us through the mere fact that they exist.” About love: “To love purely is to consent to distance, it is to adore the distance between ourselves and that which we love.” This had never felt truer, as I scanned flight schedules for a plane I wouldn’t take to be with my parents in New York, in the epicenter, an apartment with no room for self-isolation.

“Not to exercise all the power at one’s disposal is to endure the void,” Weil wrote. “This is contrary to all the laws of nature. Grace alone can do it.” At the time, I took this an explanation of what it felt like to obey lockdown instead of exercising whatever power I might have to protect myself and the people around me—that is, to rush to Brooklyn. But Weil herself was always keen to go to where the action was. She went to Spain to fight in the Spanish Civil War, although, once there, she quickly burned her feet outside of active duty and had to be rescued by her parents.

Next I moved to Benjamin, who killed himself while fleeing from France into Spain. Another strange wartime death. He and his group of fellow Jewish refugees had been turned away at the border and would be sent back to Vichy France and the camps shortly. Like Weil, Benjamin has been memorialized as physically awkward and ungainly, almost slapstick in his tragedy; a female acquaintance, according to Scholem, once said of him that he was “so to speak, incorporeal.” His was a tragic death, it has been said, because the officials did not, in the end, send his group back. They suggest that he could have lived, or else that his death was what convinced the Spanish authorities of the seriousness of his group’s crisis, that the storm cloud hovering over their heads was real. A member of the group wrote, in a letter to Theodor Adorno, that she had paid Spanish officials in the area for five years for a gravestone for Benjamin, but when Hannah Arendt went to the site only months later, she found no such marking. Later, when visitors began to come to see the grave, a marking materialized. Gershom Scholem writes thoughtfully and compassionately of his dear friend, who always disappointed him with his ardent promises to turn his attention—soon!—more fully to Judaism and the study of Hebrew. “During that year I thought that Benjamin’s turn to an intensive occupation with Judaism was close at hand,” Scholem writes of 1921. It was not.

What did these works offer me? Something about the authors’ ability to live and work in crisis, a personal crisis in some ways, unevenly distributed as all crises are, although in many other ways a global one. Something about their continued commitment to their work, their continued ability to produce great thought—not that I was capable of deep thought in March or expected to be anytime soon. Something about being myself, as I imagine most of us are, the product of crisis and catastrophe, the child (some generations removed) of those who did not die, people who escaped. Or was this self-aggrandizement, a case of American triumphalism? There were Nazis in the streets and I wanted to know why this was so important to me, what it meant to think about disaster. I wanted to know what history is for, especially but not only Jewish history. Could I use it, and if so, how?

“We experience good only by doing it,” Simone Weil wrote, in a completely different world, in a completely different context, that is also and always our world and our context.

Image Credit: Wikimedia Commons.

What the Literature About Contemporary Korean Women’s Lives Illuminates About Our Own

-

There was an infamous flasher who lurked around the school gate. He was a local who’d been showing up at the same time and place for years…On cloudy days, he would appear at the empty lot that directly faced the windows of the all-girls’ classroom eight. Jiyoung was in that class in the eighth grade.
Kim Jiyoung, Born 1982 by Cho Nam-Joo

The recent Jeffrey Toobin “incident” of his masturbatory penile exposure during a work call with colleagues at The New Yorker enraged me. And while it was welcome news that The New Yorker has fired him, though not citing a reason—the lack of professionalism during a work call should be obvious. His other employer, CNN, where he serves as head legal analyst, said that following the “Zoom incident,” Toobin “asked for some time off” and that the network had granted it. He will be just fine.

My anger has to do with not just the incident itself but also the subsequent jokey “there-but-for-the-grace-of-God-go-I” responses from the pundit lad-o-sphere, steamrollering over the fact that most women first involuntarily encounter the weaponized penis as children. I was 11 in rural Minnesota when first exposed to a flasher on the street who also threatened me with a broom handle. The response by the male adults in my life to my tears and upset were gales of laughter. The flasher, a drifter, lurked around our small town for days, unbothered by police or other authorities, until he tried flashing a well-built woman who got in a physical altercation with him and he was driven from town. Just a few years later, when I was barely an adolescent, I tried to place an ad in our town newspaper to sell my horse. My parents grabbed the phone out of my hands when they heard me shouting, “I don’t know how long a horse’s penis is, my horse is a mare!”—me not understanding that the man with the trustworthy, inquiring adult voice was not interested in purchasing my horse but was assaulting me via phone. A quick survey of friends suggests my experience is not unique but rather very typical, how this hostile atmosphere begins when girls are still children, continues with everyday misogyny (including impossible standards set by advertising), and proceeds when the weaponized penis enters the workplace. Working for years at an investment bank, a call asking a trader about price/earnings ratios would often devolve into the equivalent of horse-penis talk. For example, when pictures of a colleague’s a new baby were met with a vulgar observation by our section chief about how big the newborn’s penis was. Or when an announcement of new female analysts featured Playboy centerfolds instead of employee pictures. Unfortunately, I couldn’t hang up. Instead I had to endure this just to do my job—and it was clear I was being docked invisible points for being “overly sensitive.”

In economics, this concept of negative collateral effects of an action is called “disutility.” The poisoning of the environment and climate change would be a disutility of the energy sector. Which is why disutility is often never looked at. And when it is, it is relegated to less-than exciting (to the self-described “alpha” trader), marginalized fields like economic sustainability. It’s something I encountered only because of my interest in global developmental economics—because it comes up as a lone STEM-voice of dissent in discussions of why not just send all our old toxic iPhone trash to undeveloped countries and pay them (minimally) for the inevitable cancer they will get (See also: Theory of Competitive Advantage).

I was struck, then, at a number of recent novels set in Korea looking at the cost of sexism baldly and directly. In fact, what has been described as the Korean #MeToo movement began as a grenade that went off in the insular Korean literary community: in 2017, the Hwanghae Literature ran a feminist issue; in it a poem, “Monster,” by Choi Young-mi accused “En,” a fictional character of sexual misconduct. The details of “En” match up directly with Ko Un, arguably Korea’s most famous poet and novelist, oft-cited as Korea’s best hope for the Nobel Prize and whose work appears in most middle school and high school textbooks. Other Korean women quickly reified this accusation, suggesting Ko had gone decades unpunished for such conduct, for using his stature to sexually harass young female writers . Ko responded in a nuclear fashion, suing Choi for one billion won ($886,500) for defamation. But now that Pandora’s box has been opened, the spotlight has also shined on the movie and K-pop industry as well. Even the mayor of Seoul, who received unequivocal praise for his handling of the Covid crisis, became embroiled in his own sexual harassment scandal, and committed suicide after the allegations were made public.

The novel Kim Jiyoung, Born 1982 by Cho Nam-Joo, translated in its elegant minimalism by Jamie Chang, tells a story of a young housewife who seems to be going through a nervous breakdown and extreme depression. But, in some ways, when we see her extremely typical but psychologically and physically violent coming of age, in a dispassionate narration that not only makes the horror more total, it seems less a breakdown than a normal response to being a girl and then a young woman dealing with everyday misogyny. In the case of the flasher in Kim Jiyoung, Born 1982, when the adults brush it off, some bolder girls jump the flasher, tie him up, and bring him to the police station. Jiyoung is a tentative person and it doesn’t escape her notice that the girls who rise up to defend themselves are then punished by their school. She manages to get into college, gets a boyfriend, but overhears her male friends describing her as “spat-out gum” because her boyfriend broke up with her. She is similarly dismissed when seeking employment. It seems clearer and clearer to her that taking a stand or resisting societal norms doesn’t get the rebels anywhere. She is cognizant of and frustrated by her lack of options, and half decides, half falls into marriage and childbearing, holding out hope the conventional route will get her some modicum of satisfaction. She even gives up her job, which gave her a level of independence and self-actualization, even though she didn’t like the job per se, which she realizes only once she’s quit. But she finds out later that even work was sullied: shortly after she’d left, a scandal erupted when it was discovered that the male security team set up spycams in the bathrooms, uploading images to a porn site and to male coworkers. Thus, the routes of possible escape or alternatives fade away.

Cho, a television scriptwriter, said in an interview that she wrote the book in two months because her own life, basically, provided the entire backdrop she needed. The book hit a nerve in Korea and became a bestseller—not just in Korea but internationally; it’s been translated into 18 languages.

Han Kang’s The Vegetarian (translated by Deborah Smith) is another bestseller with a wide audience in the English-language world. More atmospheric than plotted, it centers around a Korean housewife who suddenly stops eating meat after having a disturbing dream. Her indifferent and unloving husband tries to adapt, but at a family gathering her refusal to eat meat becomes insulting to her father, who goads her husband and brother into force feeding her meat. She fights back and ends up institutionalized; her husband, the narrator, is left pondering her putative delusions and the mental instability of a woman who won’t submit to the behavior men want to see from her.

If I Had Your Face by Korean American author Frances Cha follows a group of young women who happen to live in the same building as they navigate lives in Seoul, a glittery place of neon and high-rises, and a place that is widely known as the plastic surgery capital of the world. A place where an estimated one in three women will elect to have a procedure before the age of 30, and where it is impossible to merely take the subway and not see ads promising life transformation—for men and women, but more for women—everywhere). The exquisitely beautiful, cosmetically enhanced Kyuri works at a “room salon,” a fancy place where men pay a premium to consort only with the “prettiest” women. Her roommate is a natural-faced artist dating the rich son of a chaebol family. Down the hall is Ara, a mute hairdresser, whose plain-looking roommate Sujin has a dubious dream to undergo expensive and painful plastic surgery to achieve Kyuri’s looks and work in a similar salon. Sujin’s quest for a beauty she feels will make her happy and financially stable is long and painful. Somewhat more peripherally, the women are aware of the young mother Wonna, on the first floor, who is swimming against not just impossible standards of beauty and sexism but also a cutthroat economy, which was memorably limned in Bong Jun-Ho’s award winning Parasite. These obstacles may make the college dreams she has for her children just an illusion.

Looked at it one way, these novels share a depressing commonality of women ground down or driven “crazy” by an unescapable patriarchy where misogyny is not just baked in, but baked into older women’s (i.e., the in-laws) non-support of the younger. Adding to the collection of fiction, Choi Seung-ja’s newly released poetry collection, Phone Bells Keep Ringing for Me (translated by Won-Chung Kim and Cathy Park Hong), declares poems “short as a shriek,” both witness and battle cry that reminds us that canons full of male authors gloss over societal structures that have kept women largely silent in literature as well as politics and culture via a strict and narrow set of rules of what is “acceptable” behavior–and art–by women. Korean poetry is often marked by the pastoral, and poetry by women comes with expectations to be lyrical and decorous in subject. Choi, then explodes that idea. For instance, Korean culture reveres the seasons, autumn is often considered the most attractive season for its vivid colors infused with melancholy because of the nearness of winter. Choi’s “Dog Autumn” begins with:
Dog autumn attacks.
Syphilis autumn.
And death visits
one of twilight’s paralyzed legs.
Spring, the season of renewal, is also considered an attractive, tender season, with flowers like azalea and cherry blossoms representative of its beauty and ephemerality. In Choi’s hands “Spring” is
…of the lonely, unmarried thirty-three-year-old woman…
In the spring, plants and grass bloom,
and even garbage grows fresh.
The trash pile grows bigger in my mouth.
I cannot swallow it or vomit it up.
Readers would be shortchanging themselves to think of these books as a kind of anthropological look at a Confucian society that favors males. These books fit in perfectly with contemporary English-language narratively inventive novels looking at women’s lives, recent examples that come to mind includes the comedic (but ultimately serious) Fleishmann Is in Trouble by Taffy Brodesser-Akner, a novel that also uses the wife’s absence to make points about the structures of matrimony and sexism. There is also a hard-to-classify novel about motherhood, Helen Phillips’s The Need that uses tropes of horror (a great choice) to examine motherhood and female agency.

These works, however, tend to not overtly reference the structures of patriarchy and misogyny the way the Korean novels do, drawing straight lines from these societal structures, that are largely out of women’s controls, and showing that even recognizing and resisting them isn’t a clear path to equality (or even equity). In the end, withdrawal, absenting one’s self from normal existence in the most disruptive way possible, is one of the few “effective” strategies to draw attention to these women’s stories. Kim Jiyoung, Born 1982 in particular unselfconsciously emphasizes its themes, unafraid of seeming didactic. With mini essays about gender statistics and rates of labor force participations, embedded with footnotes referencing Pew and Guttmacher type statistics, this is a novel through its narrative inventiveness fusing fact and fiction (not unlike Melville’s digressions on the whaling industry), we can see Kim Jiyoung’s story placed within a context of an entire country, for what are individual data points, but actual individuals?

In the end, Ko Un’s billion-won suit against the poet Choi Young-mi was dismissed, for, as the Korea Herald reported, Choi’s consistent testimony and that of other witnesses convinced the Seoul Central District Court that she was telling the truth. Choi thanks the judiciary for “bringing justice,” but the truth is, while fending off Ko’s power-play of a public lawsuit, she was still charged the disutility of the damages she suffered from Ko, to her career, her time and peace of mind. Similarly, for every woman who’s had to put up with a hostile workplace—like a man masturbating during a work meeting in front of his female colleagues—this is a kind of tax that, like gender pay disparities, should be reconsidered and compensated. But as Cho Nam-Joo writes in her novel with men continuing to hold the reins of power, it’s no surprise what women do continues to be undervalued. Kim Jiyoung gives it all up to be the best mother, only to be called a “mom-roach” by some young office lads having coffee in the same park where she, buckling with fatigue, is taking her baby out in the stroller, the damages she’s undergone woefully and forever unacknowledged: “Probably because the moment you put a price on something, someone has to pay.”

Letting the Days Go By

-

Back on Feb. 29, the extra day in this year of years, I stayed up to watch Saturday Night Live and saw David Byrne’s remarkable performance of “Once in a Lifetime.” I know the song well, although I usually get the title wrong, referring to it as “letting the days go by” or “same as it ever was,” which, along with “once in a lifetime,” are lyrics from chorus. My husband Mike has pointed out to me that both of my misremembered titles have a meaning that it opposite to “Once in a Lifetime.” But one of the reasons I love “Once in a Lifetime” is that it is somewhat at odds with itself, not only lyrically, but musically as well—although I didn’t really understand that until I read this article on NPR, watched this video, listened to multiple versions of the song on Spotify, and finally, after years of having it on my to-watch list, saw Jonathan Demme’s concert film Stop Making Sense. Yes, I fell down a “Once in a Lifetime” rabbit hole. This was in early March, before quarantine had even started. But a certain anxiety had set in. At the playground, parents were standing farther apart during after-school chats, and everyone had stopped sharing snacks. I happened to have two annual check-ups scheduled for the first week of March: my doctor and dentist. My doctor assured me that the new coronavirus was nothing to worry about, while my dentist advised me to come in immediately for a follow-up appointment to fill a cavity. The day after I got my tooth filled, I went into Manhattan for a morning screening of Crip Camp, a documentary that I was planning to review for my blog. I could have asked for a screening link, but I wanted to go into Manhattan. Maybe I knew, in the back of my mind, that it would be the last time. I wore gloves on the bus and subway, and a lightweight scarf over my nose and mouth. I got off the train at Eighth Avenue and walked four long blocks to the Landmark 57, a new theater that I’d never gone to before. I didn’t realize, until I got there, that it was in the silver, triangular building that my son refers to the as “the cake slice” whenever we drive up the West Side Highway. I took a photo of it with my phone before I headed home.The photos that follow the cake slice building are mostly of my kids, stuck inside, in crowded messy rooms, or of my kids, outdoors, in wide-open empty spaces. One of our favorite places to go for our daily Cuomo-approved constitutional was the parking lot of the shipyard, a place that is usually pretty empty, but which had become totally barren of cars, pedestrians, and commuters. Gone were the cruise ships and the ferries and the water taxis. My kids scootered past piles of empty shipping containers and yelled as loud as they wanted. Then they went home and watched Disney+ while Mike and I tried to figure out where we should go. We had been planning a move from Brooklyn to Queens—a future I was so certain of at the beginning of this year that I had begun to research nursery schools in our new, chosen neighborhood. I was concerned about the timing of everything, and I feared I was too late for the school I wanted, and too late with the camp sign-ups, and I was annoyed with my anxieties because I never wanted to be the person who was worried about getting her kids into a particular school or themed day camp. I first listened to “Once in a Lifetime” when I was a teenager. I heard the lyrics as description of mindless acquiescence to the status quo: “You may find yourself in a beautiful house/with a beautiful wife/and you may ask yourself, ‘Well…how did I get here?’” Byrne’s performance of the song in Stop Making Sense seemed to support my interpretation. Halfway through the song he dons a bizarrely oversized suit, making him look like a child in adult’s clothes, or an overstuffed puppet. His dance moves suggest sleepwalking as he stumbles backward and repeatedly hits himself on the forehead with an open palm, as if trying to wake up —or realizing a terrible mistake. To my teenage ears, even the chorus was damning: I thought the phrase “letting the days go by” was a kind of accusation, a way of saying, You’re wasting your life. I hear the song a little differently now. Now, it strikes me as a description of how we get through life: we let the days go by, riding on the backs of accumulated habits. But sometimes we stop and wonder: how did we get here?  I’m having one of those moments now. Maybe you are, too. The pandemic has had a way of slowing everyone down. Also, we moved to the suburbs. So did several of our friends. And so we are literally finding ourselves in strange houses. We’re part of a wave of New Yorkers that you may have read about the newspaper: people who left the city in the wake of the virus, seeking more space for home offices and classrooms. Mike and I are now renting the first floor of a house in Montclair, New Jersey that is at least twice as big as our old apartment and a lot cheaper. We’re kicking ourselves that we didn’t move out here years ago—except that we never would have, because all of our friends were in Brooklyn and Mike could walk to work and I didn’t have to drive anywhere. I can’t drive, by the way—I mean, I can, but it’s been 20 years since I’ve gotten behind the wheel. Montclair, fortunately, is very walkable, but the simple fact that I’m going to have to start driving again—that I won’t get everywhere on my own two feet or in the company of strangers on public transportation—is what leaves me really feeling the question: How did I get here?  When Byrne was on SNL in February, he was promoting his limited-run musical, American Utopia, which was a kind of a career retrospective for him. You can see it now on HBO; Spike Lee filmed it shortly before Broadway was shut down. I watched it last weekend and that’s what led me down the “Once in a Lifetime” rabbit hole a second time. Byrne has always struck me as a slightly detached performer, but in American Utopia he seems much more emotionally present, or maybe just more expressive. He sings more forcefully now and he dances with more precision. He’s older, too. You hear the strain and age in his voice as he aims for higher notes; maybe you worry a little when he executes a back-bending dance move—as he does frequently for “Once in a Lifetime.” When I saw Byrne on SNL, I thought he had the look of a mad, prophesying preacher, someone who’d come to a new realization of the divine late in life. The mystical imagery of the lyrics jumped out at me: had there always been so much water? Had the color blue always been so important? Above all, had it always been an ecstatic song? The joyful version of “Once in a Lifetime” was always there. You can hear it in Angelique Kidjo’s cover, which she sings melodically, at a slight faster tempo, with a horn section and back-up singers. And you can hear it in the instrumentation of the original. But I don’t think you can hear it in Byrne’s early vocal renditions. When he was young, I don’t think he really knew what he had on his hands. I think he was just following his intuition and trying to make the song work.  My personal theory of “Once in a Lifetime” is that the song has a will of its own, and that it wanted to exist in the world. It came together very slowly, starting with a bass line that The Talking Heads recorded during a jam session inspired by the music of Fela Kuti. Bassist Tina Weymouth gets credit for coming up with the riff, but she claims that her husband, drummer Chris Frantz, yelled it to her during rehearsal as and adjustment to what she had been playing. When the song was ready to be arranged, producer Brian Eno misheard the rhythm of the riff, adding a rest at the beginning of the measure. When he realized his mistake, he decided he liked the odd arrangement, and wrote a call-and-response chorus to go with it. The band thought the chorus sounded like a preacher leading a prayer, which led Byrne to the weird world of televangelists. He found his now-iconic lyrics—“And you may find yourself…”—by imitating their Biblically-tinged cadences. Borrowed beats, borrowed lyrics, misheard bass lines, bad transcriptions: the basis of this song about the unconscious way we move through life was made without much conscious thought. And yet it is probably one of The Talking Heads most beloved songs, the kind of song people know without even realizing they know it. A few weeks after we moved to our new apartment, my landlord, who lives above us, filled me in on all the local lore. Apparently, Montclair used to be a weekend destination for Broadway performers. She tells me that the house down the street from us hosted marvelous parties, and that Marlene Deitrich was a frequent guest. How my landlord knows this, I don’t know, but I was able to fact-check the next bit of ancient gossip she shared, which was that the house across the street was once occupied by a musician and composer named Herman Hupfeld. You’ve probably never heard of him, but in 1931, he wrote “As Time Goes By”—one of those songs you know without knowing, and may find yourself humming every once in a while.

Image Credit: Wikimedia Commons.

Who’s Afraid of Theory?

- | 9

In a pique of indignation, the editors of the journal Philosophy and Literature ran a “Bad Writing Contest” from 1995 to 1998 to highlight jargony excess among the professoriate. Inaugurated during the seventh inning of the Theory Wars, Philosophy and Literature placed themselves firmly amongst the classicists, despairing at the influence of various critical “isms.” For the final year that the contest ran, the “winner” was Judith Butler, then a Berkeley philosophy professor and author of the classic work Gender Trouble: Feminism and the Subversion of Identity. The selection which caused such tsuris was from the journal Diacritics, a labyrinthine sentence where Butler opines that the “move from a structuralist account in which capital is understood to structure social relations in relatively homologous ways to a view of hegemony in which power relations are subject to repetition, convergence, and rearticulation brough the question of temporality into the thinking of structure,” and so on. If the editors’ purpose was to mock Latinate diction, then the “Bad Writing Contest” successfully made Butler the target of sarcastic opprobrium, with editorial pages using the incident as another volley against “fashionable nonsense” (as Alan Sokal and Jean Bricmont called it) supposedly reigning ascendant from Berkeley to Cambridge.

The Theory Wars, that is the administrative argument over which various strains of 20th-century continental European thought should play in the research and teaching of the humanities, has never exactly gone away, even while departments shutter and university work is farmed out to poorly-paid contingent faculty. Today you’re just as likely to see aspersions on the use of critical theory appear in fevered, paranoid Internet threads warning about “Cultural Marxism” as you are on the op-ed pages of the Wall Street Journal, even while at many schools literature requirements are being cut, so as to make the whole debate feel more like a Civil War reenactment than the Battle of Gettysburg. In another sense, however, and Butler’s partisans seem to have very much won the argument from the ‘80s and ‘90s—as sociologically inflected Theory-terms from “intersectionality” to “privilege” have migrated from Diacritics to Twitter (though often as critical malapropism)—ensuring that this war of attrition isn’t headed to armistice anytime soon.   

So, what exactly is “Theory?” For scientists, a “theory” is a model based on empirical observation that is used to make predictions about natural phenomenon; for the lay-person a “theory” is a type of educated guess or hypothesis. For practitioners of “critical theory,” the phrase means something a bit different. A critical theorist engages with interpretation, engaging with culture (from epic poems to comic books) to explain how their social context allows or precludes certain readings, beyond whatever aesthetic affinity the individual may feel. Journalist Stuart Jeffries explains the history (or “genealogy,” as they might say) of one strain of critical theory in his excellent Grand Hotel Abyss: The Lives of the Frankfurt School, describing how a century ago an influential group of German Marxist social scientists, including Theodor Adorno, Max Horkheimer, Walter Benjamin, and Herbert Marcuse, developed a trenchant vocabulary for “what they called the culture industry,” so as to explore “a new relationship between culture and politics.” At the Frankfurt Institute for Social Research, a new critical apparatus was developed for the dizzying complexity of industrial capitalism, and so words like “reify” and “commodity fetish” (as well as that old Hegelian chestnut “dialectical”) became humanistic bywords.  

Most of the original members of the Frankfurt School were old fashioned gentlemen, more at home with Arnold Schoenberg’s 12-tone avant-garde then with Jelly Roll Morton and Bix Beiderbecke, content to read Thomas Mann rather than Action Comics. Several decades later and a different institution, the Center for Contemporary Cultural Studies at the University of Birmingham in the United Kingdom, would apply critical theory to popular culture. These largely working-class theorists, including Stuart Hall, Paul Gilroy, Dick Hebdige, and Angela McRobbie (with a strong influence from Raymond Williams) would use a similar vocabulary as that developed by the Frankfurt School, but they’d extend the focus of their studies into considerations of comics and punk music, slasher movies and paperback novels, while also bringing issues of race and gender to bear in their writings.

In rejecting the elitism of their predecessors, the Birmingham School democratized critical theory, so that the Slate essay on whiteness in Breaking Bad or the Salon hot take about gender in Game of Thrones can be traced on a direct line back through Birmingham. What these scholars shared with Frankfurt, alongside a largely Marxian sensibility, was a sense that “culture was an important category because it helps us to recognize that one life-practice (like reading) cannot be torn out of a large network constituted by many other life-practices—working, sexual orientation, [or] family life,” as elucidated by Simon During in his introduction to The Cultural Studies Reader. For thinkers like Hall, McRobbie, or Gilroy, placing works within this social context wasn’t necessarily a disparagement, but rather the development of a language commensurate with explaining how those works operate. With this understanding, saying that critical theory disenchants literature would be a bit like saying that astronomical calculations make it impossible to see the beauty in the stars.

A third strain influenced “Theory” as it developed in American universities towards the end of the 20th century, and it’s probably the one most stereotypically associated with pretension and obfuscation. From a different set of intellectual sources, French post-structural and deconstructionist thought developed in the ‘60s and ‘70s at roughly the same time as the Birmingham School. Sometimes broadly categorized as “postmodernist” thinkers, French theory included writers of varying hermeticism like Jacques Derrida, Michel Foucault, Giles Deleuze, Jean Lyotard, Jacques Lacan, and Jean Baudrillard, who supplied English departments with a Gallic air composed of equal parts black leather and Galois smoke. Francois Cusset provides a helpful primer in French Theory: How Foucault, Derrida, Deleuze, & Co. Transformed the Intellectual Life of the United States, the best single volume introduction on the subject. He writes that these “Ten or twelve more or less contemporaneous writers,” who despite their not inconsiderable differences are united by a “critique of the subject, of representation, and of historical continuity,” with their focus the “critique of ‘critique’ itself, since all of them interrogate in their own way” the very idea of tradition. French theory was the purview of Derridean deconstruction, or of Foucauldian analysis of social power structures, the better to reveal the clenched fist hidden within a velvet glove (and every fist is clenched). For traditionalists the Frankfurt School’s Marxism (arguably never all that Marxist) was bad enough; with French theory there was a strong suspicion of at best relativism, at worst outright nihilism.

Theory has an influence simultaneously more and less enduring than is sometimes assumed. Its critics in the ‘80s and ‘90s warned that it signaled the dissolution of the Western canon, yet I can assure you from experience that undergraduates never stopped reading Shakespeare, even if a chapter from Foucault’s Discipline and Punish might have made it onto the syllabus (and it bears repeating that contra the reputation of difficulty, the latter was a hell of a prose stylist). But if current online imbroglios are any indication, its influence has been wide and unexpected, for as colleges pivot towards a business-centered STEM curriculum, the old fights about critical theory have simply migrated online. Much of the criticism against theory in the first iteration of this dispute was about what such thinkers supposedly said (or what people thought they were saying), but maybe even more vociferous were the claims about how they were saying things. The indictment about theory then becomes not just an issue of metaphysics, but one of style. It’s the claim that nobody can argue with a critical theorist because the writing itself is so impenetrable, opaque, and confusing. It’s the argument that if theory reads like anything, that it reads like bullshit.

During the height of these curricular debates there was a cottage industry of books that tackled precisely scholarly rhetoric, not least of which were conservative screeds like Allan Bloom’s The Closing of the American Mind: How Higher Education has Failed Democracy and Impoverished the Souls of Today’s Students and E.D. Hirsh Jr.’s The Dictionary of Cultural Literacy. Editors Will H. Corral and Daphne Patai claim in the introduction to their pugnacious Theory’s Empire: An Anthology of Dissent that “Far from responding with reasoned argument to their critics, proponents of Theory, in the past few decades, have managed to adopt just about every defect in writing that George Orwell identified in his 1946 essay ‘Politics and the English Language.’” D.G. Myers in his contribution to the collection (succinctly titled “Bad Writing”) excoriates Butler in particular, writing that the selection mocked by Philosophy and Literature was “something more than ‘ugly’ and ‘stylistically awful’… [as] demanded by the contest’s rules. What Butler’s writing actually expresses is simultaneously a contempt for her readers and an absolute dependence on their good opinion.”

Meanwhile, the poet David Lehman parses Theory’s tendency towards ugly rhetorical self-justification in Signs of the Times: Deconstruction and the Fall of Paul de Man, in which he recounts the sundry affair whereby a confidante of Derrida and esteemed Yale professor was revealed to have written Nazi polemics during the German-occupation of his native Belgium. Lehman also provides ample denunciation of Theory’s linguistic excess, writing that for the “users of its arcane terminology it confers elite status… Less a coherent system of beliefs than a way of thinking.” By 1996 and even Duke University English professor Frank Lentricchia (in a notoriously Theory-friendly department) would snark in his Lingua Franca essay “Last Will and Testament of an Ex-Literary Critic” to (reprinted in Quick Studies: The Best of Lingua Franca) “Tell me your theory and I’ll tell you in advance what you’ll say about any work of literature, especially those you haven’t read.”

No incident illustrated more for the public the apparent vapidity of Theory than the so-called “Sokal Affair” in 1996, when New York University physics professor Alan Sokal wrote a completely meaningless paper composed in a sarcastic pantomime of critical theory-speak entitled “Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity” which was accepted for publication in the prestigious (Duke-based) journal Social Text, with his hoax simultaneously revealed by Lingua Franca. Sokal’s paper contains exquisite nonsense such as the claim that “postmodern sciences overthrow the static ontological categories and hierarchies characteristic of modernist science” and that “these homologous features arise in numerous seemingly disparate areas of science, from quantum gravity to chaos theory… In this way, the postmodern sciences appear to be converging on a new epistemological paradigm.” Sokal’s case against Theory is also, fundamentally, about writing. He doesn’t just attack critical theory for what he perceives as its dangerous relativism, but also at the level of composition, writing in Fashionable Nonsense: Postmodern Intellectuals’ Abuse of Science that such discourse “exemplified by the texts we quote, functions in part as a dead end in which some sectors of the humanities and social sciences have gotten lost.” He brags that “one of us managed, after only three months of study, to master the postmodernist lingo well enough to publish an article in a prestigious journal.” Such has long been the conclusion among many folks that Theory is a kind of philosophical Mad Libs disappearing up its own ass, accountable to nobody but itself and the departments that coddle it. Such was the sentiment which inspired the programmers of the Postmodern Essay Generator, which as of 2020 is still algorithmically throwing together random Theory words to create full essays with titles like “Deconstructing Surrealism: Socialism, surrealism and deconstructivist theory” (by P. Hans von Ludwig) and “Social realism and the capitalist paradigm of discourse” (by Agnes O. McElwaine).

Somebody’s thick black glasses would have to be on too tight not to see what’s funny in this, though there’s more than a bit of truth in the defense of Theory that says such denunciations are trite, an instance of anti-intellectualism as much as its opposite. Defenses of Theory in the wake of Sokal’s ruse tended to, not unfairly, query why nobody questions the rarefied and complex language of the sciences but blanches when the humanities have a similarly baroque vocabulary. Status quo objections to that line of thinking tend to emphasize the humanness of the humanities; the logic being that if we’re all able to be moved by literature, we have no need to have experts explain how that work of literature operates (as if being in possession of a heart would make one a cardiologist). Butler, for her part, answered criticism leveled against her prose style in a (well written and funny!) New York Times editorial, where she argues, following a line of Adorno’s reasoning, that complex prose is integral to critical theory because it helps to make language strange, and forces us to interrogate that which we take for granted. “No doubt, scholars in the humanities should be able to clarify how their work informs and illuminates everyday life,” Butler admits, “Equally, however, such scholars are obliged to question common sense, interrogate its tacit presumptions and provoke new ways of looking at a familiar world.”

To which I heartily agree, but that doesn’t mean that the selection of Butler’s mocked by Philosophy and Literature is any good. It costs me little to admit that the sentence is at best turgid, obtuse, and inelegant, and at worst utterly incomprehensible. It costs me even less to admit that that’s probably because it’s been cherry picked, stripped of context, and labeled as such so that it maximizes potential negative impressions. One can defend Butler— and Theory—without justifying every bit of rhetorical excess. Because what some critics disparage about Theory—its obscurity, its rarefied difficulty, its multisyllabic technocratic purpleness—is often true. When I arrived in my Masters program, in a department notoriously Theory-friendly, I blanched as much as Allan Bloom being invited to be a roadie on the Rolling Stones’ Steel Wheel Tour. For an undergraduate enmeshed in the canon, and still enraptured to that incredibly old-fashioned (but still intoxicating) claim of the Victorian critic Matthew Arnold in Culture and Anarchy that the purpose of education was to experience “the best which has been thought and said,” post-structuralism was a drag. By contrast, many of my colleagues, most of them in fact, loved Theory; they thrilled to its punkish enthusiasms, its irony laden critiques, its radical suspicion of the best of which has been thought and said. Meanwhile I despaired that there were no deconstructionists in Dead Poets Society.

I can no longer imagine that perspective. It’s not quite that I became a “Theory Head,” as one calls all of those sad young men reading Deleuze and  Félix Guattari while smoking American Spirit cigarettes, but I did learn to stop worrying and love Theory (in my own way). What I learned is that Theory begins to make sense once you learn the language (whether it takes you three months or longer), and that it’s innately, abundantly, and estimably useful when you have to actually explain how culture operates, not just whether you happen to like a book or not. A poet can write a blazon for her beloved, but an anatomist is needed to perform the autopsy. Some of this maturity came in realizing that literary criticism has always had its own opacity; that if we reject “binary opposition,” we would have to get rid of “dactylic hexameter” as well. The humanities have always invented new words to describe the things of this world that we experience in culture. That’s precisely the practice attacked by John Martin Ellis, who in his jeremiad Against Deconstruction took on Theory’s predilection towards neologism, opining that “there were plenty of quite acceptable ordinary English words for the status of entrenched ideas and for the process of questioning and undermining them.” All of that difference, all of that hegemony, and so much phallologocentricism… But here’s the thing— sometime heteroglossia by any other name doesn’t smell as sweet.

Something anachronistic in proffering a defense of Theory in the third decade of the new millennium; something nostalgic or even retrograde. Who cares anymore? Disciplinary debates make little sense as the discipline itself has imploded, and the anemic cultural studies patois of the Internet hardly seems to warrant the same reflection, either in defense or condemnation. In part though, I’d suggest that it’s precisely the necessity of these words, and their popularity among those who learned them through cultural osmosis and not through instruction, that necessitates a few statements in their exoneration. All of the previous arguments on their behalf—that the humanities require their own jargon, that this vocabulary provides an analytical nuance that the vernacular doesn’t—strike me as convincing. And the criticism that an elite coterie uses words like “hegemonic” as a shibboleth are also valid, but that’s not an argument to abandon the words—it’s an argument to instruct more people on what they mean.

But I’d like to offer a different claim to utility, and that’s that Theory isn’t just useful, but that it’s beautiful. When reading the best of Theory, it’s as if reading poetry more than philosophy, and all of those chewy multisyllabic words can be like honey in the mouth. Any student of linguistics or philology—from well before Theory—understands that synonyms are mythic and that an individual word has a connotative life that is rich and unique. Butler defends the Latinate, writing that for a student “words such as ‘hegemony’ appears strange,” but that they may discover that beyond its simpler meaning “it denotes a dominance so entrenched that we take it for granted, and even appear to consent to it—a power that’s strengthened by its invisibility.” Not only that, I’d add that “hegemony,” with its angular consonants hidden like a sharp rock in the middle of a snowball, conveys a sense of power beyond either brute strength or material plenty. Hegemony has something of the mysterious about it, the totalizing, the absolute, the wickedly divine. To simply replace it with the word “power” is to drain it of its impact. I’ve found this with many of those words; that they’re as if occult tone poems conveying a hidden and strange knowledge; that they’re able to give texture to a picture that would otherwise be flat. Any true defense of Theory must, I contend, give due deference to the sharp beauty that these sometimes-hermetic words convey.

As a totally unscientific sample, I queried a number of my academic (and recovering academic) colleagues on social media to see what words they would add to a list of favorite terms; the jargon that others might roll their eyes at, or hear as grad school clichés, but that are estimably useful, and dare I say it—beautiful. People’s candidates could be divided in particular ways, including words that remind us of some sort of action, words that draw strength from an implied metaphorical imagery, and words that simply have an aural sense that’s aesthetically pleasing (and these are by no means exhaustive or exclusive). For example, Derrida’s concept of “deconstruction,” a type of methodological meta-analysis that reveals internal contradictions within any text, so as to foreground interpretations that might be hidden, was a popular favorite word. “Deconstruction” sounds like an inherently practical term, a word that contractors rather than literary critics might use, the prefix connotes ripping things down while the rest of the word gestures towards building them (back?) up. A similar word that several responders mentioned, albeit one with less of a tangible feel to it, was “dialectics,” which was popularized in the writings of the 19th-century German philosopher George Wilhelm Friedrich Hegel, was mediated through Karl Marx, and was then applied to everything by the Frankfurt School. As with many of these terms, “dialectics” has variable meaning depending on who is using it, but it broadly refers to an almost evolutionary process whereby the internal contradictions of a concept are reconciled, propelling thought into the future. For the materialist deployment of the term by Marx and his followers, the actual word has an almost mystical gloss to it, the trochaic rhythm of the word itself with its up-down-up-down beat evoking the process of thesis-antithesis-synthesis to which the term itself applies. Something about the very sound of “dialectic” evokes both cutting and burying to me, the psychic struggle that the word is supposed to describe.

Then there are the words that are fueled with metaphorical urgency, short poems in their own right that often appropriated from other disciplines. Foucault used words like “genealogy” or “archeology” when some might think that “history” would be fine, and yet those words do something subtly different than the plodding narrative implied by the more prosaic word. With the former there is a sense of telling a story that connects ideas, trends, and themes within a causal network of familial relations, the latter recalls excavation and the revealing of that which remains hidden (or cursed). Deleuze and Guatari borrowed the term “rhizome” from botany, which originally described the complex branching of root systems, now reapplied to how non-hierarchical systems of knowledge propagate. “Rhizome” pays homage to something of beauty from a different way of understanding the world—it is not filching, it is honoring. The Italian Marxist Antonio Gramsci similarly borrowed the term “subaltern,” later popularized by Gayatri Chakravorty Spivak, for whom it came to designate communities of colonized people who are simultaneously exoticized and erased by imperial powers. The word itself was a term used for junior officers in the British colonial service. Finally, I’m partial to “interiority” myself, used to denote fictional representations of consciousness or subjectivity. Yet “interiority,” with its evocation of a deep subterranean network or the domestic spaces of a many-roomed mansion, says something about consciousness that the more common word doesn’t quite.

My favorite critical jargon word, however, is “liminal.” All of us who work on academic Grub Street have their foibles, the go-to scholarly tics marking their prose like an oily fingerprint left on Formica. We all know the professor with their favored jargon turn (often accompanied by an equivalent hand movement, like an intricate form of Neapolitan), or the faculty member who might be taken to yelling out “Hegemonic!” at inopportune times. Thus, I can’t help but sprinkle my own favored term into my writing like paprika in Budapest goulash. My love for the word, used to designate things that are in-between, transitioning, and not quite formed, has less to do with its utility than with the mysterious sense of the sounds that animate it. It’s always been oddly onomatopoeic to me, maybe because it’s a near homophone to “illuminate,” and makes me think of dusk, my favorite time of day. When I hear “liminal” it reminds me of moonbeams and cicadas at sunset; it reminds me that the morning star still endures even at dawn. An affection for the term has only a little to do with what’s useful about it, and everything to do with that connotative ladder that stretches out beyond its three syllables. I suspect that when we love these words, this jargon, it’s an attraction to their magic, the uncanny poetry hidden behind the seemingly technocratic. The best of Theory exists within that liminal space, between criticism and poetry; justifying itself by recourse to the former, but always actually on the side of the latter—even if it doesn’t know it.           

Image Credit: Wikipedia

On Obscenity and Literature

- | 2

“But implicit in the history of the First Amendment is the rejection of obscenity as utterly without redeeming social importance.” —Associate Justice William J. Brennan Jr., Roth v. United States (1957)

Interviewer: Speaking of blue, you’ve been accused of vulgarity. Mel Brooks: Bullshit! —Playboy (February, 1975)

On a spring evening in 1964 at the Café Au Go Go in Greenwich Village, several undercover officers from the NYPD’s vice squad arrested Lenny Bruce for public obscenity. Both Bruce and the club’s owner Howard Solomon were shouldered out through the crowded club to waiting squad cars, their red and blue lights reflected off of the dirty puddles pooled on the pavement of Bleecker Street. For six months the two men would stand trial, with Bruce’s defense attorney calling on luminaries from James Baldwin to Allen Ginsberg, Norman Mailer to Bob Dylan, to attest to the stand-up’s right to say whatever he wanted in front of a paying audience. “He was a man with an unsettling sense of humor,” write Ronald K.L. Collins and David M. Skover in The Trials of Lenny Bruce: The Fall and Rise of an American Icon. “Uncompromising, uncanny, unforgettable, and unapologetic…His words crossed the law and those in it is. He became intolerable to people too powerful to ignore. When it was over, not even the First Amendment saved him.” The three-judge tribunal sentenced Bruce to four months punishment in a workhouse. Released on bail, he never served a day of his conviction, overdosing on morphine in his Hollywood Hills bungalow two years later. He wouldn’t receive a posthumous pardon until 2003.

“Perhaps at this point I ought to say a little something about my vocabulary,” Bruce wrote in his (still very funny) How to Talk Dirty and Influence People: An Autobiography. “My conversation, spoken and written, is usually flavored with the jargon of the hipster, the argot of the underworld, and Yiddish.” Alongside jazz, Jewish-American comedy is one of the few uniquely American contributions to world culture, and if that comparison can be drawn further, then Bruce was the equivalent of Dizzy Gillespie or Miles Davis—he was the one who broke it wide open. Moving comedy away from the realm of the Borscht Belt one-liner, Bruce exemplified the emerging paradigm of stand-up as spoken word riff of personal reflections and social commentary, while often being incredibly obscene. The Catskills comedian Henry Youngman may have been inescapably Jewish, but Bruce was unabashedly so. And, as he makes clear, his diction proudly drew from the margins, hearing more truth in the dialect of the ethnic Other than in mainstream politeness, more honesty in the junky’s language than in the platitudes of the square, more righteous confrontation in the bohemian’s obscenity than in the pieties of the status quo. Among the comics of that golden age of stand-up, only Richard Pryor was his equal in bravery and genius, and despite the fact that some of his humor is dated today, books like How to Talk Dirty and Influence People still radiate an excitement that a mere burlesque performer could challenge the hypocrisy and puritanism of a state that would just as soon see James Joyce’s Ulysses and D.H. Lawrence’s Lady Chatterley’s Lover banned and their publisher’s hauled to jail as they would actually confront any of the social ills that infected the body politic.

What separates Bruce from any number of subsequent comics is that within his performances there was a fully articulated theory of language. “Take away the right to say the word ‘fuck’ and you take away the right say ‘fuck the government,’” he is reported to have said, and this is clearly and crucially true. That’s one model of obscenity’s utility: its power to lower the high and to raise the low, with vulgarity afforded an almost apocalyptic power of resistance. There is a naivety, however, that runs through the comedian’s work, and that’s that Bruce sometimes doesn’t afford language enough power. In one incendiary performance from the early ’60s, Bruce went through a litany of ethnic slurs for Black people, Jews, Italians, Hispanics, Poles, and the Irish, finally arguing that “it’s the suppression of the word that gives it the power, the violence, the viciousness.” He imagines a scenario whereby the president would introduce members of his cabinet by using those particular words, and concludes that following such a moment those slurs wouldn’t “mean anything anymore, then you could never make some six-year-old black kid cry because somebody called him” that word at school. Bruce’s idealism is almost touching—let it not be doubted that he genuinely believed language could work in this way—but it’s also empirically false. Dying a half-century ago he can’t be faulted for his ignorance on this score, but when we now have a president who basically does what Bruce imagined his hypothetical Commander-in-Chief doing, and I think we can emphatically state that the repetition of such ugliness does nothing to dispel its power.       

Discussions about obscenity often devolve into this bad-faith dichotomy—the prudish schoolmarms with their red-pens painting over anything blue and the brave defenders of free speech pushing the boundaries of acceptable discourse. The former hold that there is a certain power to words that must be tamed, while the later champion the individual right to say what they want to say. When the issue is phrased in such a stark manner, it occludes a more discomforting reality—maybe words are never simply utterances, maybe words can be dangerous, maybe words can enact evil things, and maybe every person has an ultimate freedom to use those words as they see fit (notably a different claim than people should be able to use them without repercussion). Bruce’s theory of language is respectably semiotic, a contention about the arbitrary relationship between signifier and signified, whereby that chain of connection can be severed by simple repetition, as when sense flees from a word said over and over again, whether it’s “potato” or “xylophone.” But he was ultimately wrong (as is all of structural and post-structural linguistics)—language is never exactly arbitrary, it’s not really semiotic. We need theurgy to explain how words work, because in an ineffable and numinous way, words are magic. When it comes to obscenity in particular, whether the sexual or the scatological, the racial or the blasphemous, we’re considering a very specific form of that magic, and while Bruce is correct that a prohibition on slurs would render resistance to oppression all the more difficult, he’s disingenuous in not also admitting that it can provide a means of cruelty in its own right. If you couldn’t say obscenities then a certain prominent tweeter of almost inconceivable power and authority couldn’t deploy them almost hourly against whatever target he sees fit. This is not an argument for censorship, mind you, but it is a plea to be honest in our accounting.

Obscenity as social resistance doesn’t have the same cache it once did, nor is it always interpreted as unassailably progressive (as it was for Bruce and his supporters). In our current season of a supposed Jacobin “cancel culture,” words have been ironically re-enchanted with the spark of danger that was once associated with them. Whether or not those who claim that there is some sort of left McCarthyism policing language are correct, it’s relatively anodyne to acknowledge that right now words are endowed with a significance not seen since Bruce appeared in a Manhattan courtroom. Whatever your own stance on the role that offensiveness plays in civilized society, obscenity can only be theorized through multiple perspectives. Four-letter words inhabit a nexus of society, culture, faith, linguistics, and morality (and the law). A “fuck” is never just a “fuck,” and a shit by any other name wouldn’t smell as pungent. Grammatically, obscenities are often classified as “intensifiers,” that is placeholders that exist to emphasize the emotionality of a given declaration—think of them as oral exclamation marks. Writing in Holy Sh*t: A Brief History of Swearing, Melissa Mohr explains vulgarity is frequently “important for the connotation it carries and not for its literal meaning.” Such a distinction came into play in 2003 after the Irish singer Bono of U2 was cited by the Federal Communications Commission when upon winning a Golden Globe he exclaimed “fucking brilliant.” The Enforcement Commission of the bureau initially decided that Bono’s f-bomb wasn’t indecent since its use clearly wasn’t in keeping with the sexual definition of the word, a verdict that was later rescinded higher up within the FCC.

“Historically,” Mohr writes, “swearwords have been thought to possess a deeper, more intimate connection to the things they represent than do other words,” and in that regard the pencil-necked nerds at the FCC ironically showed more respect to the dangerous power of fucking then did Bono. If vigor of emotion is all one was looking for in language, any number of milquetoast words would work as well as a vulgarity, and yet obscenity (even if uttered due to a stubbed toe) is clearly doing something a bit more transcendent than more PG terms—for both good and bad. Swearing can’t help but have an incantatory aspect to it; we swear oaths, and we’re specifically forbidden by the Decalogue from taking the Lord’s name in vain. Magnus Ljung includes religious themes in his typology of profanity, offered in Swearing: A Cross-Cultural Linguistic Study, as one of “five major themes that recur in the swearing of the majority of the languages discussed and which are in all likelihood also used in most other languages featuring swearing.” Alongside religious profanity, Ljung recognizes themes according to scatology, sex organs, sexual activities, and family insults. To this, inevitably, must also be offered ethnic slurs. Profanity is by definition profane, dealing with the bloody, pussy, jizzy reality of what it means to be alive (and thus the lowering of the sacred into that oozy realm is part of what blasphemously shocks). Obscenity has a quality of the theological about it, even while religious profanities have declined in their ability to shock an increasingly secular society.

Today a word like “bloody” sounds archaic or Anglophilic, and almost wholly inoffensive, even while it’s (now forgotten) reference to Christ’s wounds would have been scandalous to an audience reared on the King James Bible. This was the problem that confronted television director David Milch, who created the classic HBO western Deadwood. The resultant drama (with dialogue largely composed in iambic pentameter) was noted as having the most per capita profanity of any show to ever air, but in 1870s Dakota most of those swears would have been religious in nature. Since having Al Swearengen (a perfect name if ever there was one) sound like Yosemite Sam would have dulled the shock of his speech, Milch elected to transform his characters’ language into scatological and ethnic slurs, the latter of which still has the ability to upset an audience in a way that “by Christ’s wounds!” simply doesn’t. When Swearengen offers up his own theory of language to A.W. Merrick, who edits Deadwood’s newspaper, he argues that “Just as you owning a print press proves only an interest in the truth, meaning up to a fucking point, slightly more than us others maybe, but short of a fucking anointing or the shouldering of a sacred burden—unless of course the print press was gift of an angel,” he provides a nice synthesis of the blasphemous and the sexual. The majority of copious swears in Deadwood are of the scatological, sexual, or racial sort, and they hit the ear-drum with far more force than denying the divinity of Christ does. When Milch updated the profanity of the 19th century, he knew what would disturb contemporary audiences, and it wasn’t tin-pot sacrilege.  

All of which is to say that while obscenity has a social context, with what’s offensive being beholden to the mores of a particular century, the form itself universally involves the transgression of propriety, with the details merely altered to the conventions of a time and place. As an example, watch the 2005 documentary The Aristocrats directed by magician Penn Jillette, which features dozens of tellings of the almost unspeakably taboo joke of the same name. Long an after-hours joke told by comedians who would try to one-up each other in the degree of profanity offered, Jillette’s film presents several iconic performers giving variations on the sketch. When I saw the film after it came out, the audience was largely primed for the oftentimes extreme sexual and scatological permutations of the joke, but it was the tellings that involved racial slurs and ethnic stereotypes that stunned the other theater goers. It’s the pushing of boundaries in and of itself, rather than the subject in question, that designates something as an obscenity. According to Sigmund Freud in his (weirdly funny) The Joke and Its Relation to the Unconscious, vulgar humor serves a potent psychological purpose, allowing people “to enjoy undisguised obscenity” that is normally repressed so as to keep “whole complexes of impulses, together with their derivatives, away from consciousness.” Obscenity thus acts as a civilizational pressure valve for humanity’s chthonic impulses.

That
words which are considered obscene are often found in the vocabulary of the
marginalized isn’t incidental, and it recommends spicey language as a site of
resistance. English swearing draws directly from one such point of contact
between our “higher” and our “lower” language. The majority of English swears
have a Germanic origin, as opposed to a more genteel Romance origin (whether
from French or Latin). In keeping with their Teutonic genesis, they tend to
have an abrasive, guttural, jagged quality to their sounds, the better to
convey an onomatopoeic quality. Take a look at the list which comprises comedian
George Carlin’s 1972 bit “Seven Words You Can Never Say on Television.” Four of
them definitely have an Old English etymology, traceable back to the West
Germanic dialect of the Angles, Saxons, Frisians, and Jutes who occupied
Britain in the later centuries of the first millennium. Three of them – the one
that rudely refers to female genitalia, the one that tells you to rudely do
something sexual, and the one that tells you to do that thing to your mother –
may have Latin or Norman origins, though linguists think they’re just as likely
to come from what Medievalists used to call “Anglo-Saxon.” Most of these words
had no obscene connotations in their original context; in Old English the word
for urine is simply “piss,” and the word for feces is “shit.” Nothing dirty
about either word until the eleventh-century Norman invasion of Britain privileged
the French over the English. That stratification, however, gives a certain
gutter enchantment to those old prosaic terms, endowing them with the force of
a swear. Geoffrey Hughes writes in Swearing: A Social History of Foul
Language, Oaths, and Profanities in English that the “Anglo-Saxon element… provides
much more emotional force than does the Normal French of the Latin. Copulating
pandemonium! conveys none of the emotion charge of the native equivalent fucking
hell!”  Invasion, oppression, and
brutality mark those words which we consider to be profane, but they also give
them their filthy enchantments.

What’s clear is that the class connotations of what Bruce called an “argot” can’t be ignored. Swearing is the purview of criminals and travelers, pirates and rebels, highwaymen and drunks. For those lexicographers who assembled lists of English words in the early modern era, swearing, or “canting,” provided an invaluable window into the counter-cultural consciousness. The Irish playwright Richard Head compiled The Canting Academy, or Devil’s Cabinet Opened in 1673, arguably the first full-length English “dictionary,” complied decades before Dr. Johnson’s staider 1755 A Dictionary of the English Language. Decades before Head’s book, and short pamphlets by respectable playwrights from Thomas Dekker to Thomas Middleton similarly illuminated people on the criminal element’s language—other examples that were included as appendices within books, such as Thomas Harman’s A Caveat or Warning for Common Cursitors, go back well into the 16th century. Such “canting guides,” exploring the seamy underbelly of the cockney capital, were prurient pamphlets that illustrated the salty diction of thieves and rogues for the entertainment of the respectable classes. One of the most popular examples was the anonymously edited A New Dictionary of the Terms Ancient and Modern of the Canting Crew, first printed in 1698.  Within, readers could learn the definitions of insults from “blobber-lipped” to “jobber-not.” Such dictionaries (that included words like “swindler” and “phony,” which still survive today) drew from the English underclass, with a motley vocabulary made up of words from rough-hewn English, Romani, and ultimately Yiddish, among other origins.

A direct line runs between the vibrant, colorful, and earthy diction of canting to cockney rhyming slang, or the endangered dialect of Polari used for decades by gay men in Great Britain, who lived under the constant threat of state punishment. All of these tongues are “obscene,” but that’s a function of their oppositional status to received language. Nothing is “dirty” about them; they are, rather, rebellions against “proper” speech, “dignified” language, “correct” talking, and they challenge that codified violence implied by the mere existence of the King’s Speech. Their differing purposes, and respective class connotations and authenticity, are illustrated by a joke wherein a hobo asks a nattily dressed businessman for some change. “’Neither a borrower nor a lender be’—that’s William Shakespeare,” says the businessman. “’Fuck you’—that’s David Mamet,” responds the panhandler. A bit of a disservice to the Bard, however, who along with Dekker and Middleton could cant with the best of them. For example, within the folio one will find “bawling, blasphemous, incharitible dog,” “paper fac’d villain,” and “embossed carbuncle,” among such other similarly colorful examples.

An entire history could be written about early instances of noted slurs, which of course necessitates trawling the Oxford English Dictionary for examples of dirty words that appear particularly early. For “shit,” there is a 1585 instance of the word in a Scottish “flyting,” an extemporaneous poetic rhyme-battle held in Middle Scotts, which took place between Patrick Hume and Alexander Montgomerie. The greatest example of the form is the 15th-century Flyting of Dunbar and Kennedy, containing the first printed instance of the word “fuck.” In the OED, our good friend the dirty lexicographer Richard Head has the earliest example given in the entry for the word “fuck,” the profanity appearing as a noun in his play Hic et Ubique: or, The Humors of Dublin ,wherein a character says “I did creep in…and there I did see [him] putting the great fuck upon my wife.” And the dictionary reflects the etymological ambiguity concerning the faux-francophone/faux-Virgilian word “dildo,” giving earliest attribution to the playwright Robert Greene in 1590 who in his comedy Never Too Late wrote “Dildido dildido, Oh love, oh love, I feel thy rage rumble below and above.” Swearing might be a radical alternative to received language, but it pulses through literature like a counter-history, a shadow realm of the English tongue’s full capabilities. It is a secret language, the twinned-double of more respectable letters, and it’s unthinkable to understand Geoffrey Chaucer without his scatological jokes or Shakespeare minus his bawdy insults. After all, literature is just as much Charles Bukowski as T.S. Eliot; it’s William S. Burroughs and not just Ezra Pound.

Sometimes those dichotomies about what language is capable of are reconciled within the greatest of literature. A syllabus of the immaculate obscene would include the Marquis de Sade’s 120 Days of Sodom, Charles Baudelaire’s The Flowers of Evil, Gustave Flaubert’s Madame Bovary, Joyce’s Ulysses, Lawrence’s Lady Chatterley’s Lover, Vladimir Nabokov’s Lolita, Henry Miller’s Tropic of Cancer (smuggled out of a Barnes & Noble by yours truly when I was 16), and Irving Welsh’s Trainspotting. Along with his fellow Scotsman James Kelman, Welsh shows the full potential of obscenity to present an assault on the pieties of the bourgeois, mocking Madison Avenue sophistry when he famously implores the reader to “Choose rotting away, pishing and shiteing yersel in a home, a total fuckin embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.” Within the English language, looming above all as the primogeniture of literary smut, is the great British author John Cleland, who in 1748 published our first pornographic novel in Fanny Hill: Or, Memoirs of a Woman of Pleasure, wherein he promised “Truth! stark naked truth, is the word, and I will not so much as take the pains to bestow the strip of a gauze-wrapper on it.” Cleland purposefully wrote Fanny Hill entirely in euphemisms and double entendres, but the lack of dirty words couldn’t conceal the fact that the orgiastic bildungsroman about a middle-age nymphomaniac was seen as unspeakably filthy. The novel has the distinction of being the longest banned work in U.S. history, first prohibited by the Massachusetts Supreme Court in 1821, only to be sold legally after the U.S. Supreme Court ruled that its censorship was unconstitutional in 1966. The same year that Bruce was found face-down, naked and dead, in his California bathroom.     

A goddamn unequivocal fucking triumph of the human spirt that any fucking wanker can march up into a public library and check out a copy of Fanny Hill. That liberty is one that was hard fought for, and we should look askance on anyone who’d throw it away too cavalierly. But there is also something disingenuous as dismissing all those who suppressed works like Fanny Hill or Ulysses or Lady Chatterley’s Lover as mere prigs and prudes. A work is never censored because it isn’t powerful; it’s attacked precisely because of that coiled, latent energy that exists within words, none the more so than those that we’ve labeled as forbidden. If the debate over free speech and censorship is drenched in a sticky oil of bad faith, then that slick spills over into all corners. My fellow liberals will mock the conservative perspective that says film or comic books or video games or novels are capable of altering someone into action, sometimes very ugly action—but of course literature is capable of doing this. Why would we read literature otherwise? Why would we create it otherwise? The censor with his black marker in some ways does due service to literature, acknowledging its significance and its uncanny effect. To claim that literature shouldn’t be censored because all literature is safe is not just fallacious, it’s disrespectful. The far more difficult principle is that literature shouldn’t be censored despite the fact that it’s so often dangerous.

Like any grimoire or incantation, obscenity can be used to liberate and to oppress, to free and to enslave, to bring down those in power but also to froth a crowd into the most hideous paroxysms of fascistic violence. So often the moralistic convention holds that “punching down” is never funny, but the dark truth is that it often is. What we do with that reality is the measure of us as people, because obscenity is neither good nor bad, but all power resides within the mouth of who wields it. What we think of as profanity is a rupture within language, a dialectic undermining conventional speech, what the Greeks called an aporia that constitutes the moment that rhetoric breaks down. Obscenity is when language declares war on itself, often with good cause. Writing in Rabelais and His World, the great Russian critic Mikhail Bakhtin defined what he called the “carnivalesque,” that is the principle that structured much medieval and Renaissance performance and literature, whereby the “principle of laughter and the carnival spirit on which the grotesque is based destroys…seriousness and all pretense.” Examining the Shrovetide carnivals that inaugurated pre-Reformation Lent, Bakhtin optimistically saw something liberatory in the ribald display of upended hierarchies, where the farting, shitting, pissing, vomiting hilarity of the display rendered authority foolish. “It frees human consciousness,” Bakhtin wrote, “and imagination for new potentialities.”

An uneasy and ambivalent undercurrent threads through Bakhtin’s argument, though. If the carnival allowed for a taste of emancipation, there was also always the possibility that it was just more bread and circus, a way to safely “rebel” without actually challenging the status quo. How much of our fucks and shits are just that, simply the smearing of feces on our playpen walls? Even worse, what happens when the carnival isn’t organized by plucky peasants to mock the bishops and princes, but when the church and state organize those mocking pageants themselves? Bakhtin didn’t quite anticipate the troll, nor did Bruce for that matter. Gershon Legman writes in the standard text Rationale of the Dirty Joke: An Analysis of Sexual Humor that “Under the mask of humor, our society allows infinite aggressions, by everyone and against everyone. In the culminating laugh of the listener or observer…the teller of the joke betrays his hidden hostility.” Can’t you take a joke? Because I was just joking. Legman’s reading of obscenity is crucial—it’s never just innocent, it’s never just nothing, it’s never just words. And it depends on who is saying them, and to whom they’re being said. Because swearing is so intimately tied to the theological, the use of profanity literally takes on the aura of damnation. It’s not that words aren’t dangerous—they are. But that doesn’t mean we must suture our mouths, even as honesty compels us to admit that danger. What we do with this understanding is the process that we call civilization. Because if Lenny Bruce had one unassailable and self-evident observation, it was that “Life is a four-letter word.” How could it be fucking otherwise?

Bonus Link:—Pussy Riot: One Woman’s Vagina Takes on Japan’s Obscenity Laws

Image Credit: Flickr/Jeniffer Moo

Secrets That Hold Us

-

1.
“I didn’t grow up with my mother,” Mom tells me. We’re sitting on the front verandah looking out at the grass before us wilting in the afternoon sun and the dwarf coconut trees that line one side of the driveway. The coconut fronds dip with the breeze, revealing green-and yellow-husked coconuts. It’s hot on the verandah; the aluminum awning, put there years earlier to shield the sun and rain, traps the late afternoon heat as well.
“I was 12 before I knew my mother,” she says.
 My mother is responding to some transgression of mine, what specifically I don’t remember. Perhaps something I neglected to tell her. There’s a lot we didn’t talk about then and don’t talk about now—a trait I earned honestly, I now know.
To grow up in Jamaica is to hear and know these stories of mothers who have migrated abroad or to Kingston, leaving their children to be raised by another relative. Barrel children, we call them, a term that stems from the fact that the migrating parents often send barrels of food and clothes back home for their children. But this is not my mother’s story.
I am in my 20s then when she tells me about meeting her mother for the first time. My mother was 12, living in Clarendon on the southern side of Jamaica with her father’s twin sisters, a cousin, and three of her father’s other children. She talks about that day as if it’s nothing remarkable: A visitor comes and one of the aunts tells my mother to take the woman to another relative’s house some yards away on a vast tract of land that is subdivided for various relatives. They’re on their way, when my grandaunt call my mother back. It turns out they were watching to see what my mother would do, how long it would take my mother to start quizzing the visitor.
“That’s your mother,” my grandaunt tells her. My mother was a baby when her mother left her there with the aunts, too young to recognize her own mother those 12 years later.
Among her siblings, my mother’s story is not unique. My mother’s father worked with the now-defunct railway service in Jamaica, crossing the country on the east-west train line. I don’t know how or where he met the mothers of his children. But it seems, he collected his offspring and deposited them in Clarendon for his sisters to raise. “I don’t know what arrangement Papa Stanley had with her,” my mother says of her parents.
My mother’s mother—Mother Gwen—raised three boys. “She gave away the girl and kept the boys,” my mother says. In it, I hear the sting of abandonment and the loss of having grown up without a concrete reason for her mother’s absence. It’s years before my mother sees her mother again, and by then she’s an adult with a life far different from that of her mother and the sons her mother kept and raised.
2.
My mother came of age with the generation that ushered in Jamaica’s independence from Britain in 1962, and she lives the Jamaican dream, that of moving off the island for a better life elsewhere. Generation after generation of Jamaicans have moved abroad—some to Panama to help build the canal, some to Cuba to work the sugar cane fields, others to Britain during the Windrush era, and countless others to America. A year after marrying my father in 1967, my parents moved to America for undergraduate studies at Tuskegee University, and later to Urbana, Ill., for my father’s graduate studies. Driven away by the cold, winter’s wrath, snow measured by the feet instead of inches, my parents returned to Jamaica in 1971, with the first of their three daughters. My father settled into a job with a unit of a bauxite company and my mother begans teaching early childhood education, supervising teachers and later teaching at a teachers college.
In 1977, they bought the house they still own—a split-level with a front lawn that slopes down to the street. The house overlooks a valley, where a river once ran. Across the valley is the main road into town and a forested cliff side. We sit on that verandah, my mother and I, looking out as she tells me about this grandmother I hadn’t known about until recently. The openness of the yard, the grass sloping down to the road, contrasts with the family secret she’s held on to for so long.
3.
I am already out of college and in my early 20s when my sisters and I rediscover Mother Gwen. We had, the three of us, come to our separate conclusions that our maternal grandmother had long been dead. I don’t have a specific reason for thinking she was dead—no memory of a funeral, no snippets of a conversation at the back of my mind. But growing up, we made weekly or biweekly trips to visit our relatives in different parts of the island: my paternal grandparents in Anchovy, a small town in the rambling hills that look down on Montego Bay; my maternal grandfather in Kingston; the grandaunts who raised my mother in the woods of Clarendon amidst coffee and cocoa and citrus plants. Now, I suspect that I assumed she had passed on because she was not among the relatives we visited, and unlike my other relatives, I have no specific memory of her—no Christmas or Boxing Day dinners, no visits to her after Easter Sunday services.
Grandma—my father’s mother—I remember clearly. She baked birthday cakes in three different sizes because my sisters and I celebrated our birthdays in back-to-back-to-back months—April, May, and June. My older sister got the largest cake, I got the mid-sized cake, and my younger sister got the smallest cake. Grandma divided money in a similar manner—$20, $10 and $5. Our birth order mattered to her.
I don’t have childhood memories of Mother Gwen, or the sons she raised without my mother. Instead, I have a vague recollection of an unfinished house, a metal drum by the side of the house, and large red and orange colored fish swimming in the makeshift aquarium. I don’t know if the house of my memory and the house where we rediscovered our grandmother are the same but it’s the memory that came to me when we walked into the house that Sunday afternoon and my mother said to her mother, who was slowly going blind, “These are my daughters, your granddaughters.” The specifics of the conversation are lost to me now but I imagine we must have talked with our grandmother and our uncles about our studies, our lives in America. Our uncles gave us trinkets they made or bought to sell—bracelets with the Jamaican colors: black, green and yellow.
Even now, I can see the shock on my sisters’ faces—eyebrows going up, eyes widening, exaggerated blinks. How could we have lived so long, on an island as small as Jamaica, without knowing our mother’s mother still lived?
4.
On the verandah that day, my mother tells me that she feared her brothers, feared they would harm her children in some way. In response, she kept us away. My mother doesn’t give a concrete reason for fearing her brothers but vaguely says something about the differences in their lives, what she had built of her life and what they hadn’t made of theirs—and the possibility that potential jealousy of what she had accomplished would bubble over. Unlike my mother who taught at a teachers college for most of my childhood, her brothers—all Rastafarians—made and sold trinkets in various roadside stalls and markets. But they had feared her return to their lives, thinking that perhaps she’d come to claim what she thought should be hers. Perhaps her brothers said something that gave my mother pause.
Without my sisters and me, my mother drifted in and out of Mother Gwen’s life, not taking us back until we were in our 20s, young adults embarking on our own lives. By the time I rediscovered my grandmother and her sons, it was too late for her to become grandma or her sons to become uncles.
But later in her life, my mother truly became Mother’s Gwen’s daughter, driving some 60 miles every Saturday to Spanish Town—where Mother Gwen lived with the only surviving son—with bags of clean laundry and bags and boxes of grocery: fish, chicken, yam, bread, eggs, pumpkin, thyme, and sometimes a pot of fresh soup. Then she’d call out to Mother Gwen, blind then and a little hard of hearing, before walking into the room where she slept or sat in the doorway to catch the Jamaican breeze. My mother gathered the soiled bed sheets and clothes, hand washed some before she left, and hung them on the line in the back to dry. The rest my mother packed to take home where she washed them, only to exchange them for a newly soiled batch a few days later.
On the few occasions I was there, my uncle hovered, keeping watch over my mother, and when he had me for an audience, he turned the small verandah into a stage and spouted word-for-word Marcus Garvey speeches he has rehearsed, every inflection perfectly placed, his eyes staring straight ahead as if looking at the words scrawled in the air. Thin and wiry, he talked about the occasions on which he’d been invited to recite a Marcus Garvey speech, the opportunities he’d missed to make a career out of this ability of his. Sometimes he talked at length about reasons for a decision or reasons he hasn’t been able to make more of his life—his mother, of course, was the main reason.
Mother Gwen died blind, completely dependent on the daughter she given up.


5.
Mother Gwen left her daughter but my mother didn’t. For my mother, I think her mother’s absence is like a shroud she can never remove. And I think it’s why she was always there for her children.
I only have two memories of my mother not being with us: once she took a sabbatical from the college where she taught and spent some time in New York. I don’t recall the length of her absence, just that I got sick the very day she left; the helper who was with us fed me tea made from the leaf of a lime tree and cream soda. I’ve never liked either since. The second time—the summer after my older sister graduated from high school—my mother took my sister to New York and left my younger sister and me at home with our father. We’d always traveled together—my sisters, our mother, and I—so this was new. The morning after their departure, after my father had left for work, the phone rang—an operator with a collect call for my mother. In the background, I heard a cousin saying Papa Stanley had died. I didn’t even have to accept the charge for I already knew the message I had to pass on: my mother’s father was dead. Mom returned within the week.
6.
The absence of my mother’s mother lives with me, too; an obsession I cannot shake, a recurring motif that springs from my unconscious into most of my longer pieces of fiction. These days, the recurring theme in my novels is mothers who don’t raise their children. My first novel, River Woman, is about a young woman who loses her son, and whose mother returns to Jamaica after years a broad for her grandson’s funeral. Their reunion is fraught with tension. My second novel, Tea by the Sea, is about another mother who spends 17 years searching for a daughter taken from her at birth.
While I didn’t set out to write my mother’s story, the ideas of abandonment and loss and belonging have crept into my work and remained there, a lurking obsession that I don’t yet seem able to escape. Perhaps, it’s my unconscious attempt to reach under the layers of family stories to discover why my mother holds on to these family secrets and stores them, as if they will be her undoing.
Sometimes I think the secrets my mother holds trap her into bearing responsibility for her father’s transgressions, for the circumstances of her birth. It’s not her burden to carry, and yet she does. I see it in her response to the news of another brother, another of her father’s children, whom she learned about not long before her father’s death. She had known the young man, taught him at school, knew him with a surname different from her own. As she tells it, her father said the young man, now grown, was coming to visit him. Why, my mother asked. “He’s my son.” Even now, years and years later, my mother still hasn’t fully accepted him. She says, “I didn’t know him as a brother,” and recalls his mother naming him as another man’s child—a jacket, in Jamaican terms—as if those circumstances are her brother’s to bear.
As a writer, I can fictionalize the reasons family members hold onto secrets long after they have lost their usefulness. Or they can let them go, setting free the secrets that trap a child into bearing responsibility for a parent’s mistakes.
7.
My mother is nearing 80. Cancer has weakened her body, perhaps lowered her defenses. She invites her brother—the only one of her mother’s three sons still alive—his daughter, and two grandchildren to visit her home. They came on a Sunday in March, the first time in the 42 years my parents owned that house that her visited. She made cupcakes with her grandniece. And when my mother talks of the visit there is joy in her voice.
I imagine them on the verandah looking out at the expanse of green before them: a brother and a sister nearer to the end of their lives than the beginning, both reaching across the years to their memories of the mother who held them together.
Image Credit: Pikist.