Unseasonal Reading: Enjoying Books as They Come

If there’s one thing I have learned about summer reading, it’s that it should have no boundaries. Everything can be a beach read: Leo Tolstoy, John Grisham, Roland Barthes, Karl Ove Knausgaard, even massive presidential biographies if you are a grandpa or my 26-year-old brother. If you’re like me, every book that you have missed out on during the rest of the year becomes fodder for an ambitious—and somewhat behind-the-times—summer reading list. And speaking of behind-the-times, it is perhaps unsurprising that my favorite book so far this summer was Ali Smith’s Winter. Released in January 2018, Winter is the second novel in Smith’s seasonal quartet, following Autumn. Both Autumn and Winter are lyrical and political, something akin to a modernist experiment: a pastiche of texts, news, fiction. I devoured Autumn in one sitting, sprawled in morning light on my parents’ porch (with my laptop nearby to look up pictures of Pauline Boty’s work). Winter, on the other hand, took me several lazy days of beach vacation to finish. There was something especially delightful about a book that was so cognitively dissonant with my surroundings, an unseasonal reading: to read about a dysfunctional family Christmas in frozen Cornwall while sweaty and sunscreen-y on a beach, floating around a pool on a doughnut-shaped raft, or indoors waiting for aloe to soak into sunburns. This is of course apropos of Winter and its concern for global warming, its ending grimace at Donald Trump’s summer speech at the Boy Scout Jamboree in West Virginia: “In the middle of summer, it’s winter. White Christmas. God help us, every one.” Winter—and Autumn—remind the reader of the reality of climate change, the way the seasons slide into each other with less variegation than years past. From Autumn, “The days are unexpectedly mild. It doesn’t feel that far from summer, not really, if it weren’t for the underbite of the day, the lacy creep of the dark and the damp at its edges, the plants calm in the folding themselves away, the beads of the condensation on the webstrings hung between things.” Unseasonal reading has some of this awareness, that soon summer will bleed into all seasons, that heat is building, melting. [millions_ad] But along with the weather, what’s been unseasonal about my reading this summer is my lack of focus. Usually I read about a book a day in the summer, but this year I’ve been lucky to make it through one a week. Although I have read more than last year already (at 43 books at the time of writing and certainly slated to defend my title of reading the most books/pages this year in my annual contest with my dad), there’s something different about my reading lately. I put books down to nap, or to research some coastal town’s wild horse population, or to see friends. All good things, of course, but I find myself perplexed by my slowness. Has my attention span shortened from using a phone too often? Am I overcommitted? Do I have more plans than years before? Should I visit Assateague? But when I’m not berating my lack of focus, my unseasonal summer reading reminds me of a favorite phrase from Roland Barthes’ The Pleasure of the Text: He writes that “what I enjoy in a narrative is not directly its content or even its structure, but rather the abrasions I impose on the fine surface: I read on, I skip, I look up, I dip in again.” This “dipping in again” reminds me of reading Winter: a passage or two at a time, looking up, getting a drink, putting the book down to dip in the water, picking it up again. I was interrupted and unfocused. But Winter demanded to be read slowly, with frequent breaks for swimming and naps—and perhaps, if I can liken myself to Barthes, these abrasions on such a fine surface are what I enjoyed most about it. My favorite lines in Winter remind, “That’s what winter is: an exercise in remembering how to still yourself then how to come pliantly back to life again. An exercise in adapting yourself to whatever frozen or molten state it brings you.” If that is winter, it looks an awful lot like my summer—an exercise in adaptation, in stillness that will spring to action, hopefully before too long. Image: Flickr/Thomas Sturm

A Mad Woman on Fire: On Sylvia Plath and Female Rage

“You do not do, you do not do / Any more, black shoe…” The poet’s voice is strong, piercing. Her tone arch, sly. A few lines later, as she proclaims, “Daddy, I have had to kill you,” she sounds like she’s about to burble into a mean, delicious laugh. The poem is Sylvia Plath’s patricidal “Daddy” and the voice her own, recorded for the BBC in October 1962, less than five months before her death at age 30. That day, she read aloud more than a dozen of the poems that would help make up Ariel, the posthumous collection that would make her name, as she herself foretold. Writing about these same recordings in the New York Review of Books in 1971, Elizabeth Hardwick describes how “taken aback” she was by them. “Clearly, perfectly, staring you down,” she writes, describing Plath’s delivery. “She seemed to be standing at a banquet like Timon, crying, ‘Uncover, dogs, and lap!’” I still remember my surprise at hearing these recordings on my tinny cassette recorder in my freshman dorm room more than 25 years ago. I suppose I expected something more ethereal, a doomy Ophelia floating down the river. There was, instead, something ferocious about them. First, it unsettled me; then it excited me. Like many women, I saw myself in Plath: a diligent, high-achieving young woman, a striver privately bristling against social conventions and punishing gender roles. (“A living doll, everywhere you look,” she writes in “The Applicant.” “It can sew, it can cook…will you marry it, marry it, marry it?”) But Plath more than bristled; she burned.  In so many of her poems and in her mordant, acid-tongued novel The Bell Jar, you can feel the rage rippling off the pages. Growing up in the Midwest in the 1980s, I was a hard worker, a straight-A student, a people-pleaser, disciplined and dutiful. A Tracy Flick without the swagger. But when I first read “Daddy,” its incantational rhythm, its audacious analogies (the speaker likens herself to a concentration camp victim, her father to a Nazi, her husband to a vampire), it unleashed something inside me. I wanted to write like that. Bold, risk-taking, confrontational. “The work of a mad woman on fire”: That’s how my creative writing professor charitably classified the deeply derivative poems I wrote under Plath’s spell. But it didn’t matter that they were derivative. What mattered was that I—this well-behaved, compliant young woman—was writing from deeper, darker places, reservoirs of anger and frustration I’d always denied were there. Even back then, it felt to me like Plath was writing from this secret, shared subterranean place—a place where girls and women let loose all their “unacceptable” feelings: bald ambition, aggression, frustration, resentment, rage. But now, in 2018, that place no longer feels so subterranean. “The anger window is open,” Rebecca Traister wrote last November. “For decades, centuries, it was closed: Something bad happened to you, you shoved it down, you maybe told someone but probably didn’t get much satisfaction—emotional or practical—from the confession.” In the aftermath of the 2016 campaign, the inauguration, the Harvey Weinstein scandal and everything that’s followed, it’s impossible to shake the feeling that the lid’s been torn off. The subterranean is no longer subterranean. Instead, it’s terra not-so-firma. Everything’s different now. There’s always been a drumbeat of disapproval of Plath’s work, and it often comes down to: She’s just too much. A 1966 Time review of Ariel refers to “Daddy” as that “strange and terrible poem” Plath composed during her “last sick slide toward suicide,” adding that its style is as “brutal as a truncheon.” M.L. Rosenthal’s 1965 review of Ariel in The Spectator finds several of the poems “hard to penetrate in their morbid secretiveness” and puzzles uncomfortably over the way they seem to “make a weirdly incantatory black magic against unspecified persons and situations.” Years later, in the New Criterion, Bruce Bawer would refer to Plath’s “shrill, deranged” voice in the Ariel poems. She’s too much, too loud, too hysterical; she’s taking up too much space. It’s fascinating to read this criticism today, when political and cultural rhetoric runs searingly hot, when the standards for hyperbole have dramatically shifted, and when charges of female shrillness resonate more deeply than ever. That’s why I’ve found myself returning to Plath so much in the last two years. Her recordings became the de facto soundtrack to my latest novel, Give Me Your Hand, begun in the early stages of the 2016 presidential campaign and completed just after the inauguration. It’s the story of three brilliant, ambitious women seeking a cure for premenstrual dysphoric disorder, otherwise known as “extreme PMS.” There was something deeply unsettling about trying to bring to life the creeping misogyny within the male-dominated world of scientific research as the headlines blared. One day, it was candidate Donald Trump dismissing Megyn Kelly’s pointed questions by seemingly referencing her menstrual period; the next, it was “Grab them by the pussy.” Amid the cries of “Lock her up” and “Bern the Witch,” I listened to Plath over and over, but it was different now. Hearing her firm, authoritative voice, I couldn’t help thinking of the incessant criticism of Hillary Clinton’s voice as “shrill” and her laugh a cackle. I imagined what those critics might think of Plath’s voice: ruthless, relentless, witchy. Suddenly, the recordings seemed to take on an even greater strength. They felt bracing, invigorating. They felt like a battle cry. There’s a long tradition of playing connect the dots between Plath’s biography and her writing. I, too, used to indulge in this line of thinking, titillated by the details of Plath’s life: her struggles with depression, her tumultuous marriage to poet Ted Hughes, her suicide. But this approach to her work feels different now too. It’s a way to diminish her writing as “personal” and thus small, even narcissistic. The argument is familiar. It’s the same one that pigeonholes fiction by and about women as “domestic novels” or movies about women as “chick flicks.” But as with those novels and movies, the issues with which Plath wrestles are the ones now consuming us: fear of female ambition, fear of the female body and the female voice, and perhaps most of all, fear of female power. The subterranean space Plath gave voice to for decades has become our daily landscape, and she serves as our sage, our guide. “I know the bottom,” she writes in “Elm.” “I know it with my great tap root: / It is what you fear. / I do not fear it: I have been there.” It’s no surprise, then, that Plath seems more omnipresent than ever: Kirsten Dunst is developing The Bell Jar for the screen, a recent auction of Plath's belongings garnered $551,862, and an exhibit of her belongings at the National Portrait Gallery featured everything from a ribboned ponytail of her to her Girl Scout uniform. But there’s something particularly exhilarating about young women responding to her with the same fervency I did a quarter century ago. As I listened to that cassette of Plath reading, millennial women tattoo  The Bell Jar’s “I am I am I am” on their forearms, or fill their Tumblr and Instagram feeds with Plath quotes, excerpts and stanzas transformed into cris de coeur. The hands-down favorite is from “Lady Lazarus.” “Beware,” the speaker warns. “Out of the ash I rise with my red hair and I eat men like air.” Lately, whenever I talk with a young woman about Plath or teach “Daddy” to undergrads, I think about the first critique of Plath I ever heard. It was a movie scene: a man visiting a woman’s apartment for the first time plucks a copy of Ariel from her book shelf. “Interesting poetess,” he says, “whose tragic suicide was misinterpreted as romantic by the college-girl mentality.” The woman fumbles shyly and replies, self-effacingly, “Oh, yeah, right. I don’t know. Some of her poems seem neat.” Watching this exchange, I felt the burn of recognition. Because at the time, I was that “college girl” in thrall to Plath. I’m a cliché, I thought. But on a deeper level, I felt dismissed, diminished. The movie, of course, was Woody Allen’s Annie Hall. Now, when I watch that scene—the man’s casual dismissal, the woman’s abashed response, the laughter in the audience—it looks so different. Everything looks different now. Image: Flickr/summonedbyfells

Playing with Guns: Parenting in the Age of the Active Shooter

The first time I hold a gun I am 3 years old. We are visiting my father’s business partner. My dad is getting into strawberries, which means he’s getting into the drugs he can hide in the strawberry truck. In the living room, I slip my chubby 3-year-old hand between the cushions of the brown furry couch and pull out the gun. It’s heavy, but not cold, in my hand. I turn around and squeal, “Treasure!” “Bring it to me, Sara,” my mother’s steady voice requests. And I do. I don’t throw the loaded gun. I don’t fire it accidentally. I toddle over and hand it gently to my mother. It’s the story I tell about the gun that keeps this memory with me. A story about my mother saving me. Now, I am the mother of an almost 5-year-old boy. We’re navigating the issue of violence. I drop my son off at 8:30 a.m., punching in a four-digit code at the gate of his school and then another four-digit code at the main door. I get a voicemail in the afternoon, a computer telling me there is a hoax school shooting threat against the school district. The robotic voicemail says it’s a part of a series of internet hoaxes. How will I save him if I’m not there? How do I talk to my son about these possibilities, even if they seem improbable? The American Psychiatric Association and the American Academy of Pediatrics recommends parents avoid discussing mass shootings with young children until they are around 8. What should I tell my son now? When I read articles about how to talk to children about guns, I repeatedly encounter the phrase “the new normal.” The new normal is preparing your family for an active shooter situation. The new normal is not “stop, drop, and roll” but “run, hide, and fight.” My son only understands these concepts as dramatic play. At the playground, an older boy has a green plastic AR-15 he points at his playmates. He spins in a circle on the wood chips, chanting pop-pop-pop-pop. The younger boys gawk in awe. My son wants to touch it. My father carried his gun with intention when he and my uncle walked into a grocery store during the ’80s, armed, faces covered, ready to take everything they wanted. Will this boy become the teenager who trades his plastic toy for live weapons? Will he just play video games? Where is the line between pretending and life, between playing and violence? The gun I played with as a teenager was real. I’m 14, in the Mojave Desert standing next to the boy I want to impress. The sun burns my skin. I’m holding a barely cold can of Natural Ice. I push the can down into the dirt as more boys gather around behind me. Half-drunk and laughing, they don’t wear shirts. They have tan, muscled stomachs they won’t get to keep. We’re all slathered in dirt. “It’ll have a kickback,” he tells me, and I take the shotgun confidently to show that I know what this means. Kickback. I’m shooting at nothing, pointing at the vast sand beyond the four-wheeler. I pull the trigger. “Hah!” Laughter shoots from his body as the shotgun forces itself, hard, into mine, punching me in the chest. The boys erupt. I’m now not just the girl someone might hook up with when we get wasted. I’m the girl who fired the gun. The girl with the bruises all over her chest. I pick up the beer and drink it, holding the gun with my other hand. As parents, we’re the last generation to remember a childhood before school shootings. I graduated high school in 1999, the same year Columbine began this new era. Emma Gonzalez and the Peace Warriors from Chicago are my heroes. They’re teenagers. Their stories, their accomplishments are the books I want to read to my son, the media I want him to watch, the lunch pail I want him to carry. His heroes are the Rescue Bots, Star Wars Jedis, and Ninjago. We could so easily get guns. We felt so cool playing with them, drinking cheap beer in the desert. How dangerous that this kind of gunplay seems innocent now. After finding a gun in the couch at 3 and firing one at 14, I’m now trying to teach my son not to point a finger gun: “I’m only shooting the bad guys, Mommy.” Who are the bad guys? Why are they bad? My son is already a potential fighter, a potential soldier, trying to cope with sophisticated male expectations of violence through play. “Why did you bite her?” “I wanted the bike.” “Do you think there was maybe a better way to get a turn, a way where you don’t hurt your friend?” His face reddens as he whimpers, “Yes. But I already had a turn. I wanted to keep my turn.” His intentions are so clear. The preschooler’s lack of self-awareness gives my son such direct access to the reasons for his actions, but it also makes it challenging for him to regulate his impulses. If I bit someone in front me in line at the DMV because I wanted their turn, I might get arrested. It would take me so long to psychologically suss out why I committed such a spontaneous act of violence. But my son can say it. He knows why he bites his friends. My son’s gunplay is pretend. Mine was real. My dad’s. I am standing here facing the media, the toys, the lunch pails with heroes wielding guns, sabers, and swords, standing here with the responsibility to make sure my son is a peaceful person. It’s easy to feel outnumbered, exhausted, and unsure. In those moments of uncertainty, it’s easy to want a shortcut to intention, to a certainty that I’m parenting for peace and respect for all persons. I’m not the toddler pulling the gun from the couch or the girl drinking in the desert; I’m a mother parenting in the age of the active shooter. I can’t guarantee my son’s safety or be certain of who he will grow up to be, but I do know the code to the school gate, to the building door. I’m trying to teach him to put down the finger gun. Dropping my son off at his classroom door each morning I ask him, “What’s today going to be?” and each morning he replies, “ A great day, Mommy.” Image: Flickr/

On Semicolons and the Rules of Writing

1. Kurt Vonnegut’s caution against the use of semicolons is one of the most famous and canonical pieces of writing advice, an admonition that has become, so to speak, one of The Rules. More on these rules later, but first the infamous quote in question: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college.” To begin with the lowest-hanging fruit here—fruit that is actually scattered rotting on the ground—the “transvestite hermaphrodite” bit has not aged well. The quote also, it seems, may have been taken out of context, as it is followed by several more sentences of puzzlingly offensive facetiousness, discussed here. That said, I also have no idea what it means. My best guess is that he means semicolons perform no function that could not be performed by other punctuation, namely commas and periods. This obviously isn’t true—semicolons, like most punctuation, increase the range of tone and inflection at a writer’s disposal. Inasmuch as it’s strictly true that you can make do with commas, the same argument that could be made of commas themselves in favor of the even unfussier ur-mark, the period. But that is a bleak thought experiment unless you are such a fan of Ray Carver that you would like everyone to write like him. Finally, regarding the college part, two things: First, semicolon usage seems like an exceedingly low bar to set for pretentiousness. What else might have demonstrated elitism in Vonnegut’s mind? Wearing slacks? Eating fish? Second, in an era of illiterate racist YouTube comments, to worry about semicolons seeming overly sophisticated would be splitting a hair that no longer exists. But however serious Vonnegut was being, the idea that semicolons should be avoided has been fully absorbed into popular writing culture. It is an idea pervasive enough that I have had students in my writing classes ask about it: How do I feel about semicolons? They’d heard somewhere (as an aside, the paradoxical mark of any maxim’s influence and reach is anonymity, the loss of the original source) that they shouldn’t use them. To paraphrase Edwin Starr, semicolons—and rules about semicolons—what are they good for? As we know, semicolons connect two independent clauses without a conjunction. I personally tend to use em dashes in many of these spots, but only when there is some degree of causality, with the clause after the em typically elaborating in some way on the clause before it, idiosyncratic wonkery I discussed in this essay. Semicolons are useful when two thoughts are related, independent yet interdependent, and more or less equally weighted. They could exist as discrete sentences, and yet something would be lost if they were, an important cognitive rhythm. Consider this example by William James: I sit at the table after dinner and find myself from time to time taking nuts or raisins out of the dish and eating them. My dinner properly is over, and in the heat of the conversation I am hardly aware of what I do; but the perception of the fruit, and the fleeting notion that I may eat it, seem fatally to bring the act about. The semicolon is crucial here in getting the thought across. Prose of the highest order is mimetic, emulating the narrator or main character’s speech and thought patterns. The semicolon conveys James’s mild bewilderment at the interconnection of act (eating the raisins) and thought (awareness he may eat the raisins) with a delicacy that would be lost with a period, and even a comma—a comma would create a deceptively smooth cognitive flow, and we would lose the arresting pause in which we can imagine James realizing he is eating, and realizing that somehow an awareness of this undergirds the act. An em dash might be used—it would convey the right pause—but again, ems convey a bit of causality that would be almost antithetical to the sentence’s meaning. The perception follows temporally, but not logically. In fact, James is saying he doesn’t quite understand how these two modes of awareness coexist. Or consider Jane Austen’s lavish use of the semicolon in this, the magnificent opening sentence of Persuasion: Sir Walter Elliot, of Kellynch Hall, in Somersetshire, was a man who, for his own amusement, never took up any book but the Baronetage; there he found occupation for an idle hour, and consolation in a distressed one; there his faculties were roused into admiration and respect, by contemplating the limited remnant of the earliest patents; there any unwelcome sensations, arising from domestic affairs changed naturally into pity and contempt as he turned over the almost endless creations of the last century; and there, if every other leaf were powerless, he could read his own history with an interest which never failed. Periods could be ably used here, but they would not quite capture the drone of Elliot’s stultifying vanity. Again, form follows function, and the function here is to characterize the arrogantly dull mental landscape of a man who finds comprehensive literary solace in the baronetage. More than that, the semicolons also suggest the comic agony of being trapped in a room with him—they model the experience of listening to a self-regarding monologue that never quite ends. We hardly need to hear him speak to imagine his pompous tone when he does. The semicolon’s high water usage mark, as shown here, was the mid-18th to mid-/late 19th centuries. This is hardly surprising, given the style of writing during this era: long, elaborately filigreed sentences in a stylistic tradition that runs from Jonathan Swift to the James brothers, a style that can feel needlessly ornate to modern readers. Among other virtues (or demerits, depending on your taste in prose), semicolons are useful for keeping a sentence going. Reflecting on the meaning of whiteness in Moby Dick, Melville keeps the balls in the air for 467 words; Proust manages 958 in Volume 4 of Remembrance of Things Past during an extended, controversial rumination on homosexuality and Judaism. There is a dual effect in these examples and others like them of obscuring meaning in the process of accreting it, simultaneously characterizing and satirizing the boundaries of human knowledge—a sensible formal tactic during an era when the boundaries of human knowledge were expanding like a child’s balloon. Stylistically, the latter half of the 20th century (and the 21st) has seen a general shift toward shorter sentences. This seems understandable on two fronts. First—and this is total conjecture—MFA writing programs came to the cultural fore in the 1970s and over the last few decades have exerted an increasing influence on literary culture. I am far from an MFA hater, but the workshop method does often tend to privilege an economy of storytelling and prose, and whether the relationship is causal or merely correlational, over the last few decades a smooth, professionalized, and unextravagant style has been elevated to a kind of unconscious ideal. This style is reflexively praised by critics: “taut, spare prose” is practically a cliche unto itself. Additionally, personal communication through the 20th century to today has been marked by increasing brevity. Emails supplant letters, texts supplant emails, and emojis supplant texts. It stands to reason that literary writing style and the grammar it favors would, to a degree, reflect modes of popular, nonliterary writing. Beyond grammatical writing trends, though, semicolons are a tool often used, as exemplified in the Austen and James examples, to capture irony and very subtle shades of narrative meaning and intent. It might be argued that as our culture has become somewhat less interested in the deep excavations of personality found in psychological realism—and the delicate irony it requires—the semicolon has become less useful. Another interesting (though possibly meaningless) chart from Vox displays the trend via some famous authors. As fiction has moved from fine-grained realism into postmodern satire and memoir, has the need for this kind of fine-grained linguistic tool diminished in tandem? Maybe. In any case, I have an affection for the semi, in all its slightly outmoded glory. The orthographical literalism of having a period on top of a comma is, in itself, charming. It is the penny-farthing of punctuation—a goofy antique that still works, still conveys. 2. A larger question Vonnegut’s anti-semicolonism brings up might be: Do we need rules, or Rules, at all? We seem to need grammatical rules, although what seem to be elemental grammatical rules are likely Vonnegutian in provenance and more mutable than they seem. For instance, as gender norms have become more nuanced, people—myself included—have relaxed on the subject of the indeterminately sexed “they” as a singular pronoun. Likewise, the rule I learned in elementary school about not ending sentences with prepositions. Turns out there’s no special reason for this, and rigid adherence to the rule gives you a limited palette to work with (not a palette with which to work). We know, on some level, that writing rules are there to be broken at our pleasure, to be used in the service of writing effectively, and yet writing is such a difficult task that we instinctively hew to any advice that sounds authoritative, cling to it like shipwrecked sailors on pieces of rotten driftwood. Some other famous saws that come to mind: Henry James: “Tell a dream, lose a reader.” Elmore Leonard: “Never open a book with weather.” John Steinbeck: “If you are using dialogue—say it aloud as you write it. Only then will it have the sound of speech.” Annie Dillard: “Do not hoard what seems good for a later place.” Stephen King: “The road to hell is paved with adverbs.” And more Kurt Vonnegut: “Every character should want something, even if it is only a glass of water”; “Every sentence must do one of two things—reveal character or advance the action”; “Start as close to the end as possible.” In the end, of course, writing is a solitary pursuit, and for both good and ill no one is looking over your shoulder. As I tell my students, the only real writing rule is essentially Aleister Crowley’s Godelian-paradoxical “Do what thou wilt, that shall be the whole of the law.” Or alternately, abide by the words of Eudora Welty: “One can no more say, ‘To write stay home,’ than one can say, ‘To write leave home.’ It is the writing that makes its own rules and conditions for each person.” Image: Flickr/DaveBleasdale

After the Welfare State: Kathy Acker and the American Health Care System

Chris Kraus’s After Kathy Acker gives readers a long-awaited account of the experimental writer as a living, breathing, fucking, and frequently, sick human being. Masterful in detail, drawing on deep archival sources and interviews, Kraus’s account grounds Acker’s project in the physical environments of late-20th-century New York, California, and London. Consider, for example, the beautiful sentences Kraus uses to open the book’s main action: New York City, 1971: The bed, rarely made, floats in a room painted orange with big violet stars. She spends most of her days and nights in the bed, sleeping and writing. Her hair is cut short. Twice, unable to do anything with it, she shaves it off. The inside of the closet is violet, matching the stars. The room could be anywhere, really, although in actual fact it’s on the sixth floor of a building in Washington Heights, upper Manhattan, straddling the corner of Broadway and 163rd Street. There are gates on the two skinny windows, facing north onto 163rd. Even in 1971, the old prewar building, with its large corniced lobby, had seen better days. Moving through the shabby, abject places where Acker lived and worked, After Kathy Acker adds to a growing literature that examines the lives of bohemian writers of 1970s and 1980s New York and the relationship of these writers with the impoverished around them. Like Arthur Rimbaud and Charles Baudelaire in the previous century, and William Burroughs and Allen Ginsberg in earlier decades, these writers developed sympathies for their neighbors that drew on their lived proximity to such poverty. Acker lived close to the bone in the 1970s, alongside segments of the population rapidly being labeled an unproductive “underclass” responsible for the country’s economic and moral woes. In the early 1970s, Washington Heights was holding on, but barely. It was just across the river from the scene of the city’s worst urban destitution, the South Bronx, and residents knew what was coming. By the 1980s, the neighborhood would be overrun with crime. Acker knew the spaces of poverty well, and these spaces animate her literary project. By detailing Acker’s lived life, Kraus offers new perspectives on the contours of Acker’s work, particularly as these relate to the welfare state that ameliorated poverty. One of the most prominent features of Washington Heights is the Columbia Presbyterian Medical Center. The center became important to Acker in 1971, when pelvic inflammatory disease drove her into its doors. In her unpublished journals, she recollects, “I walk into wooden hospital, emergency exit for Columbia Presbyterian hospital. Largest room wood walls puke yellow. Or rot yellow. […] In front of folding chairs, wood booths. I sit in chair. I hurt.” The scene is shocking and ordinary, disturbing and workaday. It is one of thousands that occurred in New York alone in the 1970s, especially as the municipal hospital system began to fall apart in the wake of the 1975 fiscal crisis. Kraus’s I Love Dick is well-known for blurring biography and fiction, and Kraus finds her own project in Acker, showing us how Acker’s project drew on her lived life. The hospital scene from Acker’s journals would resurface throughout her early fiction, appearing in some version in The Childlike Life of the Black Tarantula (1973), I Dreamt I Was a Nymphomaniac (1980), and Great Expectations (1980). In Tarantula, for example, it becomes “woman vomits blood over floor, wood booths with tiny filthy white curtains for doctors’ rooms in front of wood chairs on wood floor filthy, doctor yells at one man you haven’t taken your medicine now you’re going to die of T.B. man is skeleton sit down not next to anyone we’ll call you man sits down next to me.” In Nymphomaniac: “Ahead of me I see an old guy sitting in a folding chair. Everyone’s screaming pain. A doctor standing next to a filthy curtain which swings from a wood frame says, “Mr. Smith”…Doctor walks in: I think she’s a nurse. She tells me she’s a doctor. I lay down, she sticks a silver thing into my cock. Each time she turns it, I hurt more. Everything becomes total pain. I’m screaming. Everyone in the examining room’s screaming. I tell the nurse to stop.” Acker’s characters have always suffered in exaggerated ways; she’s particularly fond of torture as a motif. Acker figures its contours carefully and slowly, preserving them through their repetition across her work. She brings the simple, terrible space of the emergency room to her larger project, documenting her lived encounter with an impersonal—but useful and helpful—system. By this point in the national discussion of health care, this scene is familiar: Woman without insurance gets deathly ill and goes to the emergency room or, today, urgent care. What happens next is also familiar: Acker writes in her journal, “Doctor won’t see me until I pay for last visit & this visit. I hate being yelled at. Otherwise I wouldn’t have paid.” To pay, she found work doing sex shows but had to take pain pills during them to “walk without screaming.” This scene became part of Acker’s literary project. Kraus uses it as an example of what she calls “the strength, and also the weakness” of Acker’s writing: Acker’s tendency to rewrite memories until “they became conduits to something a-personal, until they became myth.” In this case, Kraus meant that “in book after book, Acker would describe the cycle of despair of doing sex work to buy medicine so she could keep on doing sex work,” adding, somewhat cynically, that Acker “[crafted] these months of her life into something more allegorical than her actual life on West 163rd Street.” The weakness of this account is that the episode is limited; the strength of this account is that it becomes impossible to ignore after it appears for the third time. Part of that allegory told a story about institutions like Columbia Presbyterian. Acker’s work, up to and including her death from cancer in 1997, has sometimes been read as a wide attack on the institutions that dominate the private lives of individuals. Acker was associated with a wide body of anti-institutional critical theory, particularly that of Gilles Deleuze, Félix Guattari, and Michel Foucault. Kraus doesn’t indicate whether Acker attended the 1975 Schizo Culture conference run by Sylvère Lotringer, at which all three men spoke. But certainly, by 1977, Acker was published alongside these men in the issue of Semiotext(e) based on the conference. The Schizo Culture conference might be called the birth of the academic left’s long critique of the welfare state, at least in the United States. In the conference proceedings, one finds early examples of the Foucauldian critique that dominated the U.S. academy in the 1980s and 1990s. This critique often took as its target the welfare state that expanded in the 1950s and 1960s, part of what has been called the Fordist compact between government, industry, and unions. At a local level, New York hospitals had aggressively expanded their mission and physical footprint during this time, increasing in particular their mental health services. It’s doubtless true that this expansion meant more surveillance and control over the poor in particular. But there’s another way to view all this: Even as she evolved this memory into a critique of the medical system, Acker was showing how it worked, especially in New York, for impoverished Americans, while also showing the dire consequences of its limitations. She’s immersed in that waiting room; her journals take care to observe not just the color of the walls but the way doctors are behaving toward other patients, the way everything seems in need of repair, the way everyone has to sit for hours in pain. In this respect, those years at 163rd Street seem to have marked Acker in ways other than myth, as manifested by passages like this one from Blood and Guts in High School: “Most people are what they sense and if all you see day after day is a mat on the floor that belongs to the rats and four walls with tiny piles of plaster at the bottom, and all you eat is starch, and all you hear is continuous noise, and all you eat is starch, and all you hear is continuous noise, you smell garbage and piss which drips through the walls continually, and all the people you know live like you, it’s not horrible, it’s just…” Yes, the poor are part of the myth: Acker often used the poor to signal her protagonists’ abjection and position outside of power. But Acker was also not the only one deploying the poor as part of a myth. As the historian Michael Katz emphasizes, the image of the “undeserving poor”—whether as represented by “urban crisis” or “underclass,” forms a key part of those discourses aimed toward destroying the welfare state, a narrative whereby welfare is blamed for “economic stagnation and social decay.” Such discourses emerged during the Nixon administration and operate with equal vigor through Paul Ryan and Donald Trump. These privatizing discourses were everywhere in the 1970s but took some time to solidify as a target of leftist critique, a problem shaped by everything from the legacy of McCarthyism to the rise of identity politics. Arguably, the rise of French theory in the academy didn’t help. As the moment of “high theory” has faded in the academy, such discourses have begun to be reassessed. In this context, critics such as Bruce Robbins have begun describing the irony of institutional critique in the post-Fordist moment of the welfare state’s decline. In Upward Mobility and the Common Good, Robbins argues that while the welfare state was an “imperfect historical form,” it nevertheless represents “a defensible common program in which the glaringly different interests of the poor and needy, on the one hand, and elite experts, on the other…appear to be resolved.” In the context of privatization, Robbins argues, “the persistence of the Foucaultian school in interpreting the welfare state as an apparatus of domination” has come to seem, at best, open for debate. Acker may have allied herself with that Foucaultian school, especially in Empire of the Senseless and other late works, which depict hospitals and prisons as part of a Burroughsian nightmare. In that 1971 emergency room, though, Acker isn’t concerned with being surveilled: She wants conditions to improve, both for both herself and for the elderly man with tuberculosis waiting alongside her. As Joshua Freeman describes in Working-Class New York, the system that Acker encountered in 1971, however imperfect, was pretty good. Elsewhere in the country, the postwar medical system was compromised from the beginning. Labor had to fight not only doctors, hospitals, and insurers but often the government as well. New York, though, had two things going for it: a dense concentration of unionized workers and independent-minded doctors. Combined, these produced in the postwar years what Freeman calls “an exceptional health care delivery system.” As Freeman quickly adds, though, that system came under pressure in the late 1960s. By 1971, The New York Times editorial board was warning its readers of the impossibility of providing quality health care to all Americans in the wake of a threatened strike and the establishment of the Health and Hospitals Corporation, an organization that quickly moved to demand a 5 percent cut in hospitals citywide. After the city’s 1975 fiscal crisis, cuts were made even more widely. Reflecting on a letter Acker wrote to Ron Silliman about New York’s 1975 fiscal crisis, Kraus describes “welfare, unemployment insurance, and disability SSI” as “de facto grants that funded most of New York’s off-the-grid artistic enterprises.” In the 1970s, writers and artists like Acker needed the welfare state. I don’t say this to point out Acker’s hypocrisy, as if Acker were Ayn Rand requiring Social Security in her dotage. I instead mean to point out that a desire for more care, not less care, has been present in Acker’s work from the beginning. In the midst of the welfare state’s breakdown in the 1970s, Acker in Nymphomaniac showed her readers how the “poor who can have no lovers, who long to die” deserved better than a hospital that “was the worst hell I had ever been in.” Certainly, they deserve better than, as Acker describes in her journals,  a choice between going without medical care and doing sex work while taking pain medication “so that I can walk.” The health care system haunts After Kathy Acker. It can’t help doing so: Acker’s 1997 death was, in part, a consequence of her refusal of chemotherapy, a refusal that Acker herself attributed to the high costs of getting care without health insurance. In other words, while Acker’s refusal is sometimes attributed to her career-long critique of the biopolitical control of health care, it’s equally the case that Acker simply lacked access to health care. Had the Affordable Care Act been in effect in 1996, there’s some likelihood that Acker, as a visiting professor at the San Francisco Art Institute, would have been at the very center of the policy: an independent contractor, one of the “many self-employed people in San Francisco” who “bought their own coverage,” as Kraus suggests Acker might have done without subsidies (and a $260,000 trust to her name). But in following the idiosyncratic Acker through spaces of poverty, Kraus has already made the case for a different world, one in which the self-employed, like the impoverished, find quality service that doesn’t turn dystopian. That world became more possible with the Affordable Care Act, legislation that Acker the writer might have hated but that Acker the cancer patient might have used. Image: Columbia University Medical Center

The Problem with Patriotism: A Critical Look at Collective Identity in the U.S. and Germany

1. In 1984, George Orwell’s year of looming dystopia, I received an academic scholarship to study fine arts and moved to Germany, a country that had embodied modern dystopia to an unprecedented degree. The scholarship, awarded to students of the United States and the United Kingdom, had been created in commemoration of the Berlin Air Lift of 1948–1949. It carried the cumbrous name “Luftbrückendank” (literally “Air Lift Thanks”), which amused my artist colleagues but reminded them that the Allied forces occupying their country still retained a degree of legal jurisdiction over the whole of East and West Germany—inducing in them that strange mixture of irritation, envy, and respect that anything American inspired in Europe at the time. You lose several important parts of yourself when you move to another country. First you lose your language: struggling to explain things in simple terms, grappling with a new grammar and syntax, you wince at the wince on the face of the person you’re talking to as you stumble through your botched sentence. The next thing you lose is your identity as a citizen, as a member of an ethnic group, as a native. Eventually, however—and this is the strange part—loss leads to gain. You learn to master this new language, grow comfortable in this new culture: You crack jokes, slip into slang; it becomes second nature, and you think in it now, dream in it. This loss and gain does not, of course, revert to a default setting when I’m back home; it’s permanent, and it colors my perception of America. I am both of us and not of us. I’ve begun the process of acquiring dual citizenship, of declaring not only an emotional but also a formal loyalty to my adopted home—but although I pay my taxes there, am raising my child there, have lived considerably more than half my life there—my position is and always will be ambivalent. Because I didn’t stay to absorb the changes gradually, adapt to them organically, the current state of affairs—our own new 1984—comes as a profound shock. 2. Leave America, and you begin to see it as the rest of the world sees it: as an unpredictable, potentially hostile force dedicated exclusively to protecting its own interests; as a gargantuan military power with an aggressive presence on the world stage and a dangerously undereducated populace. We’ve toppled governments, covertly assassinated democratically elected leaders, waged illegal wars that have poisoned and destabilized entire regions around the globe. The enormous postwar bonus we’ve enjoyed—our status as the world’s darlings—has been eroding steadily away, yet incredibly, we still imagine that everyone loves us. Peering wide-eyed from our self-absorbed bubble, we issue Facebook “apologies” to the rest of the world for our mortifying president and his absurd coterie, not quite realizing that the world, at this point, is less interested in how Americans feel than in foreseeing, assessing, and coping with the damage the United States is likely to wreak on world peace, stability, economic justice, and the environment. James Baldwin, after having spent more than a decade in France, observed that “Europeans refer to Americans as children in the same way that American Negroes refer to them as children, and for the same reason: they mean that Americans…have no key to the experience of others. Our current relations with the world forcibly suggest that there is more than a little truth to this.” Although Baldwin was conflicted by the feeling that he’d shirked his responsibility by moving abroad, and he returned many times throughout the civil rights era, he also understood that a great deal of his artistic and intellectual maturity had grown out of the distance he’d put between himself and his native country. A special type of perception arises when we see something we already know fairly well but after a protracted absence: when it’s stripped of its familiarity and all of a sudden becomes a strange new thing—but only for a little while. Habit quickly settles back in and the specialness of this particular type of perception fades. For the first few moments, though, you get a sense that what you’re seeing is essentially reality, divested of its numbing effect. It’s kind of like the mildly hallucinatory state one experiences on psilocybin, and I get an initial jolt of this kind every time I come back to the U.S. It starts in the airport: For a minute or two, the entire scene, including myself standing in a line of passengers waiting to proceed through passport control, feels like an insane asylum in which utter nonsense issues forth from TV screens everywhere while people barely take notice or, worse, watch with interest and don’t find it shocking at all. The fake jokiness of the news, the non-news content of it, the stupefaction, the graphics. The entire appalling reality of what the country has become and the memory that it used to be very different. When a German friend of mine visited me one year in Brooklyn, she remarked that she felt confused: Everything looked and sounded just as it did in the movies and on TV—the cops-and-robbers-blare of the police sirens, the steam rising in thick clouds from the manhole covers—it was all too familiar; she’d seen it all hundreds of times, and therefore nothing seemed real. While I found myself struggling to articulate this indescribable thing—the surreal whatness of things, the sense that everything had fallen under some kind of evil spell—to her, American identity, or Americanness, felt like a simulation of itself. This narrative insistence, narrative hegemony even, this adamant and endless proclamation of ourselves, is virtually unique to the United States in its power and its exclusion of the rest of the world. It not only permeates every facet of life in the U.S. but also implicitly questions the validity of other cultural identities it has not, in some way, already absorbed. Indeed, other narratives are virtually untranslatable unless they conform to the American script and contain the requisite ingredients. You need a good guy and a bad guy; you need a dream and something standing in the way of that dream. You need inauspicious circumstances that threaten to defeat the hero so that he can take heart, rise to the challenge, and win. The national narrative is a narrative of infantilization, a fairy tale written for children in which love, sex, family, in fact all human endeavor, is sentimentalized, stripped of nuance and ambiguity and all of life’s inherent contradictions. We need everything spelled out; we are a culture with childish notions, even of childhood. It’s as though the American mind were calibrated to a single overriding narrative: It can be found not only in our movies but in our politics, our journalism, our school curricula; it’s on our baseball fields, in our TED talks, award ceremonies, and courts of law, among our NGOs and in our cartoons and the way we speak about disease and death—virtually anywhere we enact the stories we’ve created about our history; our collective aspirations; our idea of who we are and what we’d like to become. But to disagree with this narrative, to call its major premises into question, is to betray the tribe. Identity is a construct that forms in response to a psychic need: for protection, for validation, for a sense of belonging in a bewildering world. It’s a narrative; it tells itself stories about itself. But identity is also a reflex, a tribal chant performed collectively to ward off danger, the Other, and even the inevitable. Its rules are simple: They demand allegiance; they require belief in one’s own basic goodness and rightness. It’s a construct based not in fact but on belief, and as such it has far more in common with religion than with reason. I try for the life of me to understand what it is and how the fiction of what this country has become has turned into such a mind-altering force that one can only speak of mass hypnosis or a form of collective psychosis in which the USA still, bafflingly, sees itself as the “greatest nation on Earth,” in which anything that calls what makes America American into question is met not with impartial analysis or self-scrutiny but indignant and often hostile repudiation. We have, as Baldwin observed in his Collected Essays, “a very curious sense of reality—or, rather…a striking addiction to irreality.” Are we really as brave as we think we are; are we as honest, as enterprising, as free as we think we are? We’re not the envy of the world and haven’t been for a long time, and while this might not match the image we have of ourselves, it’s time to address the cognitive dissonance and look within. Before the fall of the Berlin Wall, German identity was predicated on the rejection of the Other. And because the two Germanys defined themselves in opposition to one another, because each of them claimed a moral superiority over the other, one that was contingent on the depravity of the other—that blamed the fascist past on the other—anything that touched upon this taboo was a threat. The ’90s were an intense time to be in Berlin, and it took a foreigner, the French artist Sophie Calle, to see and address what almost every German I knew wasn’t able to: the fact that the country was slipping, post-Reunification, into a state of selective amnesia. Her work Detachment (1996), in which she asked passersby to describe emblems of German Democratic Republic history that had been removed—including the gigantic wreath with hammer and sickle on the Palast der Republik, the former East German parliamentary building that would itself soon be demolished—succinctly captures an era of collective repression during which the past was in a process of being rewritten. The empty space left behind by the removed monument or plaque became a gap in memory that could not be filled by the largely inaccurate descriptions of passersby, no matter how emotionally charged or how close to home. Eventually, as an entire country struggled with the transition from planned economy to free-market capitalism and it slowly dawned on the former citizens of the GDR that the West was not the utopia they’d imagined it to be, this gap would be filled by “Ostalgie,” a nostalgia for all things East—matched in intensity only by the witch hunts carried out on anyone with even the most tenuous Stasi connection in their past. But then again, idealization and demonization have always gone hand in hand, while maintaining power depends on controlling the narrative and interpreting the evidence. [millions_ad] 3. A few days ago, I took my son to see the Andres Veiel film on Joseph Beuys, which premiered at the Berlinale earlier this year. I hadn’t actually thought about Beuys in a long time, but it seemed to make sense to turn my 16-year-old budding graffiti artist on to some bona fide radicalism in art. The first time I saw Beuys’s work was in the 1979 show at the Guggenheim, and I understood nearly nothing. Gradually, of course, I came to see the ways in which Beuys—both his person and work—epitomized the psychological drama of postwar West Germany, but what I’ve rediscovered now, after seeing the Veiel film, is that he also had quite a lot to say about America—and that he and his art are as relevant today as they were 40 years ago. In May of 1974, Beuys, wrapped in felt from the moment his plane touched ground on the American continent, was taken to the Rene Block Gallery in Soho in an ambulance. The plan was to spend several days locked up in a room with a coyote. I Like America and America Likes Me was Beuys’s cryptic and poetic critique of the country and its violent history; in choosing an animal indigenous to the North American continent for over a million years, an animal that is totemic in Native-American mythology and that inspired the coyote deity stories dating back over 10,000 years, the oldest known literature on the continent—an animal that nonetheless fell victim to a bitter, widespread, and ruthless extermination campaign—Beuys instinctively went for America’s Achilles heel: its founding genocide. His action took on the guise of a shamanistic ritual: America was spiritually ill, he contended, the result of a violent psychological repression; there was “a score to be settled with the coyote,” Beuys said, “and only then will this trauma be over.” “We know,” Baldwin wrote, “that whoever cannot tell himself the truth about his past is trapped in it, is immobilized in the prison of his undiscovered self. This is also true of nations.” When you consider that Germany has been confronting the worst elements of human nature for over 70 years—that its history and its own obsessive and enduring examination of this history have left little in the way of pride—it becomes clear what happens when a national narrative is rewritten through an experience of profound shame. It’s a lesson in humility, one we’d all be well-advised to learn. We’re talking relentless national debate, relentless historical analysis. Picture three full pages of The New York Times dedicated every day, over a period of decades, to uncovering layer upon layer of past atrocities; picture a debate, continuing over decades, over the myriad ways in which history nonetheless continues to collude. Picture the U.S. getting anywhere near this thorough a look at its own history: the true circumstances surrounding Europe’s arrival in the Americas, for instance, or the ruthless exploitation and eventual extermination of the indigenous population and the never-ending legacy of slavery. Imagine the U.S. making this historical analysis a mandatory part of its school curriculum, nationwide. In Germany, pride has been taken in an absence of the very pride that led it to the abyss in the first place, in an aversion to grandiosity and hyperbole and fanaticism, in a distance to and general suspicion of national identity. Now, however, and in spite of all it’s learned, the country is faced with the emergence of a radical right that, for the first time in 60 years, has garnered enough votes to enter the German national parliament—a startling fact that demonstrates the precariousness of any social equilibrium in the face of rabid populist revisionism, no matter how long or bitter the struggle to achieve it. 4. There’s the old story of the frog sitting in a pot of water on the stove: As the temperature rises, the change is too gradual for the frog to detect the danger and escape to safety. National culture exerts a similar spellbinding effect, in which all forces serve to craft and reinforce a narrative that passes for objective reality. One of nationalism’s most deleterious illusions is that “evil” is something that comes from without—and not something lurking inside each of us, waiting to be activated, waiting to be unleashed. In the words of Baldwin, writing about Shakespeare, “all artists know that evil comes into the world by means of some vast, inexplicable and probably ineradicable human fault.” Taking this thought to its logical conclusion, Einstein claimed in Ideas and Opinions, “in two weeks the sheeplike masses of any country can be worked up by the newspapers into such a state of excited fury that men are prepared to put on uniforms and kill and be killed, for the sake of the sordid ends of a few interested parties.” We are, in other words—despite our prodigious brains—still very much animals, subject to a herd mentality. But there is also a subtler form of spellbinding, one that lies in acquiescence. Americans know they’re in crisis, understand that their democracy is at risk, yet what I see—with very few exceptions beyond the occasional comparisons to the Weimar era that directly preceded the advent of fascism in Germany—are not efforts to transcend national identity in order to understand the dangerous ways in which the human mind is vulnerable to suggestion and manipulation, but a clambering to recover American “values” and cherished attributes and to reaffirm them. One of the arenas in which these efforts are enacted is language itself. Yet while Orwell foresaw the rewriting of the historical past and the falsification of existing documents, including newspaper archives, books, films, photographs, etc., to bring them into line with party doctrine and prove its infallibility—while he predicted the reduction of language as a powerful tool to curtail the radius of human thought for political ends and postulated a semantic system in which words are used to denote their opposite and are thus rendered meaningless—even the broadest political, historical, and psychological analysis of how propaganda has been used throughout the ages to whip up popular support and manipulate the mass mind pales when applied to the phenomenon of fake news, which takes “Newspeak” and multiplies it to kaleidoscopic dimensions. All the U.S. needs is one good international crisis for the patriotism reflex to kick in: It’s an immediate emotional response, yet what is needed most in times of shock is a suspension of emotion, distance to the forces that would manipulate us. What happens is this: Something shakes us to the marrow, we rally around what makes us feel safe—and it’s the bulwark of national identity we cling to, even if this identity is precisely what clouds our cognitive faculties most. But when someone steps forward and offers a truly critical perspective—Susan Sontag in the days immediately following 9/11 comes to mind here—this is the moment she is held in the greatest suspicion, because critical distance means that she is not part of the emotional bond a reaction to a state of shock brings about, that the observations she makes or the conclusions she draws might find fault not with some evildoing Other but with us, with our own. Better to brand the critic an alien with alien allegiances—in other words, something dangerous: a tainting, a contamination, a contagion. It’s through narrative that reality acquires meaning and becomes intelligible, that it conceives itself, enacts itself. Yet the national narrative has made it virtually impossible for Americans to perceive themselves and the world around them in any accurate or objective way. Words have morphed to the point where they no longer signify anything but rather act as invisible triggers, actively shut thought down and preclude the possibility of communication. Everywhere we look, we see what we want to believe about ourselves. We are, after all, the birthplace of Hollywood: It should come as no surprise, then, that we prefer fairy tales to the laws of nature and the tedious facts of reality; that the boundaries between fact and fiction have not only blurred but have become, to us, undetectable. “We are often condemned as materialists,” Baldwin wrote. “In fact, we are much closer to being metaphysical because nobody has ever expected from things the miracles that we expect.” We’re in the business of inventing superheroes with fabulous, gravity-defying superpowers and have been daydreaming about them for such a long time that it’s entered our collective subconscious, become a part of our DNA. And so we imagine that Robert Mueller and his investigation will save us, or Stormy Daniels and her titillating revelations, or our very own Jeanne d’Arc, Emma Gonzalez, with the incorruptibility of youth and a God-given ability to speak truth to power. As written in the German paper Die Zeit, the “ostentatious vulgarity” of the present American administration “shouldn’t distract us from the fact that…something is happening that goes beyond mere audacity, that cannot really be described, even with the word ‘propaganda,’ a term that today has become inflated and imprecise.…It’s more about doing away with the principle of truth altogether, the categorical differentiation between true and false.” As the author of the article mentions, the philosopher Hannah Arendt analyzed precisely this in an interview with Roger Errera in 1974: “If people are constantly lied to, the result isn’t that they believe the lies, but rather that no one believes anything at all anymore.…And a people that can no longer believe anything cannot make up its mind. It is deprived not only of its capacity to act but also of its capacity to think and to judge. And with such a people you can then do what you please.” As far as Germany is concerned, it’s not entirely clear whether the lessons learned and the insights gained during the long and painful process of Vergangenheitsbewältigung (a term referring to Germany’s process of coming to terms with its Nazi past) will be lasting ones; if they can immunize the country against the dangers of recurrence. The rapidly growing popularity of PEGIDA (Patriotic Europeans Against the Islamicization of the Occident) and the AfD (Alternative for Germany) and their reactionary anti-immigration and anti-Europe agendas remind us how precarious and contingent the current state of affairs remains, how enduring the fear of the Other continues to be. Yet it’s only when a populace ventures beyond the spellbinding effects of its collective identity—when the national debate succeeds in illuminating the blind spots of this identity to compare one culture’s hallucinations with another’s—that it has a chance at breaking that spell. What unites us most is how quickly our efforts can be instrumentalized for someone else’s purposes—and the ease with which we can be duped and played. We aren’t living in Orwell’s world, yet—we still have time to reclaim some of what’s been lost—but it’s anyone’s guess how much time remains. Image: Flickr/Darron Birgenheier

Baby Steps All the Way: Making the Time to Write a Book

Track practice. An hour and a half. A metal picnic table. Cold enough for hats and gloves, hot enough for shorts and flip-flops. Other parents talking about football and summer camps and the new high school. Tennis practice. Second-story bleachers. Other parents scattered around, looking at phones or their children, who are learning to serve, to rush the net, to move their feet. Every now and then an intake of breath and a ball bounces into this upper deck. I save my document often. Lunch break. The cafeteria-style section of Wegman’s grocery store, the overpriced pub in the hotel down the street from my office, the burrito place, the burger place, the salad place, the pho place. Me and my laptop and an hour to eat and write, 40 minutes if you count the drive, a chance to move this story along, just get words down, word count, produce content that may eventually be improved enough to be part of a novel. Maybe. The bookstore coffee shop. Saturday afternoons. I can write for an hour, an hour and a half, and then I walk around, look at books, get another coffee, move to something else. I have lost the muscles that used to let me write for extended blocks of time, long afternoons or evenings, three or four hours. I am 47, 48, 49, 50. I have a full-time job and mortgage and a 9-, 10-, 11-, 12-year-old son. This is how I wrote my new book—in 15-minute or half-hour or hourlong sessions of forcing out words between doing other things. These in-betweens are the only time I have. This is not how I thought books were written when I was a young person dreaming of becoming a writer, or when I was a student learning about other writers, or 20 years ago when I was first trying to actually be a writer. Then, I had what now seem like comically long stretches of time laying out before me, impossibly open weekend days or evenings with little to attend to other than a story or a novel or the idea that I should be working on the same. Then a whole bunch of stuff happened and all of a sudden I was a 45-year-old web manager with a young son and wife and too much on my Visa bill and a life full of other things that required my attention. All of a sudden nights were busy, days were busier, and the idea that some kind of ideal writing condition would one day occur became laughable. So: the in-betweens. Practices and lunches and a half hour at the end of the night, at the beginning of the day, as I’m waiting in the doctor’s office or the airport or the Chinese takeout place. I didn’t invent this, and over the past decade I’ve come to realize this is how a lot of us get work done. It’s not romantic, and it doesn’t feel particularly productive. Most of the time, sitting at these sports practices, I feel like an antisocial crackpot. I live in a small town somewhat obsessed with sports and it’s not great to feel like that, to be that person with the headphones on clacking away at a laptop while everybody else is chatting or cheering their kids on. It’s not great to turn down lunch invitations to sit by yourself and work away on something that may never become anything more than a Word file on your computer. Few of the people around me will ever know that my Word file eventually did turn into a book, but they may remember that I was holed up by myself, my face screwing up in disgust every now and then as I put shitty word after shitty word to make shitty sentences and shitty paragraphs. They may remember that Ben’s Daddy was that one that sat over there all by himself with his little computer, tapping away by himself. In the back of my mind I used to think that maybe they were looking at me and wondering what I was working on, that they would see me hunched over the computer and think of Stephen King or Flannery O’Connor, of the hard work and sacrifice that goes into producing art, but I’m pretty sure they were looking at me and thinking, look at that asshole. Most of the time, that’s how I felt, too, and I know a lot of working people or parents or working parents who feel this same way, like we have five jobs and are doing really shitty at every single one, all the time. Working this way doesn’t feel as good as the scene I imagined when I was 15 or 28: me at a desk, perhaps on a lake, a bottle of bourbon sitting off to the side, while Jann Wenner or my agent or the people from the publisher pace in their skyscrapers in New York, waiting for me to finish my masterwork, my Exile on Main Street or The Last Good Kiss or Jesus’ Son. Now I know better, and all of that seems ridiculous. I know I wrote a story bit by bit, 15 minutes here, two hundred words there, holed up in stolen moments ignoring the fact that lots of things were going on around me. Ignoring, most of all, the fact that I’ll never have those golden blank days stretched out in front of me, a giant desk in a country house with nothing to do but write. Ignoring all of that to see only those 40 minutes that were right in front of me right at that moment. It feels shitty, most of the time, moving things along in these bites of time, stolen moments in between other things. Eventually, though, you cobble enough of them together and you can make something that maybe you can turn into something better, and then you can find enough stolen moments to turn that thing into something good. Driving at night with headlights on, as the saying goes, you can only see a few feet in front of you, but you can get all the way home. The YMCA. Lots of naked men walking back and forth as I sit here typing away on the bench furthest from the showers. I can hear my son laughing, goofing off with the kids from his swim lesson. Me, a dad, a husband, a guy with a full-time job, a writer who is just trying to scratch out some shitty words one post-swim-lesson shower at a time, finishing an essay. Image: Flickr/qeh4qer

Everything Worthwhile Is Very Far Away: An Excerpt from the New Introduction to ‘Poor White’

1. In 1919, after his first marriage had ended, his family had dissolved, and his career as a mail-order businessman had been tossed aside for artistic dreams, 42-year-old Sherwood Anderson published Winesburg, Ohio, his fourth book of an eventual 27. A quiet, stark volume of nearly plotless biographical sketches, Winesburg was a two-fold Big Bang in American literary history: Its prose style, a deviously intimate third-person omniscience that left no character’s innermost shame unexamined, invented an entire branch of our country’s fiction focused on anonymous, troubled lives, while its overlapping, episodic structure inaugurated the “novel in stories.” These are powerful legacies, perhaps too powerful to overcome: A century on, Anderson is rarely thought of for anything but Winesburg, Ohio, when he is thought of at all. That famous book’s syntax is so spare and haunting and its simple concept so well-executed that curious readers of Poor White, the 1920 novel that immediately followed it, might feel confused or exhausted over the course of its winding plot and frequent pugilistic asides. Here, Anderson is narratively diffuse rather than precise, socially as well as emotionally minded. Winesburg is a catalog of human loneliness and regret, whereas Poor White is, at least by appearances, a decades-spanning rags-to-riches epic. Going from one to the other, as contemporary readers are likely to do, feels like leaving a one-room log cabin for a taxidermy-festooned hunting lodge: there’s a shared sensibility, a familiar woodsy style, but the scope and intention aren’t even comparable. I suspect these curious readers will persevere despite the confusion. For all its imperfection, Poor White spills forth with the same stylistic beauty that Winesburg demonstrates, especially Anderson’s knack for heartbreakingly dense narration. And while the plot does wobble—beginning as an allegory, switching to a new character’s perspective and concerns almost halfway through, and culminating in previously unimaginable violence, without anything resembling emotional closure—it does so from grand intentions, and never without a unifying force of vision. If Winesburg, Ohio created a new literary language to map the human capacity for self-recrimination, then Poor White is an attempt to scale up those techniques and indict the entire industrialized world. Anderson sought to dramatize the creation of a Winesburg and its spiritually flustered denizens, and he sought to show that this one town’s ills are those of its age and its nation. It is an odd book but a great one, thematically and stylistically contiguous with its more well-known predecessor. Ultimately, Poor White may even be more immediately relevant to readers a century on, as we contend with a berserk and unsettling new industrialism of our own. 2. Poor White opens about a decade after the Civil War, when Hugh McVey is born to a Missouri layabout who lives in a shack on the banks of the Mississippi River. When Hugh reaches the cusp of manhood, “a railroad pushed its way down along the river” and he takes a job as a station attendant. Soon he is adopted by his new employers, displaced Northeasterners named Henry and Sarah Shepard, and the latter enlists him in the Protestant work ethic: “It is a sin to be so dreamy and worthless,” Hugh hears from his new mother, and he takes this warning as his creed. These early pages have a whiff of pastiche: Anderson essentially piles on national archetypal images (the mighty river, the life-changing encroachment of trains, industrious New Englanders named Shepard), while Hugh’s quiet intellect and dirt-poor upbringing recall Huck Finn or even Abraham Lincoln. “If I do not move and keep moving, I’ll become like father,” Hugh worries during one of his “dreamy” spells alone in the train station. His anxiety, his fear-driven need to make something of himself, by himself, is emblematic as well, in keeping with what Anderson, in his famous story “The Egg,” called “the American passion for getting up in the world.” The Shepards move on, and Hugh, compelled to some undefined notion of material success, keeps moving as well. He experiences “three years of wandering” throughout the Midwest, passing for the first time through farmland and hill country. On this pilgrimage he takes a variety of manual labor jobs and realizes, looking at the communities around him, that: a quaint and interesting civilization was being developed. Men worked hard but were much in the open air and had time to think. Their minds reached out toward the solution of the mystery of existence. This is the late 1880s, and one of the greatest pleasures of Poor White Anderson’s crystallization of his native Midwest at the very moment of its romantic enshrining. He gives us loping fields of wheat and cabbage, hilltop barns aglow with kerosene lamps, and dirt-road towns where farmers arrive in horse-drawn wagons to purchase supplies or visit the haberdasher—a semi-cartoon naturalism like Thomas Hart Benton and Grant Wood would manage in the visual arts soon after. Anderson’s depiction is romantic and pastoral, but he also writes as a cultural historian, telling us what these people read and what they think. He writes often in a collective third-person, sweeping the whole region along with summaries of industrial development and even occasional flash-forwards to describe what a certain building is “now,” that is, in 1920. Change becomes the guiding principle of this sainted land, not beauty or quiet or, god knows, the individual propensity for philosophical thinking. By the end of the novel’s second book, we have moved from pastiche to historical documentary, just as Hugh has finally arrived in Bidwell, Ohio, the scene of his most prosperous years. Well, not quite Bidwell. He is initially stuck at the train depot in nearby Pickleville—named, we learn, for a pickle factory that has since gone out of business. Sitting in the shadow of this empty factory, attending to the very occasional trains that pull in, Hugh once again begins to dream, this time in “the spirit of the age.” While not concerned with profit, he nevertheless occupies himself with thoughts of labor and invention. One night he surreptitiously observes cabbage pickers in a field and thinks up a machine that could do the backbreaking work on their behalf. This would be the moment of grand triumph in a typical saga of American fortune, but here is where Anderson really begins to interrogate the idea of the self-made man. It isn’t Hugh’s own ambitions or talent that brings his invention to profitable life, it’s blind chance and the desperation of others. Steve Hunter, a would-be magnate who embodies the 1890s’ obsession with riches, needs something to impress investors and assumes that the perpetually doodling, socially awkward man down at the Pickleville junction has some idea they can throw money at. Indeed he does: a mechanical planter, and soon the pickle factory is back up and running to manufacture them. Hugh, who was born in destitution and has yet to acquire so much as a friend, suddenly becomes, in his neighbors’ eyes, “the man who belonged to the new age of iron and steel.” Anderson is not coy about his feelings toward this age. As a young man, Hugh has a vision where he experiences himself as “part of something significant and terrible that was happening to the earth and to the peoples of the earth,” and this turns out to be prophecy. The success of the planter and his subsequent agricultural inventions quickly buries Bidwell’s placid farmland under pavement and factories, and longstanding social conventions are crushed as well. The metonym for this ravaging loss is Joseph Wainsworth, a talented harness-maker whose expert craftsmanship is made obsolete overnight by mass-production. “I know my trade and do not have to bow down to any man,” Wainsworth says when the machine-made harnesses arrive, but he is unconvinced even then, and his growing anger is the most important plot development that Anderson slowly builds under his main story. 3. In Anderson’s vision of socioeconomic progress, both the visionary inventor and the overwhelmed craftsman are like weathervanes spinning in the wind. Things are always happening to his characters, like the railway line that “pushed its way” into Hugh’s early life. Anderson’s artistic breakthrough was his exquisite depiction of this powerlessness and the pain it brings. “All men lead their lives behind a wall of misunderstanding they themselves have built,” as he writes in Poor White, “and most men die in silence and unnoticed behind those walls.” They are all like the local boy Ed Hall, one of many who gathers to see and celebrate newly rich Hugh McVey once the planter becomes nationally known. Standing in an awestruck crowd, Ed wonders aloud, “They say he’s smart. I suppose he wouldn’t tell me nothing. I wish I was smart enough to invent something and maybe get rich.” The book’s only major character who comes close to a life on their own terms is a woman, Clara Butterworth, daughter of Bidwell’s biggest landowner. Clara “did not want stupidly to accept life” and goes to college to find wisdom, experience, and friendship. The closest she comes is a formative relationship with a radical student named Kate Chancellor, with whom, Anderson implies, she had an unacted-upon romantic affinity as well. But life draws her back to Bidwell nonetheless, where, angrily aware of the social requirements for any successful woman, she accepts that she needs to marry. She chooses Hugh McVey, or as Anderson unromantically puts it, “a chain of circumstances...hurled them into marriage.” Despite their wealth, they remain hesitant, incompatible partners, perpetually unhappy and barely even capable of physical intimacy. Anderson himself knew this condition all too well and returned to it throughout his work, including in “The Egg,” which is part of a small group of his stories set in Bidwell. His central theme was man’s tendency to feel “swept aside from his purpose by the complexity of life,” as he writes in Poor White. But this novel is even more frankly autobiographical than his other tales. He may not have been a river rat like Hugh McVey, but Anderson grew up in poverty at the exact same time in history, with a father who underperformed at a series of careers, including, briefly, harness-making. “A man, if he is any good, never gets over being a boy,” he would later write in the novel Tar: A Mid-West Childhood, and he was forever committed to avoiding his father’s waywardness. Anderson took to heart the various midwestern economic koans he heard as a boy, including “Money makes the mare go.” He went into business early and did his first writing as an ad man before eventually taking over his own catalog company and starting a family in Elyria, Ohio. That first marriage, to a worldly, wealthy businessman’s daughter named Cornelia Lane, was joyless and increasingly suffocating. Anderson’s interest in literature and writing took root in an attic workspace where he sequestered himself to escape from the daily trials of young children and money-chasing. In his memoirs he recounts and episode where, bereft and lonely, he returned from a nighttime walk and spotted Cornelia pacing in their yard. Rather than approach or comfort her, he kept his distance and peered through the shrubbery, watching her suffer in solitude. Much as Hugh McVey witnesses cabbage pickers through the bushes and vows to end their pain, Anderson took this furtive glimpse of quiet dread and built his entire aesthetic and worldview around it. His achievement in Poor White is a kind of double-vision, an ability to dramatize personal angst and national tragedy as one story. Anderson saw the machine age as a betrayal, and felt profit-maximization was a goal unworthy of decent people. He had his own psychic break with that life, a fugue state in 1912 that precipitated his heedless turn to fiction writing and gradual separation from business. Anderson knew the misery that metastasized in the commerce-centered life and wrote this grand, occasionally high-blown novel to warn people that industry will only create more awful nights in the yard for everyone it touches. He had poetry in his soul, and his earnest, mournful writing is saved from bathos by his insistence that everyone else does, as well. 4. In the 21st century, a Midwest-set novel titled Poor White surely spurs thoughts of opioid casualties and postindustrial decay. Anderson’s story is like a prologue to that current one, a warning that the industrialization of the country’s most fertile agricultural region was a rotten idea from the get-go. But its other warnings—about the instant and unregulated accrual of wealth, and the haphazard sorting of fortune and luck in boom times—ring louder to my contemporary ears. Poor White, above all, is about change. It concerns the great lost opportunities of industrialization, which might have improved people’s lives materially but moved them farther away from that concern for “the mysteries of existence” that Hugh McVey notices in his countrymen before the machinery arrives. The parallels with the information age are almost too obvious to mention. Armed with the widest access to knowledge in human history, American society has grown more economically unequal. Middle-class wages have stagnated while the effective and actual costs of education, housing, even food, have skyrocketed. Unions are under perennial attack, just as they are in Poor White’s final pages. Blessed with unbelievable convenience in all aspects of life, we produce a literally life-threatening amount of garbage and struggle to find political consensus that the planet is even worth taking steps to save. In response to this hopeless reality, there is currently a widespread belief among readers and writers that fiction teaches empathy. I find this argument wrong and even wrong-minded, but certainly all good writing shows us potential ways of seeing the world, and few writers have widened the scope of what American literature could show as much as Sherwood Anderson. He took our most cherished national myths of self-reliance and continuous societal improvement and slowed them to a crawl, revealing the confusion and dissociation that result from this gospel of eternal striving. He doesn’t ask us to empathize with other people so much as he forces us to confront our own unquestioned presumptions about ourselves. “Everything worth while is very far away,” thinks Clara on an automobile ride at the end of Poor White. Almost no one in Bidwell has a car at this point, but her father, “who now talked only of the making of machines and money,” unsurprisingly does. She could be speaking for the residents of left-behind towns in contemporary Ohio and elsewhere, or for the millions of us who feel the levers of power are accessible only to the hyper-wealthy. She could be speaking for anyone who reflexively refreshes a social media feed to stave off anxiety. It is the right time for us to rediscover Sherwood Anderson, who interrogated subconscious American assumptions as well as any other writer in our history. What a thrill to have this complex book back in print, to see the ways in which a genius can recognize the culture’s ills in its residents’ sadness—and his own. Poor White is a social history that reaches uncomfortably deep into its subjects’ psyches, a still-relevant tract that aches with raw feeling. We might still yet heed its example, and appreciate its vision. Excerpted from John Lingan’s Introduction to the new edition of Poor White by Sherwood Anderson, part of Belt Publishing's Belt Revivals series. Reprinted with permission from Belt Publishing, copyright 2018.

On Anthony Bourdain in a Tearing World

1. I made the mistake of trying to write about Anthony Bourdain with old episodes of Parts Unknown playing in the background. His sudden jarring death demanded words and witness both, so the balance seemed simple enough, an earnest attempt at paying my honest respect. Then again, nothing about that work was built for the background. I settled randomly on Season 3, which began in Punjab Province, and the voice and sights and sounds enraptured my attention like an urgent sermon. The staccato storytelling felt as new, suckering me into believing this was one I somehow hadn’t seen. I had, possibly twice, and stopped what I was writing to watch it anyway. There’s an intoxicating verve to them, a pace and cadence apart from Bourdain’s contemporaries. How food and history and politics are tapestried together with such delicate deftness. Moving from human to human and scene to scene, embracing each on their level, cameras just as invisible to the subject as they are to you. It’s the kind of show that straight-up shapes the viewer’s desires, to where the question “Where in the world would you go if you could go there now?” is the episode you saw last—the meal you most want, whatever stranger’s table he made you share. And so I couldn’t write, not right away, because the beauty of what he created, then as now, simply stole my imagination. Only this time, the reverence is cut with confusion. You read into the shades and cocksure gait a poise that’s utterly intractable. This is someone who understands precisely where they’re going and why, in life, in craft, on any sunbaked street or arborous boulevard he happens to be smelling. In conversation, be the topic food or something barely tangential (like tripe), Bourdain navigates nuance and niche with the tack of a seawise journo, proud port-ward bent aside. He speaks lovingly of childhood, of family, of friendships forged in kitchen fires and over broken bread in travels even he concedes were an embarrassment of riches. A man who both acknowledges the absurd sequence of forces that sired his career and believes—or carries the air, at least—that he was born to do exactly this. And then you remember he ended it all, eighty-sixed the trip, traded knowing the world’s woes and wonders with a near-peerless depth and ferventness for literally anything else. You remember this, and the image of it sinks you like an anchor. After confusion comes the anger. A kind that’s acutely self-aware, in all the ways these tragedies should be teaching us: about the futility of surmising someone’s pain, the folly inherent in conflating confident posture with true wellbeing—but a fleeting anger nonetheless. Asked whose job I’d most like to have, Bourdain was named without a second breath. Surely you would never betray such gifts. Surely, given this life of sensuous verve and immersive encounters, suffering and anguish needn’t apply. You recognize the grave naivety of these thoughts and rationalize them still. I certainly did. But once the vainness of these close-fisted flails connects with only air, you’re left gasping and grasping for what so many who’ve endured a loss to suicide are forced to lean on, broken and devoid of answers, until those planes once again maybe, hopefully converge: imperfect memories. 2. The first episode I saw was “Iceland.” I can’t recall the exact scene, though it may have been just before the fermented shark. What I do remember is being instantly drawn to the narrative pace, ditto the commanding presence of Bourdain himself. This particular chapter was filmed in winter, a time when the sun barely musters a cameo, and there must’ve been something about the freeze-hearth back-and-forth that strummed a familiar chord, having been similarly mired in hibernation myself. A friend of mine, a fellow cook with whom I’d helmed a line years before, positively revered the man, plugging his book during smoke breaks and whatever turn-and-burn moments most harkened its juice. To call that first half episode a revelation would be hyperbolic. I was hooked, however, and inhaled his entire TV canon as quickly as I could. A sweeping journey awaited, complete with warmer climes and an ever-blossoming belief in Bourdain’s effortless gift for wine and song. By the time our son Everett was born the following spring, I’d taken to recycling old episodes between new cuts of Parts Unknown, often while bouncing him to sleep on an exercise ball in the hours before my 5-to-2 a.m. writing shift. It never got stale. “Detroit” alone I watched no fewer than three times in a summer. Bourdain was as staple a routine as tummy time or midnight wakeups, a near-daily escape that had the fortunate virtue of being profoundly informative and poignant. When Rett was diagnosed with cancer that November, our world and purview shrank to the size of a hospital room. Media of every kind became transactional, a respite of denial, a focus so distant from daily life as to be interstellar. We barely noticed the calories we consumed, let alone what fare to seek on the streets of Glasgow or ranches of Montana. Our mantra was survival, of our baby boy and of whatever modicum of normalcy remained. I might’ve stolen an episode here or there. Likely on the Kindle during my post-midnight shift of keeping Rett asleep. He died the following February, at sunset on a Sunday, after four months of chemo and a last-ditch trip to St. Jude. The subsequent weeks unfurled in a blur of sorrow and sympathy and making plans for Rett’s memorial and how we’d honor him through the throbbing life to come. In April, Deana and I embarked on a two-week sojourn to Spain and Portugal. We spent what would’ve been Rett’s first birthday at Montserrat, where we touched the Black Virgin and lit a candle for our son amid his fellow thousands. Tethered though we were to our trauma, it was a chance to be human again, to wade ankles-deep into beauty and history and the pearls of perspective suspended in each. Bourdain was part of the early healing, one of the scant few things that made me feel I was or might be somewhere else, a plane ride past the place we watched our son stop breathing. The dark corners Bourdain revealed—the ugliness and utterly human sins—became part and parcel of the vagabond bliss. It was precisely because we’d unearthed the blunt brutality of life, how it thrills to make meals of us all, that the toasts and grease-sheened plates had me aching to know his silver-haired secrets. He’d become a hero, a vessel for my horribly scarred designs, and I can’t help but ponder whether the collective weight of that brand of worship—the kind that happens between commercial breaks—didn’t somehow hasten his despair. 3. There’s a series of scenes in the Buenos Aires episode where a couch-splayed Bourdain confides in a psychiatrist—Argentinians are per capita tops in this regard, apparently—about his various, seemingly amateur neuroses. About eating an airport hamburger and succumbing to a days-long spiral of depression. “I’d like to be happy,” he says, in a way that only sounds serious in retching hindsight. “I’d like to be happier. I should be happy. I’ve had incredible luck. I’d like to look out the window and say, ‘Hey! Life is good!’ But I don’t.” If not for the clue-searching lens, I’d have likely dusted the scenes aside. Not because Bourdain’s conveyance wasn’t sincere. Rather, because anyone might’ve let slip the same—present included. These are the things one expects an admitted libertine and lifelong outcast to say, after all. Tortured artist, fill in the rest. But processing what I had—the circumstances surrounding his death, a hundredfold more the daughter bereft her father’s presence and love—the pain was there, gnawing through whatever tunnels it takes to reach and breach the slowly crumbling will. Absent an answer, speculation fills the gaps. Maybe he couldn’t square his genuine empathy and the white guilt that birthed it. Perhaps all the mirror saw was a failed novelist cursed to TV stardom. That there was some heinous act none of us knew. Absent an answer, depression—this pain below the name—crawls camouflaged towards the light. Mostly I’ll miss him. His snout-to-tale lust for life. The way he turned simple food into accolades. How he made his country listen before it spoke and think before it ate. Bourdain and Zamir wandering Chernobyl in a last-night haze. I’ll miss him because he cared about people—their stories, memories, and mothers’ recipes—even knowing their inherent doom. The world is tearing too fast to lose that kind of ever-weaving thread. The one always aiming to pull it all together, wound by infected wound and damn the collateral blood, to where scars are all that’s left to heal. Whether he believed it or not, whether the wound creators know it or not, Bourdain was a healer. The kind that pulls so tight for so long through so many wounds you wonder how it never broke before. Image: Flickr/Anna Hanks

A Great Short Story Has a Pulse: Donald Barthelme’s ‘Game’

A few months before Donald Barthelme’s “Game” appeared in the July 31, 1965, issue of The New Yorker, a cloud moved over Los Angeles. The cloud originated in Jackass Flats, Nevada, born from a nuclear rocket test. The Atomic Energy Commission wanted a “controlled excursion.” That excursion rode hundreds of miles before drifting into the ocean. This was the era of Dr. Strangelove. A time when hundreds of kilograms of Uranium-235 went missing from a nuclear plant in Pennsylvania. The years when American B-52s secretly flew above Europe, hydrogen bombs at the ready, first strike only a moment away. In a 1982 interview, Barthelme captured the sentiment that charged “Game.” His words could apply to the Cold War, as well as our painful present: “The doomsday clock has been set up a few notches, I gather. The way our present government is talking is absolutely mad.” “Game,” reprinted in Sixty Stories, is a paranoid, recursive, claustrophobic, uncomfortable tale. It takes place a single room. Shotwell and the narrator are two military officers stuck in an underground bunker “in Utah, Montana or Idaho” for 133 days, “owing to an oversight.” They are bored; they are frustrated. Shotwell is stubborn. He plays jacks but does not let the narrator join the game. The narrator wants the jacks, but Shotwell stuffs them into his attaché case. This double game—a game of hiding a game—continues as they watch the console. They both are outfitted with .45s, along with hidden backup pistols, and are supposed to shoot “if the other is behaving strangely.” They are strange from the first word to the final word of the story, but strangeness is relative in Barthelme’s box of a story. Their job is significant: If “certain events take place upon the console, we are to insert our keys.” They are to release the “bird,” the missile that will destroy a city. It is a hypothetical situation. It never happens. But it could, and its possibility is the story’s profluence. Barthelme’s recursive tale moves between several repeated signs—birds, guns, attaché case—but I am most drawn to its anomalies, its variables. A great short story has a pulse. A great short story is tightly wound—no wasted words or breaths—but a great short story has new contours when we return to it. I first read “Game” in the basement of a university library, among the dark stacks of nearly discarded issues of Popular Mechanics. But now, even reading the story in a brightly lit classroom, I can still appreciate how the bunker’s “pale green reinforced concrete walls sweat and the air conditioning zips on and off erratically.” Anaphoric, with the occasional aside and quirk, his sentences are like incantations—liturgical, even (not surprising—although he later lapsed, Barthelme had a nostalgia for Catholicism cultivated by his years at St. Thomas High School in Houston; think James Joyce sneaking into the back of churches, sentimentality tempering one’s skepticism). The narrator wonders if they are subjects of an experiment. Maybe. But we the readers are the experiment, so often, in Barthelme’s fiction. He pushes and strains the expectations we have for fiction. He asks us to play his game, with his rules. What is the syntax of a mind gone mad? Not just any mind—soldiers who can destroy cities. Mass destruction, “Game” suggests, is always in the wrong hands—because such power stains the soul. In Donald Barthelme: The Genesis of a Cool Sound, Helen Moore Barthelme said that her ex-husband claimed the story had touched an official nerve: “According to Don, ‘Game’ evidently stirred considerable interest among the military. He said that ‘Game’ had knocked them all for a loop in the Pentagon, but ‘not because it was true.’ He later told me that although the story caused a small furor, he heard nothing further about it.” Neither Shotwell nor the narrator can sleep well at the end of “Game.” Two soldiers, close together and underground, cradle and rock each other to sleep. “Game” is a horror story. It is suffocating, and it is simple. We are in that bunker with these soldiers, and whatever postmodern games Barthelme was playing with language, the result is frightening. Sooner or later, the keys will be turned, and the bird will fly. Image: Flickr/B4bees