In a word cloud of writing about Flannery O’Connor’s stories, “grotesque” would be central, medium-large. The word is typically applied to the characters in her stories, whose spiritual deformities are often represented—in problematic authorial shorthand—by some kind of physical deformity. Hulga, in “Good Country People,” is archetypal, missing both a leg and the religious faith of her mother. Or Rufus, in “The Lame Shall Enter First,” with a clubfoot (she was big on legs and feet) that represents his essentially twisted nature. But the dictionary definition of grotesque, “comically or repulsively ugly or distorted,” also speaks to an essential quality of O’Connor’s work. Her stories are morally distorted, and the effect is both ugly and comic. In the denouement of her most famous story, “A Good Man Is Hard to Find,” the Misfit, an escaped convict, has his henchmen systematically march the family members of the Grandmother into the woods and shoot them. The Grandmother, we are given to understand, has brought this upon her family and herself—in her deceitful vanity, she has gotten them lost, and then, having stowed away her verboten cat which attacks her son while he’s driving, gotten them wrecked and stranded. She is to blame. It is a testament to both the power of O’Connor’s writing and the fame of this story that we do not generally find this ending odd. The Grandmother is annoying, true. She is a petty, class-obsessed bigot who wears perfumed cloth violets so that, in the event of a car accident, any onlooker "would know at once that she was a lady." She names her cat “Pitty Sing” (“Pretty Thing” done in a reprehensible Charlie Chan accent). She makes up stories about her youth to impress her bored family. She flatters herself and holds fatuous opinions about the world. She is, in short, almost exactly like my own late grandmother who, ancient pain in the ass though she was, probably did not cosmically or karmically merit the murder of my entire family. Yet we take this ending, in the context of the O’Connor universe, as more or less fair—if not deserved, exactly, then somehow structurally congruent. We accede to this kind of disproportionate judgment, and disproportionate punishment, in the majority of her stories. Disproportionality is at the heart of O’Connor’s work, and it represents both the worst and best aspects of her art. A pattern of punitive excess repeats itself again and again in her work. Julian, in “Everything That Rises Must Converge” is punished for workaday maternal contempt with watching his mother suffer a massive stroke. Asbury, in “The Enduring Chill,” guilty of being a proud, theatrical fool (not to mention, simply twenty-five), is consigned to a sickbed in his mother’s home for the rest of his days. True, we live in a world that can be mightily disproportionate, mightily unfair—we know this—but O’Connor is not merely describing our world and its injustice. These stories are torture boxes, lovingly designed by a master craftsperson to enact maximal punishment for minimal crimes. A flawed character is set inside; that flaw catches on the ineluctable, merciless gears of her narrative logic; they are ground to dust. In “The Lame Shall Enter First,” the singularly named Sheppard and his little boy, Norton, have suffered the loss of their wife and mother. While volunteering at a local juvenile detention facility, Sheppard takes an interest in local ruffian Rufus Johnson—an intelligent child from a poor, criminal family. Rufus insists that he is evil, a claim repugnant to Sheppard’s rational atheism, and Norton becomes the battleground upon which these competing moral claims fight. Although Sheppard’s obvious sin, in the medieval Catholic framework of O’Connor’s work, is his godlessness, it can be argued that the greater underlying emotional sin is his privileging of Rufus over Norton. His interest in helping Rufus supersedes his interest in his own son, whom he finds annoying and mildly repulsive, as the boy mirrors his own grief back at him. This is Sheppard’s emotional sin—his charity not beginning at home—and the penalty for this temporary neglect is the death of Norton, who hangs himself. We exit the story as the father kneels over his dead son’s body, utterly bereft in the coldest of all possible universes. This same distorted quality is also largely responsible for the black humor of her writing. She is a very funny writer, with a Mel Brooksian sense of the inherent comedy of the suffering of others. As he put it, “Tragedy is when I cut my finger. Comedy is when you fall into an open sewer and die.” Comedy, in other words, is inherently disproportionate, at least partially premised on the lopsided limits of empathy where the suffering of oneself and others is concerned. Seinfeld’s humor, for example, is based on juxtaposing the main characters’ utter insensitivity to other people’s plights—in particular, Jerry’s breezy unconcern—with their extreme sensitivity to their own trivial problems. A child trapped in a germ-free “bubble” is fodder for jokes, while not being able to get seated at a Chinese restaurant becomes a kind of personal hell. Cartoons, likewise, are almost completely predicated on a similar dynamic. Wile E. Coyote’s suffering is funny not only for the extreme violence of the physical comedy—the faulty boxes of dynamite and ill-planned catapults—but in a meta sense, the fact of his Sisysphean struggle, the brutality of his universe and its two-fold cruelty, assigning him an unappeasable desire for the Roadrunner that can only result in his destruction, for all eternity. So it is in much of O’Connor’s writing, which, despite—and because of—its bleakness, is essentially comic. “She would have been a good woman, if it had been somebody there to shoot her every minute of her life,” says the Misfit, after finally putting the Grandmother out of her misery. With these words, he dispatches the Grandmother’s moral existence as summarily as he has dispatched the lives of the family. It’s justly one of the most famous lines in any short story, and one of the funniest as well, in its bathos and casual horror. In a sense, the entire story is an enactment of Mel Brooks’s epigram, with the audience enjoying the bleakly funny spectacle of the Grandmother, in all her petty obliviousness, leading her family straight toward the open sewer that is the Misfit. Asbury’s bedridden fate in “The Enduring Chill,” while not comic in itself, is comic in the inexorable steps that lead to it. His prideful, self-dramatizing silliness leads to an ill-considered show of solidarity with his mother’s field hands during which he gulps a bucket of tainted milk. The fact that he does not “deserve” to become gravely ill forever—while arguably a moral flaw in the storytelling, is a pure advantage in the comic rendering of Asbury’s pathetic existence. In a larger sense, there is something horribly comic in the experience of reading an O’Connor story. You pull back the cloth draped over the torture box, see the unlucky character enter, and nervously titter as you learn the terms of their demise. It is awful, and it is funny, and it is funny because it’s awful. It is funny because we live in a world that can be so awful, in which we spend large swaths of our time attempting to avoid that awfulness, and here is an artist training her godlike powers on the creation of perfectly awful worlds for her characters to inhabit. In this sense, the aesthetic experience of watching Wile E. Coyote and reading Flannery O’Connor are not very different: in both, you are waiting for the two-ton anvil to fall from the sky. This is also to say, however, that there is something fundamentally cartoonish about the moral unfairness of O’Connor’s work, an unfairness rooted, if one is inclined toward biographical interpretation, in her Catholicism and illness. Here was a genius twice cursed—first with original sin, then with lupus. These outsized personal afflictions surely inflected her worldview, a worldview that allows characters no agency—rarely, if ever, does the cool breeze of free will blow through her humid North Georgia hellscapes. It is an art of typology: types of characters, types of flaws and sins, types of punishments. This monolithic, caricatured quality is both a weakness and strength, manifesting in ways that are impossible to disentangle. It lends, in my mind, an imposing greatness to the work that sometimes comes at the expense of truth.
Good fiction typically provides few good answers but many good questions. The great novels and stories can often be, however incompletely, expressed as a single, overarching question that the author is working out via narrative. Is the American dream an illusion? (The Great Gatsby); should a person marry for money? (Sense and Sensibility); can the son of God be born in human form and sacrifice himself to save humanity? (Harry Potter and the Deathly Hallows). Good essay, like good fiction, is also mostly engaged in the act of asking questions. But the forms differ in a few crucial aesthetic respects, leaving aside the basic fact of fictionality, which, as we know, can be an overstated difference—nonfiction is often partly invented and much fiction is true, or true enough, but never mind that. Centrally, fiction possesses a narrator that obscures the author. Largely as a function of the narrator’s existence and also simple novelistic convention, most novels seek to attain a smooth narrative surface, an artifactual quality. A great deal of received wisdom regarding fiction craft has to do with the author disappearing in the service of creating John Gardner’s “vivid, continuous dream.” This isn’t to say that essayists don’t also obsessively and endlessly revise to create a polished surface, but the goal is typically not authorial effacement. Maybe an easier way to say it would be that both fiction and essay revolve around formulating questions, but essay very often works the act of questioning—of figuring out what the question is—into the form. Joan Didion pioneered what we think of as the modern essay, a self-conscious blend of journalism, criticism, and personal experience. Some Didion essays are intensely focused on one subject, for instance, “On Keeping a Notebook.” But the most Didion-y of Didion’s essays are ones like “Slouching Toward Bethlehem” and “The White Album,” that meander through subject and theme like a car driving home from work via L.A.’s surface streets. “The White Album,” for example, combines the description of her mental instability and compulsive dread with a more panoramic view of her bad-trippy east-Hollywood neighborhood in the late ’60s, a personal account which ripples out into larger cultural considerations: the Doors, the Manson murders, and California—always California. Didion’s stylistic legacy serves as both influence and study for Alice Bolin’s Dead Girls, an excellent collection of individual essays and also, to my mind, a fascinating example of the book-length possibilities of the essay form. Dead Girls begins in what seems straightforward-enough fashion with Part One, The Dead Girl Show, a quartet of thematically unified essays examining the centrality of the figure of the dead girl in American popular culture. These include “Toward a Theory of the Dead Girl,” about the glut of recent dead girl TV shows including True Detective, The Killing, and Pretty Little Liars; “Black Hole,” about growing up in the serial killer-y Pacific Northwest; “The Husband Did It,” about true-crime TV shows; and my personal favorite, “The Daughter as Detective,” about Bolin’s father’s taste in Scandinavian crime thrillers. (A side note: It’s not a requirement that you have mystery-addict parents to enjoy this essay, but it could hardly fail to charm someone who, like myself, grew up in a house crammed with Maj Sjöwall and Per Wahlöö mysteries.) Having established its seeming method in Part One, the book veers sideways into Part Two, Lost in Los Angeles, essays largely about Bolin’s experience as a 20-something living, for no especially good reason, in L.A. Aimlessness becomes a dominant theme, as the book shifts gears into writing about freeways, Britney Spears’s celebrity journey, and wandering around graveyards. Perhaps in an attempt to pre-empt readerly confusion at the book’s shape-shifting, Bolin has made it clear, both in press and in the introduction: This is not just a book of essays about dead girls in pop culture. I understand this concern and will admit to feeling a slight confusion about Bolin’s project immediately after Part One. But proceeding through Part Two, and then Three, Weird Sisters, about teenage girlhood and the occult, I found myself increasingly glad the book had morphed and kept morphing. The book’s intelligence has a questing quality, a pleasant restlessness as it moves from literary criticism to personal anecdote to academic cultural/political critique and back again, like a jittery moth that never lands for too long on the light it circles. The way Bolin modulates subject and approach metaphorizes both the breadth and slipperiness of her main thematic concern: narratives of female objectification. The book generally proceeds from objective to subjective, mimicking the detached and objectifying eye of its central detective figure in Part One, then moving steadily into subjective, personal territory. Like Indiana Jones switching a bag of sand for gold, Bolin substitutes her younger self as the Dead Girl and, in doing so, bestows the Dead Girl agency, brings her to life. Part Four, the longform essay “Accomplices,” brings the project to an end and to a thematic whole. In a way, it embodies the entire book, incorporating the major concerns—growing up, white female objectification and privilege, romance and the lack thereof, Los Angeles—into a self-aware meditation on the author’s sentimental education in the context of literary counterparts like Rachel Kushner and Eileen Myles and, yes, Joan Didion. Bolin seems to be asking whether there is, inherent in the act of writing the classic coming-of-age “Hello to All That” essay, as she puts it, a self-objectification that echoes the deadly cultural objectifications critiqued earlier in the collection. “How can I use the personal essay,” she asks, “instead of letting it use me?” Part Four anatomizes the entire Dead Girls project, simultaneously encapsulating the book and acting as a Moebius strip that returns the reader to the more stylized and essayistic distance of the opening chapters. To be clear, there are many standout and stand-alone individual essays in these sections. The aforementioned “Daughter as Detective,” which, in addition to its many virtues, contains the unforgettable description of Bolin’s father as a “manic pixie dream dad.” “This Place Makes Everyone a Gambler,” a deft personal history of reading and rereading Play It as It Lays, that weaves together L.A. noir, Britney Spears, and Dateline NBC. “Just Us Girls,” a touching cultural study of adolescent female friendship. But the book’s biggest triumph, in my opinion, is of a larger, formal nature, as Bolin marshals her themes and interests into a book-length reflection, of and on, the persistent figure of the Dead Girl. Alice was kind enough to field a few of the questions that occurred to me in writing this review, mainly regarding how this book’s singular form came to be. The Millions: Can you provide a little general background about how the book got written? I'm curious which essays were written first. Also, if there were any pieces that it became apparent needed to be written in the interest of book-length cohesion. I'm especially interested in "Accomplices," which serves so well as an embodiment and critique of the project. Alice Bolin: This is a little hard to answer because most of the previously published essays in the book are drastically changed from their earlier forms. I would say the book really started with “The Dead Girl Show” and the essays in the second section about California, which I started writing, hilariously, the second I moved there. I started most of those pieces in 2013 and 2014 in Los Angeles, and that was when I started to see the ideas I'd been working with coming together in some vaguely book-like shape. Most of the essays in the third section, “Weird Sisters,” existed earlier, though, in different versions—I realized late in the game that my preoccupations with witchiness and teen girl pathology pretty obviously dovetailed with the Dead Girl thing. "Accomplices" was the last piece I wrote for the book, and I knew that it was my opportunity to pull up some of the narrative paths I'd laid down earlier, both about Dead Girls and about my own life. The book as a whole is about questioning received narratives, so I had ambitions for it to work as sort of a (sorry) palimpsest, putting forth suppositions and then writing over or revising them. I want there to be some dissonance for the reader. TM: At what point did the theme of The Dead Girl emerge? Was it obvious from the start? The collection approaches this subject from so many angles; I’m interested in if there was a certain amount of retrofitting in the revision—that is, were there already completed or published essays that you went back to and revised with the dead girl subject/theme in mind? Or did it all kind of hang together as it does from the start? AB: I think once I wrote “The Dead Girl Show,” I saw that Dead Girls were a theme that I had been interested in for a very long time. I had already been writing about thrillers, true crime, detective fiction, and horror movies, genres where Dead Girls were everywhere. After that I was thinking about other ways I could write about Dead Girl genres—like in the Nordic Noir essay—and about subjects from other pulp genres that could throw those essays into relief, like pop music or reality TV. I didn't really do much retrofitting that I can remember, except maybe lines here and there. I have my MFA in poetry, so I have borrowed a lot of the ways I think about a collection from poetry books—that you allow your preoccupations to dictate the shape of the book, instead of the other way around. TM: The book’s critical mode seems to move somewhat from objective to subjective, and then, in Part Four, comment on that move. That is (and I realize I might be oversimplifying here, since all these elements exist in all the essays), Part One is predominately cultural critique, and then parts Two and Three become increasingly personal. To what extent was this movement something that organically emerged in revision, and to what extent was it conscious? AB: It's interesting, because in my original draft I had the California essays first, and the Dead Girl essays second—they seemed most important to me, but then my editor was like "Uh, shouldn't Dead Girls be first since that is the title and the whole point of the book?" She was so right. Someone else has pointed out that the book works like a Dead Girl show, with the Dead Girl as bait at the beginning of the book, but the rest of the narrative arc being about something totally different. I love this, but it didn't really occur to me, except maybe intuitively. I definitely wanted the fourth section to critique the strategies of earlier essays, but beyond that, the organization was more by subject than method. I actually wanted to cut the third section late into the drafting process, if that tells you anything about how uncomfortable I am with writing about my own life! TM: To me, because of the thematic unity and movement of the book, Dead Girls has a somewhat novelistic quality or instinct. Is this something you’re interested in doing? More generally, what’s next? AB: This is such a nice compliment! I am absolutely interested in experimenting with fiction. I had a sort of epiphany in the past few months about how my own attitude toward myself in the book is a lot like the detachment novelists have toward their characters—it's the only way I can break through (or maybe... use?) my self-loathing. Anyway, yes I am interested in writing an autobiographical novel sometime in the future, with more details TBA, in maybe like 10 years. I'm also thinking about another very girly essay collection about magazines, social media influencers, and the vintage internet, and more generally, the way women have mediated and monetized their personalities.
1. Kurt Vonnegut’s caution against the use of semicolons is one of the most famous and canonical pieces of writing advice, an admonition that has become, so to speak, one of The Rules. More on these rules later, but first the infamous quote in question: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college.” To begin with the lowest-hanging fruit here—fruit that is actually scattered rotting on the ground—the “transvestite hermaphrodite” bit has not aged well. The quote also, it seems, may have been taken out of context, as it is followed by several more sentences of puzzlingly offensive facetiousness, discussed here. That said, I also have no idea what it means. My best guess is that he means semicolons perform no function that could not be performed by other punctuation, namely commas and periods. This obviously isn’t true—semicolons, like most punctuation, increase the range of tone and inflection at a writer’s disposal. Inasmuch as it’s strictly true that you can make do with commas, the same argument that could be made of commas themselves in favor of the even unfussier ur-mark, the period. But that is a bleak thought experiment unless you are such a fan of Ray Carver that you would like everyone to write like him. Finally, regarding the college part, two things: First, semicolon usage seems like an exceedingly low bar to set for pretentiousness. What else might have demonstrated elitism in Vonnegut’s mind? Wearing slacks? Eating fish? Second, in an era of illiterate racist YouTube comments, to worry about semicolons seeming overly sophisticated would be splitting a hair that no longer exists. But however serious Vonnegut was being, the idea that semicolons should be avoided has been fully absorbed into popular writing culture. It is an idea pervasive enough that I have had students in my writing classes ask about it: How do I feel about semicolons? They’d heard somewhere (as an aside, the paradoxical mark of any maxim’s influence and reach is anonymity, the loss of the original source) that they shouldn’t use them. To paraphrase Edwin Starr, semicolons—and rules about semicolons—what are they good for? As we know, semicolons connect two independent clauses without a conjunction. I personally tend to use em dashes in many of these spots, but only when there is some degree of causality, with the clause after the em typically elaborating in some way on the clause before it, idiosyncratic wonkery I discussed in this essay. Semicolons are useful when two thoughts are related, independent yet interdependent, and more or less equally weighted. They could exist as discrete sentences, and yet something would be lost if they were, an important cognitive rhythm. Consider this example by William James: I sit at the table after dinner and find myself from time to time taking nuts or raisins out of the dish and eating them. My dinner properly is over, and in the heat of the conversation I am hardly aware of what I do; but the perception of the fruit, and the fleeting notion that I may eat it, seem fatally to bring the act about. The semicolon is crucial here in getting the thought across. Prose of the highest order is mimetic, emulating the narrator or main character’s speech and thought patterns. The semicolon conveys James’s mild bewilderment at the interconnection of act (eating the raisins) and thought (awareness he may eat the raisins) with a delicacy that would be lost with a period, and even a comma—a comma would create a deceptively smooth cognitive flow, and we would lose the arresting pause in which we can imagine James realizing he is eating, and realizing that somehow an awareness of this undergirds the act. An em dash might be used—it would convey the right pause—but again, ems convey a bit of causality that would be almost antithetical to the sentence’s meaning. The perception follows temporally, but not logically. In fact, James is saying he doesn’t quite understand how these two modes of awareness coexist. Or consider Jane Austen’s lavish use of the semicolon in this, the magnificent opening sentence of Persuasion: Sir Walter Elliot, of Kellynch Hall, in Somersetshire, was a man who, for his own amusement, never took up any book but the Baronetage; there he found occupation for an idle hour, and consolation in a distressed one; there his faculties were roused into admiration and respect, by contemplating the limited remnant of the earliest patents; there any unwelcome sensations, arising from domestic affairs changed naturally into pity and contempt as he turned over the almost endless creations of the last century; and there, if every other leaf were powerless, he could read his own history with an interest which never failed. Periods could be ably used here, but they would not quite capture the drone of Elliot’s stultifying vanity. Again, form follows function, and the function here is to characterize the arrogantly dull mental landscape of a man who finds comprehensive literary solace in the baronetage. More than that, the semicolons also suggest the comic agony of being trapped in a room with him—they model the experience of listening to a self-regarding monologue that never quite ends. We hardly need to hear him speak to imagine his pompous tone when he does. The semicolon’s high water usage mark, as shown here, was the mid-18th to mid-/late 19th centuries. This is hardly surprising, given the style of writing during this era: long, elaborately filigreed sentences in a stylistic tradition that runs from Jonathan Swift to the James brothers, a style that can feel needlessly ornate to modern readers. Among other virtues (or demerits, depending on your taste in prose), semicolons are useful for keeping a sentence going. Reflecting on the meaning of whiteness in Moby Dick, Melville keeps the balls in the air for 467 words; Proust manages 958 in Volume 4 of Remembrance of Things Past during an extended, controversial rumination on homosexuality and Judaism. There is a dual effect in these examples and others like them of obscuring meaning in the process of accreting it, simultaneously characterizing and satirizing the boundaries of human knowledge—a sensible formal tactic during an era when the boundaries of human knowledge were expanding like a child’s balloon. Stylistically, the latter half of the 20th century (and the 21st) has seen a general shift toward shorter sentences. This seems intelligible on two fronts. First—and this is total conjecture—MFA writing programs came to the cultural fore in the 1970s and over the last few decades have exerted an increasing influence on literary culture. I am far from an MFA hater, but the workshop method does often tend to privilege an economy of storytelling and prose, and whether the relationship is causal or merely correlational, over the last few decades a smooth, professionalized, and unextravagant style has been elevated to a kind of unconscious ideal. This style is reflexively praised by critics: “taut, spare prose” is practically a cliche unto itself. Additionally, personal communication through the 20th century to today has been marked by increasing brevity. Emails supplant letters, texts supplant emails, and emojis supplant texts. It stands to reason that literary writing style and the grammar it favors would, to a degree, reflect modes of popular, nonliterary writing. Beyond grammatical writing trends, though, semicolons are a tool often used, as exemplified in the Austen and James examples, to capture irony and very subtle shades of narrative meaning and intent. It might be argued that as our culture has become somewhat less interested in the deep excavations of personality found in psychological realism—and the delicate irony it requires—the semicolon has become less useful. Another interesting (though possibly meaningless) chart from Vox displays the trend via some famous authors. As fiction has moved from fine-grained realism into postmodern satire and memoir, has the need for this kind of fine-grained linguistic tool diminished in tandem? Maybe. In any case, I have an affection for the semi, in all its slightly outmoded glory. The orthographical literalism of having a period on top of a comma is, in itself, charming. It is the penny-farthing of punctuation—a goofy antique that still works, still conveys. 2. A larger question Vonnegut’s anti-semicolonism brings up might be: Do we need rules, or Rules, at all? We seem to need grammatical rules, although what seem to be elemental grammatical rules are likely Vonnegutian in provenance and more mutable than they seem. For instance, as gender norms have become more nuanced, people—myself included—have relaxed on the subject of the indeterminately sexed “they” as a singular pronoun. Likewise, the rule I learned in elementary school about not ending sentences with prepositions. Turns out there’s no special reason for this, and rigid adherence to the rule gives you a limited palette to work with (not a palette with which to work). We know, on some level, that writing rules are there to be broken at our pleasure, to be used in the service of writing effectively, and yet writing is such a difficult task that we instinctively hew to any advice that sounds authoritative, cling to it like shipwrecked sailors on pieces of rotten driftwood. Some other famous saws that come to mind: Henry James: “Tell a dream, lose a reader.” Elmore Leonard: “Never open a book with weather.” John Steinbeck: “If you are using dialogue—say it aloud as you write it. Only then will it have the sound of speech.” Annie Dillard: “Do not hoard what seems good for a later place.” Stephen King: “The road to hell is paved with adverbs.” And more Kurt Vonnegut: “Every character should want something, even if it is only a glass of water”; “Every sentence must do one of two things—reveal character or advance the action”; “Start as close to the end as possible.” In the end, of course, writing is a solitary pursuit, and for both good and ill no one is looking over your shoulder. As I tell my students, the only real writing rule is essentially Aleister Crowley’s Godelian-paradoxical “Do what thou wilt, that shall be the whole of the law.” Or alternately, abide by the words of Eudora Welty: “One can no more say, ‘To write stay home,’ than one can say, ‘To write leave home.’ It is the writing that makes its own rules and conditions for each person.” Image: Flickr/DaveBleasdale
Has there, in American letters, at least, ever been a better noticer than John Updike? By “noticer,” I don’t mean “writer of detail,” exactly. There are many writers whose use of detail I find more narratively effective: Saul Bellow, Joan Didion, Toni Morrison, Denis Johnson, Stephanie Vaughn, etc. Updike’s use of detail, as I discussed in a recent essay, often impinges upon the realism, or “realism,” of his fictional worlds, overlarding them with sensory detail that too often alerts the reader to the writer’s presence. But in terms of pure, preternatural eye for the minute, the ephemeral, and the easily missed, it is difficult to think of another writer who can compare. Here are a couple of excerpts from “Incest,” one of the earlier stories (contained, in fact, in the wonderful The Early Stories), many of which endeavor to capture the mundane drama of young married life, as played out in a series of similar, cramped New York apartments. In the first excerpt, Lee, husband and father, is cleaning up the sugar his daughter has spilled: With two sheets of typing paper, using one as a brush and the other as a pan, he cleaned up what she had spilled on the counter, reaching around her, since she kept her position standing on the chair. Her breath floated randomly, like a butterfly, on his forearms as he swept. Later, he watches his wife drying diapers in the bathroom: Looking, Lee saw that, as Jane squinted, the white skin at the outside corner of her eye crinkled finely, as dry as her mother’s, and that his wife’s lids were touched with the lashless, grainy, desexed quality of the lids of the middle-aged woman he had met not a dozen times, mostly in Indianapolis, where she kept a huge brick house spotlessly clean and sipped vermouth from breakfast to bed. The random breath on the forearm like a butterfly, the fine-grained quality of dry eyelid skin—these are signature Updike details. What other writer would notice, let alone bother to describe, the texture of his wife’s eyelids? He is a master at this kind of description, keying in on the microscopic and near-invisible, not as world-building flourishes, but as primary detail. The child’s breath on Lee’s forearm hairs is a perfect metaphor for Updike’s sensory apparatus—I imagine this apparatus as fine cilia, authorial cat whiskers, delicately picking up the slightest shift in descriptive breeze. This keenness extends to non-sensory details, the internal mechanism of a character’s mundane thoughts. We have closely followed Lee from, in his words, breakfast to bed, and we lie with him as he deploys a favorite tack against the insomnia their tiny, hot apartment causes him, a mental exercise he refers to as the insomnia game. The insomnia game sees Lee working his way through the alphabet in the following manner: He let the new letter be G. Senator Albert Gore, Benny Goodman, Constance Garnett, David Garnett, Edvard Grieg, Goethe was Wolfgang and Gorki was Maxim. Farley Granger, Graham Greene… Detailing a character’s nighttime mental routine is unusually perceptive to begin with, but true to form, Updike finds a higher register of his protagonist’s attention to pay attention to, as Lee pauses his list to work through the less familiar foreign first names of Goethe and Gorki. These moments of deep attention are not there to embellish a larger narrative point—they are the point. In the space of several pages, an accumulation of these details create a world in miniature and a feeling of rare intimacy with its inhabitants. The apartment stories—“Incest,” and others like it: “Snowing in Greenwich Village,” “Sunday Teasing,” “Should Wizard Hit Mommy,” etc.—seem to me the perfect vehicle for Updike’s rare gifts. There is, after all, a claustrophobia to this kind of detail, and these small city apartments match setting and theme to technique. The apartments become a kind of panopticon, with Updike’s thousand eyes relentlessly monitoring the stifled desires and tense moods of their main characters—almost invariably a young married man. At the same time, the molecular focus that Updike brings to these stories manages to turn one-bedroom shotguns into universes of discovery. There is a sense in these stories of relentless searching for a truth or truths that can only be found by plumbing down into the granular, the microscopic; these tiny details both encode and reveal the larger hidden structures we move through unaware. A small New York apartment contains an entire life—the miracles (and curses) of marriage, childbirth, sex, and death. [millions_ad] The stories themselves follow suit, finding the largest meaning in the smallest plot point. For the most part, these tales bear little in common with others of their seeming kin—John Cheever’s earlier stories, for example, many of which also center on young married couples pressure-cooking in tiny apartments. But Cheever, despite being an eccentric fabulist, is conventional in his adherence to traditional plot devices, the need for inciting incidents and escalating tension. There is no Enormous Radio in Updike, just a very small one playing in the background, the crackling sound agitating the room’s ambient emotion, and the humbleness of the device itself obscurely bothering Lee (or Richard or Arthur) trying to read in the next room. We seem to be dropped into these characters’ lives almost at random, the barest wisp of event enough cause for a story to coalesce. In “Walter Briggs,” for example, a couple driving home from a party (cars—or trains, or airplanes—like apartments, are ideal Updike sensory dioramas) attempts to remember people they met on their first wedded vacation: “How could you forget Roy? And then there was Peg Grace.” “Peg, Grace. Those huge eyes.” “And that tiny long nose with the nostrils shaped like water wings,” Claire said. “Now: tell me the name of her pasty-faced boyfriend.” “With the waxy blond hair. Lord. I can’t conceivably hope to remember his name. He was only there a week.” This goes on far longer than it would in the work of other writers, several pages. On and on they drone, exactly the kind of inane, half-focused conversation that composes the atomic structure of married life, exactly the kind of desultory scene typically excised from most stories. And yet: something snags—her enviable memory, and his inability to summon the name of a comic figure at the camp. Later, at the door of sleep, his mild frustration blends with a litany of details from the camp, and the sudden return of the man’s name—Walter Briggs—is like a poignant echo of his old love for his wife. In both their obsession with remembering tiny details, and their ability to do so, these two resemble their creator. Who, but Updike, would find erotic charge in a nostalgic memory competition? By so heavily foregrounding textural detail in these stories, Updike calls into question what constitutes a story to begin with. There is an aesthetic claim being made, that anything can be a story if you look closely enough. And the domestic sphere, Updike’s natural habitat and milieu, is all stifling closeness—what are marriages if not an infinite series of minute, learned, hateful, and joyful gestures, performed in the tiny theaters of our living rooms and beds? Here, the aesthetic claim grades into something that approaches a moral claim. A reader, waiting in these stories for twist or conflict or denouement, will get to the end unsatisfied, having missed the all intervening action, action that occurs on a moment-to-moment perceptual level. Life is like that, too. Although modern readers may justly find Updike morally distasteful on many counts—mid-century white male privilege, literary sexism, and political conservatism to name a few—he seems exemplary, at least, in the sense of how much attention a person ought to bring to bear on the banal splendidness that comprises their life. Taken as a whole, the attentiveness that Updike trains on these intermittently peaceful and unpeaceful homes becomes performative and self-justifying. Like a fantastically gifted magician, the show becomes less about the trick itself and more about the dexterity required to perform the trick. He is constantly finding the edge of his talent and reaching just beyond it for the detail so fine and fleeting that it is preposterous, even for him, to notice it. Yes, on a basic level, this is show-offy. But I sense that it also comes from a generous instinct, a desire to share something with the reader no one else has shared before. As he puts it in “Wife-Wooing”: “An expected gift is not worth giving.”
Punctuation, largely invisible and insignificant for normal people, as it should be, is a highly personal matter for writers. Periods, commas, colons, semi-colons: in their use or non-use and in their order and placement, can represent elaboration, conjecture, doubt, finality. And in aggregate, over the course of a text, the rhythms of punctuation advance an author’s worldview and personality as surely as any plot or theme. Patterns of punctuation usage are the writerly equivalent of an athlete’s go-to moves, or a singer’s peculiar timbre and range—those little dots and squiggles, in a sense, encode your voice. Anthony Powell’s colon (pardon the inadvertent image) is as signature as Kyrie Irving’s crossover or Rihanna’s throaty cry. For me, there is no punctuation mark as versatile and appealing as the em dash. I love the em dash in a way that is difficult to explain, which is, probably, the motivation of this essay. And my love for it is emphasized by the fact that many writers never, or rarely, use it—even disdain it. It is not, so to speak, an essential punctuation mark, the same way commas or periods are essential. You can get along without it and most people do. I don’t remember being taught to use it in elementary, middle, or high school English classes; I’m not even sure I was aware of it then, and I have no clear recollection of when or why I began to rely on it, yet it has become an indispensable component of my writing. It might be useful to include an official definition of the em. From The Punctuation Guide: “The em dash is perhaps the most versatile punctuation mark. Depending on the context, the em dash can take the place of commas, parentheses, or colons—in each case to slightly different effect.” The “slightly different” part is, to me, the em dash’s appeal summarized. It is the doppelgänger of the punctuation world, a talented mimic impersonating other punctuation, but not exactly, leaving space to shade meaning. This space allows different authors to use the em dash in different ways, and so the em dash can be especially revealing of an author’s style, even their character. The maestro of the em dash—as he was with many things (and apologies here, it is difficult not to annoyingly play, or seem to play, on a punctuation’s usage while writing about it)—was probably Vladimir Nabokov. The locus of Nabokov’s attention is usually at least half trained on the fictional document he’s producing, so em dashes often serve as a kind of in-text footnote. But in a more general sense, he simply employs them as part of his exemplary stylistic machinery, using them as counterweights against commas, as parenthetical ballast and rhetorical cog. In Lolita, Nabokov is engaged in creating a calibrated ironic voice that half-emulates speech while retaining its smooth literary surface, and em dashes enable a more precise pacing of words and thoughts from the sentence to paragraph level. A representative passage chosen completely at random: I launched upon an “Histoire abregee de la poesie anglaise” for a prominent publishing firm, and then started to compile that manual of French literature for English-speaking students (with comparisons drawn from English writers) which was to occupy me throughout the forties—and the last volume of which was almost ready for press by the time of my arrest. I found a job—teaching English to a group of adults in Auteuil. Notice how the use of em dashes here, not strictly prescribed by any pressing grammatical need (the first could be justly replaced with a comma, the second eliminated), are used to create an internal structure that bridges paragraphs. The long sentence at the end of the first paragraph closes with a short clause set off by an em dash, and the short sentence at the beginning of the next starts with a shorter clause also enclosed by the em. The chief effect of this kind of bracketing is, I think, intuitive and rhythmical, adding to Humbert’s pompous purr, but there is a secondary effect of conjoining the ideas of transgression (his arrest) and seeming normalcy (finding a job), a pas de deux central to Lolita’s thematic heart. [millions_ad] A more contemporary user of the em dash is Donald Antrim. Antrim’s em dash helps to create a faltering narration that expresses the pervasive emotional mood of his work, an almost paralytic anxiety. Take this first sentence, from the story “Ever Since:” Ever since his wife had left him—but she wasn’t his wife, was she? he’d only thought of her that way, had begun to think of her that way, since her abrupt departure, the year before, with Richard Bishop—Jonathan had taken up a new side of his personality and become the sort of lurking man who, say, at work or at a party, mainly hovers on the outskirts of other people’s conversations, leaning close but not too close… The narrator has only just begun to have a thought about Jonathan’s wife before a new thought intrudes, needlessly clarifying who she is to him before we even know who he is. “Needlessly” in story terms, though the larger narrative need is to exemplify, through halting syntax, Jonathan’s excruciatingly circumspect mental process. This is, to a degree, Antrim’s own process, and we get doses of it even in more remote, comic narration, such as in the long beginning sentence—Antrim is a great lover of long beginning sentences—of “An Actor Prepares:” Lee Strasberg, a founder of the Group Theatre and the great teacher of the American Method, famously advised his students never to “use”—for generating tears, etc., in a dramatic scene—personal/historical material less than seven years in the personal/historical past; otherwise, the Emotion Memory (the death of a loved one or some like event in the actor’s life that can, when evoked through recall and substitution, hurl open the floodgates, as they say, right on cue, night after night, even during a long run)—this material, being too close, as it were, might overwhelm the artist and compromise the total control required to act the part or, more to the point, act it well; might, in fact, destabilize the play; if, for instance, at the moment in a scene when it becomes necessary for Nina or Gertrude or Macduff to wipe away tears and get on with life; if, at that moment, it becomes impossible for a wailing performer to pull it together; if, in other words, the performer remains trapped in affect long after the character has moved on to dinner or the battlefield—when this happens, then you can be sure that delirious theatrical mayhem will follow. Here, Antrim actually violates, as he sometimes does, a basic rule of parenthetical em dash usage, that you can only use one set per sentence. The violation of this stricture is unsettling and makes it difficult to keep up with meaning. Which, in a sentence and story about artistic chaos and loss of control, is, of course, the point. Emily Dickinson is probably the most well-known user of the dash, to such an extent that “em” might justly be taken as short for “Emily.” She habitually ended lines with em dashes, sometimes to an obvious effect, sometimes not. Here is her most famous stanza: Because I could not stop for Death— He kindly stopped for me— The Carriage held but just Ourselves— And Immortality. What are the dashes doing here? On the one hand, since they don’t serve any obvious syntactical function, they can be read simply as a stylistic tic. But they do create a feeling of hesitation that serves the poetry. Without them, this stanza is a nicely crafted, clever piece of thinking about the inevitability and dignity of death. With them, we feel Dickinson’s hand hovering over the page, considering her subject. This lends a poignancy to the poem, a sense of the artist thinking through her subject, considering the terms of her own death. Her use of the em dash obliquely posits writing as an elaborative act, and in many of her poems the em transforms what would otherwise be somewhat inert, though great, common meter into something alive to itself, process-oriented. My own favorite use of the em dash is for elaboration, similar to the way many writers use colons. As a personal rule, I only use colons in a specific context: that is, if what follows answers the question what? Em dashes, I find useful for both narrowing and expanding a train of thought that might lose momentum in a new sentence—in this sense, they also stand in for the semicolon, but semicolons are best used (in my fuddled cosmology of punctuation) as dividing walls between two related but independent thoughts of approximate equal value (I wholly reject, by the way, that old bullshit about eliminating semicolons). In truth, I probably overuse the em, find too much pleasure in asides, in explanation. But I can’t do it, I cannot write terse little impregnable Tobias Wolffian sentences that stand on their own. Though I can admire a page of these sentences—the calm presiding rationality, like civilized people queueing to exit the building in a fire drill—I am drawn instinctively to the dithering em, some contingency always butting up, worrying the previous sentence before it’s had a chance to end. As Noreen Malone put it in a self-deprecating Slate article, “The problem with the dash—as you may have noticed!—is that it discourages truly efficient writing. It also—and this might be its worst sin—disrupts the flow of a sentence.” This is true. But is efficiency the point or purpose of writing? It seems to me that novels, especially, are almost anti-efficiency devices. Yes, we want to communicate clearly, but sometimes, just as crucially, we also want to clearly communicate the difficulty of communicating clearly. Image Credit: Wikimedia Commons.
“I think fiction writing which does not acknowledge the uncertainty of the narrator himself is a form of imposture which I find very, very difficult to take. Any form of authorial writing where the narrator sets himself up as stagehand and director and judge and executor in a text, I find somehow unacceptable. I cannot bear to read books of this kind.” – W. G. Sebald That this is the age of first person seems undeniable. Essay and memoir are—have been for some time—culturally ascendant, with the lines between fiction and essay increasingly blurred (I’ve written about this here). In its less exalted form, first person dominates our national discourse in many guises: the tell-all, the blog post, the reality confessional booth, the carefully curated social media account, the reckless tweets of our demented president. We are surrounded by a multitude of first person narratives, vying for our time and attention, and we respond to them, in our work, and increasingly in our art, in first person. My impression, as a writer and teacher, is that over the last 10 or 15 years there has been a paradigmatic move toward first person as the default mode of storytelling. In a workshop of 20 student pieces, I’m now surprised if more than a third are written in third person. When I flip open a story collection or literary magazine, my eye expects to settle on a paragraph liberally girded with that little pillar of self. Anecdotal evidence tends to support this suspicion. A completely random example: six of the last 10 National Book Award winners have been first-person narratives; of the 55 previous NBA winners stretching from 2005 to 1950 (Nelson Algren’s The Man with the Golden Arm), the tally is 40 to 15 in favor of third person. This is, of course, completely anecdotal and almost certainly statistical noise, to a degree. Still, it’s suggestive. As recently as 10 years ago, creative nonfiction specialist jobs barely existed at the university and graduate MFA level; last year, there were more creative nonfiction job openings than comparable tenure track positions for poets. Essay and memoir classes have sprung up everywhere. Whether this trend is significant and whether it will continue are debatable; that it is a trend, seems less so. It worries me that we may be slowly losing the cultural ability or inclination to tell stories in third person. Why does this matter? Because, I believe, third-person narration is the greatest artistic tool humans have devised to tell the story of what it means to be human. In How Fiction Works, James Wood cites Sebald, decrying third person as obsolete following the horrors of World War II. Wood comments, “For Sebald, and for many writers like him, standard third-person omniscient narration is a kind of antique cheat.” The general argument, as advanced by Sebald, and more recently, by writers like David Shields and Will Self, seems to go: Flaubertian third-person omniscient narration is a jerry-rigged, mechanistic anachronism blithely ignorant of the historical context that renders it obsolete; far from “realism,” it is almost wholly artificial, beginning in the first place with the artifice of a narrator and extending through the sleight-of-hand known as free indirect discourse (crudely put: the blending of narrator and character perceptions). First person narration, the corollary would go, is more immediate and less contrived. It is authentic. Most people seem to agree. These critical interpretations both reinforce and describe a more popular apprehension of first-person narrative—that it is the most direct and natural form of storytelling. In creative writing classes, teachers will often advise students to employ first person with more overtly raw or emotional material, operating on the rationale that first person has an implicit honesty third does not. Sebald’s quote—as to the inherent (and therefore inherently truthful) uncertainty of the essayistic perspective—is simply a more sophisticated version of this position, what we might call the naturalistic view of first person. First person, however, contains a contrivance central to its character that third person does not: audience. In first person, someone is addressing someone else, but absent narrative framing to position these someones—a la Holden Caulfield directing his speech to a ghostly doctor—we find ourselves in an inherently ambiguous space: to whom, exactly, is this person talking, and why? The uncertainty of this space, I would argue, is largely filled, intentionally or not, by the voice of the narrator, its presence and authority. Even if this narrator declaims her own uncertainty, she declaims it with certainty, and she declaims it toward an imagined audience, in a speaker/listener relationship. There are no competing voices, no opportunity for the objective telescoping of third person, and so the reader essentially become a jurist listening to a lawyer’s closing argument. In this sense, all first-person narration is unreliable, or placeable on a continuum of unreliability. It isn’t accidental that the greatest examples of the first-person novel—Lolita, The Good Soldier, Tristram Shandy—make ample use of unreliability and/or intricate frame narration. The best examples of the form lean as heavily as possible on first person’s audience-related pretenses. Third-person narration, in contrast, contains no similar inherent claim to authority, and therefore tends toward a version of the world that is more essentially descriptive in character. A third-person narrative, whether in the form of a short story or War and Peace, is a thing to be inspected by the reader. It is, in a sense, a closed system, a ship in bottle, and the reader can hold it up to the light to see how closely it resembles a real ship. If it does, part of the reading experience is to imagine it as the real thing; but it can be assumed, in a kind of contract on the part of intelligent writers and readers, that the shipbuilder is not pretending his model is fit for actual seafaring. In other words, the existence of a third-person narrator—that artificial authority Sebald found intolerable—signals the act of storytelling, and in doing so, encodes a structural uncertainty that first person lacks. Third-person narrators no longer walk onstage and deliver monologues, a la Jane Austen, but we still understand them to be devices in service of telling a story—a contrivance that announces itself as such. They are the artifice that enables the art, and they are truthful as to their own untruthfulness, or perhaps better, their truthlessness. Compared to the explicit machinery of third-person narration, first person’s artifice seems covert, a clandestine operation. This is not necessarily an argument against first-person narration—in able hands, this concealment can be a means of exposing greater truths about the subject of the writing or its writer—but it is an argument against the proposition that first person is somehow more transparent or “honest” than third. The other common objection to third-person narration, and by proxy an argument for first person, also concerns the artificiality of the third person narrator, not in artistic but rather, experiential terms. This is the second prong of the naturalist argument: it isn’t a thing that exists. No one walks into a room and thinks of themselves, “he walked into a room.” Also, no one simply watches other people walk into a room without being aware of their own frame of reference. And this is true: close third person, via free-indirect discourse, models human consciousness with an intimacy that strives toward first person’s access to a character’s thoughts and emotions. Why then, the argument goes, not dispense with this clumsy intermediary and go right to the source? Counterintuitively, third person achieves an effect, both in spite of and because of its narrator, that is more “realistic” than first. While no one walks into a room and thinks, “he walks into a room,” it can be asserted with even greater force that no one walks into a room and thinks, “I walk into a room.” No one, that is, who isn’t an imbecile or robot—not characters who figure heavily in the canon of great fictional protagonists. The experience of being a human is, in fact, an experience of dual consciousness. Human beings are social creatures, and human existence is an endless negotiation of the immediate, subjective perspective, and the greater objective context. We constantly divide our attention between the first- and third-person points of view, between desiring the shiny object in front of us and figuring out what it means for us to take it: who else wants it, what we have to do to get it, and whether it’s worth taking it from them. In this sense, close third person not only accurately models human cognition, but omniscient third does as well, since, while we cannot read other people’s minds, we are constantly inferring their consciousness—their motives and feelings. The human experience is a kind of constant jumping of these cognitive registers, from pure reptile-brain all the way up to a panoramic moral overview and back down, and human ingenuity has yet to invent a better means of representing this experience in art than the third-person narrator. The apparatus of third-person narration, while wholly artificial, ironically enables the most authentic depiction of the quagmire of personhood. Irony is key here, in both cause and effect. Third person’s scaffolding of multiple, competing levels of awareness is inherently, structurally ironic; the effect created by these slightly ill-fitting beams and joists, as the demands of narrative push and pull them against each other, is a large-scale, resonant irony. Writing about the ability of narrative to convey humanity’s huge profligacy of type, Adelle Waldman, in a New Yorker piece from 2014, quotes Leo Tolstoy’s depiction of Vronsky: He was particularly fortunate in that he had a code of rules which defined without question what should and should not be done. The code covered only a very small number of contingencies, but, on the other hand, the rules were never in doubt, and Vronsky, who never thought of infringing them, had never had a moment’s hesitation about what he ought to do. The rules laid it down most categorically that a cardsharper had to be paid, but a tailor had not; that one must not tell a lie to a man, but might to a woman, that one must not deceive anyone but one may a husband; that one must not forgive an insult but may insult others, etc. She says, “If someone like Vronsky were to give an account of his moral code, it would not, we can be sure, read in precisely these terms.” This is true but neglects an important aspect of this rendering of Vronsky’s moral code, for we see at once in this passage a social view of Vronsky’s hypocrisy that shades toward a self-awareness of his own hypocrisy. This shading—the ironic bounce of the repeated “never,” and the pompous “most categorically”—both enact Vronsky’s pompous hypocrisy and suggest a shiver of cognitive dissonance, of unease, that seems to come from Vronsky himself. The point is debatable—maybe Tolstoy is just calling Count Vronsky an asshole—but in a general sense, the ironic space that third person carves out creates a productive ambiguity that deepens character the same way these little ironies of the self, the simultaneity of objective and subjective, deepen human existence the more a person is aware of them. In this case, they suggest a Count Vronsky who is not only an asshole, but also, perhaps, very slightly aware of his own assholishness, as most assholes are. It at least implies that possibility—a complex position unavailable to first person, in which a Vronsky POV would essentially either cop to his own hypocrisy, or strategically introduce it through unwitting revelation in the usual reliable unreliable method. As a thought experiment, try to imagine Ulysses written in the first person, the dueling solitary consciousnesses of Stephen and Bloom. We are, of course, embedded deep in Bloom's and Stephens’s minds, but we are embedded there, via virtuoso free-indirect discourse rather than first-person. It is surprising, in a way, that Ulysses was not written in first—after all, here we have the summit of stream-of-consciousness narrative, with an emotional and associative immediacy that has informed 100 years of writing all the way to the essayists of the moment. Not only this, but the fracturing of consciousness and Dublin’s social institutions as represented in the book are (as we understand, in a somewhat trite though probably accurate sense) a cultural response the First World War; per Sebald, we would expect such a narrative to dispense with the puppetry of third-person narration. So why not in first? What would be lost? Among other things, it would more or less be simply a record of human confusion. It would be an exhaustive, exhausting trek through Dublin, unremitting in its assault on our senses. Ulysses is already exhausting enough in this regard, but many of the moments of relief are moments of perspectival shift: the wider view of Stephen in the classroom, for example, or the anti-Semitic Citizen throwing a biscuit tin at Bloom as he flees the pub, righteous and triumphant. These, and similar moments allowed by the omniscient narration, crucially allow in other people, complicating the dominant note of mental claustrophobia. I say crucially, because the novel is not, ultimately, about mental claustrophobia, about being trapped in oneself; it is about the opposite, about the inevitability and value of social connection. A Ulysses in first would represent, in spite of its erudition and catholicity of reference, essentially a shriveling worldview, rather than the enlarging one it offers. HCE: Here Comes Everybody. All of which is to say that the current critical and cultural movement away from third-person narration should be taken seriously, and to some extent—as much as such a thing is possible—resisted. Matters of taste come and go, and it may seem silly to imagine third-person narration disappearing. After all, it has persisted in its current form for going on 300 years. But many pinnacles of high art recede and disappear in the face of changing norms. It was probably similarly hard for the 19th- century art lover to imagine classical portraiture and Renaissance brushwork disappearing. David Shields and similar critics may be dismissed as extreme, but they give voice to a larger cultural impulse, the enthronement of unmediated personal experience and feeling (as though such a thing were possible, even if desirable) as the height of written expression. Reviewing Meghan Daum’s essay collection, The Unspeakable, Roxane Gay writes, “When it comes to the personal essay, we want so much and there is something cannibalistic about our desire. We want essayists to splay themselves bare. We want to see how much they are willing to bleed for us.” The promotion of this kind of writing is, in turn, a collective response to larger cultural currents, among them the still shockingly recent advent of the Internet and reality television. In this context, it is not hard to imagine omniscient third person, with its many registers of complex irony and representation, becoming the truly outmoded art form that Sebald and others would like it to be, an ornate artifact of a slower and more explicable age. And it’s true that in a very real sense, third person is not the narrative mode of our time. A Henry James novel is essentially the anti-tweet. Its aesthetic roots are in a more contemplative era, an era with fewer distractions and, simultaneously, more incentive to consider one’s place in the larger social context of a world that was rapidly expanding. Now that the world has expanded to its seeming limits, we see an urge to put the blinders on and retreat into the relative safety of personal narrative. This impulse should be resisted. We need to engage with our world and one another, making use of the most sensitive instruments of understanding we have at our disposal. Image Credit: Pixabay.
There is a special joy in finding someone who hates a book as much as you do. But not just any hate, and not just any book. There are plenty of books you may not like: books that fail to deliver on their explicit premise or implicit promise, books with middles that sag like the floor of a teardown, books with underwritten or overworked prose, books that strain credulity, books that could stand to strain credulity a bit more, books that are too long, books with bad covers, books that just aren’t very good. The world is full of books you may not like, and it is not hard to be generous about them, especially as a writer—because, as well you should know, books are incredibly hard to write, and even harder to write well. A genuinely good book is something of a rarity; a genuinely great book is something of a miracle. So if someone, in your estimation, misses the mark, that is an easy thing to forgive, either by way of reading charitably or simply putting down the book in question. The type of book I describe here must be a book that you not only dislike, but that everyone else likes, a lot. By “everyone,” I mean friends and family, the general reading public, and publishing’s commercial/critical apparatus—everyone loves it, and your non-love is so unusual as to make you question your taste. In extreme cases, it may seem as though there’s a conspiracy afoot in the general culture to make you feel insane, a kind of cultural gaslighting. The hatred you develop for these books is different in character from mere non-enjoyment. It is rooted in a feeling of unfairness—not so much being left out, as being the clear-eyed pariah in a horror film, Steve McQueen in The Blob. It is a hatred that begins to feel personal, a resentment that extends from the text in question to the person responsible for the text. Several examples of books like this come to mind from recent years, but I will focus on one in particular, title and author redacted. This novel came highly recommended in print and online reviews, and by many of my friends and colleagues. I’ll credit my mother, an unusually wary reader, with some ambivalence, but by and large the response was a tidal wave of praise. It was variously hailed as one of the 10 best of the year, an utterly brilliant book, a landmark performance by [REDACTED]. I found it almost unreadable. The highly touted prose is a lush jumble of metaphors and registers without any controlling logic or taste. Each sentence is like a cordoned-off museum of its own aesthetic particulars, seemingly unrelated to the sentence coming before or after. A single paragraph might blithely mix lavish description, various unconnected similes, a flat and unfunny sex joke or two, and futurist Internet-speak, with no seeming concern as to how any of these elements work together. How, I wondered as I read, did this qualify for anyone as “sparkling prose?” For me, the reading experience was like being trapped in a mudslide, an avalanche of language, each additional word adding to the incoherence. The story is a series of lazy implausibilities cobbled together with all the deftness of a tailor wearing oven mitts. The characters are ludicrous, with ludicrous back stories and ludicrous arcs—they are ludicrous to an extent that feels intentional, satirical, except nothing is being discernibly satirized. Rather than a failed attempt at satire, the general silliness of the proceedings instead stems from what feels like authorial disengagement; the novel itself seems uninterested in why its characters do things and what their level of self-awareness is. They simply move where [REDACTED] needs them to and behave as the story dictates, however unconvincingly. The overall effect is a contempt for these people and their fictional reality. If I’d happened randomly upon the book, I would have simply stopped reading, but in the spirit of someone not getting a joke and asking the teller to repeat it over and over, I kept on, thinking at some point it would click. When it hadn’t halfway through, I finally stopped, feeling deceived. What trick of promotion—what cocktail party legerdemain—had managed to foist this nonsense upon the reading public, had somehow turned this gobbling turkey into a golden goose? That the book had been published at all was vexing enough—though not, seemingly, to the scores of critics and fellow authors who padded the proceedings with an introductory chapter of praise. I moved on to other, less maddening novels, but I’d be lying if I said I didn’t seek out negative Amazon reviews for that paltriest comfort: the spectral two-star outrage of other bewildered readers. The reflexive cynical view—one that is easy to adopt in the thrall of book-hate—is that you have fallen victim to a corporate conspiracy. Publishing decides ahead of time a certain writer’s moment has come, the engines of commerce and publicity fire up, and the quality of the book in question is a distant secondary concern. While this may contain a kernel of truth, a simpler, more general explanation probably suffices. Namely, there is a great inexorable power in expectation—as social beings, we want to like things we’re supposed to like, and we’re uncomfortable standing at the platform, watching the bullet train of popular opinion shriek by. Human nature, of course, tends toward a herd instinct, but the inclination of readers—and this includes agents, editors, publicists, and critics—to enjoy a book derives mostly from good qualities: kindness and generosity and the urge to like, rather than dislike things. About a year later, I sat with friends on the second-floor porch of a rented vacation house in the mountains. We were enjoying a drink and the view of a late summer sunset over a canopy of trees just beginning to bruise with the colors of autumn. It was cool, birds flew to and fro, and from inside the house drifted music and the smell of roasting meat. The scene felt unimprovable, but then, from out of nowhere, this book came up in conversation and I discovered, to my surprise, two other readers who hated it. Oh, my friends! Oh, the joy! The sweet, delicious bliss of finding other readers who hate a book as much as you! Our individual aversions, long suppressed in unspoken self-doubt, were suddenly given full voice in an ecstasy of communal loathing. At last, we’d found each other. We leaned in with a conspiratorial air. “Godawful,” said my friend B. “I put it down after 50 pages, maybe 40,” said J. I said, “It’s completely ridiculous. I quit halfway, imagine reading the whole thing.” “I did,” said B. “I couldn’t believe I was reading the same book people gushed over all year. It was like I had to keep checking.” This went on too long, as this kind of thing typically does. Twenty, 30 minutes, maybe. We were like children gorging on an unattended cake, so desperate to get in every last piece of frosting and crumb that we misjudged our appetites. The initial greedy sugar rush of shared hatred went away and we crashed into a state of rueful circumspection. B said, “I mean, there’s no question [REDACTED] is talented.” “Yeah,” said J, “And I loved [REDACTED’S] first novel. I think it was part of my disappointment.” The conversation spiraled in once or twice more to how bad the novel was, but with ever-waning enthusiasm. As readers, I think, we share an understanding of what a niche audience we are, and what a niche enterprise writing a novel is. There’s a feeling—and perhaps this goes a way toward explaining why bad literary novels become fêted—that any nominally non-stupid book enjoying a moment of popularity is something to be celebrated. Disliking a popular literary novel—that blackest of black swans—is not like disliking a TV show everyone else thinks is great. Being the contrarian in the corner of the party who thinks Breaking Bad is overrated might not win you any friends, but being the reader who thinks a good book is overrated feels uncomfortably close to a betrayal. We settled down. I said, “Yeah, agreed. [REDACTED] really is an interesting writer. “I’ll read the next one,” said B. “Also,” J said, “I guess we could be wrong about this.” But that was a possibility too terrible to contemplate, and the conversation moved on to other things as we went inside for dinner. Image Credit: Flickr/K-Screen Shots.
“I came to realize that far more important to me than any plot or conventional sense was the sheer directionality I felt while reading prose, the texture of time as it passed, life’s white machine.” —Ben Lerner, Leaving the Atocha Station In the creative writing classes I teach, my students—most of them brand new at writing fiction—often go crazy writing plot. Their understanding of fiction, derived from a Stephenie Meyer/J.K. Rowling/Suzanne Collins-heavy reading background (not to mention 18 years of TV and movies), is that in fiction, stuff needs to happen. These early stories are breakneck affairs, full of marriages and divorces and car chases and gunplay and fistfights and murders and suicides and murder-suicides—sometimes spinning several into the same piece. They are B-movie scripts written as prose, mostly expository. Slow down, I advise, boringly. They listen, or pretend to listen, as I explain that literary fiction, the kind I am ostensibly being paid to instruct them how to write, is more character than plot. This is axiomatic: in literature, character is primary, always. Yes, things must happen, but they can be small things, incremental turns of the thousand gears that make up a narrative’s clockwork interior mechanism. And they should be things, ultimately, that derive from character. My students nod and proceed to write stories with gunfighting vampires—but introspective gunfighting vampires—and get an A-minus, probably. Lately, however, I’ve read a spate of critically lauded books published in the last few years that make me feel like one of my students. Where, I wonder, pausing halfway through, is the action? To be sure, there is still plenty of fiction being written with conventional plot, plenty of bestsellers with blood-soaked covers, but it has begun to seem to me that many of today’s best writers are writing fiction in which almost quite literally nothing happens. The prefatory Ben Lerner quote comes from his 2009 novel, the roman-a-clefish Leaving the Atocha Station. In it, Adam Gordon, a 20-something in Madrid on a Fulbright fellowship, wanders around the city smoking pot, thinking about poetry, writing very little of it himself, and freaking out when he isn’t sleeping. In other words, he is a 20-something poet abroad. What event there is, including the titular Madrid bombing, happens to him or near him, not because of him. Early on, for instance, he passively gloms onto a group of strangers in a bar, is befriended by a gregarious Madrileño and his sister, vomits from an excess of Mojitos, is driven home but can’t remember the address, is driven to a party, smokes too much pot while listening to a guitar player, pretends to cry while telling the sister his mother has died, and is comforted by her. I suppose there’s a bit of action there, in terms of his weaselly dissembling, but that’s about as far as it goes. The page turns, and we are on to the next day of espressos in the shower and lonely flaneuring around the Prado and Plaza del Sol and Gran Via. The point is repeatedly made that Adam cannot really speak Spanish, a narrative choice that, on its face, thematizes the problems Adam has with poetry and representation, but on a practical (read: plot) level also means that he cannot really do anything. His agency is limited by narrative design, clearing maximal novelistic space for his thought process, a process that includes lengthy ruminations on his lack of agency. Teju Cole’s Open City takes this mode a step further, almost entirely dispensing with conventional plot in favor of the narrator Julius’s peregrinations through New York. These “aimless wanderings” as he describes them in Chapter One, also describe the book’s narrative strategy, one in which the real-time event of Julius moving through the city is simply a pretext for him to think about the city, his life, the world, politics, as he says, “noticing himself noticing.” Flaneuring is, of course, nothing new in literary novels, nor, more generally is the phenomenon of the most important action occurring in thought. What, for example, is Madame Bovary besides the portrait of a woman noticing herself notice her real feelings about her life and her marriage? What seems different is the degree to which these books, and many similar others published in the last few years, feel little pressure to make anything happen or put their characters in any sort of plot-driven crucible. It would be negligent to discuss books lacking event without mentioning that 800-pound gorilla of strategic tedium, Karl Ove Knausgaard’s My Struggle. The Min Kamp series is an ode to inaction, a paean to plotlessness. The first section of Book I, approximately 200 pages, is committed to Karl Ove’s recollection of an adolescent New Year’s eve, during which he and a friend finagle a twelver of beer, hide it in the woods, and attend a classmate’s boring party. That’s it. Interwoven are lots of digressions about life, death, philosophy, and fatherhood—many of them interesting—but also interwoven are lots of other digressions into his daily existence, many of punishing mundanity (how many times, for example, are we treated to a scene of Knausgaard entering a gas station and buying a Coke?). Even the weightier second half of the book, given over to an account of the week between his father’s death and funeral, focuses mainly on all the stuff narrative is supposed to leave out: waiting on a flight, cleaning a bathroom, cleaning a kitchen, putting on a pair of pants—all flatly rendered in a po-faced Nordic mock-heroism. The static quality imparted in these books is not accidental, is, in fact, purposeful and, in certain senses, the point. You do not, in either reality or in Knausgaard, get the epiphany without the apophany. As a human body is mostly water, a human life is mostly waiting, mostly killing time between the main events. But fiction, as we have understood it for centuries, could be defined in character as “main events.” Traditionally, fiction seeks to strip out the tangential and irrelevant, the moments between moments, to create an accelerated version of reality in which relatively important things lead to other relatively important things—it isn’t that important things don’t happen in real life, it’s that it takes far too long to get from one to the next, and along the way any narrative outlines become lost in banality, what Lerner, cadging from John Ashbery, describes as “life’s white machine.” Why then, are some of the smartest and most celebrated writers in modern literature (I count among these, as well, Rachel Cusk, whose incomplete trilogy—Outline, Transit—is a study in passivity and seeming tangentiality; to some extent Elena Ferrante, whose Naples quartet accrues its violence and romance in endless reiterative, sedimentary layers; also, Sheila Heti, and an entire generation of creative non-fiction writers, a separate but closely related phenomenon)—telling stories that are largely banal, largely life’s white machine? One reason (as is, regrettably, so often the answer to this kind of rhetorical): the Internet. In moving our consumption of news and media from top-down to an a la carte model, the Internet has changed the way we are accustomed to receiving and disseminating information. Information of all variety spreads now across a series of nodes on a relatively flat plane of authority, and this change, though most frequently and obviously framed in terms of journalism, has also inevitably impacted the way we read and write novels. Put one way, William Shakespeare—or for that matter, Leo Tolstoy or Marilynne Robinson—is to Edward R. Murrow as Teju Cole is to your Twitter feed. This is a facile comparison, but it has some truth, I think. In an era of vertical information—roughly prehistory until 2005—novelists predictably wrote vertically. In the 19th century, a still-religious era with consolidated organs of journalism and publishing, authors wrote with epochal authority, a confidence in the universality of their observations and judgments: it is a truth universally acknowledged, all happy families are alike, etc. Even through the upheavals of the 20th century, there remained an essential confidence in the role of the author as someone bringing the news. Postmodernists like Thomas Pynchon, William Gaddis, and Don DeLillo may have doubted the institutions of their country, may have created new landscapes of American paranoia, but they had little paranoia or doubt with regard to their essential role as shapers and interpreters of a shared reality. As has been noted with tiresome frequency in the wake of the election, we now occupy individuated worlds of curated art, opinion, reality. There may still be objective truth, but it isn’t clear that we can reach consensus on what it is, and we accept as consensus fewer and fewer people as authorities. Of course, the word “authority” derives from the word “author,” and the act of writing itself is inherently an act of authority, of assuming the right to create and order words and thereby create sense. But there are degrees of order and sense in narrative, and writers like Lerner, Cole, Cusk, Knausgaard, and many others, it seems to me, are responding to this mood of uncertainty—if I may be grand, this weakening of cultural epistemic authority—by mitigating the authority they exert in their narratives; in general, by moving from the objective to the subjective. This manifests in certain ways—for one thing, in a preponderance of first person narrative. Third-person assumes the right to speak for, to inhabit, other characters than oneself, and to manipulate these characters, imbuing the text with a unitary consciousness; first-person, no matter the degree of artifice, implies a bounded consciousness, the disconnection between people. And disconnected first-person narratives—from blog posts to reality show confessional to the infantile tweets of our deranged president—are essentially the narratives of our time. It is not hard to mentally recast these writers as social media types: Knausgaard, the maximalist oversharer; Lerner, the pomo ironic; Cusk, the reticent philosopher; Heti, the more traditional diarist. (As an aside, it may to some extent subliminally flatter the reading public to imagine that a compilation of their status updates and thoughts on their life could—with a little editorial organization—comprise a publishable novel.) And this mitigated fictional authority also manifests in a tendency toward plotlessness and mundanity. The act of making up a story is an act of control, an exertion of order over entropy. The more carefully narrative is created, the more meticulously event is arranged for effect, the greater the implied presence of authorial control in the traditional sense. William Makepeace Thackeray, in Vanity Fair, for example, engages in a decades-long arrangement of characters’ lives, moving Becky Sharp and company around a diorama, into which he occasionally enters to comment: This is what happened to everyone, and why. In contrast, reproducing life “as it happens” in this type of quasi-memoiristic fiction, implies a position of authorial neutrality, with no presumption of ordering event for narrative effect or explanatory power: This is what happened to me, and who can say why? It may also be the case that we are, to an extent, culturally fatigued by plot. Five minutes of cable news provides sufficient event, enough dramatic twists and lurid intrigue, to sate anyone’s appetite for the fictional stand-in. Any adult with a job and smartphone is inundated with an unprecedented amount of media and advertising. Our attention is competed for nearly incessantly, with previous limits of mental privacy seemingly encroached on further every day—you cannot take a flight, ride in a cab, or gas up your car without a chyron scrolling beneath your weary eyes. Eventless fiction can, through this lens, be seen as a form of cultural protest, a refusal to vie for a reader’s scanty attention via the bright, shiny artifacts of plot. In terms of aesthetic experience, it is also a respite—I suspect many readers find themselves pleasantly lulled by the snowdrifts of Knausgaard’s youth, the quiet calm of the novel’s glacial inaction. During these “interesting times” (as the old saw would have it), an intelligent response may be to write less interesting fiction.
During this hoops-rich period, the frenetic Madness of March having transitioned into the more austere months-long slog of the NBA Playoffs, I found myself fruitlessly poking around for a good basketball novel. I’m both a writer and great fan of the game -- my podcast, Fan's Notes, pairs the discussion of a novel with a discussion of basketball, usually the NBA. My podcasting partner and I tend to find no shortage of cultural and metaphorical linkage between the two art forms, yet modern literary fiction seems to harbor no special love for this great game. Football has A Fan’s Notes, End Zone, The Throwback Special, Billy Lynn’s Long Halftime Walk. Baseball has The Natural, Shoeless Joe, Underworld, and more recently The Art of Fielding. For Christ’s sake, hockey yet has another Don DeLillo tome, the pseudonymously written Amazons. Where, I find myself wondering, is the great basketball novel? First of all, no, The Basketball Diaries is not a basketball novel. It is a memoir, and it is about heroin -- it features precious little actual basketball. John Updike's Rabbit and Richard Ford's Bascombe books both involve hoops to varying degrees, but not as a central concern or dramatic focus. Under the Frog, by Tibor Fischer, is a very good book about basketball players, but it concerns 1950s Hungary, the titular frog being the regime of Marshal Tito. What else is there? Walter Dean Myers wrote several young adult books that revolved around basketball; there’s also Sherman Alexie’s YA novel Absolutely True Diary of a Part-Time Indian and The Crossover by Kwame Alexander and the Blacktop series by my friend L. J. Alonge -- interestingly, most books about basketball that come to mind seem to be YA written by men of color, while Big Sports Lit is very, very white. There is not, as far as I can tell, a big work of literary fiction for adults that is “about” basketball, in the same sense that Chad Harbach’s Art of Fielding is “about” baseball. Perhaps this has to do with the particular character of these sports. Baseball, with its mano-a-mano pitcher-hitter duels, is perfectly congenial to narrative -- is itself comprised of a series of mini-narratives involving protagonists and antagonists (one way or the other depending on your rooting interests). There is really no moment of solo heroism in any other major sport comparable to the walk-off home run (or strike out) to end a game; there is likewise no greater sporting scapegoat than Bill Buckner and his ilk. In less dramatic terms, a baseball game is comprised of hundreds of discrete individual plays: someone throws a ball, someone hits it, someone fields and throws it, and it is caught again by the first baseman for an out. This is how traditional narrative is structured, a series of explicable interactions between a cast of characters that mount in importance and conflict until a crucial, deciding act that resolves the plot. Even the structure of baseball’s gameplay is writerly, with its nine innings constituting nine tidy chapters inside the larger dramatic arc. Football, too, though tritely metaphorized as violent, armed combat -- marching up the field, a war of attrition, a massacre, etc. --is constituted by many clean moments of contest, various plot points interspersed between the interminable commercial breaks. American football is American in character, pairing a love of mayhem with an equal love of bureaucratic fussiness. The game’s horrifying ultraviolence is committed within the parameters of a rulebook thicker than a Cheesecake Factory menu, meted out in orderly skirmishes, and broken up by five minute replays to determine the spotting of the ball within a nanometer or two. We want war, but we want a safe war, a manageable war in which the actors stay within their prescribed roles -- in which no one, in effect, goes rogue (few things are more pleasurably disconcerting than a broken play and the ensuing spectacle of a four-hundred-pound lineman hurtling toward the end zone). Again, this is very compatible with traditional storytelling, placing maximum visceral conflict and chaos within neat scene and a hyperrationalized narrative structure. In contrast, the narrative possibilities of basketball seem somehow European in character, closer to futból than football (or as a British student of mine liked to call it, handegg). Inbounds are approximate, as are jump balls. Except in certain key situations, there are no replays and refereeing occurs on the fly. Mistakes are routinely made, lamented, forgotten. Superstar players -- the protagonists of the game, so to speak -- are coveted, but the play itself is supremely team-oriented. Unlike baseball and football, in which individual statistics are iron-clad and fetishized, basketball stats are the subject of endless arguments regarding context. It is curiously difficult to disentangle the individual moments that contribute to an orange ball falling into a hole. Yes, someone shoots it, and yes, often someone assists on the shot, but a hundred other smaller actions, essentially unquantifiable -- screens, shooting gravity, secondary assists, etc. -- go into it as well. And even the countable stats are the subject of debate. Scoring twenty-eight points in a game sounds good until you look at how they were scored, with what efficiency, and giving up how much on the defensive end. Quants -- that is, stat nerds -- regularly put forth the case that a player like Andrew Bogut, a low-scoring defensive bruiser who sets vicious picks, is as valuable than a shooting threat like Isaiah Thomas. There is no comparable ambivalence in the record books of, say, baseball: a homerun is a homerun is a homerun. All of which is to say that there is, inherent to basketball’s play, an indeterminacy that may not lend itself to conventional narrative. Moby-Dick versus Heart of Darkness, to throw a strange but perhaps productive analogy at the fridge (and thereby further mix metaphors), are like baseball versus basketball. One is about a majestic, doomed assertion of individual will; one is about ambiguous forces clashing in a mist of doubt and dread. Occasionally a basketball player comes along who is great enough to totally clarify the terms of the game: LeBron James, for example. But these players are surpassingly rare, generational. If the orderliness of baseball and football lends itself generally to narrative, it lends itself specifically to retrospective narrative. In much the same way that we often imagine our lives as a series of cruxes (and model that imagining in our fictions), a football game can be broken down into a series of botched or successful plays, good or bad calls. These sports are almost built to be post-mortemed, in their perfect state only when finished. It seems consonant, then, that big literary sports novels are typically about a character looking back at former greatness and lost innocence -- either personally or culturally, or both. And this type of literary sentimentality, in turn, pervades the cultures of football and baseball, which are forever backward-looking, enshrining and nostalgiazing moments, sometimes as they still happen. Memorable plays are almost immediately assigned names as historically pungent as World War II battles: “The Immaculate Reception,” “The Shot Heard Round the World,” “The Catch.” Even the bungled plays have immortal names: “The Fail Mary,” “The Butt Fumble.” There aren’t really similarly fetishized moments in basketball. Its fluid and complex play does not invite the same kind of nostalgic retrospection, and indeed, it is unsentimental about its history to a degree that routinely enrages former greats. Basketball could never serve as a good metaphor for America’s glorious past, or even its fallen present (football still serves admirably here: see Billy Lynn’s Long Halftime Walk) but it might be just the sport for a more skeptical and circumspect twenty-first century, an era when we need a literature of certainty less than ever.
John Cheever may be the most misunderstood and miscategorized important American author of the 20th century. On three separate recent occasions, and many more times over recent years, I have read articles/interviews that group him stylistically with Raymond Carver. This is mystifying: one would be hard-pressed to think of a body of work more antithetical to Carver’s spare, working-class realism than Cheever’s elegant, upper-class fabulism, where nymphs come to life and families vacation in Italian seaside villages. I can only guess this very bad comparison stems from people not actually having read Cheever, while knowing that 1) he and Carver were drinking buddies at Iowa, and 2) both of their names begin with C and end with VER. He is often also (mis)paired with Richard Yates, a more understandable comparison. Both men served in the Second World War and chronicled the roiling fault lines beneath the tranquility of New York’s far suburbs. Both men were impeccable stylists, although Yates tended toward a rhetorical stylishness powered by limpid prose, while Cheever was, like John Updike, an extravagant sensualist, both in subject matter and descriptive tendency. Both men enjoyed their greatest success with novels, while exerting their greatest artistic mastery in the short story form. But Yates’s world, however dated it may be in 2017, is the world we live in. Cheever’s is not our world and never was. I have no way to verify this, but I suspect in the '50s he was misread as well, though misread more widely. He seems to be writing about the Westchester suburbs -- Shady Lawn and Bullet Park, with their sloping lawns and cocktail parties populated by characters recognizable as ur-Don Drapers, ur-Roger Sterlings. Except as we read, the landscape distorts, the familiar becomes strange. Cheever’s stories are, to put it simply, strange, and in them, the Mad Men may really be mad. Take “The Swimmer,” his most famous and familiar. Neddy Merrill, half-cocked on gin and tonics during a restorative summer brunch at the house of some friends, decides to return home through several miles of Connecticut exurb by swimming the lengths of contiguous pools. Thus begins a minor odyssey during which we watch as Neddy makes his way, first in drunken delight, but then through rainstorms, colder weather, and the hostility of former friends, gradually growing old and infirm, finally arriving home to find it deserted. What is going on here? In fiction, when unreal elements appear, usually one of two things is happening. In the first case, the unreal actually is real. This describes much of genre fiction, in which the reader expects vampires and aliens to appear -- would, in fact, be disappointed if they didn’t. In literary fiction, too, the unreal may be introduced with a straight face, for effect. Magical realism depends on the introduction of a fantastic element into otherwise grim reality, for instance in Gabriel García Márquez’s “A Very Old Man with Enormous Wings.” The appearance of an angel in a poor Colombian village creates a host of consequences, though a crucial difference between magical realism and, say, fantasy, is that in magical realism the narrative is primarily interested in the village, while in fantasy the author would focus primarily on the old man, his wings, how he got them, and what his home world is like. More typically, in literary fiction, the fantastic occurs as a manifestation of the main character’s disordered psychology. In close third person, the narrative is so intimately linked to a protagonist’s point of view that the world appears in subjective terms, and if the main character is sufficiently disoriented -- drunk, delusional, or simply experiencing very heightened emotion -- aspects of their immediate surroundings may become distorted in a way that reveals their mental state. In William Kennedy’s Ironweed, Francis Phelan, an itinerant, guilt-wracked alcoholic sees the ghosts of dead people he’s known, some of whom he killed. Although the narrative never states that they are apparitions deriving from his fear and shame, it doesn’t need to: we are able to read them as having a kind of immediate corporeality, at least to Francis, while still being utterly unreal, figments. So which of the two is happening in “The Swimmer?” Well, neither, really. On the one hand, it is impossible to read “The Swimmer” and think that the main events of the story are happening as described -- that, in the course of a single afternoon, a man ages 30 years while becoming increasing destitute and reviled -- unless we believe Neddy Merrill has entered some horrific parallel universe. On the other hand, it is equally impossible to read the events of the story as merely a manifestation of Neddy’s mental state. He’s been drinking as the story starts, but not that much. He is happy, overwhelmingly content in his life, really. Even if we were to read the story as a projection of Neddy’s subsumed life anxieties, it is impossible to imagine him projecting a vision of the world this entirely altered. Neddy finds himself in a third situation, a Cheeverian zone of strangeness between the actual and imagined, crucially of both and neither. Although Cheever makes frequent use of mythical tropes and creatures, it is not myth, not purely figurative. It is not magical realism because the strangeness is not intended to be taken literally -- strangeness in magical realism is almost always encountered and acknowledged by multiple characters, and is, in fact, a device meant to comment on the interlaced relationships that form a society. Strangeness in Cheever performs the opposite function: it is personal, particular, atomizing. In another well-known Cheever story, “The Enormous Radio,” a Manhattan couple buy a radio, and enjoy it until it begins picking up the conversations of neighbors throughout their building. The wife becomes obsessive, the husband guilt-ridden. It threatens to destroy their marriage and is returned. As with “The Swimmer” -- because the other elements in the story are so prosaic, so local and identifiable -- it is very hard to read the story as intending the reader to believe in a magical radio. But also like “The Swimmer,” the events of the story are too sharply defined and internally consistent to be written off as mistake or delusion. The closest available description is dreaming -- Cheever’s protagonists often feel as though they’ve slipped into a dream, their own or someone else’s. And yet this doesn’t seem exactly right, either. The fantastic does seem to be happening, but in an intensely subjective sense, as characters’ fears and desires warp the sturdy fabric of their previously staid realities. Cheever’s preferred locales -- Manhattan, Ossining, Italy -- deform like wax effigies, exposed to the heat of a character’s sudden lusts. This deformation is grotesque and startling in stories like “The Swimmer” and “The Enormous Radio,” and in less famous pieces like “The Chimera” and “Metamorphoses.” But many of Cheever’s less fantastic works operate in the same mode, if quieter. “The Country Husband” begins with Francis Weed nearly dying in a plane crash. He returns to Shady Hill to find everything subtly altered -- more vivid, shot through with erotic feeling, uncomfortably alive. This reads as standard narrative strangeness, i.e. a man has undergone trauma and found his perspective changed. But the next evening, Francis and his wife attend a neighborhood cocktail party, and we find ourselves in a zone of distorted reality. Francis suddenly recognizes the neighbors’ maid: when he was serving in France during the war, a French woman who’d been having relations with a German officer was forced to march naked through the town square. The maid is that woman. Normally, we would ascribe such an unlikelihood as a misperception on Francis’s part, but Francis asks after the maid and the hostess confirms she was hired from the same small town in Normandy -- Trénon -- where Francis had been stationed. Misperception is eliminated as an explanation -- it is, we are reassured, the same woman. But this seems wildly improbable, especially given that Francis has just had a paradigm-shifting experience, one that has tilted him toward the mysterious and sensual. A woman is sexually humiliated during the war; years later she reappears in a Westchester suburb, pouring brandy and coffee and serving as an emblem of the main character’s thwarted sexual energy, which later manifests itself in clichéd lust for the babysitter. While many writers could write the near-crash and subsequent vivification of their protagonist’s senses, it is uniquely Cheever to present the maid as a new fact of the landscape and leave the reader to deal with it. What is she doing there? She is real and she is impossible, or so improbable as to amount to the same. Again, like a heavy ball bearing rolling across a piece of tautened cloth, the weight of a protagonist’s anxious desire seems to have distorted the physical reality of his surroundings. In the end, Francis visits a psychiatrist and addresses himself to basement woodworking, a wholesome pastime that also sees him sequestered from the outside world -- not in self-protection, but rather, one senses, protecting Shady Hill from himself in a kind of erotic quarantine. The cumulative effect of these individual fantasias is, paradoxically, a strengthening of the apparatus of social realism in Cheever’s work. As in “The Chimera,” when a dream woman emerges from the woods surrounding the home of an unhappily married man, these events are oppositional in nature to the backdrop of reality and routine. The plots of many Cheever stories are, in effect, aberrations, and they do not last. The maid vanishes into the unnoticed shadows of suburban domestic life, and the radio is returned to the store. The fantastic in Cheever is intense, but it is not durable. In the end of most Cheever stories, the force of social expectation tends to smooth these abnormalities over, though it is not always clear how we’re meant to feel about this. At times we sense an opportunity lost; at times the story itself seems to breathe a sigh of relief as the normal rhythms of life reassert themselves. As a social critic, Cheever can be read, therefore, as simultaneously transgressive and conservative. On the one hand, the twin treadmills of suburban family life and postwar American consumerism stifle the human spirit. These visions represent a reaching beyond the borders of societal expectation for something rare and ineffable: sexual, religious, often both. The implication being that there is no adequate means for people to fulfill themselves within the boundaries of their normal life. Once a Cheever protagonist deviates, they deviate wholly, as in “The Housebreaker of Shady Hill,” in which Johnny Hake is fired and begins plundering the homes of his neighbors for cash. A corollary implication here would be how thin the line is between normal and the freakishly abnormal, how little occupiable space exists between the two. But this view of life, with the forces of madness held at bay only by an adherence to work and marriage is, itself, inherently conservative, in both its diagnosis of disease and prescription for cure. After all, given a binary choice between dull routine and utter chaos, most people will chose the former, and this mostly holds true in Cheever’s stories. Johnny Hake is wracked with guilt and, reinstalled in his previous position, returns the money he’s stolen. Francis Weed takes up penitent basement carpentry as a dull corrective while outside, dryads caper in the moonlit shadows of his garden. In a similar backyard, the Chimera, Olga, emerges a last time from the edge of darkened woods, staggering and bleeding, seemingly battered by her imaginer’s self-judgment. It is the tension between these two countervailing urges -- the urge for freedom and the urge for safety -- that lends Cheever’s work much of its enduring power. Though social norms have changed dramatically in the 50 years since his heyday, we still negotiate this axis of desire in our lives. We still veer wildly into chaos and overcorrect back into predictable routine. To survive the mundane crush, we daily create little fantasies that must be destroyed by nightfall. Image Credit: Wikimedia Commons.