I have a friend—call him Tom—who, like me, is a writer. Tom has written many novels over a long and enviable publishing career, and his novel-writing philosophy, related to me over various drinks at various bars, can be summarized as follows: Write whatever the hell you write, whatever concept or character or situation has burrowed under your skin and must be freed. Forget commerce and forget audience—you write for an audience of one, and if an editor or reader happens to find it interesting, all the better. A bestseller, in Tom’s view, should merely be a happy alignment of the world’s interests with your own, a momentary occupation of a dominant paradigm that is essentially unplannable. Or not something to be planned, at any rate. Tom's philosophy holds many advantages. It is pure, uncompromised and uncompromising. It presumably results in the best art, at least if you assume that, in theory, the most adventurous art usually takes money the least into account. And it is easily followed, as well, simply by adhering to its lone Thelemic precept: Do what thou wilt. It is, finally, a comforting artistic position for an artist to hold vis-à-vis commerce. If you are utterly beholden to your artistic impulses, you cannot be surprised or mind much when a piece of art does not sell. You did not create it to sell. If it does, great, but whether it does or not is a simple matter of luck, of spinning the wheel. Further, it implies a retroactively absolving determinism—if a lifetime of artistic work has sold no paintings, no albums, no books, why fret? After all, you were always going to do the thing you were going to do, and you were never going to do the thing you weren’t going to do, and the thing you did do was never not going to be unpopular, QED. This may be a philosophically solid position, but is it necessarily true? I began to ask this question after the publication and non-success—the anti-success—of my first novel. I wrote the book, as many first-time novelists do, in a kind of prelapsarian innocence, protected from the practical concerns of publication by ignorance and wonder at the odd fact of writing a novel in the first place. In the beginning, I hadn’t even really intended to write a novel, had simply been working on a short story that kept accumulating pages. In the end, it sold to a trade house, and the whole experience had the hazy quality of a dream, an impression strengthened by the arcane inscrutability of the publishing process. Preparing to write a second novel, I had no such illusions. I had seen the amount of machinery required to make a book, all the stubborn engines of commerce that must be coaxed to life; I had received the distant publication schedules, the important dates that feel imaginary set nearly two years in the future; most importantly, I had a book come out that didn’t do much of anything besides get some nice reviews. These are lessons that cannot be unlearned, and they come with a circumspection about the projects to which you are willing to commit your time and attention. Suddenly lots of market-related considerations crept in that would never have occurred to me the first time around. I began to wonder, contra Tom: Could a writer set out to write a popular book? In a largely facetious (though slightly more serious than I'd like to admit) attempt to address this question, I decided to take the most literal possible approach and go through several years of New York Times Best Seller lists. After all, to write a bestseller, it would be helpful to know what has sold best. Making the Times best-seller list may seem like casting a broad net, but only counting literary number ones, I was left with, approximately, All the Light We Cannot See and The Nightingale. So I figured hitting the top ten for a week would do it, over the previous five years. Too much further back and you might run into epochal changes of taste, some forgotten mania of the aughts. Also, I didn't have the time. An immediate issue this exercise presented, and a question much larger than the scope of this piece, was deciding what qualifies as “literary fiction.” For my purposes, I included almost anything not having to do with worldwide conspiracies, serial killers, werewolves and shapeshifters and rogue triple agents—i.e. anything not obviously genre. And though they invoke the Bard of Avon, William Shakespeare’s Star Wars series—The Empire Striketh Back, The Jedi Doth Return, I am not making this up—did not make the final cut. (Before moving on to actual findings, a couple of notes after having spent many man hours going through nearly 300 or so of these weekly lists. First—and I realize this is the summit of trite publishing observation—but holy shit does James Patterson, or The James Patterson Military Industrial Complex or whatever it is, produce a lot of books. I’m not sure I noticed more than a handful of weeks in the last five years in which some Pattersonian permutation wasn’t on The List. David Baldacci, also. Second, Brad Thor may be the only bestselling genre author with a less plausible name than his protagonist, the relatively mundane “Scott Horvath.” You would think his hero should be named something like Odin Hercules, but no.) [millions_ad] Having compiled a long list of recent literary hits, what did I learn? Well, for one thing, start your title with “The.” Around a third of these bestsellers are “The” books. The Goldfinch, The Nightingale, The Martian, The Interestings, The Vacationers, The Girl on the Train. Granted, “the” is a fairly common word in English usage, but I suspect it also holds some subliminal power for prospective readers, announcing a book as official in subject and purpose—the definite article, so to speak. Just imagine how many more copies All the Light We Cannot See would have sold if it had been titled, for example, The Light We Cannot See (All of It), or The Entirety of Unseen Light. Another smart move is to be famous already. Ideally, have written To Kill a Mockingbird 50 years ago, but otherwise, at least be a known quantity. This, of course, introduces another chicken/egg problem, i.e., how did these writers get to be known quantities before they were? At any rate, surprisingly few authors seem to make the list from out of nowhere. More seriously, write one of two types of books: mysteries or historical fiction, both if possible. In either of these genres, you’re in good shape if you can work in something to do with a famous painting or painter or other noteworthy work of art or artist. Anything to do with marriage and travel to exotic locales, as well. Over and again, a combination of these elements popped up, and the obvious common theme is that of escape: escape into the past, escape into a mystery, escape into aesthetics and culture, escape into imagined relationships, and the literal escape from one’s home to parts unknown. It turns out that the escapist instinct that drives genre fiction sales is alive and well in readers of literary fiction—it simply requires (debatably) better sentences and (usually) less fantastic trappings. With these guidelines in mind, I came up with a few potential novels that wouldn't have seemed out of place on the list. Here's one: a historical mystery based on the life and death of Paul Gauguin. But told from the perspective of his estranged wife, Mette-Sophie, via a diary she keeps as she travels the world, investigating her husband's artistically triumphant and morally bankrupt life after leaving his family. Call it The Journals of the First Mrs. Gauguin. A synopsis of this ghostly book in the style used to query agents is as follows: When a previously unknown Paul Gauguin painting is discovered in an abandoned apartment in Chicago, art historian Lena Wexler is assigned the job of tracking its provenance; an investigation back through time, and place—from Chicago to Miami, from Denmark to France, from Tahiti to, finally, The Marquesas, all with the help of The Journals of the First Mrs. Gauguin. Does this sound like a book people would buy? I think so. I can very easily imagine this book on the coffee table of my mother-in-law, an omnivorous reader of literary bestsellers, classics, and nonfiction who helms a monthly book club. I’m fairly confident that if I queried 20 agents with this synopsis, one or two would request a read. It sounds like a popular book. The only problem is that for it to exist, I would have to write it. And it's not a book I can write. Working through this little thought experiment confirmed what I already knew writing a novel requires: an ineffable, personal spark of interest that catches fire and burns steadily enough to not be extinguished by doubt and creative incapacity; a fire that manifests over time as curiosity about the subject, and the project itself, how it all turns out. Lacking this deep interest, an otherwise valid project—exciting, interesting, and commercial—remains a theoretically good idea, like going to medical school or quitting social media. Since this essay's inception, I've published another novel and have two more in stages of revision, and I've fully accepted Tom’s point of view: You have to write what you want to write, even if what you want to write won’t usually be what people want to read. You can’t spend two to five years on something for a theoretical, external reward. Or I can’t, anyway, but maybe some people can—if so, The Journals of the First Mrs. Gauguin is all yours. Image: Flickr/Nabeel H
If you have ever been in a writing workshop, especially of the MFA variety—and good for you if you haven’t—you are likely familiar with the most dreaded response a piece of writing can get. It can be phrased in a variety of ways, sometimes hedged and mealymouthed and sometimes forthrightly insulting, but it is essentially this: Why is this story is being told? It’s a curious criticism, one that invites responses that are escalatory and often epistemological in nature, for example: “Why should I care about your opinion?” or “Why is any story told?” or “Why do we bother doing anything?” This line of criticism, especially when it is directed at you and something you wrote, is maddening, both in its dismissiveness and in its unanswerability. And yet it captures something about a lot of writing, even writing that is good, or “good”—well-considered, well-phrased, and well-paced. Lots of good writing has this inert, pointless quality, and sometimes otherwise amateurish, inept writing has a curious vibrancy. Whatever the case, the first test that any piece of writing must pass is answering, in some way, why it exists. To put it another and possibly less clear way, all fiction has to do battle with its own fictional nature. There’s a minor absurdity inherent in the act of entering a world invented by another person, even if we ardently wish to enter such a world. As readers, we sense it in the beginning of any new book—the dipping of toes into the water, the wading in. It isn’t so much realism or factuality that pulls us beneath the fictional surface, but a kind of mysterious weight, the gravity of a story that, for whatever reason, needs to be told. Narrative lacking this quality cannot convince us to read past the bare fact of its arbitrariness, its status as something created. It sits there, only making us aware of the infinitude of stories that can be told, when a good story does the opposite, narrowing the universe to its singular vision and significance. One of the atomic elements of this weight—the felt necessity of a piece of writing—is a text’s narrative perspective, its particular deployment of POV and tense. And one of the most potently self-justifying combinations of narrative voice and time is first-person retrospective. Done well, it possesses implied narrative motivation as part of its DNA. Here’s the beginning of Stephanie Vaughn’s classic story “Dog Heaven”: Every so often, that dead dog dreams me up again. It’s twenty-five years later. I’m walking along Forty-second Street in Manhattan, the sounds of the city crashing beside me—horns and gearshifts, insults—somebody’s chewing gum holding my foot to the pavement, when that dog wakes from his long sleep and imagines me. I’m sweet again. I’m sweet-breathed and flat-limbed. Our family is stationed at Fort Niagara, and the dog swims his red heavy fur into the black Niagara River… This is a story about childhood. The narrator, Gemma, recalls a year of her childhood as a transient army brat and, in particular, her friendship with a boy named Sparky who drowns in the Niagara. It also details her family life, the triumphant return of the titular dog after running away, her pacifist teacher’s moral lecture on nuclear war, and several other memories of that vivid year on the Canadian border. Notably, it never returns to her actual narrative position, 25 years later in New York. This is an unusual choice for a frame narrative—as readers, we expect what occurs inside a frame narrative to inform our understanding of the frame itself. Frame narratives offer a kind of significance IOU: however long it’s withheld, the story being told will eventually telescope out to the teller, thereby creating a diegetic justification for the telling, which, in turn, helps create a reader’s sense of the telling’s necessity. But the frame itself implies a justification for the story’s telling, and what’s interesting about Vaughn’s story (and others in this collection) is that it turns out you don’t really need to know how the story has affected Gemma. That she is telling it at all—that apparently, this era of her childhood comes back to her at random moments during her adult life two decades later—is enough: it must be important. This implied motivation suffuses the whole story and informs our reading experience. What might otherwise read as a nice piece of nostalgia, a bit sentimental, perhaps, with its runaway dog and dead childhood friend, reads as something more serious and complex, something with the weight of subsequent lived experience behind it even if we don’t know what that experience is. [millions_ad] Jesus’ Son, by Denis Johnson, employs this dynamic throughout the entire book. For large swaths, sometimes entire stories, we are allowed to forget that this is something that happened, not something that’s happening. But then, every so often, the present-moment narrator slides in and reminds us. “Emergency,” probably the most famous story in the collection, begins in this way: I’d been working in the emergency room for about three weeks, I guess. This was in 1973, before the summer ended. With nothing to do on the overnight shift but batch the insurance reports from the daytime shifts, I just started wandering around, over to the coronary-care unit, down to the cafeteria, et cetera, looking for Georgie, the orderly, a pretty good friend of mine. He often stole pills from the cabinets. With this, we are off with Fuckhead (the narrator) and Georgie as they take drugs and hallucinate and endanger patients, and kill rabbits, and rescue a draft dodger. No more mention that this is in the past, but the fact hangs over the story like one of the Midwestern snowstorms always threatening to blow into Iowa City. Johnson never elucidates exactly how long has elapsed between the story and the storytelling, but it feels like a long time—decades, maybe—and this communicated distance is vital to, as in the case of Stephanie Vaughn’s stories, providing depth and significance to what might otherwise feel like mere druggy randomness. This holds true over the course of the book and lends it a coherent shape—in vulgar terms, the shape of a recovery narrative. The narrator’s present-day existence, though faint as a ghost, suggests not only the significance of the stories he’s telling, but the simple fact that he’s around to tell them. Despite the Quaaludes and heroin being taken, the drunk driving, the almost slapstick violence, and the random appearance of handguns, we understand that our storyteller will walk right through, like Jesus on water. The novel’s stakes are thereby transformed from pure survival to the meaning and value of survival, to what survival might later entail. Not knowing the exact terms of how this story has affected the narrator is advantageous. It creates a pocket of ambiguity the reader can fill with whatever they want in terms of significance and according to the reader’s own experience and aesthetics. To me, Jesus’ Son, despite all of the drugs and death and foolish misery, ultimately reads as a triumph. But a different reader might fill those years between telling and told with a sense of loss and regret. In a story from her wonderful collection A Manual for Cleaning Women, entitled “Point of View,” Lucia Berlin makes the case for the (what I’m calling in this essay) implied narrative motivation—the weight—of third person, arguing that it possesses an authority that first person lacks. Introducing her character, she says: I mean if I just presented to you this woman I’m writing about now… “I’m a single woman in her late fifties. I work in a doctor’s office. I ride home on the bus…” You’d say, Give me a break. But my story opens with “Every Saturday, after the laundromat and grocery store, she bought the Sunday Chronicle.” You’ll listen to all the compulsive, obsessive boring little details of this woman’s, Henrietta’s, life only because it is written in the third person. You’ll feel, hell if the narrator thinks there is something in this dreary creature worth writing about, there must be. This suggests a productive approach to looking at the effect of various ways of telling a story. As stylized as the written word has become over the centuries, at its heart, narrative is based on oral storytelling. We still, on some level, encounter fiction transmitted via the intermediary of a book in the same way we encounter fiction as told to us by someone. It can be useful to imagine the author as a stranger who sits down beside us at a crowded bar. Berlin’s account seems intuitively true in this respect: if this stranger begins telling us about another stranger, it is simultaneously less engaging and more motivated—that is to say, we would infer that they have a good reason to be unloading this on us, even if in doing so they have to convince us to care about this other, described, person. If, on the other hand, they simply begin talking about themselves and the kind of day they’ve had, the story they tell hinges purely on how interesting and arresting it is, and until they command our full attention with their raconteurial power, we may wonder why we’re supposed to care. We may assume, as is often the case, that this person just likes to hear themselves talk. Finally, to return to the case of retrospective narratives, if a stranger begins their story with, “Let me tell you about something that happened to me ten years ago,”—as both Vaughn and Johnson do—they probably have our attention. The duration of elapsed time is suggestive: of potential loss and lessons learned, of all the many reasons they not only remember the story but still feel compelled to tell it all these years later. We may leave the bar thinking not only about their story, but about them, inferring things, writing our own version. Image: Flickr/Harsha K R
In a word cloud of writing about Flannery O’Connor’s stories, “grotesque” would be central, medium-large. The word is typically applied to the characters in her stories, whose spiritual deformities are often represented—in problematic authorial shorthand—by some kind of physical deformity. Hulga, in “Good Country People,” is archetypal, missing both a leg and the religious faith of her mother. Or Rufus, in “The Lame Shall Enter First,” with a clubfoot (she was big on legs and feet) that represents his essentially twisted nature. But the dictionary definition of grotesque, “comically or repulsively ugly or distorted,” also speaks to an essential quality of O’Connor’s work. Her stories are morally distorted, and the effect is both ugly and comic. In the denouement of her most famous story, “A Good Man Is Hard to Find,” the Misfit, an escaped convict, has his henchmen systematically march the family members of the Grandmother into the woods and shoot them. The Grandmother, we are given to understand, has brought this upon her family and herself—in her deceitful vanity, she has gotten them lost, and then, having stowed away her verboten cat which attacks her son while he’s driving, gotten them wrecked and stranded. She is to blame. It is a testament to both the power of O’Connor’s writing and the fame of this story that we do not generally find this ending odd. The Grandmother is annoying, true. She is a petty, class-obsessed bigot who wears perfumed cloth violets so that, in the event of a car accident, any onlooker "would know at once that she was a lady." She names her cat “Pitty Sing” (“Pretty Thing” done in a reprehensible Charlie Chan accent). She makes up stories about her youth to impress her bored family. She flatters herself and holds fatuous opinions about the world. She is, in short, almost exactly like my own late grandmother who, ancient pain in the ass though she was, probably did not cosmically or karmically merit the murder of my entire family. Yet we take this ending, in the context of the O’Connor universe, as more or less fair—if not deserved, exactly, then somehow structurally congruent. We accede to this kind of disproportionate judgment, and disproportionate punishment, in the majority of her stories. Disproportionality is at the heart of O’Connor’s work, and it represents both the worst and best aspects of her art. A pattern of punitive excess repeats itself again and again in her work. Julian, in “Everything That Rises Must Converge” is punished for workaday maternal contempt with watching his mother suffer a massive stroke. Asbury, in “The Enduring Chill,” guilty of being a proud, theatrical fool (not to mention, simply twenty-five), is consigned to a sickbed in his mother’s home for the rest of his days. True, we live in a world that can be mightily disproportionate, mightily unfair—we know this—but O’Connor is not merely describing our world and its injustice. These stories are torture boxes, lovingly designed by a master craftsperson to enact maximal punishment for minimal crimes. A flawed character is set inside; that flaw catches on the ineluctable, merciless gears of her narrative logic; they are ground to dust. In “The Lame Shall Enter First,” the singularly named Sheppard and his little boy, Norton, have suffered the loss of their wife and mother. While volunteering at a local juvenile detention facility, Sheppard takes an interest in local ruffian Rufus Johnson—an intelligent child from a poor, criminal family. Rufus insists that he is evil, a claim repugnant to Sheppard’s rational atheism, and Norton becomes the battleground upon which these competing moral claims fight. Although Sheppard’s obvious sin, in the medieval Catholic framework of O’Connor’s work, is his godlessness, it can be argued that the greater underlying emotional sin is his privileging of Rufus over Norton. His interest in helping Rufus supersedes his interest in his own son, whom he finds annoying and mildly repulsive, as the boy mirrors his own grief back at him. This is Sheppard’s emotional sin—his charity not beginning at home—and the penalty for this temporary neglect is the death of Norton, who hangs himself. We exit the story as the father kneels over his dead son’s body, utterly bereft in the coldest of all possible universes. This same distorted quality is also largely responsible for the black humor of her writing. She is a very funny writer, with a Mel Brooksian sense of the inherent comedy of the suffering of others. As he put it, “Tragedy is when I cut my finger. Comedy is when you fall into an open sewer and die.” Comedy, in other words, is inherently disproportionate, at least partially premised on the lopsided limits of empathy where the suffering of oneself and others is concerned. Seinfeld’s humor, for example, is based on juxtaposing the main characters’ utter insensitivity to other people’s plights—in particular, Jerry’s breezy unconcern—with their extreme sensitivity to their own trivial problems. A child trapped in a germ-free “bubble” is fodder for jokes, while not being able to get seated at a Chinese restaurant becomes a kind of personal hell. Cartoons, likewise, are almost completely predicated on a similar dynamic. Wile E. Coyote’s suffering is funny not only for the extreme violence of the physical comedy—the faulty boxes of dynamite and ill-planned catapults—but in a meta sense, the fact of his Sisysphean struggle, the brutality of his universe and its two-fold cruelty, assigning him an unappeasable desire for the Roadrunner that can only result in his destruction, for all eternity. So it is in much of O’Connor’s writing, which, despite—and because of—its bleakness, is essentially comic. “She would have been a good woman, if it had been somebody there to shoot her every minute of her life,” says the Misfit, after finally putting the Grandmother out of her misery. With these words, he dispatches the Grandmother’s moral existence as summarily as he has dispatched the lives of the family. It’s justly one of the most famous lines in any short story, and one of the funniest as well, in its bathos and casual horror. In a sense, the entire story is an enactment of Mel Brooks’s epigram, with the audience enjoying the bleakly funny spectacle of the Grandmother, in all her petty obliviousness, leading her family straight toward the open sewer that is the Misfit. Asbury’s bedridden fate in “The Enduring Chill,” while not comic in itself, is comic in the inexorable steps that lead to it. His prideful, self-dramatizing silliness leads to an ill-considered show of solidarity with his mother’s field hands during which he gulps a bucket of tainted milk. The fact that he does not “deserve” to become gravely ill forever—while arguably a moral flaw in the storytelling, is a pure advantage in the comic rendering of Asbury’s pathetic existence. In a larger sense, there is something horribly comic in the experience of reading an O’Connor story. You pull back the cloth draped over the torture box, see the unlucky character enter, and nervously titter as you learn the terms of their demise. It is awful, and it is funny, and it is funny because it’s awful. It is funny because we live in a world that can be so awful, in which we spend large swaths of our time attempting to avoid that awfulness, and here is an artist training her godlike powers on the creation of perfectly awful worlds for her characters to inhabit. In this sense, the aesthetic experience of watching Wile E. Coyote and reading Flannery O’Connor are not very different: in both, you are waiting for the two-ton anvil to fall from the sky. This is also to say, however, that there is something fundamentally cartoonish about the moral unfairness of O’Connor’s work, an unfairness rooted, if one is inclined toward biographical interpretation, in her Catholicism and illness. Here was a genius twice cursed—first with original sin, then with lupus. These outsized personal afflictions surely inflected her worldview, a worldview that allows characters no agency—rarely, if ever, does the cool breeze of free will blow through her humid North Georgia hellscapes. It is an art of typology: types of characters, types of flaws and sins, types of punishments. This monolithic, caricatured quality is both a weakness and strength, manifesting in ways that are impossible to disentangle. It lends, in my mind, an imposing greatness to the work that sometimes comes at the expense of truth.
Good fiction typically provides few good answers but many good questions. The great novels and stories can often be, however incompletely, expressed as a single, overarching question that the author is working out via narrative. Is the American dream an illusion? (The Great Gatsby); should a person marry for money? (Sense and Sensibility); can the son of God be born in human form and sacrifice himself to save humanity? (Harry Potter and the Deathly Hallows). Good essay, like good fiction, is also mostly engaged in the act of asking questions. But the forms differ in a few crucial aesthetic respects, leaving aside the basic fact of fictionality, which, as we know, can be an overstated difference—nonfiction is often partly invented and much fiction is true, or true enough, but never mind that. Centrally, fiction possesses a narrator that obscures the author. Largely as a function of the narrator’s existence and also simple novelistic convention, most novels seek to attain a smooth narrative surface, an artifactual quality. A great deal of received wisdom regarding fiction craft has to do with the author disappearing in the service of creating John Gardner’s “vivid, continuous dream.” This isn’t to say that essayists don’t also obsessively and endlessly revise to create a polished surface, but the goal is typically not authorial effacement. Maybe an easier way to say it would be that both fiction and essay revolve around formulating questions, but essay very often works the act of questioning—of figuring out what the question is—into the form. Joan Didion pioneered what we think of as the modern essay, a self-conscious blend of journalism, criticism, and personal experience. Some Didion essays are intensely focused on one subject, for instance, “On Keeping a Notebook.” But the most Didion-y of Didion’s essays are ones like “Slouching Toward Bethlehem” and “The White Album,” that meander through subject and theme like a car driving home from work via L.A.’s surface streets. “The White Album,” for example, combines the description of her mental instability and compulsive dread with a more panoramic view of her bad-trippy east-Hollywood neighborhood in the late ’60s, a personal account which ripples out into larger cultural considerations: the Doors, the Manson murders, and California—always California. Didion’s stylistic legacy serves as both influence and study for Alice Bolin’s Dead Girls, an excellent collection of individual essays and also, to my mind, a fascinating example of the book-length possibilities of the essay form. Dead Girls begins in what seems straightforward-enough fashion with Part One, The Dead Girl Show, a quartet of thematically unified essays examining the centrality of the figure of the dead girl in American popular culture. These include “Toward a Theory of the Dead Girl,” about the glut of recent dead girl TV shows including True Detective, The Killing, and Pretty Little Liars; “Black Hole,” about growing up in the serial killer-y Pacific Northwest; “The Husband Did It,” about true-crime TV shows; and my personal favorite, “The Daughter as Detective,” about Bolin’s father’s taste in Scandinavian crime thrillers. (A side note: It’s not a requirement that you have mystery-addict parents to enjoy this essay, but it could hardly fail to charm someone who, like myself, grew up in a house crammed with Maj Sjöwall and Per Wahlöö mysteries.) Having established its seeming method in Part One, the book veers sideways into Part Two, Lost in Los Angeles, essays largely about Bolin’s experience as a 20-something living, for no especially good reason, in L.A. Aimlessness becomes a dominant theme, as the book shifts gears into writing about freeways, Britney Spears’s celebrity journey, and wandering around graveyards. Perhaps in an attempt to pre-empt readerly confusion at the book’s shape-shifting, Bolin has made it clear, both in press and in the introduction: This is not just a book of essays about dead girls in pop culture. I understand this concern and will admit to feeling a slight confusion about Bolin’s project immediately after Part One. But proceeding through Part Two, and then Three, Weird Sisters, about teenage girlhood and the occult, I found myself increasingly glad the book had morphed and kept morphing. The book’s intelligence has a questing quality, a pleasant restlessness as it moves from literary criticism to personal anecdote to academic cultural/political critique and back again, like a jittery moth that never lands for too long on the light it circles. The way Bolin modulates subject and approach metaphorizes both the breadth and slipperiness of her main thematic concern: narratives of female objectification. The book generally proceeds from objective to subjective, mimicking the detached and objectifying eye of its central detective figure in Part One, then moving steadily into subjective, personal territory. Like Indiana Jones switching a bag of sand for gold, Bolin substitutes her younger self as the Dead Girl and, in doing so, bestows the Dead Girl agency, brings her to life. Part Four, the longform essay “Accomplices,” brings the project to an end and to a thematic whole. In a way, it embodies the entire book, incorporating the major concerns—growing up, white female objectification and privilege, romance and the lack thereof, Los Angeles—into a self-aware meditation on the author’s sentimental education in the context of literary counterparts like Rachel Kushner and Eileen Myles and, yes, Joan Didion. Bolin seems to be asking whether there is, inherent in the act of writing the classic coming-of-age “Hello to All That” essay, as she puts it, a self-objectification that echoes the deadly cultural objectifications critiqued earlier in the collection. “How can I use the personal essay,” she asks, “instead of letting it use me?” Part Four anatomizes the entire Dead Girls project, simultaneously encapsulating the book and acting as a Moebius strip that returns the reader to the more stylized and essayistic distance of the opening chapters. To be clear, there are many standout and stand-alone individual essays in these sections. The aforementioned “Daughter as Detective,” which, in addition to its many virtues, contains the unforgettable description of Bolin’s father as a “manic pixie dream dad.” “This Place Makes Everyone a Gambler,” a deft personal history of reading and rereading Play It as It Lays, that weaves together L.A. noir, Britney Spears, and Dateline NBC. “Just Us Girls,” a touching cultural study of adolescent female friendship. But the book’s biggest triumph, in my opinion, is of a larger, formal nature, as Bolin marshals her themes and interests into a book-length reflection, of and on, the persistent figure of the Dead Girl. Alice was kind enough to field a few of the questions that occurred to me in writing this review, mainly regarding how this book’s singular form came to be. The Millions: Can you provide a little general background about how the book got written? I'm curious which essays were written first. Also, if there were any pieces that it became apparent needed to be written in the interest of book-length cohesion. I'm especially interested in "Accomplices," which serves so well as an embodiment and critique of the project. Alice Bolin: This is a little hard to answer because most of the previously published essays in the book are drastically changed from their earlier forms. I would say the book really started with “The Dead Girl Show” and the essays in the second section about California, which I started writing, hilariously, the second I moved there. I started most of those pieces in 2013 and 2014 in Los Angeles, and that was when I started to see the ideas I'd been working with coming together in some vaguely book-like shape. Most of the essays in the third section, “Weird Sisters,” existed earlier, though, in different versions—I realized late in the game that my preoccupations with witchiness and teen girl pathology pretty obviously dovetailed with the Dead Girl thing. "Accomplices" was the last piece I wrote for the book, and I knew that it was my opportunity to pull up some of the narrative paths I'd laid down earlier, both about Dead Girls and about my own life. The book as a whole is about questioning received narratives, so I had ambitions for it to work as sort of a (sorry) palimpsest, putting forth suppositions and then writing over or revising them. I want there to be some dissonance for the reader. TM: At what point did the theme of The Dead Girl emerge? Was it obvious from the start? The collection approaches this subject from so many angles; I’m interested in if there was a certain amount of retrofitting in the revision—that is, were there already completed or published essays that you went back to and revised with the dead girl subject/theme in mind? Or did it all kind of hang together as it does from the start? AB: I think once I wrote “The Dead Girl Show,” I saw that Dead Girls were a theme that I had been interested in for a very long time. I had already been writing about thrillers, true crime, detective fiction, and horror movies, genres where Dead Girls were everywhere. After that I was thinking about other ways I could write about Dead Girl genres—like in the Nordic Noir essay—and about subjects from other pulp genres that could throw those essays into relief, like pop music or reality TV. I didn't really do much retrofitting that I can remember, except maybe lines here and there. I have my MFA in poetry, so I have borrowed a lot of the ways I think about a collection from poetry books—that you allow your preoccupations to dictate the shape of the book, instead of the other way around. TM: The book’s critical mode seems to move somewhat from objective to subjective, and then, in Part Four, comment on that move. That is (and I realize I might be oversimplifying here, since all these elements exist in all the essays), Part One is predominately cultural critique, and then parts Two and Three become increasingly personal. To what extent was this movement something that organically emerged in revision, and to what extent was it conscious? AB: It's interesting, because in my original draft I had the California essays first, and the Dead Girl essays second—they seemed most important to me, but then my editor was like "Uh, shouldn't Dead Girls be first since that is the title and the whole point of the book?" She was so right. Someone else has pointed out that the book works like a Dead Girl show, with the Dead Girl as bait at the beginning of the book, but the rest of the narrative arc being about something totally different. I love this, but it didn't really occur to me, except maybe intuitively. I definitely wanted the fourth section to critique the strategies of earlier essays, but beyond that, the organization was more by subject than method. I actually wanted to cut the third section late into the drafting process, if that tells you anything about how uncomfortable I am with writing about my own life! TM: To me, because of the thematic unity and movement of the book, Dead Girls has a somewhat novelistic quality or instinct. Is this something you’re interested in doing? More generally, what’s next? AB: This is such a nice compliment! I am absolutely interested in experimenting with fiction. I had a sort of epiphany in the past few months about how my own attitude toward myself in the book is a lot like the detachment novelists have toward their characters—it's the only way I can break through (or maybe... use?) my self-loathing. Anyway, yes I am interested in writing an autobiographical novel sometime in the future, with more details TBA, in maybe like 10 years. I'm also thinking about another very girly essay collection about magazines, social media influencers, and the vintage internet, and more generally, the way women have mediated and monetized their personalities.
1. Kurt Vonnegut’s caution against the use of semicolons is one of the most famous and canonical pieces of writing advice, an admonition that has become, so to speak, one of The Rules. More on these rules later, but first the infamous quote in question: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you've been to college.” To begin with the lowest-hanging fruit here—fruit that is actually scattered rotting on the ground—the “transvestite hermaphrodite” bit has not aged well. The quote also, it seems, may have been taken out of context, as it is followed by several more sentences of puzzlingly offensive facetiousness, discussed here. That said, I also have no idea what it means. My best guess is that he means semicolons perform no function that could not be performed by other punctuation, namely commas and periods. This obviously isn’t true—semicolons, like most punctuation, increase the range of tone and inflection at a writer’s disposal. Inasmuch as it’s strictly true that you can make do with commas, the same argument that could be made of commas themselves in favor of the even unfussier ur-mark, the period. But that is a bleak thought experiment unless you are such a fan of Ray Carver that you would like everyone to write like him. Finally, regarding the college part, two things: First, semicolon usage seems like an exceedingly low bar to set for pretentiousness. What else might have demonstrated elitism in Vonnegut’s mind? Wearing slacks? Eating fish? Second, in an era of illiterate racist YouTube comments, to worry about semicolons seeming overly sophisticated would be splitting a hair that no longer exists. But however serious Vonnegut was being, the idea that semicolons should be avoided has been fully absorbed into popular writing culture. It is an idea pervasive enough that I have had students in my writing classes ask about it: How do I feel about semicolons? They’d heard somewhere (as an aside, the paradoxical mark of any maxim’s influence and reach is anonymity, the loss of the original source) that they shouldn’t use them. To paraphrase Edwin Starr, semicolons—and rules about semicolons—what are they good for? As we know, semicolons connect two independent clauses without a conjunction. I personally tend to use em dashes in many of these spots, but only when there is some degree of causality, with the clause after the em typically elaborating in some way on the clause before it, idiosyncratic wonkery I discussed in this essay. Semicolons are useful when two thoughts are related, independent yet interdependent, and more or less equally weighted. They could exist as discrete sentences, and yet something would be lost if they were, an important cognitive rhythm. Consider this example by William James: I sit at the table after dinner and find myself from time to time taking nuts or raisins out of the dish and eating them. My dinner properly is over, and in the heat of the conversation I am hardly aware of what I do; but the perception of the fruit, and the fleeting notion that I may eat it, seem fatally to bring the act about. The semicolon is crucial here in getting the thought across. Prose of the highest order is mimetic, emulating the narrator or main character’s speech and thought patterns. The semicolon conveys James’s mild bewilderment at the interconnection of act (eating the raisins) and thought (awareness he may eat the raisins) with a delicacy that would be lost with a period, and even a comma—a comma would create a deceptively smooth cognitive flow, and we would lose the arresting pause in which we can imagine James realizing he is eating, and realizing that somehow an awareness of this undergirds the act. An em dash might be used—it would convey the right pause—but again, ems convey a bit of causality that would be almost antithetical to the sentence’s meaning. The perception follows temporally, but not logically. In fact, James is saying he doesn’t quite understand how these two modes of awareness coexist. Or consider Jane Austen’s lavish use of the semicolon in this, the magnificent opening sentence of Persuasion: Sir Walter Elliot, of Kellynch Hall, in Somersetshire, was a man who, for his own amusement, never took up any book but the Baronetage; there he found occupation for an idle hour, and consolation in a distressed one; there his faculties were roused into admiration and respect, by contemplating the limited remnant of the earliest patents; there any unwelcome sensations, arising from domestic affairs changed naturally into pity and contempt as he turned over the almost endless creations of the last century; and there, if every other leaf were powerless, he could read his own history with an interest which never failed. Periods could be ably used here, but they would not quite capture the drone of Elliot’s stultifying vanity. Again, form follows function, and the function here is to characterize the arrogantly dull mental landscape of a man who finds comprehensive literary solace in the baronetage. More than that, the semicolons also suggest the comic agony of being trapped in a room with him—they model the experience of listening to a self-regarding monologue that never quite ends. We hardly need to hear him speak to imagine his pompous tone when he does. The semicolon’s high water usage mark, as shown here, was the mid-18th to mid-/late 19th centuries. This is hardly surprising, given the style of writing during this era: long, elaborately filigreed sentences in a stylistic tradition that runs from Jonathan Swift to the James brothers, a style that can feel needlessly ornate to modern readers. Among other virtues (or demerits, depending on your taste in prose), semicolons are useful for keeping a sentence going. Reflecting on the meaning of whiteness in Moby Dick, Melville keeps the balls in the air for 467 words; Proust manages 958 in Volume 4 of Remembrance of Things Past during an extended, controversial rumination on homosexuality and Judaism. There is a dual effect in these examples and others like them of obscuring meaning in the process of accreting it, simultaneously characterizing and satirizing the boundaries of human knowledge—a sensible formal tactic during an era when the boundaries of human knowledge were expanding like a child’s balloon. Stylistically, the latter half of the 20th century (and the 21st) has seen a general shift toward shorter sentences. This seems intelligible on two fronts. First—and this is total conjecture—MFA writing programs came to the cultural fore in the 1970s and over the last few decades have exerted an increasing influence on literary culture. I am far from an MFA hater, but the workshop method does often tend to privilege an economy of storytelling and prose, and whether the relationship is causal or merely correlational, over the last few decades a smooth, professionalized, and unextravagant style has been elevated to a kind of unconscious ideal. This style is reflexively praised by critics: “taut, spare prose” is practically a cliche unto itself. Additionally, personal communication through the 20th century to today has been marked by increasing brevity. Emails supplant letters, texts supplant emails, and emojis supplant texts. It stands to reason that literary writing style and the grammar it favors would, to a degree, reflect modes of popular, nonliterary writing. Beyond grammatical writing trends, though, semicolons are a tool often used, as exemplified in the Austen and James examples, to capture irony and very subtle shades of narrative meaning and intent. It might be argued that as our culture has become somewhat less interested in the deep excavations of personality found in psychological realism—and the delicate irony it requires—the semicolon has become less useful. Another interesting (though possibly meaningless) chart from Vox displays the trend via some famous authors. As fiction has moved from fine-grained realism into postmodern satire and memoir, has the need for this kind of fine-grained linguistic tool diminished in tandem? Maybe. In any case, I have an affection for the semi, in all its slightly outmoded glory. The orthographical literalism of having a period on top of a comma is, in itself, charming. It is the penny-farthing of punctuation—a goofy antique that still works, still conveys. 2. A larger question Vonnegut’s anti-semicolonism brings up might be: Do we need rules, or Rules, at all? We seem to need grammatical rules, although what seem to be elemental grammatical rules are likely Vonnegutian in provenance and more mutable than they seem. For instance, as gender norms have become more nuanced, people—myself included—have relaxed on the subject of the indeterminately sexed “they” as a singular pronoun. Likewise, the rule I learned in elementary school about not ending sentences with prepositions. Turns out there’s no special reason for this, and rigid adherence to the rule gives you a limited palette to work with (not a palette with which to work). We know, on some level, that writing rules are there to be broken at our pleasure, to be used in the service of writing effectively, and yet writing is such a difficult task that we instinctively hew to any advice that sounds authoritative, cling to it like shipwrecked sailors on pieces of rotten driftwood. Some other famous saws that come to mind: Henry James: “Tell a dream, lose a reader.” Elmore Leonard: “Never open a book with weather.” John Steinbeck: “If you are using dialogue—say it aloud as you write it. Only then will it have the sound of speech.” Annie Dillard: “Do not hoard what seems good for a later place.” Stephen King: “The road to hell is paved with adverbs.” And more Kurt Vonnegut: “Every character should want something, even if it is only a glass of water”; “Every sentence must do one of two things—reveal character or advance the action”; “Start as close to the end as possible.” In the end, of course, writing is a solitary pursuit, and for both good and ill no one is looking over your shoulder. As I tell my students, the only real writing rule is essentially Aleister Crowley’s Godelian-paradoxical “Do what thou wilt, that shall be the whole of the law.” Or alternately, abide by the words of Eudora Welty: “One can no more say, ‘To write stay home,’ than one can say, ‘To write leave home.’ It is the writing that makes its own rules and conditions for each person.” Image: Flickr/DaveBleasdale
Has there, in American letters, at least, ever been a better noticer than John Updike? By “noticer,” I don’t mean “writer of detail,” exactly. There are many writers whose use of detail I find more narratively effective: Saul Bellow, Joan Didion, Toni Morrison, Denis Johnson, Stephanie Vaughn, etc. Updike’s use of detail, as I discussed in a recent essay, often impinges upon the realism, or “realism,” of his fictional worlds, overlarding them with sensory detail that too often alerts the reader to the writer’s presence. But in terms of pure, preternatural eye for the minute, the ephemeral, and the easily missed, it is difficult to think of another writer who can compare. Here are a couple of excerpts from “Incest,” one of the earlier stories (contained, in fact, in the wonderful The Early Stories), many of which endeavor to capture the mundane drama of young married life, as played out in a series of similar, cramped New York apartments. In the first excerpt, Lee, husband and father, is cleaning up the sugar his daughter has spilled: With two sheets of typing paper, using one as a brush and the other as a pan, he cleaned up what she had spilled on the counter, reaching around her, since she kept her position standing on the chair. Her breath floated randomly, like a butterfly, on his forearms as he swept. Later, he watches his wife drying diapers in the bathroom: Looking, Lee saw that, as Jane squinted, the white skin at the outside corner of her eye crinkled finely, as dry as her mother’s, and that his wife’s lids were touched with the lashless, grainy, desexed quality of the lids of the middle-aged woman he had met not a dozen times, mostly in Indianapolis, where she kept a huge brick house spotlessly clean and sipped vermouth from breakfast to bed. The random breath on the forearm like a butterfly, the fine-grained quality of dry eyelid skin—these are signature Updike details. What other writer would notice, let alone bother to describe, the texture of his wife’s eyelids? He is a master at this kind of description, keying in on the microscopic and near-invisible, not as world-building flourishes, but as primary detail. The child’s breath on Lee’s forearm hairs is a perfect metaphor for Updike’s sensory apparatus—I imagine this apparatus as fine cilia, authorial cat whiskers, delicately picking up the slightest shift in descriptive breeze. This keenness extends to non-sensory details, the internal mechanism of a character’s mundane thoughts. We have closely followed Lee from, in his words, breakfast to bed, and we lie with him as he deploys a favorite tack against the insomnia their tiny, hot apartment causes him, a mental exercise he refers to as the insomnia game. The insomnia game sees Lee working his way through the alphabet in the following manner: He let the new letter be G. Senator Albert Gore, Benny Goodman, Constance Garnett, David Garnett, Edvard Grieg, Goethe was Wolfgang and Gorki was Maxim. Farley Granger, Graham Greene… Detailing a character’s nighttime mental routine is unusually perceptive to begin with, but true to form, Updike finds a higher register of his protagonist’s attention to pay attention to, as Lee pauses his list to work through the less familiar foreign first names of Goethe and Gorki. These moments of deep attention are not there to embellish a larger narrative point—they are the point. In the space of several pages, an accumulation of these details create a world in miniature and a feeling of rare intimacy with its inhabitants. The apartment stories—“Incest,” and others like it: “Snowing in Greenwich Village,” “Sunday Teasing,” “Should Wizard Hit Mommy,” etc.—seem to me the perfect vehicle for Updike’s rare gifts. There is, after all, a claustrophobia to this kind of detail, and these small city apartments match setting and theme to technique. The apartments become a kind of panopticon, with Updike’s thousand eyes relentlessly monitoring the stifled desires and tense moods of their main characters—almost invariably a young married man. At the same time, the molecular focus that Updike brings to these stories manages to turn one-bedroom shotguns into universes of discovery. There is a sense in these stories of relentless searching for a truth or truths that can only be found by plumbing down into the granular, the microscopic; these tiny details both encode and reveal the larger hidden structures we move through unaware. A small New York apartment contains an entire life—the miracles (and curses) of marriage, childbirth, sex, and death. [millions_ad] The stories themselves follow suit, finding the largest meaning in the smallest plot point. For the most part, these tales bear little in common with others of their seeming kin—John Cheever’s earlier stories, for example, many of which also center on young married couples pressure-cooking in tiny apartments. But Cheever, despite being an eccentric fabulist, is conventional in his adherence to traditional plot devices, the need for inciting incidents and escalating tension. There is no Enormous Radio in Updike, just a very small one playing in the background, the crackling sound agitating the room’s ambient emotion, and the humbleness of the device itself obscurely bothering Lee (or Richard or Arthur) trying to read in the next room. We seem to be dropped into these characters’ lives almost at random, the barest wisp of event enough cause for a story to coalesce. In “Walter Briggs,” for example, a couple driving home from a party (cars—or trains, or airplanes—like apartments, are ideal Updike sensory dioramas) attempts to remember people they met on their first wedded vacation: “How could you forget Roy? And then there was Peg Grace.” “Peg, Grace. Those huge eyes.” “And that tiny long nose with the nostrils shaped like water wings,” Claire said. “Now: tell me the name of her pasty-faced boyfriend.” “With the waxy blond hair. Lord. I can’t conceivably hope to remember his name. He was only there a week.” This goes on far longer than it would in the work of other writers, several pages. On and on they drone, exactly the kind of inane, half-focused conversation that composes the atomic structure of married life, exactly the kind of desultory scene typically excised from most stories. And yet: something snags—her enviable memory, and his inability to summon the name of a comic figure at the camp. Later, at the door of sleep, his mild frustration blends with a litany of details from the camp, and the sudden return of the man’s name—Walter Briggs—is like a poignant echo of his old love for his wife. In both their obsession with remembering tiny details, and their ability to do so, these two resemble their creator. Who, but Updike, would find erotic charge in a nostalgic memory competition? By so heavily foregrounding textural detail in these stories, Updike calls into question what constitutes a story to begin with. There is an aesthetic claim being made, that anything can be a story if you look closely enough. And the domestic sphere, Updike’s natural habitat and milieu, is all stifling closeness—what are marriages if not an infinite series of minute, learned, hateful, and joyful gestures, performed in the tiny theaters of our living rooms and beds? Here, the aesthetic claim grades into something that approaches a moral claim. A reader, waiting in these stories for twist or conflict or denouement, will get to the end unsatisfied, having missed the all intervening action, action that occurs on a moment-to-moment perceptual level. Life is like that, too. Although modern readers may justly find Updike morally distasteful on many counts—mid-century white male privilege, literary sexism, and political conservatism to name a few—he seems exemplary, at least, in the sense of how much attention a person ought to bring to bear on the banal splendidness that comprises their life. Taken as a whole, the attentiveness that Updike trains on these intermittently peaceful and unpeaceful homes becomes performative and self-justifying. Like a fantastically gifted magician, the show becomes less about the trick itself and more about the dexterity required to perform the trick. He is constantly finding the edge of his talent and reaching just beyond it for the detail so fine and fleeting that it is preposterous, even for him, to notice it. Yes, on a basic level, this is show-offy. But I sense that it also comes from a generous instinct, a desire to share something with the reader no one else has shared before. As he puts it in “Wife-Wooing”: “An expected gift is not worth giving.”
Punctuation, largely invisible and insignificant for normal people, as it should be, is a highly personal matter for writers. Periods, commas, colons, semi-colons: in their use or non-use and in their order and placement, can represent elaboration, conjecture, doubt, finality. And in aggregate, over the course of a text, the rhythms of punctuation advance an author’s worldview and personality as surely as any plot or theme. Patterns of punctuation usage are the writerly equivalent of an athlete’s go-to moves, or a singer’s peculiar timbre and range—those little dots and squiggles, in a sense, encode your voice. Anthony Powell’s colon (pardon the inadvertent image) is as signature as Kyrie Irving’s crossover or Rihanna’s throaty cry. For me, there is no punctuation mark as versatile and appealing as the em dash. I love the em dash in a way that is difficult to explain, which is, probably, the motivation of this essay. And my love for it is emphasized by the fact that many writers never, or rarely, use it—even disdain it. It is not, so to speak, an essential punctuation mark, the same way commas or periods are essential. You can get along without it and most people do. I don’t remember being taught to use it in elementary, middle, or high school English classes; I’m not even sure I was aware of it then, and I have no clear recollection of when or why I began to rely on it, yet it has become an indispensable component of my writing. It might be useful to include an official definition of the em. From The Punctuation Guide: “The em dash is perhaps the most versatile punctuation mark. Depending on the context, the em dash can take the place of commas, parentheses, or colons—in each case to slightly different effect.” The “slightly different” part is, to me, the em dash’s appeal summarized. It is the doppelgänger of the punctuation world, a talented mimic impersonating other punctuation, but not exactly, leaving space to shade meaning. This space allows different authors to use the em dash in different ways, and so the em dash can be especially revealing of an author’s style, even their character. The maestro of the em dash—as he was with many things (and apologies here, it is difficult not to annoyingly play, or seem to play, on a punctuation’s usage while writing about it)—was probably Vladimir Nabokov. The locus of Nabokov’s attention is usually at least half trained on the fictional document he’s producing, so em dashes often serve as a kind of in-text footnote. But in a more general sense, he simply employs them as part of his exemplary stylistic machinery, using them as counterweights against commas, as parenthetical ballast and rhetorical cog. In Lolita, Nabokov is engaged in creating a calibrated ironic voice that half-emulates speech while retaining its smooth literary surface, and em dashes enable a more precise pacing of words and thoughts from the sentence to paragraph level. A representative passage chosen completely at random: I launched upon an “Histoire abregee de la poesie anglaise” for a prominent publishing firm, and then started to compile that manual of French literature for English-speaking students (with comparisons drawn from English writers) which was to occupy me throughout the forties—and the last volume of which was almost ready for press by the time of my arrest. I found a job—teaching English to a group of adults in Auteuil. Notice how the use of em dashes here, not strictly prescribed by any pressing grammatical need (the first could be justly replaced with a comma, the second eliminated), are used to create an internal structure that bridges paragraphs. The long sentence at the end of the first paragraph closes with a short clause set off by an em dash, and the short sentence at the beginning of the next starts with a shorter clause also enclosed by the em. The chief effect of this kind of bracketing is, I think, intuitive and rhythmical, adding to Humbert’s pompous purr, but there is a secondary effect of conjoining the ideas of transgression (his arrest) and seeming normalcy (finding a job), a pas de deux central to Lolita’s thematic heart. [millions_ad] A more contemporary user of the em dash is Donald Antrim. Antrim’s em dash helps to create a faltering narration that expresses the pervasive emotional mood of his work, an almost paralytic anxiety. Take this first sentence, from the story “Ever Since:” Ever since his wife had left him—but she wasn’t his wife, was she? he’d only thought of her that way, had begun to think of her that way, since her abrupt departure, the year before, with Richard Bishop—Jonathan had taken up a new side of his personality and become the sort of lurking man who, say, at work or at a party, mainly hovers on the outskirts of other people’s conversations, leaning close but not too close… The narrator has only just begun to have a thought about Jonathan’s wife before a new thought intrudes, needlessly clarifying who she is to him before we even know who he is. “Needlessly” in story terms, though the larger narrative need is to exemplify, through halting syntax, Jonathan’s excruciatingly circumspect mental process. This is, to a degree, Antrim’s own process, and we get doses of it even in more remote, comic narration, such as in the long beginning sentence—Antrim is a great lover of long beginning sentences—of “An Actor Prepares:” Lee Strasberg, a founder of the Group Theatre and the great teacher of the American Method, famously advised his students never to “use”—for generating tears, etc., in a dramatic scene—personal/historical material less than seven years in the personal/historical past; otherwise, the Emotion Memory (the death of a loved one or some like event in the actor’s life that can, when evoked through recall and substitution, hurl open the floodgates, as they say, right on cue, night after night, even during a long run)—this material, being too close, as it were, might overwhelm the artist and compromise the total control required to act the part or, more to the point, act it well; might, in fact, destabilize the play; if, for instance, at the moment in a scene when it becomes necessary for Nina or Gertrude or Macduff to wipe away tears and get on with life; if, at that moment, it becomes impossible for a wailing performer to pull it together; if, in other words, the performer remains trapped in affect long after the character has moved on to dinner or the battlefield—when this happens, then you can be sure that delirious theatrical mayhem will follow. Here, Antrim actually violates, as he sometimes does, a basic rule of parenthetical em dash usage, that you can only use one set per sentence. The violation of this stricture is unsettling and makes it difficult to keep up with meaning. Which, in a sentence and story about artistic chaos and loss of control, is, of course, the point. Emily Dickinson is probably the most well-known user of the dash, to such an extent that “em” might justly be taken as short for “Emily.” She habitually ended lines with em dashes, sometimes to an obvious effect, sometimes not. Here is her most famous stanza: Because I could not stop for Death— He kindly stopped for me— The Carriage held but just Ourselves— And Immortality. What are the dashes doing here? On the one hand, since they don’t serve any obvious syntactical function, they can be read simply as a stylistic tic. But they do create a feeling of hesitation that serves the poetry. Without them, this stanza is a nicely crafted, clever piece of thinking about the inevitability and dignity of death. With them, we feel Dickinson’s hand hovering over the page, considering her subject. This lends a poignancy to the poem, a sense of the artist thinking through her subject, considering the terms of her own death. Her use of the em dash obliquely posits writing as an elaborative act, and in many of her poems the em transforms what would otherwise be somewhat inert, though great, common meter into something alive to itself, process-oriented. My own favorite use of the em dash is for elaboration, similar to the way many writers use colons. As a personal rule, I only use colons in a specific context: that is, if what follows answers the question what? Em dashes, I find useful for both narrowing and expanding a train of thought that might lose momentum in a new sentence—in this sense, they also stand in for the semicolon, but semicolons are best used (in my fuddled cosmology of punctuation) as dividing walls between two related but independent thoughts of approximate equal value (I wholly reject, by the way, that old bullshit about eliminating semicolons). In truth, I probably overuse the em, find too much pleasure in asides, in explanation. But I can’t do it, I cannot write terse little impregnable Tobias Wolffian sentences that stand on their own. Though I can admire a page of these sentences—the calm presiding rationality, like civilized people queueing to exit the building in a fire drill—I am drawn instinctively to the dithering em, some contingency always butting up, worrying the previous sentence before it’s had a chance to end. As Noreen Malone put it in a self-deprecating Slate article, “The problem with the dash—as you may have noticed!—is that it discourages truly efficient writing. It also—and this might be its worst sin—disrupts the flow of a sentence.” This is true. But is efficiency the point or purpose of writing? It seems to me that novels, especially, are almost anti-efficiency devices. Yes, we want to communicate clearly, but sometimes, just as crucially, we also want to clearly communicate the difficulty of communicating clearly. Image Credit: Wikimedia Commons.
“I think fiction writing which does not acknowledge the uncertainty of the narrator himself is a form of imposture which I find very, very difficult to take. Any form of authorial writing where the narrator sets himself up as stagehand and director and judge and executor in a text, I find somehow unacceptable. I cannot bear to read books of this kind.” – W. G. Sebald That this is the age of first person seems undeniable. Essay and memoir are—have been for some time—culturally ascendant, with the lines between fiction and essay increasingly blurred (I’ve written about this here). In its less exalted form, first person dominates our national discourse in many guises: the tell-all, the blog post, the reality confessional booth, the carefully curated social media account, the reckless tweets of our demented president. We are surrounded by a multitude of first person narratives, vying for our time and attention, and we respond to them, in our work, and increasingly in our art, in first person. My impression, as a writer and teacher, is that over the last 10 or 15 years there has been a paradigmatic move toward first person as the default mode of storytelling. In a workshop of 20 student pieces, I’m now surprised if more than a third are written in third person. When I flip open a story collection or literary magazine, my eye expects to settle on a paragraph liberally girded with that little pillar of self. Anecdotal evidence tends to support this suspicion. A completely random example: six of the last 10 National Book Award winners have been first-person narratives; of the 55 previous NBA winners stretching from 2005 to 1950 (Nelson Algren’s The Man with the Golden Arm), the tally is 40 to 15 in favor of third person. This is, of course, completely anecdotal and almost certainly statistical noise, to a degree. Still, it’s suggestive. As recently as 10 years ago, creative nonfiction specialist jobs barely existed at the university and graduate MFA level; last year, there were more creative nonfiction job openings than comparable tenure track positions for poets. Essay and memoir classes have sprung up everywhere. Whether this trend is significant and whether it will continue are debatable; that it is a trend, seems less so. It worries me that we may be slowly losing the cultural ability or inclination to tell stories in third person. Why does this matter? Because, I believe, third-person narration is the greatest artistic tool humans have devised to tell the story of what it means to be human. In How Fiction Works, James Wood cites Sebald, decrying third person as obsolete following the horrors of World War II. Wood comments, “For Sebald, and for many writers like him, standard third-person omniscient narration is a kind of antique cheat.” The general argument, as advanced by Sebald, and more recently, by writers like David Shields and Will Self, seems to go: Flaubertian third-person omniscient narration is a jerry-rigged, mechanistic anachronism blithely ignorant of the historical context that renders it obsolete; far from “realism,” it is almost wholly artificial, beginning in the first place with the artifice of a narrator and extending through the sleight-of-hand known as free indirect discourse (crudely put: the blending of narrator and character perceptions). First person narration, the corollary would go, is more immediate and less contrived. It is authentic. Most people seem to agree. These critical interpretations both reinforce and describe a more popular apprehension of first-person narrative—that it is the most direct and natural form of storytelling. In creative writing classes, teachers will often advise students to employ first person with more overtly raw or emotional material, operating on the rationale that first person has an implicit honesty third does not. Sebald’s quote—as to the inherent (and therefore inherently truthful) uncertainty of the essayistic perspective—is simply a more sophisticated version of this position, what we might call the naturalistic view of first person. First person, however, contains a contrivance central to its character that third person does not: audience. In first person, someone is addressing someone else, but absent narrative framing to position these someones—a la Holden Caulfield directing his speech to a ghostly doctor—we find ourselves in an inherently ambiguous space: to whom, exactly, is this person talking, and why? The uncertainty of this space, I would argue, is largely filled, intentionally or not, by the voice of the narrator, its presence and authority. Even if this narrator declaims her own uncertainty, she declaims it with certainty, and she declaims it toward an imagined audience, in a speaker/listener relationship. There are no competing voices, no opportunity for the objective telescoping of third person, and so the reader essentially become a jurist listening to a lawyer’s closing argument. In this sense, all first-person narration is unreliable, or placeable on a continuum of unreliability. It isn’t accidental that the greatest examples of the first-person novel—Lolita, The Good Soldier, Tristram Shandy—make ample use of unreliability and/or intricate frame narration. The best examples of the form lean as heavily as possible on first person’s audience-related pretenses. Third-person narration, in contrast, contains no similar inherent claim to authority, and therefore tends toward a version of the world that is more essentially descriptive in character. A third-person narrative, whether in the form of a short story or War and Peace, is a thing to be inspected by the reader. It is, in a sense, a closed system, a ship in bottle, and the reader can hold it up to the light to see how closely it resembles a real ship. If it does, part of the reading experience is to imagine it as the real thing; but it can be assumed, in a kind of contract on the part of intelligent writers and readers, that the shipbuilder is not pretending his model is fit for actual seafaring. In other words, the existence of a third-person narrator—that artificial authority Sebald found intolerable—signals the act of storytelling, and in doing so, encodes a structural uncertainty that first person lacks. Third-person narrators no longer walk onstage and deliver monologues, a la Jane Austen, but we still understand them to be devices in service of telling a story—a contrivance that announces itself as such. They are the artifice that enables the art, and they are truthful as to their own untruthfulness, or perhaps better, their truthlessness. Compared to the explicit machinery of third-person narration, first person’s artifice seems covert, a clandestine operation. This is not necessarily an argument against first-person narration—in able hands, this concealment can be a means of exposing greater truths about the subject of the writing or its writer—but it is an argument against the proposition that first person is somehow more transparent or “honest” than third. The other common objection to third-person narration, and by proxy an argument for first person, also concerns the artificiality of the third person narrator, not in artistic but rather, experiential terms. This is the second prong of the naturalist argument: it isn’t a thing that exists. No one walks into a room and thinks of themselves, “he walked into a room.” Also, no one simply watches other people walk into a room without being aware of their own frame of reference. And this is true: close third person, via free-indirect discourse, models human consciousness with an intimacy that strives toward first person’s access to a character’s thoughts and emotions. Why then, the argument goes, not dispense with this clumsy intermediary and go right to the source? Counterintuitively, third person achieves an effect, both in spite of and because of its narrator, that is more “realistic” than first. While no one walks into a room and thinks, “he walks into a room,” it can be asserted with even greater force that no one walks into a room and thinks, “I walk into a room.” No one, that is, who isn’t an imbecile or robot—not characters who figure heavily in the canon of great fictional protagonists. The experience of being a human is, in fact, an experience of dual consciousness. Human beings are social creatures, and human existence is an endless negotiation of the immediate, subjective perspective, and the greater objective context. We constantly divide our attention between the first- and third-person points of view, between desiring the shiny object in front of us and figuring out what it means for us to take it: who else wants it, what we have to do to get it, and whether it’s worth taking it from them. In this sense, close third person not only accurately models human cognition, but omniscient third does as well, since, while we cannot read other people’s minds, we are constantly inferring their consciousness—their motives and feelings. The human experience is a kind of constant jumping of these cognitive registers, from pure reptile-brain all the way up to a panoramic moral overview and back down, and human ingenuity has yet to invent a better means of representing this experience in art than the third-person narrator. The apparatus of third-person narration, while wholly artificial, ironically enables the most authentic depiction of the quagmire of personhood. Irony is key here, in both cause and effect. Third person’s scaffolding of multiple, competing levels of awareness is inherently, structurally ironic; the effect created by these slightly ill-fitting beams and joists, as the demands of narrative push and pull them against each other, is a large-scale, resonant irony. Writing about the ability of narrative to convey humanity’s huge profligacy of type, Adelle Waldman, in a New Yorker piece from 2014, quotes Leo Tolstoy’s depiction of Vronsky: He was particularly fortunate in that he had a code of rules which defined without question what should and should not be done. The code covered only a very small number of contingencies, but, on the other hand, the rules were never in doubt, and Vronsky, who never thought of infringing them, had never had a moment’s hesitation about what he ought to do. The rules laid it down most categorically that a cardsharper had to be paid, but a tailor had not; that one must not tell a lie to a man, but might to a woman, that one must not deceive anyone but one may a husband; that one must not forgive an insult but may insult others, etc. She says, “If someone like Vronsky were to give an account of his moral code, it would not, we can be sure, read in precisely these terms.” This is true but neglects an important aspect of this rendering of Vronsky’s moral code, for we see at once in this passage a social view of Vronsky’s hypocrisy that shades toward a self-awareness of his own hypocrisy. This shading—the ironic bounce of the repeated “never,” and the pompous “most categorically”—both enact Vronsky’s pompous hypocrisy and suggest a shiver of cognitive dissonance, of unease, that seems to come from Vronsky himself. The point is debatable—maybe Tolstoy is just calling Count Vronsky an asshole—but in a general sense, the ironic space that third person carves out creates a productive ambiguity that deepens character the same way these little ironies of the self, the simultaneity of objective and subjective, deepen human existence the more a person is aware of them. In this case, they suggest a Count Vronsky who is not only an asshole, but also, perhaps, very slightly aware of his own assholishness, as most assholes are. It at least implies that possibility—a complex position unavailable to first person, in which a Vronsky POV would essentially either cop to his own hypocrisy, or strategically introduce it through unwitting revelation in the usual reliable unreliable method. As a thought experiment, try to imagine Ulysses written in the first person, the dueling solitary consciousnesses of Stephen and Bloom. We are, of course, embedded deep in Bloom's and Stephens’s minds, but we are embedded there, via virtuoso free-indirect discourse rather than first-person. It is surprising, in a way, that Ulysses was not written in first—after all, here we have the summit of stream-of-consciousness narrative, with an emotional and associative immediacy that has informed 100 years of writing all the way to the essayists of the moment. Not only this, but the fracturing of consciousness and Dublin’s social institutions as represented in the book are (as we understand, in a somewhat trite though probably accurate sense) a cultural response the First World War; per Sebald, we would expect such a narrative to dispense with the puppetry of third-person narration. So why not in first? What would be lost? Among other things, it would more or less be simply a record of human confusion. It would be an exhaustive, exhausting trek through Dublin, unremitting in its assault on our senses. Ulysses is already exhausting enough in this regard, but many of the moments of relief are moments of perspectival shift: the wider view of Stephen in the classroom, for example, or the anti-Semitic Citizen throwing a biscuit tin at Bloom as he flees the pub, righteous and triumphant. These, and similar moments allowed by the omniscient narration, crucially allow in other people, complicating the dominant note of mental claustrophobia. I say crucially, because the novel is not, ultimately, about mental claustrophobia, about being trapped in oneself; it is about the opposite, about the inevitability and value of social connection. A Ulysses in first would represent, in spite of its erudition and catholicity of reference, essentially a shriveling worldview, rather than the enlarging one it offers. HCE: Here Comes Everybody. All of which is to say that the current critical and cultural movement away from third-person narration should be taken seriously, and to some extent—as much as such a thing is possible—resisted. Matters of taste come and go, and it may seem silly to imagine third-person narration disappearing. After all, it has persisted in its current form for going on 300 years. But many pinnacles of high art recede and disappear in the face of changing norms. It was probably similarly hard for the 19th- century art lover to imagine classical portraiture and Renaissance brushwork disappearing. David Shields and similar critics may be dismissed as extreme, but they give voice to a larger cultural impulse, the enthronement of unmediated personal experience and feeling (as though such a thing were possible, even if desirable) as the height of written expression. Reviewing Meghan Daum’s essay collection, The Unspeakable, Roxane Gay writes, “When it comes to the personal essay, we want so much and there is something cannibalistic about our desire. We want essayists to splay themselves bare. We want to see how much they are willing to bleed for us.” The promotion of this kind of writing is, in turn, a collective response to larger cultural currents, among them the still shockingly recent advent of the Internet and reality television. In this context, it is not hard to imagine omniscient third person, with its many registers of complex irony and representation, becoming the truly outmoded art form that Sebald and others would like it to be, an ornate artifact of a slower and more explicable age. And it’s true that in a very real sense, third person is not the narrative mode of our time. A Henry James novel is essentially the anti-tweet. Its aesthetic roots are in a more contemplative era, an era with fewer distractions and, simultaneously, more incentive to consider one’s place in the larger social context of a world that was rapidly expanding. Now that the world has expanded to its seeming limits, we see an urge to put the blinders on and retreat into the relative safety of personal narrative. This impulse should be resisted. We need to engage with our world and one another, making use of the most sensitive instruments of understanding we have at our disposal. Image Credit: Pixabay.
There is a special joy in finding someone who hates a book as much as you do. But not just any hate, and not just any book. There are plenty of books you may not like: books that fail to deliver on their explicit premise or implicit promise, books with middles that sag like the floor of a teardown, books with underwritten or overworked prose, books that strain credulity, books that could stand to strain credulity a bit more, books that are too long, books with bad covers, books that just aren’t very good. The world is full of books you may not like, and it is not hard to be generous about them, especially as a writer—because, as well you should know, books are incredibly hard to write, and even harder to write well. A genuinely good book is something of a rarity; a genuinely great book is something of a miracle. So if someone, in your estimation, misses the mark, that is an easy thing to forgive, either by way of reading charitably or simply putting down the book in question. The type of book I describe here must be a book that you not only dislike, but that everyone else likes, a lot. By “everyone,” I mean friends and family, the general reading public, and publishing’s commercial/critical apparatus—everyone loves it, and your non-love is so unusual as to make you question your taste. In extreme cases, it may seem as though there’s a conspiracy afoot in the general culture to make you feel insane, a kind of cultural gaslighting. The hatred you develop for these books is different in character from mere non-enjoyment. It is rooted in a feeling of unfairness—not so much being left out, as being the clear-eyed pariah in a horror film, Steve McQueen in The Blob. It is a hatred that begins to feel personal, a resentment that extends from the text in question to the person responsible for the text. Several examples of books like this come to mind from recent years, but I will focus on one in particular, title and author redacted. This novel came highly recommended in print and online reviews, and by many of my friends and colleagues. I’ll credit my mother, an unusually wary reader, with some ambivalence, but by and large the response was a tidal wave of praise. It was variously hailed as one of the 10 best of the year, an utterly brilliant book, a landmark performance by [REDACTED]. I found it almost unreadable. The highly touted prose is a lush jumble of metaphors and registers without any controlling logic or taste. Each sentence is like a cordoned-off museum of its own aesthetic particulars, seemingly unrelated to the sentence coming before or after. A single paragraph might blithely mix lavish description, various unconnected similes, a flat and unfunny sex joke or two, and futurist Internet-speak, with no seeming concern as to how any of these elements work together. How, I wondered as I read, did this qualify for anyone as “sparkling prose?” For me, the reading experience was like being trapped in a mudslide, an avalanche of language, each additional word adding to the incoherence. The story is a series of lazy implausibilities cobbled together with all the deftness of a tailor wearing oven mitts. The characters are ludicrous, with ludicrous back stories and ludicrous arcs—they are ludicrous to an extent that feels intentional, satirical, except nothing is being discernibly satirized. Rather than a failed attempt at satire, the general silliness of the proceedings instead stems from what feels like authorial disengagement; the novel itself seems uninterested in why its characters do things and what their level of self-awareness is. They simply move where [REDACTED] needs them to and behave as the story dictates, however unconvincingly. The overall effect is a contempt for these people and their fictional reality. If I’d happened randomly upon the book, I would have simply stopped reading, but in the spirit of someone not getting a joke and asking the teller to repeat it over and over, I kept on, thinking at some point it would click. When it hadn’t halfway through, I finally stopped, feeling deceived. What trick of promotion—what cocktail party legerdemain—had managed to foist this nonsense upon the reading public, had somehow turned this gobbling turkey into a golden goose? That the book had been published at all was vexing enough—though not, seemingly, to the scores of critics and fellow authors who padded the proceedings with an introductory chapter of praise. I moved on to other, less maddening novels, but I’d be lying if I said I didn’t seek out negative Amazon reviews for that paltriest comfort: the spectral two-star outrage of other bewildered readers. The reflexive cynical view—one that is easy to adopt in the thrall of book-hate—is that you have fallen victim to a corporate conspiracy. Publishing decides ahead of time a certain writer’s moment has come, the engines of commerce and publicity fire up, and the quality of the book in question is a distant secondary concern. While this may contain a kernel of truth, a simpler, more general explanation probably suffices. Namely, there is a great inexorable power in expectation—as social beings, we want to like things we’re supposed to like, and we’re uncomfortable standing at the platform, watching the bullet train of popular opinion shriek by. Human nature, of course, tends toward a herd instinct, but the inclination of readers—and this includes agents, editors, publicists, and critics—to enjoy a book derives mostly from good qualities: kindness and generosity and the urge to like, rather than dislike things. About a year later, I sat with friends on the second-floor porch of a rented vacation house in the mountains. We were enjoying a drink and the view of a late summer sunset over a canopy of trees just beginning to bruise with the colors of autumn. It was cool, birds flew to and fro, and from inside the house drifted music and the smell of roasting meat. The scene felt unimprovable, but then, from out of nowhere, this book came up in conversation and I discovered, to my surprise, two other readers who hated it. Oh, my friends! Oh, the joy! The sweet, delicious bliss of finding other readers who hate a book as much as you! Our individual aversions, long suppressed in unspoken self-doubt, were suddenly given full voice in an ecstasy of communal loathing. At last, we’d found each other. We leaned in with a conspiratorial air. “Godawful,” said my friend B. “I put it down after 50 pages, maybe 40,” said J. I said, “It’s completely ridiculous. I quit halfway, imagine reading the whole thing.” “I did,” said B. “I couldn’t believe I was reading the same book people gushed over all year. It was like I had to keep checking.” This went on too long, as this kind of thing typically does. Twenty, 30 minutes, maybe. We were like children gorging on an unattended cake, so desperate to get in every last piece of frosting and crumb that we misjudged our appetites. The initial greedy sugar rush of shared hatred went away and we crashed into a state of rueful circumspection. B said, “I mean, there’s no question [REDACTED] is talented.” “Yeah,” said J, “And I loved [REDACTED’S] first novel. I think it was part of my disappointment.” The conversation spiraled in once or twice more to how bad the novel was, but with ever-waning enthusiasm. As readers, I think, we share an understanding of what a niche audience we are, and what a niche enterprise writing a novel is. There’s a feeling—and perhaps this goes a way toward explaining why bad literary novels become fêted—that any nominally non-stupid book enjoying a moment of popularity is something to be celebrated. Disliking a popular literary novel—that blackest of black swans—is not like disliking a TV show everyone else thinks is great. Being the contrarian in the corner of the party who thinks Breaking Bad is overrated might not win you any friends, but being the reader who thinks a good book is overrated feels uncomfortably close to a betrayal. We settled down. I said, “Yeah, agreed. [REDACTED] really is an interesting writer. “I’ll read the next one,” said B. “Also,” J said, “I guess we could be wrong about this.” But that was a possibility too terrible to contemplate, and the conversation moved on to other things as we went inside for dinner. Image Credit: Flickr/K-Screen Shots.
“I came to realize that far more important to me than any plot or conventional sense was the sheer directionality I felt while reading prose, the texture of time as it passed, life’s white machine.” —Ben Lerner, Leaving the Atocha Station In the creative writing classes I teach, my students—most of them brand new at writing fiction—often go crazy writing plot. Their understanding of fiction, derived from a Stephenie Meyer/J.K. Rowling/Suzanne Collins-heavy reading background (not to mention 18 years of TV and movies), is that in fiction, stuff needs to happen. These early stories are breakneck affairs, full of marriages and divorces and car chases and gunplay and fistfights and murders and suicides and murder-suicides—sometimes spinning several into the same piece. They are B-movie scripts written as prose, mostly expository. Slow down, I advise, boringly. They listen, or pretend to listen, as I explain that literary fiction, the kind I am ostensibly being paid to instruct them how to write, is more character than plot. This is axiomatic: in literature, character is primary, always. Yes, things must happen, but they can be small things, incremental turns of the thousand gears that make up a narrative’s clockwork interior mechanism. And they should be things, ultimately, that derive from character. My students nod and proceed to write stories with gunfighting vampires—but introspective gunfighting vampires—and get an A-minus, probably. Lately, however, I’ve read a spate of critically lauded books published in the last few years that make me feel like one of my students. Where, I wonder, pausing halfway through, is the action? To be sure, there is still plenty of fiction being written with conventional plot, plenty of bestsellers with blood-soaked covers, but it has begun to seem to me that many of today’s best writers are writing fiction in which almost quite literally nothing happens. The prefatory Ben Lerner quote comes from his 2009 novel, the roman-a-clefish Leaving the Atocha Station. In it, Adam Gordon, a 20-something in Madrid on a Fulbright fellowship, wanders around the city smoking pot, thinking about poetry, writing very little of it himself, and freaking out when he isn’t sleeping. In other words, he is a 20-something poet abroad. What event there is, including the titular Madrid bombing, happens to him or near him, not because of him. Early on, for instance, he passively gloms onto a group of strangers in a bar, is befriended by a gregarious Madrileño and his sister, vomits from an excess of Mojitos, is driven home but can’t remember the address, is driven to a party, smokes too much pot while listening to a guitar player, pretends to cry while telling the sister his mother has died, and is comforted by her. I suppose there’s a bit of action there, in terms of his weaselly dissembling, but that’s about as far as it goes. The page turns, and we are on to the next day of espressos in the shower and lonely flaneuring around the Prado and Plaza del Sol and Gran Via. The point is repeatedly made that Adam cannot really speak Spanish, a narrative choice that, on its face, thematizes the problems Adam has with poetry and representation, but on a practical (read: plot) level also means that he cannot really do anything. His agency is limited by narrative design, clearing maximal novelistic space for his thought process, a process that includes lengthy ruminations on his lack of agency. Teju Cole’s Open City takes this mode a step further, almost entirely dispensing with conventional plot in favor of the narrator Julius’s peregrinations through New York. These “aimless wanderings” as he describes them in Chapter One, also describe the book’s narrative strategy, one in which the real-time event of Julius moving through the city is simply a pretext for him to think about the city, his life, the world, politics, as he says, “noticing himself noticing.” Flaneuring is, of course, nothing new in literary novels, nor, more generally is the phenomenon of the most important action occurring in thought. What, for example, is Madame Bovary besides the portrait of a woman noticing herself notice her real feelings about her life and her marriage? What seems different is the degree to which these books, and many similar others published in the last few years, feel little pressure to make anything happen or put their characters in any sort of plot-driven crucible. It would be negligent to discuss books lacking event without mentioning that 800-pound gorilla of strategic tedium, Karl Ove Knausgaard’s My Struggle. The Min Kamp series is an ode to inaction, a paean to plotlessness. The first section of Book I, approximately 200 pages, is committed to Karl Ove’s recollection of an adolescent New Year’s eve, during which he and a friend finagle a twelver of beer, hide it in the woods, and attend a classmate’s boring party. That’s it. Interwoven are lots of digressions about life, death, philosophy, and fatherhood—many of them interesting—but also interwoven are lots of other digressions into his daily existence, many of punishing mundanity (how many times, for example, are we treated to a scene of Knausgaard entering a gas station and buying a Coke?). Even the weightier second half of the book, given over to an account of the week between his father’s death and funeral, focuses mainly on all the stuff narrative is supposed to leave out: waiting on a flight, cleaning a bathroom, cleaning a kitchen, putting on a pair of pants—all flatly rendered in a po-faced Nordic mock-heroism. The static quality imparted in these books is not accidental, is, in fact, purposeful and, in certain senses, the point. You do not, in either reality or in Knausgaard, get the epiphany without the apophany. As a human body is mostly water, a human life is mostly waiting, mostly killing time between the main events. But fiction, as we have understood it for centuries, could be defined in character as “main events.” Traditionally, fiction seeks to strip out the tangential and irrelevant, the moments between moments, to create an accelerated version of reality in which relatively important things lead to other relatively important things—it isn’t that important things don’t happen in real life, it’s that it takes far too long to get from one to the next, and along the way any narrative outlines become lost in banality, what Lerner, cadging from John Ashbery, describes as “life’s white machine.” Why then, are some of the smartest and most celebrated writers in modern literature (I count among these, as well, Rachel Cusk, whose incomplete trilogy—Outline, Transit—is a study in passivity and seeming tangentiality; to some extent Elena Ferrante, whose Naples quartet accrues its violence and romance in endless reiterative, sedimentary layers; also, Sheila Heti, and an entire generation of creative non-fiction writers, a separate but closely related phenomenon)—telling stories that are largely banal, largely life’s white machine? One reason (as is, regrettably, so often the answer to this kind of rhetorical): the Internet. In moving our consumption of news and media from top-down to an a la carte model, the Internet has changed the way we are accustomed to receiving and disseminating information. Information of all variety spreads now across a series of nodes on a relatively flat plane of authority, and this change, though most frequently and obviously framed in terms of journalism, has also inevitably impacted the way we read and write novels. Put one way, William Shakespeare—or for that matter, Leo Tolstoy or Marilynne Robinson—is to Edward R. Murrow as Teju Cole is to your Twitter feed. This is a facile comparison, but it has some truth, I think. In an era of vertical information—roughly prehistory until 2005—novelists predictably wrote vertically. In the 19th century, a still-religious era with consolidated organs of journalism and publishing, authors wrote with epochal authority, a confidence in the universality of their observations and judgments: it is a truth universally acknowledged, all happy families are alike, etc. Even through the upheavals of the 20th century, there remained an essential confidence in the role of the author as someone bringing the news. Postmodernists like Thomas Pynchon, William Gaddis, and Don DeLillo may have doubted the institutions of their country, may have created new landscapes of American paranoia, but they had little paranoia or doubt with regard to their essential role as shapers and interpreters of a shared reality. As has been noted with tiresome frequency in the wake of the election, we now occupy individuated worlds of curated art, opinion, reality. There may still be objective truth, but it isn’t clear that we can reach consensus on what it is, and we accept as consensus fewer and fewer people as authorities. Of course, the word “authority” derives from the word “author,” and the act of writing itself is inherently an act of authority, of assuming the right to create and order words and thereby create sense. But there are degrees of order and sense in narrative, and writers like Lerner, Cole, Cusk, Knausgaard, and many others, it seems to me, are responding to this mood of uncertainty—if I may be grand, this weakening of cultural epistemic authority—by mitigating the authority they exert in their narratives; in general, by moving from the objective to the subjective. This manifests in certain ways—for one thing, in a preponderance of first person narrative. Third-person assumes the right to speak for, to inhabit, other characters than oneself, and to manipulate these characters, imbuing the text with a unitary consciousness; first-person, no matter the degree of artifice, implies a bounded consciousness, the disconnection between people. And disconnected first-person narratives—from blog posts to reality show confessional to the infantile tweets of our deranged president—are essentially the narratives of our time. It is not hard to mentally recast these writers as social media types: Knausgaard, the maximalist oversharer; Lerner, the pomo ironic; Cusk, the reticent philosopher; Heti, the more traditional diarist. (As an aside, it may to some extent subliminally flatter the reading public to imagine that a compilation of their status updates and thoughts on their life could—with a little editorial organization—comprise a publishable novel.) And this mitigated fictional authority also manifests in a tendency toward plotlessness and mundanity. The act of making up a story is an act of control, an exertion of order over entropy. The more carefully narrative is created, the more meticulously event is arranged for effect, the greater the implied presence of authorial control in the traditional sense. William Makepeace Thackeray, in Vanity Fair, for example, engages in a decades-long arrangement of characters’ lives, moving Becky Sharp and company around a diorama, into which he occasionally enters to comment: This is what happened to everyone, and why. In contrast, reproducing life “as it happens” in this type of quasi-memoiristic fiction, implies a position of authorial neutrality, with no presumption of ordering event for narrative effect or explanatory power: This is what happened to me, and who can say why? It may also be the case that we are, to an extent, culturally fatigued by plot. Five minutes of cable news provides sufficient event, enough dramatic twists and lurid intrigue, to sate anyone’s appetite for the fictional stand-in. Any adult with a job and smartphone is inundated with an unprecedented amount of media and advertising. Our attention is competed for nearly incessantly, with previous limits of mental privacy seemingly encroached on further every day—you cannot take a flight, ride in a cab, or gas up your car without a chyron scrolling beneath your weary eyes. Eventless fiction can, through this lens, be seen as a form of cultural protest, a refusal to vie for a reader’s scanty attention via the bright, shiny artifacts of plot. In terms of aesthetic experience, it is also a respite—I suspect many readers find themselves pleasantly lulled by the snowdrifts of Knausgaard’s youth, the quiet calm of the novel’s glacial inaction. During these “interesting times” (as the old saw would have it), an intelligent response may be to write less interesting fiction.