Poets, editors, songwriters, teachers, journalists, novelists—some great writers and some under-sung ones left us this year. Here, in chronological order of their deaths, is a selective compendium of literary obituaries from 2017.
Bharati Mukherjee was born in Calcutta, educated in England, Switzerland, and India; she earned advanced writing degrees in the United States, and lived more than a decade in Canada—a peripatetic life she mined to write fiction about the aspirations and dislocations of immigrant life. Mukherjee, who died Jan. 28 at 76, grew up in a rich Hindu family, “bubble-wrapped in innocence,” as she would say later. Shortly after arriving at the Iowa Writers’ Workshop, where she studied under Philip Roth, Mukherjee informed her parents that she was not going through with the marriage they had arranged for her and that, in fact, she had recently married a white American writer, Clark Blaise. Her first-hand knowledge of the immigrant’s yearnings was captured in the title character of her breakthrough novel, Jasmine, a poor girl from Punjab who arrives in America “greedy with wants and reckless with hope.” Mukherjee’s collection The Middleman and Other Stories, winner of the National Book Critics Circle Award in 1988, explored the immigrant experience through the stories of new arrivals from the Caribbean, Sri Lanka, the Philippines, and the Middle East. As she was writing those stories, she was developing a credo: “Make the familiar exotic (Americans won’t recognize their country when I get finished with it) and make the exotic—the India of elephants and arranged marriages—familiar.” Given that we now live in a world with 60 million refugees, driven from their homes for reasons ranging from terror to desire, it’s hard to argue with Mukherjee’s claim that “the narrative of immigration is the epic narrative of this millennium.”
Some writers are lucky to have a singular place that forever nourishes their art. William Faulkner had Yoknapatawpha County. Elmore Leonard had Detroit. Patrick Modiano has Paris. And Derek Walcott, the Nobel Prize-winning poet, had his native Caribbean island of St. Lucia. It provided Walcott with ample raw materials for his vivid, musical poems—the sea, the pulsing sun, the land and its fecund vegetation, and the people who live there in the wake of slavery, colonialism, and forced exile, snagged in the mesh of commingled cultures.
Walcott, who died March 17 at 87, published his first poem when he was 14 while operating under the influence of Christopher Marlowe and John Milton. Over the next seven decades he became an accomplished poet, playwright, and watercolorist, fluent in English, French, and Spanish, producing a body of poems that ranged from compact to epic, always spun from the weather, the history, and the people of the Caribbean. Walcott was also a wanderer, and, like all exiles, he knew the twinned aches of leaving home and returning. These lines are from In a Green Night, the 1962 book that announced him as a major writer:
The hospital is quiet in the rain.
A naked boy drives pigs into the bush.
The coast shudders with every surge. The beach
Admits a beaten heron. Filth and foam.
There is a belt of emerald light, a sail
Plunges and lifts between the crests of reef,
The hills are smoking in the vaporous light,
The rain seeps slowly to the core of grief.
It could not change its sorrows and be home.
Though he’ll be remembered as a Pulitzer Prize-winning newspaper columnist of the New York City persuasion, Jimmy Breslin, who died on March 19 at 88, was also a gifted novelist, memoirist, biographer, and writer of nonfiction books about subjects both light and dark, from the ineptitude of the early New York Mets baseball teams to the sins of sexual predators in the Catholic priesthood. His biography of Damon Runyon reads like Damon Runyon on acid. Breslin produced more than 20,000 newspaper columns in his long and fluorescent career—a staggering number, I can attest, having produced about 600 of the things myself. Many of Breslin’s were written on behalf of the powerless, the ignored, the forgotten. When someone asked him why he kept going back to the well, he replied: “Rage is the only quality which has kept me, or anybody I have ever studied, writing columns for newspapers.”
Breslin’s was an only-in-New-York life. Born in Queens, he knew the streets and the saloons, the mobsters and the cops like nobody else, and he was among the vanguard of writers who birthed what has come to be known as the New Journalism, though he scoffed at the term. Too high-minded for this burly son of the outer boroughs. He ran (unsuccessfully) for New York city council the same year Norman Mailer ran (unsuccessfully) for mayor. His fame reached its peak in 1977, when the serial killer David Berkowitz, known as the Son of Sam, began sending letters to Breslin, which he published in the New York Daily News. For all the warmth he felt for the little people, Breslin could be as cold and hard as iron. His father abandoned the family when Jimmy was young, and when his father died, the son paid for the cremation. “Good,” he said afterward. “That’s over.”
Jean Stein died on April 30 at 83, an apparent suicide. She grew up amid Hollywood luxury—her father founded Music Corporation of America—and she returned to that milieu in her later work. But it was her 1982 book, Edie: An American Biography, that upended my understanding of what a book can be. It tells the story of Edie Sedgwick, who also grew up wealthy, became a Andy Warhol superstar, then spiraled into drug addiction and death by overdose at 28. Her story is told by dozens of people whose lives crossed hers (and her patrician family’s). Stein does not elicit conventional answers to conventional questions, as in Studs Terkel or Oriana Fallaci; instead she acts like a camera, unflinching, mutely watching and listening as people talk. There is no authorial intervention, seemingly no point of view. In time, the lack of affect becomes the affect. The book is a flat yet sneakily rich portrait of squandered American privilege and the cult of celebrity. It’s an act of dissection. An X-ray. A masterpiece.
Stein was not a one-hit wonder. She worked at The Paris Review (where she interviewed William Faulkner), Esquire, and the literary quarterly Grand Street. She produced another oral history, American Journey: The Times of Robert F. Kennedy, and West of Eden, a study of the influences of Hollywood, oil exploration, and real estate on the city of Los Angeles. Stein was shy by nature but she threw glittering parties, including one at which Norman Mailer and Gore Vidal got into a fistfight. She was an unobtrusive but brilliant interviewer. Of the technique behind Edie, she once said, “Each person is speaking directly to you…Nobody is ever telling you, the reader, what to think.”
The news that Denis Johnson had died on May 24 at 67 sent me back to two pieces of writing. The first was Johnson’s masterly short story, “Car Crash While Hitchhiking,” from his 1992 collection about drug-addled drifters and losers, Jesus’ Son. Like all great fiction, “Car Crash” conjures a world that’s unlike any other and yet instantly, even shockingly, familiar. Words pop out of nowhere and ambush the reader. It’s the story of a lone hitchhiker stuck in a downpour who gets a lift from a young couple. As the hitchhiker dozes in the back seat with the couple’s baby, the car is involved in a ghastly crash on a rain-slicked bridge. Clutching the baby, the hitchhiker staggers from the wreckage and is taken to a hospital, where this unforgettable scene unfolds:
Down the hall came the wife. She was glorious, burning. She didn’t know yet that her husband was dead. We knew. That’s what gave her such power over us. The doctor took her into a room with a desk at the end of the hall, and from under the closed door a slab of brilliance radiated, as if by some stupendous process, diamonds were being incinerated in there. What a pair of lungs! She shrieked as I imagined an eagle would shriek. It felt wonderful to be alive to hear it! I’ve gone looking for that feeling everywhere.
The second piece of writing was Geoff Dyer’s review of Johnson’s National Book Award-winning novel, Tree of Smoke. Dyer makes the point that nothing in Johnson’s earlier output, not even Jesus’ Son, had prepared readers for this teeming, meandering mind-fuck of a novel about America’s misadventures in Southeast Asia. Dyer compares Johnson to Don DeLillo, Robert Stone, Joseph Conrad and, of course, Graham Greene. Far more astutely, he calls Johnson “a junkyard angel,” a writer who, “at some level, did not know how to write at all—and yet knew exactly what he was doing.” I can’t imagine more apt, or higher, praise.
Three days after Johnson’s death, Gregg Allman died at 69. If Bob Dylan is worthy of a Nobel Prize in literature, then Allman, the keyboardist and lead songwriter for The Allman Brothers Band, surely merits inclusion in a list of noteworthy literary obituaries. He wrote many of the band’s signature songs, including “Whipping Post,” “Midnight Rider,” and “Melissa.” Some of his song lyrics rise to the level of art, including these from “Ain’t Wastin’ No More Time,” written shortly after his beloved big brother, Duane, the band’s lead guitarist, died in a motorcycle crash in Macon, Georgia:
Last Sunday morning, the sunshine felt like rain.
Week before, they all seemed the same.
With the help of God and true friends, I come to realize
I still had two strong legs, and even wings to fly.
And oh, I ain’t wastin’ time no more
‘Cause time goes by like hurricanes, and faster things.
The news of Gregg Allman’s death, like the news of Johnson’s, sent me back to a piece of writing—in this case, “Hitting the Note with the Allman Brothers Band,” Grover Lewis’s Rolling Stone chronicle of being embedded on tour with the band in 1971, shortly before Duane’s death. It was a deep-pore examination of life on the road with a big-name rock band, a string of identical days and nights full of “pure listless boredom” and plane flights and concerts and groupies and TV and piles of comic books and cocaine.
Despite the grind of the road, Gregg Allman’s life did not lack for color. He avoided fighting in Vietnam by getting drunk and shooting himself in the foot. He had a long solo career. He married, recorded with, and divorced Cher. (She was the third of his six wives.) He contracted hepatitis and arthritis. He got a liver transplant. Late in life he wrote a memoir, My Cross to Bear, with Alan Light. As a writer, Allman may not be in a league with Patti Smith, but the book has its moments, including a line that would have made an unbeatable epitaph: “If I fell over dead right now, I have led some kind of life.”
If you favor writers who live long colorful implausible lives, Clancy Sigal, who died on July 16 at 90, is your man. Sigal’s resume reads like overcooked fiction: he plotted to assassinate Hermann Göring at the Nuremberg war crimes trials; he was Humphrey Bogart’s Hollywood agent; he was noteworthy enough to make the anti-Communist blacklist; he had to dodge FBI agents; he worked with the Student Nonviolent Coordinating Committee; he was Doris Lessing’s lover (and the model for Saul Green in her 1962 novel, The Golden Notebook); he underwent therapy and dropped acid with the anti-psychiatrist R.D. Laing; he organized Detroit autoworkers; he was a popular commentator on the BBC. Somehow, Sigal also found time to write, producing essays, novels, memoirs, and the screenplay for the 1992 Salma Hayek movie, Frida. His best known book was 1961’s Going Away: A Report, A Memoir, an autobiographical account of a blacklisted Hollywood agent’s picaresque cross-country trip aboard a DeSoto convertible, during which the hero discovers a fractured nation and his own fractured self. It was seen as a rebuttal to Jack Kerouac’s effervescence, and it became a finalist for the National Book Award. The critic John Leonard offered this praise: “It was as if On the Road had been written by somebody with brains.” Sigal never stopped working. He was busy blogging a couple of days before he died.
Dick Gregory didn’t hector or lecture about America’s racial divide but went at it sideways, with a dagger instead of a sledgehammer. Classic early Dick Gregory has him going into a restaurant in the segregated South, where the waitress informs him: “I’m sorry, we don’t serve colored people here.” To which he replies: “That’s all right. I don’t eat colored people nowhere. Just bring me a whole fried chicken.”
Gregory, who died on Aug. 19 at 84, wrote a dozen books, and his 1964 autobiography, nigger, was built on this strategy for neutering an epithet through frank exposure and overuse: “I said, let’s pull it out of the closet, let’s lay it out there, let’s deal with it, let’s dissect it. It should never be called ‘the N-word.’ You see, how do you talk about a swastika by using another term?”
Gregory was soon on the front lines of the civil rights movement, which led to beatings and a dozen arrests, a gunshot wound. Other issues that inspired his activism included the Vietnam War, police brutality, the Equal Rights Amendment, South African apartheid, and the rights of Native Americans. Sometimes he flirted with the bizarre, speculating that “whoever the people are who control the system” were behind the killings of President John F. Kennedy, Martin Luther King, Jr., and John Lennon, as well as the crack cocaine epidemic and the 9/11 terrorist attacks. Then again, there are more than a few people don’t find anything bizarre about such suspicions. Gregory famously embraced various diet fads, and he ran (unsuccessfully) for mayor of Chicago and president of the United States. At the end, he was still able to laugh. “Here’s how you can tell when you’re getting old,” he said late in life. “When someone compliment you on those beautiful alligator shoes you’re wearing—and you’re barefoot.”
Kate Millett’s polemical bombshell, Sexual Politics, burst on the scene in 1970. A portrait of Millett by Alice Neel soon graced the cover of Time magazine, which was then the gold standard of a writer’s anointment as Truly Important. Sexual Politics began as a doctoral thesis, and it used literary criticism and historical analysis to dismantle such supposed avatars of sexual liberation as Henry Miller, D.H. Lawrence, Jean Genet, and Norman Mailer. Millett, who died on Sept. 6 at 82, portrayed such men as cogs in a masculine machine designed to establish and perpetuate the inferior status of women. Patriarchy, Sigmund Freud’s theory of penis envy, the nuclear family—all, in Millett’s view, led to the “interior colonization” of women.
The book, out of print for many years, was reissued in a new edition last year—just in time for the avalanche of revelations of sexual misconduct that have borne out Millett’s original premise. The machine, as we seem to learn anew every day, was indeed set up to ensure the inferior status of women. It ran—until now—on women’s enforced silence. Nearly half a century after the original publication of Sexual Politics, the silence is finally being broken.
Lillian Ross, who died on Sept. 20 at 99, was the fly who came off the wall—with disastrous consequences. In a celebrated six-decade career as a staff writer at The New Yorker, Ross followed this reporter’s dictum: “Do not call attention to yourself.” Her unobtrusive interviewing techniques resulted in a tall stack of superb journalism, on subjects ranging from Ernest Hemingway to a group of rural Indiana high schoolers’ first trip to New York City. Some believe that the best book ever written about Hollywood was Ross’s Picture, from her New Yorker articles about John Huston’s tortured effort to bring Stephen Crane’s Civil War novel, The Red Badge of Courage, to the screen.
But in 1998, the fly on the wall did something out of character: she called attention to herself by publishing a memoir, Here but Not Here, which revealed her 50-year love affair with the late William Shawn, the married editor of The New Yorker, whose widow and children were still alive. Many in the New York literary tribe were incensed. Charles McGrath, then editor of The New York Times Book Review, dissed the book as “a tactless example of the current avidity for tell-all confessions.” Jeremy Bernstein, a 31-year veteran of The New Yorker, called it “a deeply hurtful, self-indulgent, tasteless book that never should have been written at all.” Ross claimed to be mystified by the uproar. As she told the gossip columnist Liz Smith: “The controversy doesn’t make any sense to me.”
Jim Clark may not be a household name, but for more than four decades, as a student, teacher, editor, then director of the MFA program in creative writing at the University of North Carolina at Greensboro, Clark was an outsize influence on generations of writers. He carried a torch passed down by the school’s earlier writing teachers—Allen Tate and his wife Caroline Gordon, Randall Jarrell, Peter Taylor, Fred Chappell, Bob Watson and, now, Michael Parker and Terry Kennedy, among many others. The word “generous” keeps popping up when people remember Clark, who died on Oct. 30 at 72. I experienced that generosity firsthand when Clark, who was also an ordained minister, helped me put together an essay about Greensboro’s peculiar allure for writers. Clark pointed me to a quote by Jarrell, who called the town “Sleeping Beauty,” adding that “Greensboro leaves one alone just wonderfully.” I join hundreds of writers in saying, “Thank you, Jim. Rest in peace.”
William H. Gass
William H. Gass, who died on Dec. 6 at 93, is regarded by many as a father of postmodern writing (unless you think the title belongs to Miguel de Cervantes for that house of mirrors called Don Quixote). Gass, after all, coined the word “metafiction” for his favored ploy of inserting a character known as William H. Gass into fiction written by William H. Gass. But I think Gass should be remembered for four very different reasons. First, he believed sentences were sacred objects and every one should be as perfect as the writer can possibly make it. Second, while he will be remembered for his novels, especially The Tunnel, and his short stories, I’m partial to his essays, on everything from suicide to Malcolm Lowry’s epic (and suicidal) drinking, which are the work of a brilliant mind that wears its erudition lightly. Third, Gass was a metaphor machine; he said the things came at him in “squadrons.” Of the insane he wrote that “their thoughts are open razors, their eyes go off like guns.” Metal threads, he wrote, were “glinting like those gay gold loops which close the coat of a grenadier.” And fourth, in our careerist, prize-drunk age, Gass had a refreshing disdain for literary awards, even as many were bestowed on him. “The Pulitzer Prize in fiction,” he wrote, “takes dead aim at mediocrity and almost never misses.”
My father was working as a reporter at The Washington Post in 1952 when the paper hired its first black reporter, a Baltimore native named Simeon Booker. But Booker lasted just two years at The Post, becoming frustrated by the limited assignments from his white editors in the nation’s rigidly segregated capital. He yearned to write about the black experience in America, and so he started contributing to the weekly Jet and the monthly Ebony, both aimed at black readers. Booker’s timing was superb. Over the next six decades, he covered many of the defining stories of the 20th century, including the brutal murder of the black teenager Emmett Till and the acquittal of his white killers, the Montgomery bus boycott, the Freedom Rides, the Bloody Sunday melee on the Pettus Bridge. He also wrote about politicians, celebrities, and ordinary people.
Booker, who died on Dec. 10 at 99, found time to produce books in his long and decorated life, including Black Man’s America (1964) and Shocking the Conscience: A Reporter’s Account of the Civil Rights Movement. While there were many courageous and talented reporters, black and white, covering the civil rights movement (see Gene Roberts and Hank Klibanoff’s fine book, The Race Beat, or the memoir Beware of Limbo Dancers by Roy Reed, a New York Times reporter who also died on Dec. 10, at 87), Booker seemed to get there first, and he had access, guts, and drive that few rivals could match. And his words carried major weight. One long-time reader said she and others eagerly awaited Booker’s dispatches in Jet and Ebony, which they regarded as nothing less than “the gospel according to Simeon.”
Other notables who left us this year, in alphabetical order:
John Ashbery, 90, was a giant of American letters, an inimitable poet who was often imitated but never equaled. He was also an insightful art critic, and in 1976 he became the only writer to win the Pulitzer Prize, the National Book Award, and the National Book Critics Circle award in the same year for his collection Self-Portrait in a Convex Mirror.
William Peter Blatty, 89, author of the 1971 horror novel, The Exorcist, which sold 13 million copies. Blatty won the Academy Award for adapted screenplay two years later for the movie version of the book, which shattered box office records thanks to its ingenious use of projectile pea-soup vomiting and a girl with a spinning head.
J.P. Donleavy, 91, whose bawdy 1955 novel The Ginger Man was banned and burned before it became a contemporary classic, with 45 million copies in print. Donleavy, who lived for many years in Ireland and was an accomplished painter, had this to say about old age: “It’s not nice, but take comfort that you won’t stay that way forever.”
Paula Fox, 93, was dubbed one of America’s “least appreciated” novelists by The Nation, but she received some overdue recognition in 1999, when Jonathan Franzen wrote an introduction to a popular reissue Fox’s signature novel, Desperate Characters.
Nancy Friday, 84, author of the bestsellers My Secret Garden and Forbidden Flowers, built her writing career on the earth-shattering premise that women have sexual fantasies. To the dismay of many feminists, Friday argued that it was by ridding themselves of shame that women can achieve professional, political, and economic equality with men. Some of Friday’s ideas have held up better than others. In 1996, appearing on Politically Incorrect with Bill Maher, she dismissed the importance of on-the-job sexual harassment. “The workplace,” she said, “is the meeting and mating place.” Try telling that to Salma Hayek.
Sue Grafton, 77, didn’t quite make it to Z. Her so-called alphabet novels, featuring the private eye Kinsey Millhone, began with 1982’s A Is for Alibi and reached Y Is for Yesterday last summer. Grafton, whose influences ranged from Nancy Drew to Mickey Spillane, was at work on Z Is for Zero at the time of her death.
Clifford Irving, 87, who became a millionaire, briefly, but then went to prison when his early 1970s book, The Autobiography of Howard Hughes, was blocked from publication after it was proven to be one of the most sublime literary hoaxes of the 20thcentury.
Robert M. Pirsig, 88, who captured the schizoid zeitgeist of the 1970s with his novel Zen and the Art of Motorcycle Maintenance, which sold millions of copies and remained on bestseller lists for a decade.
Sam Shepard was that rarest thing: a Pulitzer Prize-winning playwright—and an accomplished memoirist, musician, screenwriter, and songwriter—who became an Oscar-nominated, heart-throb movie star. His posthumous final work, Spy of the First Person, is narrated by a man suffering from a degenerative disorder much like the Lou Gehrig’s disease that killed Shepard at age 73.
Robert Silvers, 87, was a founding editor of The New York Review of Books in 1963, and he spent the rest of his life shaping it into one of America’s most influential literary publications. The self-effacing Silver had this to say about the editor’s role: “The one thing he should avoid is taking credit. It’s the writer that counts.”
Richard Wilbur, 96, was a poet, translator, and opera lyricist who won two Pulitzer Prizes and a National Book Award for his meticulous, unshowy poetry. In 1988 he succeeded Robert Penn Warren as the nation’s poet laureate.
Yevgeny Yevtushenko, 83, was the un-Richard Wilbur, a Russian whose showy, defiant poems and theatrical delivery turned him into poetry’s version of an international rock star. Stalinism and other forms of totalitarianism were early targets, though some grumbled that the Soviet government tolerated him while sending other dissidents to Siberia. Some went so far as to call Yevtushenko a sellout. The exiled poet Joseph Brodsky said of him, “He throws stones only in directions that are officially sanctioned and approved.” Millions of fans worldwide disagreed.
A literary controversy (or what passes for controversy in our fairly tame circle) erupted last month when the Pulitzer Prize Board elected not to award a Pulitzer Prize for a work of fiction. It was the first time they had done so since 1977. The reason why this can happen has to do with the way the Pulitzer Prize Board’s selection process works: three initial readers — this year they were novelist Michael Cunningham and critics Susan Larson and Maureen Corrigan — pore over several hundred books published in the previous year and settle on three finalists. Then they turn this list over to the twenty members of the Board, eighteen of whom have voting power (who knows why the board includes two members who can’t vote) to pick one. A majority vote among the Board is required to select a winner. This year, a majority could not come to agree on one book.
The three books nominated were: Swamplandia!, the second book by my friend Karen Russell, a garrulous oddball romp that forays into satire and surrealism; Train Dreams, by Denis Johnson, a decorated luminary on his way to becoming an old guard figure as our village elders like Vonnegut and Updike are vacating their positions; and The Pale King, the unfinished last novel of David Foster Wallace, the most energizing, polarizing, and influential literary voice of our generation, his reputation as a genius now safely beatified by his suicide.
Apparently not one of these three books was liked enough unanimously by ten people on the Board, and so none was awarded the most prestigious literary prize in America this year. “There’s always going to be dissatisfaction, frustration,” said Sig Gissler, the administrator of the Pulitzer Prizes, regarding the indecision. “But [this year] the board deliberated in good faith to reach a decision — just no book got the majority vote.”
When the unusual and disappointing decision was announced, the reaction among the literati—writers, I suppose, and critics, and a vast rearguard of booksellers, bloggers, and book geeks on Twitter who have greatly expanded and diversified the circle of conversation in recent years — was like the moment in the courtroom drama when the unassuming girl on the witness stand calmly says something that suddenly changes everything, and the room bursts all at once into a frenzy of barely contained whispers. What’s more, the Pulitzer Prize Board was pissing on a parade that already felt drenched. Just a few days before, the hobbits of the publishing industry had been dismayed when the Justice Department sued three major publishers over e-book pricing, siding with Amazon like Saruman sided with Sauron, whose ominous red eye sweeps across the land from his Dark Tower in that northwestern Mordor, Seattle.
Ann Patchett, a novelist who last year published a book eligible for the prize (State of Wonder, a novel as magnificent as her other masterpiece, Bel Canto), and now also a bookseller, as she recently opened an independent bookstore in Nashville (so she’s got two horses in this race) maligned the Pulitzer Board’s non-decision in a widely read op-ed piece in The New York Times. “If I feel disappointment as a writer and indignation as a reader, I manage to get all the way to rage as a bookseller,” she writes. She argues that the bestowal of a Pulitzer Prize has the power to get people excited about a book in particular and books in general, and under the shadow of our current zeitgeist, it’s a bad time to put down literature. “What I am sure of,” she writes, “is this: Most readers hearing the news will not assume it was a deadlock. They’ll just figure it was a bum year for fiction.”
Patchett’s piece is heartfelt and impassioned, and in some respects I agree with her — but what this controversy mostly did was remind me of how fundamentally I dislike the whole idea of literary prizes at all. I believe with all my soul that the concept of a board of twenty journalists — or people of any profession for that matter, it doesn’t really make a difference who they are — awarding a prize to a work of art, putting an official stamp of approval on one book and thus by implication saying the other books published that year aren’t as good, should strike us as misguided, shortsighted, and dumb.
I’m not saying this in a sour-grapes way, as a novelist who also wrote an eligible book that was published last year. If I were awarded the Pulitzer, it’s not like I’d fling it in their faces. Obviously I would kiss their feet with gratitude. I have benefited greatly from a literary prize, the Bard Fiction Prize, for which I am hugely grateful, and was nominated for a couple of others, the Dylan Thomas Prize in the UK and the Young Lions Fiction Prize here (which Karen Russell did win, by the way). These prizes can help writers out tremendously, especially early in their careers, giving them prestige, publicity, and money, and for that, they’re a good thing. But this isn’t about me — I’m making this argument not as a writer, but from a more abstract standpoint, from a big-picture view.
There was a shrewdly observant piece in n+1 that was rerun in Slate last year by Chad Harbach (whose roaringly hyped novel, The Art of Fielding, also came out last year) titled “MFA vs. NYC,” and given the headline, which pretty much spells it out, “America now has two distinct literary cultures. Which one will last?” I found the piece spot-on about its observation that our literary culture is sharply bifurcated into two contingents, one concentrated in the publishing mecca of New York City, and the other scattered far and wide across the land at various colleges and universities. Harbach is sharply critical of MFA programs, essentially making all the usual arguments against them and coming down on the side of NYC. After I got an MFA at the ur-program, the Iowa Writers’ Workshop, I moved to New York City, because I figured that’s where writers go, and I’ve lived there for the last few years. So I feel I’m in a commodious place from which to observe these two literary cultures, and I must say, though both the insular little MFA world and the New York City world of literary culture come with their own and different forms of attendant bullshit, there is far, far — and I mean far — more bullshit in NYC.
The difference between the two cultures becomes most profoundly evident contrasting the books that get talked about at the bar over after-class or after-work drinks, respectively. There are many books I came to fall in love with that altered the course of my writing and changed what I thought could be done with literature that were recommendations from some of my friends in the MFA program. We would excitedly talk about what we had been reading lately, or great books we had read before — it was a conversation that was happening constantly and everywhere. A quick list of things I discovered in grad school from my friends’ recommendations that hugely affected me would include the philosophy of Antonin Artaud, the poetry of Paul Celan, Flann O’Brien’s At Swim-Two-Birds, J.P. Donleavy’s The Ginger Man, Joe Wenderoth’s Letters to Wendy’s, the stories of Mavis Gallant, Thomas Bernhard’s The Loser. And I dashed out that list in part to illustrate that we were not exactly shrieking and hyperventilating about the brand-new hot young rising stars of American fiction. (Well, some of us were, but I wasn’t one of them. And indeed in retrospect I notice how most of what I just listed were the recommendations of my poet friends, by necessity bound for academia, if they were lucky, and not for the networky New York literary scene.) Of course, we wanted lustily to be those hot young rising stars of American fiction soon. But when we talked about books, we would pull out the interesting and unusual jewels of our collections the way a music geek will pull out a rare LP in a plastic sleeve. We didn’t really give a shit about what book won what prize and did such-and-such really “deserve” to win the Pulitzer? Those are the kinds of gossipy, facile book conversations you have in New York, where everything is in some way tainted with commerce. Ours were the conversations of collectors, enthusiasts, purists, of people genuinely interested in the art itself, and I miss them.
All that is by way of suggesting that literary prizes are mainly manifestations and obsessions of that buzzy New York literati hive, which can become less of a hive and more of an echo-chamber. It’s an observable phenomenon: a book comes out, which for whatever reason gathers a tsunami of critical praise that perpetuates itself — for by the time the great wave makes landfall, some critics may either be hesitant to disagree with their peers, timorously fearing that they’re missing something everyone else can see (Naked Emperor syndrome), or what’s more probable, their perception has been primped by the power of suggestion, in the same way we are more likely to declare a fine wine magnifique if we know before tasting it that the bottle cost a hundred dollars than if it cost ten. This is why sometimes quite mediocre books wind up vaunted with widespread and lavish praise, and are sometimes even buoyed all the way up to the Pulitzer. But mediocre books getting overpraised does not bother me seriously, as I would rather let ten guilty men go free than hang one innocent — it irritates me far more when truly great books are ignored, which happens all the time.
A book has a vertical life and a horizontal one. The vertical life is what happens to it up to, during, and very soon after its publication; the horizontal life is what happens as the years and decades and even centuries slide by. As the Pulitzer is awarded to a work of fiction published in the previous year, all it can take stock of is a book’s vertical life, which sometimes can be deceiving. I’m sure this helps explain some of the more embarrassing retrospective head-slaps in the Pulitzer’s history, such as when, in 1930, it awarded the prize to Oliver La Farge’s Laughing Boy — a second-rate and now utterly forgotten book by an utterly forgotten writer — for the year in which both Hemingway’s The Sun Also Rises and Faulkner’s The Sound and the Fury were published. It’s perfectly natural they would make that mistake; back then, Faulkner and Hemingway were not yet Faulkner and Hemingway, they were just a couple of young writers who happened to be named Faulkner and Hemingway. The Pulitzer Board would try to atone for their sin years later by awarding them both (Faulkner twice) prizes for far lesser works after their reputations were already secure. The hype of the moment does not necessarily translate into lasting luminance. Just scroll down the list of all the past winners of the prize, and count how many you’ve ever heard of. Start at the bottom and move upward chronologically, and you’ll find the occurrence of familiar names increases as we move closer to the present. This is not because the Pulitzer Board has gradually been growing wiser — it’s because we’re living now, not a hundred years in the future. Then we’ll see. We can’t help it — we’re blinded by our own times; all prizes are like that, and that is why, as a measure of what is good and what is not in art, they are not exactly the trustworthiest oracles.
Also, a twenty-member prize board may be seducible by groupthink. I trust groupthink more when we’re talking about the long and justice-bending arc of history, not twenty journalists (eighteen of whom have voting power) talking about fiction, which is not even their forte. Come to think of it, why have we been letting a roomful of people who don’t necessarily know anything about literature tell us what the best book of fiction was last year, year after year? Why didn’t they just let Michael Cunningham, Maureen Corrigan and Susan Larson pick it? I would be more interested to hear their opinions on the matter, anyway. (The 2012 board did include one — exactly one — fiction writer, past winner Junot Díaz. The only other person on the board I’d heard of was New York Times columnist Thomas L. Friedman, who I’m sure is a wonderful man but the dude writes like a clown honks a bicycle horn.)
Let me tell you a story about the problem with a group of people of about that number locked in a room trying to come to a decision about a work of art, fiction specifically. The stakes here are much smaller, but the phenomenon I believe is similar. For a short time I was a submissions reader for a fairly well-known, medium-cachet literary review. There were usually about ten to fifteen of us around the editorial meeting table. Each of us would read through the slush pile and select a few stories we liked, and then the boss would Xerox the top stories for everyone, we’d all go home and read them, pick out our favorites among those, and at the next meeting discuss which stories to put in the issue. After all our arguing and deliberation, usually the pieces that wound up being selected for publication were not the most interesting, or what I thought were the best of what we had to choose from. They were the pretty good pieces that we could all compromise on. Because a truly great and interesting work of art will have both its loving defenders and its outraged detractors, such a work is intrinsically less likely to be selected for honor by a large committee. That is the nature of good art: it provokes. I agree with Churchill that democracy is the worst form of government except all those others that have been tried from time to time, but not when it comes to lionizing certain novels over others. That I prefer to do on my own, thank you very much.
Historically, this obsession with prizes — and its grandchild, the micro-hysteria over those “best-of” lists that seasonally return to stipple the hills like dandelions — seems to be an impulse particularly characteristic of the twentieth century and beyond: the first Nobel Prize in Literature went in 1901 to the great Sully Prudhomme (what, you’ve never heard of him?), the first Pulitzer Prize for Fiction in 1918 to Ernest Poole for His Family, the first National Book Award in 1950 to Nelson Algren for The Man with the Golden Arm, the first National Book Critics Circle Award in 1975 to E.L. Doctorow for Ragtime, and the first PEN/Faulkner in 1981 to Walter Abish for his How German Is It. I’d say the only one of those that’s still well remembered today is E.L. Doctorow’s Ragtime (although I happen to have read Nelson Algren’s The Man with the Golden Arm — it’s pretty good).
However, there’s also an argument that this misguided impulse is not necessarily so much a modern one as an inherently human one (and we have plenty of those), when one considers that in ancient Greek festivals, prizes were given out, as they were for the more objectively measurable outcomes of athletic contests, to the best plays. But this phenomenon was in evidence even back then — that of the critics of the time failing to recognize what history would discover greatness in: angered and confused by the way he broke the conventions of Greek drama, the judges snubbed Euripides.
The next-to-next-to-last time the Pulitzer Board chose not to award a prize at all was in 1974, when all three of the readers recommended Thomas Pynchon’s Gravity’s Rainbow, and every member of the Board categorically denied it. Considering what a rambunctious, rebellious book it is, and considering the long life it has since enjoyed as both a cult classic and a classic, a necessary item on the bookshelf of every druggy collegiate pseudo-intellectual on his way or not to becoming an intellectual, fiercely hated by many and by many fiercely loved (and both parties have their points), it is so fitting that that, of all books, would be bestowed this negative honor; if anything, it’s an enduring badge of coffee-shop cool, and it well deserves it Of course Gravity’s Rainbow can’t win a Pulitzer. It would be like a punk band winning a Grammy.
Here’s a question. Imagine Satan were to appear in a sulfurous cloud as the host of some Faustian game show, on which the contestants, who are artists at inchoate and uncertain stages of their careers, are forced to confront interesting spiritual dilemmas. Old Scratch says to the Young Writer, I offer you a choice between two fates. In the first, he says — and this seductive vision appears in an orb of smoky light hovering above his outstretched claw — your books are met with blazing success. Every critic fawningly gushes over your work. You’re heralded as a genius. You’re interviewed on TV and on widely-syndicated NPR programs, your phone won’t stop ringing with interview requests. Packed houses at every reading you give. The New York Times Best-Seller List. The money rolls in, you easily clear your outrageous advances. You win the National Book Award, you win the National Book Critics Circle Award, you win the PEN/Faulkner, you win the Orange Prize if you’re a woman, you win the Pulitzer. The movies based on your books hit the screens with famous actors and actresses playing your characters, and everyone says the books were so much better. This is your life. But! — and the vision vanishes — know this: after you die, after your life of literary celebrity, interest in your work will fade. None of the shadows you made will stick to the cave walls because, in the end, none of the cave-dwellers was moved to chalk its outline when it was there. Over time, the world will forget you. Or, behind door number two… The world, if it ever knew you, will forget you in your own lifetime, and you will die in obscurity, uncelebrated, unfulfilled, destitute, and bitter. But! —in the years following your death, your work will be rediscovered, and one of your books in particular will even become a classic that lives on for many generations and forever changes the landscape of our collective imagination. In other words, you’ll be Herman Melville.
Now, both of these are rare and lucky fates. If the variables were at all uncertain — if in the first case there was a chance your work would be remembered, and in the second there was a chance you’d remain forgotten — it would be a much harder decision. But I’d like to think that any artist who is truly interested in art would choose the second option in a heartbeat. I know I would, and I’m not too humble to say so. It’s the first option, not the second, that’s the Faustian bargain: heaven on earth, hell for dessert.
The reason a real artist would choose the second option over the first has nothing to do with any inner nobility — far from it; in fact each fantasy springs from the same megalomaniacal, insatiable hunger. (It’s no coincidence that Hitler was a failed painter and Franco a failed poet. The heart of an artist beats wild and greedy in the chest of every despot. It’s the very same source of energy that produces both.) It is because, while worldly recognition may be an object of lust, immortality is an object of love. As I once read in Plato’s Symposium, and was so amazed by their truth that I’ve never forgotten these sentences, “the soul has its offspring as well as the body. Laws, inventions and noble deeds, which spring from love of fame, have for their motive the same passion for immortality. The lover seeks a beautiful soul in order to generate therein offspring which shall live for ever.”
This is why, for any artist, dying in obscurity is among the worst nightmares. If I had a time machine, I would visit Herman Melville at his deathbed and tell him the good news from the future, so he might go into that good night with some sense of satisfaction. But on second thought, why wait until the very end? I’d go further back and tell him sooner, give him something to help him through those nineteen years he spent growing old as a customs inspector, his public literary career long dead in the water after the critics of his day shouted him out of town as a crackpot, though he was still returning home every night to quietly scribble out poetry and a novella that would be published many years posthumously as Billy Budd. On third thought, seeing as he was in fact working on Billy Budd, and wasn’t so frustrated he’d completely given up writing, maybe somebody already told him. On fourth thought, maybe he didn’t need anyone to tell him, because he knew he was a genius and held out hope the world might one day see it.
All in all, I would urge readers to not pay too much attention to big prestigious literary prizes. In a perfect world, I would wish for every writer a magical bag of money that is never empty (to level the financial question) and simply do away with them all: no Pulitzer Prize for Fiction, no National Book Award, no PEN/Faulkner, no Man Booker, no Nobel Prize in Literature. Let writers write, let critics have their say, let readers read, let time decide.
It doesn’t really matter, though. Even without the magic moneybags, and even with the swells of cacophonic hype surrounding all the literary prizes and all the literary darlings of any given moment, history will plod on, and the Ozymandias of now will be the half-sunk and shattered visage of later. F. Scott Fitzgerald, who never won a Pulitzer, will remain F. Scott Fitzgerald, and two-time Pulitzer Prize winner Booth Tarkington will remain Booth Tarkington. And anyway, I am absolutely certain there have been many writers the equal of Fitzgerald who, through their own bad luck or other people’s bad taste, were never published and never read, let alone given prizes, and it’s especially to these unknown soldiers of literature that I raise my glass. John Kennedy Toole killed himself believing he was doomed to be one of them, and he most certainly would have been, had his mother not accosted Walker Percy years later with his manuscript of A Confederacy of Dunces, which went on to win a twelve-years-posthumous Pulitzer Prize. It was a nice gesture.
A former student interviewed me recently for an audio journalism project about the dangers of undergraduates being online, a perennial topic for faculty, students, and administrators alike, not to mention for newspapers and cultural hand-wringers at large. Her project arose in response to recent articles about social networking being harmful to student health and emotional well-being, and her questions to me concerned the impact of Facebook and Twitter on classwork and grades. It’s an important conversation, but one that too often stops at the level of panic, so I tried to go in another direction, toward an aspect of digital culture that seems as important as bad grades and lost sleep.
When I teach, I jump from YouTube to Delicious to Google docs on the classroom screen. I allow—even encourage—my students to use laptops in class, and I don’t worry if they’ve got Facebook in one tab while the day’s assigned reading is in another. They’re the ones getting graded, so I let them own the direction and degree of their attention along with the consequences. I let them work the way I work, or find their own way of working, without pretending these tools don’t exist. So it isn’t their use of Facebook that concerns me as much who they’re using it with, and I don’t mean the predators and creeps and assorted bogeyfolk 20/20 warns us about every sweeps week. To the contrary, I worry the web might be keeping them safe but stunted, because it’s so easy to continue the relationships and conversations they had in high school and carry along to college an entrenched identity built when they were younger. As I told my student, what concerns me about her peers and herself being online is the loss of isolation, of separation, of the ability for 18 year olds to live in their heads for a while. Or at least to own their own minds.
By the end of high school I’d had enough of classrooms and sitting still, and I had no interest in college. Sure, the only thing you take with you wherever you go is yourself and all that, but I needed some distance and some time to myself. So I set out with my backpack and passport and hostel card, full of romantic if well-trod notions. I went to Ireland because I’d read The Ginger Man and thought if I hung around Dublin I’d be embraced by a community of songwriters and poets and novelists and become one of those things myself. Never mind that J.P. Donleavy’s novel was set nearly half a century before my arrival. What I found instead was loneliness, isolation, the strain of starting up conversations with strangers and of making friends without the institutionalized encouragement of being stuck in school together all day. It was sink or swim because there was no option to stay in bed: hostel wardens kicked you out from morning until late afternoon. If you’ve read David Foster Wallace’s essay about steeling himself in his cruise ship cabin before venturing out to gather data for an article he was meant to be writing, then running back to his room to recover… let’s just say I could relate when I read it a few years later.
That grinding isolation shocked me, then shock became fascination and I suppose I kept traveling to test it like a loose tooth or sprained limb: to challenge myself to be bolder. There was no Facebook then, and mobile phones were rare and as big as our backpacks. Even if I’d had an email address there would have been no one to write. I had no choice but to decide who I wanted to be, without the crutch or cross of who I’d been, of who I’d been told I was and of who I’d become in response to a small school and small town and, perhaps most of all, a small sense of myself. I spent hours alone by necessity, whether hitching on a roadside or riding a bus or wandering the streets of new cities as alien to me as I was to them. And as alien as the inside of my head, until I learned the lay of it.
In The Guardian recently, Jennifer Egan said of her own teenage backpacking experience,
There was a kind of intensity to the isolation of travel at that time that’s completely gone now. You had to wait in line at a phone place, and then there weren’t even answering machines. That feeling of waiting in line, paying for the phone and then not only having no one answer, but not being able to leave a message so that they would never know you called. It’s hard to fathom what that disconnection felt like. But I’m actually very grateful for it. Because it was extreme. And that kind of extreme isolation showed me that I wanted to be a writer.
She goes on to describe the panic attacks and fear isolation brought on, and as I read I remembered a day spent hiking in Cornwall, of hours in the forest then emerging into a village in front of a pub. There were families heads-down over maps, a cricket team warming up on the green, two old men in identical gum boots and waxed canvas coats stopped to chat; ordinary lives going on in everyday ways, in other words, and there was me, ragged and woolly and stepping out of the woods where no one knew me from the Beast of Bodmin. I could have turned around and disappeared into the trees without being noticed, like another anonymous hiker whose body had recently turned up in the moors and was all over the news.
Another time in the Australian desert, a few days after I’d been shot at by drunk hunters who mistook my companions and I for overgrown possums, it occurred to me while hiking alone in dry heat and red sand that if I died there, if something happened, it would be a very long time until my family knew. I’d read something by Paul Theroux about a recurring nightmare of his children receiving postcards long after his death, because the places he’d traveled were so far away, and it stuck with me as simultaneously bleak and liberating (though now that I’m a father myself, and past my daughter’s colicky stage, it seems mostly bleak).
Postcards went in one direction, and offered so little room to say what I’d been doing they were merely an indication I was still alive. Hostels didn’t yet offer terminals for popping online and updating your friends, and you couldn’t upload your photos as fast as you took them nor could you see what folks were doing at home. Even after lining up as Egan describes, phone calls were expensive and brief, as compressed as postcards, and sometimes there weren’t any phone booths at all. I doubt I’m making any of this sound positive, but here’s the thing: all those experiences, the good and the bad, the everyday and inexplicably strange, were mine and mine alone.
There was no one to tell about the midnight train ride to Belfast when a brick smashed through the window over my head after a group of boys had already been hauled off by armored Garda for lighting strings of firecrackers at the end of the carriage. About the nun whose body tangled with mine when the sound of “gunshots” made us dive to the floor, or my arrival in Belfast between a cab driver’s shooting and a retaliatory bomb the next morning. No one knew I’d been fired on by those Australian hunters while camped in a billabong, or had carried a stranger’s dog off a mountain in Colorado only for it to die in my arms at the bottom, within sight of the ranger station. About a party with an Australian MP that, had phones been smarter and had there been a place to post photos, would have gotten us both into trouble, and camping by a desert hot spring the night of an Aboriginal corroboree then observing (for as long as I was allowed to) an outdoor meeting with a government minister the following day. Only I knew about watching and watching and watching an enormous brown snail crawl across my ground sheet when I woke by the Tasman Sea at sunrise, and standing with a German football team on a Galway street corner, each of us holding a lead-plugged axe handle, because a group of thugs had attacked and robbed the hostel warden the night before and were amassing again up the street.
If I’d been able to share those things quickly, if I’d been able to tweet them or make them my status or even speak them to someone I knew, I might not have hung onto them. If I’d been able to upload my photo of an apparently ancient stone wall on Inishmore that contained, near the bottom and bearing weight, a rock mysteriously stenciled with a bright yellow pedestrian crossing icon, and had someone commented with a quick explanation of why it was there, I might not be thinking about it almost two decades later. Instead it became a defining question for me, a question I’ve been trying to put into words ever since, in poems and stories and my failed first attempt at a novel (and probably the second and third novels, too, in some way). I might not have felt that “extreme isolation” Egan refers to, that uncrossable gulf between “home” and “away,” between “me” and “you” and between “me now” and “me then,” for that matter. I wouldn’t have been forced by circumstance and separation to redefine myself in my own awkward terms rather than rest comfortably uncomfortable in who I’d already been before setting off to “find myself,” as awfully much as that cliché rankles and rots.
Maybe there’s something healthy in telling our stories immediately and in making experience as ephemeral as a bird’s own tweeting is, so we can move on to something else. Maybe my students are better adjusted than I ever was, more aware of themselves because they’re able to share their lives quickly and widely and constantly. Some of my own closest, most meaningful friendships have grown out of blogging and being online; my sense of myself as a writer has come from engaging that wider world the internet offers as much as from engaging the world through more embodied travel—something I haven’t often had the luxury of after taking on college debts and degrees. So perhaps I’m projecting, and as much as I want to deny it all this is one more way of complaining things were better when we were young, blah blah blah. Besides, the whole notion of leaving home at 18 is so ethnocentric, a global rarity arrogantly normalized by a relatively small, relatively affluent part of the world. It’s like the internet, that way.
I know, though, for better or worse, I wouldn’t be the writer and person I am if I’d been online earlier. There’s a guy I run into on the subway sometimes, someone I played soccer with in junior high, and twenty-five years later he still thinks I’m not me but another kid we played with. When I moved to town, there were a few weeks at the start of sixth grade when the whole team had me confused with that other kid, a few weeks in which I was not only the new kid but a different new kid, a double disorientation. And now every few months I hear this guy shouting that other kid’s name down the train car and I’m turned back into someone I haven’t been in a very long time and never was, really. So when I worry about my students being online, it’s because I imagine their moments of discovery and reinvention and risk derailed by Facebook comments from people who remember them as they weren’t and won’t let them forget it, tying them down before they lift off. That worries me more, in a way, than the possibility of drunken college hijinks preserved online haunting them as they look for jobs—something I suspect we’ll grow more forgiving of, as a culture, the more inevitable it becomes. I worry there’s less room to try on and cast off new selves, as people and artists alike, but maybe that’s only an issue for someone who always finds himself writing about isolation one way or another, and for whom the most terrifying thing ever seen on TV is that eBay ad asking, “What if nothing was ever forgotten?”
Still, even if we move home after college or never go to college at all, even if we stay close to old friends our whole lives, I can’t help but think there’s a value in removing ourselves from all that for a while and in stockpiling experiences no one else knows about. Whether those are big, dramatic experiences or small, quiet ones I doubt makes much difference; the important thing is they’re ours, only ours, and we’ll have them to return to forever—not by Googling to remind ourselves what we wrote when they happened, or by looking at photos (I lost most of those when I mailed home a box of unprocessed film that never arrived). Not by having them pass into the collective memory of the people who know us and the stories that get swapped around campfires and dining room tables for so long it stops mattering who they happened to, but by tucking them away like a tweet left untweeted, in some private corner where 140 characters of pithy distillation might grow into something much more.
Image credit: Pexels/Miray Bostancı.