It was inevitable, really. Mary Roach’s previous books have ostensibly been about sex, cadavers, the afterlife, and space travel, but each one has spent a fair amount of time on digestion, excretion, and highly topical fart jokes. In her latest book, Gulp: Adventures on the Alimentary Canal, Mary Roach finally explores all that happens after you take that bite -- not as an entertaining sideshow, but as the focus of 17 entertaining and occasionally bizarre chapters. Megacolons, stimulated saliva, death by constipation, and of course the scientists who study this stuff with aplomb (and an occasional apology): it is, as it were, all in there. I interviewed her while she was in Seattle for her book tour. The Millions: Your past work has focused on bringing topics that are creepy or lurid or awe-inspiring back down to earth. This one's primarily about overcoming disgust. Did you find it challenging to make digestion and excretion palatable for your audience? Mary Roach: Well, I think digestion is another lurid, taboo subject -- particularly from the navel down. But even what goes on in the mouth is an unthinkable, revolting thing that no one wants to think about. There was a sense that this was right up my stinky little alley. TM: Packing for Mars came out the same year that NASA retired the Shuttles and the Opportunity rover broke records for the longest Mars mission. Stiff came out around the same time that shows about forensic science became the hot new thing… MR: And Six Feet Under. TM: Right. Was there something that made you decide: yes, 2013 is the time to publish a book about the digestive tract? MR: You know, I think the whole obsession with food had hit an unprecedented level, in terms of people photographing every meal and posting it on Instagram. Food has moved so far away from just a way to stay alive and take in sustenance to a point where it’s now almost high culture. So it makes sense to look past the borders of the lips and take a peek at what goes on after the food leaves the plate. On the tour, my publicist set up events with people like Chris Kimball and Ted Allen, which makes for a really interesting conversation. We tend to elevate food to a sensual, cultural place, but for someone like Chris Kimball, he’s looking at food as chemistry. And eating is biology. TM: Two of the most memorable people in your book were William Beaumont, the physician who made a career out of his patient's stomach fistula, and Andries van der Bilt, the chewing scientist at Utrecht who will probably be the last researcher of his kind at the university. In other words, in the book you go from the early days of science virtuosos to very specialized people doing narrow and admittedly gross stuff. Is that how the field of gastro-intestinal science looks right now? MR: It’s really hard to get funding for pure science just for the sake of figuring out how things work. It’s a lot easier to get funded if you have a practical application for things. Also, a lot of it’s been figured out. I’m not a scientist, so I’m going out on a limb here, but I do get the sense that the more you know, the more you realize you don’t know very much, and there isn’t an end point. So there’s always more to be done -- there’s just not as much funding for it anymore. It’s more science in the service of the food industry. But even in that world, there are characters like Erika Silletti, who are so awed and amazed by what they’re discovering. She has this infectious enthusiasm for...spit! TM: You’re in the middle of your book tour now. What do you like to present at your events? MR: I don’t do a lot of readings when I do talks. Some of them are billed as readings, but I wouldn’t call them that, really. I find that if you read to non-fiction to people for more than a paragraph, people will glaze over. So I’ll read footnotes. Once of my favorite footnotes in Gulp is where I’m talking about per anum, which is through the anus, and per annum, which is yearly. And I did a Google search for instances where people meant per annum but wrote per anum, and that’s one of my favorites. How many childbirths occur per anum, that sort of thing. TM: Do you find that people are more squeamish to ask you questions about the digestive tract than they were on your earlier book tours, when the topic was the afterlife or space? MR: Oh no, no. Once you open the door for them, people want to know all kinds of things. Sometimes people want to know, “I have a mucoid plaque, can you tell me what to do,” and I have to say, “I don’t have an M.D., I can’t dispense medical advice -- nor do I really want to.” But people have great questions. The other day I was doing an author lunch at Google; people had a lot of questions about rectal smuggling. TM: Early on in the book, you find out that your palate isn't developed enough for you to be on a professional olive oil tasting panel. Any other potential avenues of research that never worked out? MR: I wanted to go to Food Valley when someone was actually involved in an experiment, but I couldn’t seem to time it right. There were a couple subjects, but most were not particularly interesting. There’s someone there with a tube hooked in mouth, and it squirts in something that they’re tasting, and they make a note of it. It’s not particularly scintillating, and it didn’t make it into the book. There’s a woman who studies pica (people who ingest non-food) and she studies in Zanzibar with women who eat a variety of clays. I wanted to do that, partly because I really wanted to go to Zanzibar -- that really lies at the bottom of a lot of what I do, I really want to go there, what could I find there? -- but it felt a little...off unto itself. When you go all the way to Zanzibar, you want to spend a lot of time on that subject, not just a few paragraphs. TM: When you first started writing, I think one of your pieces was about the IRS. Then you moved on to the afterlife, cadavers, space travel. What sort of writers made you think, I want to write about things that are kind of out there, or that people might not think about all that often? MR: The writer who I glommed on to and loved his style was Bill Bryson, when he wrote The Lost Continent. This was before A Walk in the Woods, before he started going huge. I remember reading his books, and I admired his ability to combine information and humor and character. It was inspirational. Every now and then I put a little homage to passages that really got in my head from writers like him, inside jokes to myself. When I was writing Gulp, I had just read Patti Smith's Just Kids, and when she did something outrageous or pathetic, Mapplethorpe would say to her, “Patti, no.” And when I was with Sue Langston, who’s a sensory analyst -- the nose woman -- and I asked her, “if you had to choose between a Budweiser and an IPA right now, what would you choose?” and she said, “I would take the Bud,” and I said, “Sue, no.” That was my little homage to Mapplethorpe and that book. TM: In 2011 you edited the Best American Science and Nature Writing, so I'm hoping you might pontificate about science writing in general for a second. When you write your books, or when you evaluate science writing, what are you looking for -- what should a good science piece accomplish? MR: I don’t know if the genre of “science writing” should really exist. We don’t classify “religion writing” or “political writing” as an entirely different register of writing. There’s such a wide range -- material that simply explains, others that are strongly narrative. In The Immortal Life of Henrietta Lacks, you learn a lot about cell culturing, but it’s still primarily a story. For that collection in particular, I tried to do a mix. The Atul Gawande essay, I thought, was beautiful. For someone in that community to stand up -- or I should say, sit down and write -- about how far the discussion about end-of-life care needs to go in order to really help people. And then the Oliver Sacks piece about prosopagnosia, he’s just a wonderful thinker/writer, he’s more reporting rather than recommending. For me, there are a lot of valid ways to write about science, and when I was judging that book it was really just what pulled me in and kept my interest and taught me something. There are so many ways to do that.
1. The book review is dead. At the very least, it’s very obviously dying. Anyway, we can all agree that it should be killed off, because it’s gotten to be irrelevant. If not downright parasitic. (Though maybe it might be salvaged if the average review was a little meaner.) I exaggerate only slightly here. This past August, a pair of meta-critical essays by Dwight Garner in The New York Times and Jacob Silverman in Slate sent everyone who fancied him- or herself a critic — whether institutionalized or not — into a collective fit. It was probably the biggest literary-cultural dustup since the Great MFA Debate of 2010-2011, when Elif Batuman’s London Review of Books article, “Get a Real Degree,” made everyone feel just a little bit bad about the existence of MFA programs. I found it hard to get terribly worked up about literary criticism’s emotional register. For every Laura Miller or Lev Grossman who has foresworn negative reviews, I know that there will be just as many qualifiers for the Hatchet Job of the Year Award to fulfill the angry review quota. For every purchased five-star review, there will be that lady on Goodreads who says that the only good thing about the new Junot Diaz novel is that it taught her the Spanish word sucio. But enough about the State of the Art! I enjoyed all of these essays, but the one thing that struck me was that they were all essentially negative, in the sense that they set out to describe how things were going wrong or why things ought not to be the way that they are. What they didn’t do a very good job of was describing what criticism or book reviewing is, or what it should be. Okay, there were some nice, bold mission statements thrown in there. Here’s Dwight Garner: “What we need more of…are excellent and authoritative and punishing critics.” Agreed. Or Daniel Mendelsohn, in the New Yorker: “the critic is someone who, when his knowledge, operated on by his taste in the presence of some new example of the genre he’s interested in...hungers to make sense of that new thing, to analyze it, interpret it, make it mean something.” Sounds great. Or Richard Brody, again in the New Yorker: “Criticism is the turning of the secondary (the critic’s judgment) into the primary.” Sure, why not. So I think we can all agree that A) the “book review” is a prestigious class of writing that people aspire to write, and B) there is a continuum of, shall we say, critical perceptiveness — what in the pre-everyone-gets-a-trophy age we might call “value” or “quality” — on which the multiple-thousand-word, tightly-argued essays of the New York/London/L.A. Review of Books reside at one end, and the rapid reactions of John Q. Tumblr reside at the other. (By the way, I don’t want to suggest that there is something philosophically corrupt or intrinsically wrong about the latter, or that just because something is edited and not self-published, it is automatically better than a blog post. Advanced degrees, journalistic credentials, and/or getting published in hard copy is not a guarantee that a book review is any good. See, for example Janet Maslin’s misreading of This Bright River.) But what should this excellence and interpretation and maybe a little bit of hard-headedness actually look like, in practice? Why has it been absent? And why does any discussion about literary criticism turn into a giant game of dodging the question, as if the concept of a book review were like the concept of pornography, in that you might not know how to define it, but you’d know it when you see it? In the interest of getting everyone on the same page (book pun!), I thought it would be an interesting exercise to dissect a book review, to pick apart what makes it work or not work, what makes some book reviews great and others — most of them, really — bland and unhelpful and immediately forgettable. Because book reviewing is a genre with its own conventions, just as every murder mystery must start with a body, and every epic fantasy must feature elvish words with too many apostrophes. It’s worth figuring out what those conventions are. 2. In the beginning, there is ego. As George Orwell put it in his essay “Writers and Leviathan”: “One’s real reaction to a book, when one has a reaction at all, is usually ‘I like this book’ or ‘I don’t like it,’ and what follows is a rationalization.” The decision to like or not like a book is where every book review begins. This is what gives the genre its underlying suspense — will Michiko Kakutani like this book or won’t she? — but also its frustrating sense of chaos, because no matter how technically sound or philosophically sophisticated or beautiful a book might be, something minor or tangential can turn off a reviewer so much that he or she decides the book is not good. A lot of book reviews never get past this first stage, and this is where the whole free-for-all of online reviewing can get frustrating. For instance, the Goodreads lady on Junot Diaz, or the people who unironically give one-star reviews to classic literature: all of these reviews consist entirely of the initial response and a subsequent explanation, and no self-reflection about whether there might be more to the book — and to the reviewer’s response — than that initial, emotional decision. If the nauseating chumminess of the publishing world is the Scylla of book criticism, than this kind of reviewerly narcissism is surely its Charybdis. But hopefully, no matter how much reviewers are in love with themselves, they will at least step aside and say a few things about the book. In the case of fiction, its plot, its characters, some of the backstory, and the setting. In the case of nonfiction, the overall narrative or argument of the book, the author’s source material and expertise in the subject matter. This is the next stage in the evolution of a book review, and it is plain nuts and bolts kind of stuff. But it’s so important to do readers this simple courtesy because, unlike an oil filter or a frying pan, the world of literature is so expansive and dependent on authorial decisions and whims that two books within a genre, or a sub-genre, or even a sub-sub-genre, may vary wildly in so many ways. Is the protagonist of this hard science-fiction story an astrobiologist on a generation ship or a detective on an asteroid base? And so on. This is where things start to get complicated. The average paid reviewer gets one scant paragraph in Publisher’s Weekly, maybe four or six in your average major metropolitan daily, to appraise a book. And more often than not, they splurge on summary — often to the exclusion of everything else. So their concluding paragraphs tend to be a little overstuffed, as these recent examples show: But finally, of course, this kind of rigidity exacts its own price, and Natalie can’t avoid paying. Each of the novel’s sections ends with a scene of violence, something Ms. Smith presents as inescapable in northwest London. Some characters die from it, others survive, but none are unscathed. What Ms. Smith offers in this absorbing novel is a study in the limits of freedom, the way family and class constrain the adult selves we make. In England, the margin for self-invention is notoriously smaller than it is in the U.S. — which is one reason why Ms. Smith, with NW, seems more than ever a great English novelist. (Adam Kirsch, review of NW in The Wall Street Journal) There are moments here and there in Telegraph Avenue where Mr. Chabon, himself sounds as if he’s trying very hard “to sound like he was from the ’hood,” but for the most part he does such a graceful job of ventriloquism with his characters that the reader forgets they are fictional creations. His people become so real to us, their problems so palpably netted in the author’s buoyant, expressionistic prose, that the novel gradually becomes a genuinely immersive experience — something increasingly rare in our ADD age. (Michiko Kakutani, review of Telegraph Avenue in The New York Times) The Revised Fundamentals of Caregiving deals with sorrow and disability and all the things that can go wrong in life. But mostly Evison has given us a salty-sweet story about absorbing those hits and taking a risk to reach beyond them. What a great ride. (Ellen Emry Heitzel, review of The Revised Fundamentals of Caregiving in The Seattle Times) In other words, you can see where these reviews are trying to do too much with too little space. Trying to sum up the quality of the prose with a few abstract descriptors. Making a final plea for the cultural relevance of the book. Ending on a gnomic, life-affirming mantra. And all this in fewer than 100 words. The fact that these reviewers are reaching for something beyond what they have space to cover is, to me, a tacit admission that there is more to be done here; that saying “Here is the plot of the book, and here is a pile of adjectives to show how much I (dis)liked it” just barely scratches the surface of what book criticism can cover. But if you’ve already done all that and you still feel that readers ought to take away one more big idea — what do you do? 3. Matt Taibbi hated The World is Flat and Hot, Flat, and Crowded. He hated their titles. He hated their premise. Hated their predictability, their utter lack of real insights, and most memorably of all, hated their language. In his reviews of Thomas Friedman’s two books, Taibbi tracked dozens of bizarre proclamations and just plain bad writing, from the first confusion between herd animals with hunting animals to his last, triumphant-until-you-think-about-it graph of freedom vs. oil prices, which used four data points selected basically at random to make a point about the march of democracy across the globe. (“What can’t you argue, if that’s how you’re going to make your point??” wrote Taibbi, two question marks included.) This might make Taibbi sound like a prescriptivist grump, a Grammar Nazi who just happened to find the one guy who was famous enough and bad at writing enough to deserve this kind of thrashing. Except that the reviews do more than that. It turns out that Friedman’s “anti-ear” is actually the most obvious symptom of a larger case of intellectual and moral fraud. In Friedman’s world, the rules of basic logic and historical causation do not exist; he invents new realities out of a few cherry-picked events and the limited frame of reference of a privileged, jet-setting columnist based out of New York City. On the one hand, this entire review stems from an act that we all can do: to try and gauge the quality of Friedman’s writing and thought. But Taibbi manages to do more than wag his finger at Friedman for writing poorly — he discovers something important and true that we didn’t know before, and more importantly, couldn’t know just by taking Friedman at his word. So Taibbi passes Daniel Mendelsohn’s “meaning” test, because we now know something new about Friedman’s book that we didn’t before. He certainly passes Dwight Garner’s bar for being both excellent and punishing. This is not simple aesthetic snobbery: it’s formal criticism that actually matters. Then there is the big picture. It’s hard to get much bigger than James Wood’s famous 2000 proclamation: “A genre is hardening.” In it, he identifies the “perpetual-motion machine that appears to have been embarrassed into velocity” that characterized novelists like Don DeLillo, Salman Rushdie, and mainly, Zadie Smith, whose then-new book White Teeth Wood was reviewing. These practitioners of “hysterical realism,” Wood argued, were to the novel what the van Eyck brothers were to medieval painting — artists who thought that conceptual virtuosity and an inexhaustible supply of detail substituted for a plausible, profound exploration of the human experience. Instead of treating the text as a mirror for the writer’s psyche, this kind of criticism assumes that the novel in question is a mirror of some kind of shared worldview, brought on not just by the writer’s personal choices (of character, setting, plot, and so on) but also by the context in which he or she is writing. In the case of the hysterical realists, they are all too in love with their grand, underlying, and basically untrue idea — everything and everyone is interconnected in ambition and subject to the same fate — that they have to make their characters essentially inhuman to make their plots work. But not everyone has to be present at the birth of a genre to do this sort of criticism. Rosecrans Baldwin discovered a trope that’s almost as old as the modern novel — the “distant-dog impulse” — from Tolstoy to Picoult. Evgeny Morozov tracked not only the intellectual vacuousness, but also the stylistic commonalities imposed by the new line of TED Books. What’s going on here? Elif Batuman explains that all of these reviewers are looking for context in the morass of personal and artistic choices that go into every piece of writing: Literature viewed in this way becomes a gigantic multifarious dream produced by a historical moment. The role of the critic is then less to exhaustively explain any single work than to identify, in a group of works, a reflection of some conditioned aspect of reality. Maybe it doesn’t sound great when reduced to a mission statement like this — in fact, I think it sounds vaguely totalitarian, especially when you consider that this sort of criticism is called “Marxist criticism” in academic circles. But in practice, it definitely works. 4. So. Reaction. Summary. Aesthetic and historical appraisal: these are the four classical elements of literary criticism. To that I might add that it helps to be negative — of the twelve reviews I quote here, eight are at least moderately negative, and about five are relentlessly so. That people are even having this conversation about the supposed niceness of book reviews is great: it shows that book reviews are anything but irrelevant. And now that we’ve teased out the ground rules of what can and should go into a book review, it’s time to turn you loose. You now have the tools to cut through the morass of literary criticism and decide for yourself not only if a book review is worthwhile, but why. You can critique the critics. You can be a meta-Michiko. Use this knowledge wisely. As for me, I eagerly await the next big, invented crisis to strike the world of literature. I hope it involves deckled edges. Image via Wikimedia Commons
H.L. Mencken wrote that rubbernecking – that voyeuristic impulse to gawk at someone else’s difficulties – was “almost a complete treatise on American psychology.” The term perfectly describes the recent outpouring of interest in the industrial heartland of the United States, known as the Rust Belt, which has been in decline since the 1970s and which has suffered even more during the recession. First came news stories about places like Dayton, Ohio, where unemployment has more than doubled since the beginning of 2008 thanks to the closing of several manufacturing plants, or Gary, Indiana, the home of the largest integrated steel mill in the northern hemisphere, where an average of one person a week is murdered and over a quarter of its residents live below the poverty line. Then came the literary interest: 2009 saw the publication of books such as Nick Reding’s Methland: The Death and Life of a Small American Town, Patrick Carr and Maria Kefalas’s Hollowing Out the Middle: The Rural Brain Drain and What it Means for America, and Bonnie Jo Campbell’s American Salvage, all of which catalogued the various social and personal ills, and the universal sense of despair about the future, that plague Rust Belt cities and towns. In the crowded field of “recession literature,” however, Philipp Meyer’s relentlessly pessimistic debut novel American Rust has attracted an outsized share of acclaim and attention, and deservedly so. The book follows Isaac English and Billy Poe, two friends whose families have anchored them to the steelworking town of Buell, Pennsylvania. Isaac is the smartest kid in the entire county, but is stuck tending to his disabled father and trying to understand his mother’s recent suicide. Billy, meanwhile, passed up an offer to play football at Colgate College just because he was too stubborn to leave. At the ripe old age of twenty, both can already see an unfulfilling future stretching out in front of them. So Isaac strikes out for California. In his head he takes on the persona of “the kid,” a modern-day Huck Finn figure whose idea of freedom involves studying astrophysics at Lawrence Livermore. On the way to the Pittsburgh rail yards, he runs into Billy Poe, and the two take shelter from the rain in an abandoned factory. Unknowingly, they have trespassed on the territory of three vagrants who assault Billy and hold him at knife-point, and Isaac is forced to kill one of them in order to save his friend. Both boys panic and hastily try to cover up their crime, and in doing so reveal the self-destructive tendencies that consume them over the course of the novel. The next day, the police arrest Billy, who feels that he has little choice but to take the blame for a crime he did not commit. He stubbornly refuses to implicate Isaac or even talk to a public defender, which lands him in prison; there, his hair-trigger temper makes him an outcast among outcasts. Meanwhile, Isaac treats his escape as an adventure at first, but eventually his guilt at abandoning his father and sister slowly consumes him, and the picaresque tale of “the kid” takes on more and more false bravado with each humiliation that he endures, from washing himself in the bathroom of a diner to getting his money stolen by a tramp. The murder begins to poison those who have a stake in Billy’s and Isaac’s future as well. Billy’s mother Grace despairs that her decision to stay in the Valley, and her refusal to throw her deadbeat husband Virgil out of her life altogether, has robbed her of a career and a a son. Isaac’s sister Lee feels like she has to save both her brother Isaac and her former lover Billy, whom she abandoned for the Ivy League and an unfulfilling marriage to a wealthy classmate. And local police chief Bud Harris, who once convinced the local prosecutor to dismiss an assault charge against Billy, wonders whether he should try to save the boy a second time, or whether such an effort will prove as effective as “trying to catch a body falling from a skyscraper.” American Rust is an ambitious book, both in terms of its structure (it follows six narrators) and its subject (“the ugly reverse of the American Dream,” according to one character). As a result, it occasionally loses its focus. At times, the reader can at times get lost in a sea of introspection that is leavened only occasionally with action. Certain passages sag under the weight of the characters’ regret, indecision, and self-loathing, and the plot takes a long time to develop forward momentum; the murder takes place at the end of the first chapter, but it is not until about halfway through the book that Poe gets arrested and Isaac begins hopping trains for points west. Meyer also cannot resist an ostentatious tribute to his literary forebears once in a while. For example, Isaac’s sister Lee broods over the relative merits of James Joyce, Henry James, and Jean-Paul Sartre for almost an entire page (there is no other literary criticism in the entire novel), and Meyer tells the reader several times that Isaac’s mother killed herself by filling her overcoat pockets with stones and drowning herself, a grisly tribute to Virginia Woolf. Finally, the ending, which sees Bud Harris transform from a beleaguered Good Samaritan into a self-serving vigilante, feels unearned; nothing in the first 300 pages of the book sets up such a drastic personality change. Still, these authorial missteps do not really detract from the book’s ability to portray the Rust Belt in new, unsparing, and unsentimental ways. Ten years ago, fictional post-industrial towns served merely as stages on which to act out much larger melodramas. In Richard Russo’s Pulitzer Prize-winning Empire Falls, for example, misery comes not from large, impersonal forces but from the choices that the characters themselves make. Russo is much more raconteur than social commentator, however, which gives him the freedom to write in a decidedly tragicomic mode, and to make his characters relatively ambivalent about their own hardships; at one point, the novel’s protagonist, Miles Roby, asks, “If I was so unhappy, wouldn’t I know?” The plot of Empire Falls, in other words, just happens to be set in a declining mill town in Maine, and although there are moments of genuine suffering and humiliation in Russo’s novel, they are the exception rather than the rule. In American Rust, the setting is the story. Isaac and Billy’s hometown of Buell is a stand-in for any number of Rust Belt towns like Dayton and Gary: its factories have been shut and its good jobs have been gone for nearly two decades, its former steelworkers, who in the 1980s made twenty dollars an hour, now bag groceries for less than five, and neither its residents nor its municipal government can make ends meet. Besides getting the economic indicators right, Meyer understands that socioeconomic malaise and personal malaise are two sides of the same coin. He shows, through the eyes of each of the main characters, the human consequences of a sick economy, which include desperation, psychic distress, moral confusion, and the real or imagined loss of one’s free will. He has the luxury of space and unmediated access to his characters’ thoughts, which allows him to explore a familiar topic – the effects of a prolonged economic downturn – in ways that writers of non-fiction cannot. As a result, American Rust provides a gentle corrective to the kind of fact-and-statistic-based reportage that focuses more on rubrics and measurements (punctuated, of course, by the occasional human interest story) than the recession’s non-economic effect on individuals. A newspaper article about a rise in shoplifting, for instance, provokes quiet a different reaction from the reader than Isaac’s theft of an overcoat from a Wal-Mart: The other customers stared intently at their merchandise until he passed. Embarrassed to look at you. Who wouldn’t be? Except the kid does not care. Possessed of a higher mission—self-improvement. Resource gathering. Like the original man—starts from scratch. A new society. Beginning in Men’s Outerwear. All those coats. Never know how much you value a coat. Took months to make in the old days. Now you just go to a store. Don’t be nervous, she’s looking at you. The novel is also much-needed challenge to the kind of myth-making that the political commentariat has forced down Americans’ throats over the past few years. One the one hand, there are the wedge-drivers like Sarah Palin and Glenn Beck, who pit a mythological “real America” (blue-collar, religious, small town, uncorrupted) against the so-called “coastal elites.” On the other, there are the tone-deaf and the contemptuous. At a San Francisco fundraiser in April of 2008, for example, then-candidate Barack Obama nearly derailed his primary campaign by commenting about small Pennsylvania towns where people “get bitter” and “cling to guns or religion or antipathy to people who aren’t like them.” And New York Times columnist Frank Rich has spent the better part of a year celebrating the slow and violent death of “a dwindling white nonurban America that is aflame with grievances and awash in self-pity as the country hurtles into the 21st century and leaves it behind.” Clearly, it is much easier to misrepresent places like Buell for the sake of political gain, or else to dismiss them as irrelevant and insignificant, than it is to treat them without cynicism or contempt. How, then, to categorize this book? Like Upton Sinclair’s The Jungle and George Orwell’s Down and Out in Paris and London, American Rust documents the psychological and moral tangle that comes with poverty, something that people with savings accounts, secure jobs, and enough disposable income to spend on a hardcover book usually cannot intuit, or else choose to forget. In stylistic terms, Meyer’s clipped, stream-of-consciousness narration brings to mind not only the modernists (Hemingway, Woolf, Joyce) but also Cormac McCarthy, especially when Isaac begins to refer to himself as “the kid,” just like the narrator of Blood Meridian. The book’s dust jacket provides the most commercially shrewd answer to the question of literary descent, however. American Rust, it says, belongs with “Steinbeck’s novels of restless lives during the Great Depression.” On the surface, the comparison seems fair; both Meyer and Steinbeck wrote about times of extraordinary economic insecurity, both created characters who struggle for independence despite their circumstances, and, most of all, both resisted the easy sentimentality of many writers of “regional” fiction. But there are no Tom Joads in Buell, Pennsylvania, and Philipp Meyer is no romantic. Steinbeck’s fiction, though often stark, had brave heroes, clear moral lessons, and even the barest hints of redemption playing about their edges. In American Rust, poverty does not ennoble the dispossessed; instead, it leads them down the path of moral hazard, where they rationalize theft, murder, and other bad decisions in the name of survival. At best, the constant presence of the characters’ internal monologues allows the reader to understand, if not pardon, their worst choices. Meyer also does not share Steinbeck’s tendency to sermonize. There are a few grand pronouncements about The Way Things Are (“We’re trending backwards as a nation, probably for the first time in history, and it’s not the kids with the green hair and the bones through their noses.”), but Meyer always dilutes them by putting them in the mouths of secondary characters, or else by immediately exposing them in a character’s internal monologue as empty clichés: In the end it was rust. That was what defined this place. A brilliant observation. She was probably about the ten millionth person to think it. Ultimately, American Rust is not a hymn to the fraying brotherhood of man, and its characters do not survive for the sake of illustrating how despair fortifies the spirit or poverty strips away all pretenses or some other uplifting observation about the human condition. Instead, Meyer insists only that his readers pay attention, even (or perhaps especially) to those whose main accomplishment is the simple act of carrying on, of finding the desire to “keep setting one foot in front of the other.” In that sense his goal is at once humble and profound, and deeply sympathetic to those who can only seek imperfect improvements upon unacceptable circumstances.
“Tell me what you eat,” wrote Jean-Anthelme Brillat-Savarin, “and I will tell you what you are.” His magnum opus, The Physiology of Taste , was gastronomy’s answer to Diderot’s Encyclopedia or Kant’s Critique of Pure Reason: technique by technique, principle by principle, Brillat-Savarin relentlessly marches through the catalog of la gourmanidse and lays out, in his words, nothing less than the “eternal foundation” for “a new science”—the science of gastronomy—that would “nourish, restore, preserve, persuade, and console us; a science which, not content with strewing flowers in the path of the individual, also contributes in no small measure to the strength and prosperity of empires.” In many ways, The Physiology of Taste applied the essence of Enlightenment thought to food, cooking, and eating. It is a massive and thorough discourse written by an author supremely confident in his ability to know himself and all of his faculties, from consumption to cognition, in perfect detail. Brillat-Savarin sought to train mankind’s collective palate and teach everyone the joys of cooking and dining through the science of gastronomy. But that science is strangely limited in scope as well. According to Brillat-Savarin, gourmandism is a science of pleasure-making, training in aesthetic judgment, the gifts of the muse Gasterea—and nothing else. Even though Brillat-Savarin used the language and structure of Enlightenment thought, he followed the dictum of the Greek philosopher Epictetus: “do not discourse how people ought to eat; but eat as you ought.” In fact, food writers have taken that advice to heart since Brillat-Savarin’s time. They have felt free to recall, meditate, and describe, from Marcel Proust’s tea-soaked madeleine to Julia Child’s sole meunière, but they never connected food with morality, only competing tastes. The topic even blunted the sharpest pens of the nineteenth century. H.L. Mencken reminisced about oyster fritters and soft-shell crabs in “The Baltimore of The Eighties,” but made only a passing mention of the pollution that would later render the Patapsco River one of the first identified marine dead zones in the world. Likewise, Mark Twain wrote enough about food to fill a book, though it’s probably too much of a stretch to call him a “locavore,”, since it sounds like Twain’s preference for local food was more of a logistical rather than a moral issue. It took socialists to first convince people that food issues extended well beyond the dinner plate. Upton Sinclair’s The Jungle famously exposed the near-complete lack of concern for food safety in the meatpacking industry, and George Orwell spent a surprising amount of his literary life defending roast beef, bread-and-drippings, and the English way of making tea from the encroachment of margarine and tin cans. “We may find in the long run that tinned food is a deadlier weapon than the machine-gun,” he wrote in The Road to Wigan Pier, without any irony whatsoever: he believed that industrialism meant the decline of man’s moral, intellectual, and physical health, and nowhere was this decline more apparent than in the English kitchen. Still, these books were intended to be more like John Steinbeck’s In Dubious Battle and less like Fast Food Nation—that is, arguments for the justice and moral rectitude of socialism, not merely calls for better food regulation. As Sinclair later complained, “I aimed for the nation’s heart and hit its stomach.” After World War II, as the world grew accustomed to frozen vegetables and Jell-O molds, the vast majority of food writing turned anodyne. Food essays tended to resemble extended essays, a sort of verbal channeling of the Platonic form of a particular dish or technique. It was, and still is, the most common kind of food writing, produced by critics and chefs alike, from James Beard to Nigella Lawson. Entertaining (and appetite-whetting) though these may books might be, all of them are fairly low-risk and low-stakes. Most examples are panegyrics to one dish or ingredient or technique, and the rest are simply culinary relativism, an attempt to show that one thing is better than another. It’s the same ground that Brillat-Savarin had covered a century prior. But a few writers have always aspired to more. In the 60s and 70s, travel writers showed that the “went there, ate that” travelogue could, in fact, have a point beyond mere description: Paul Theroux, for instance, found that the dismal dining cars in the Orient Express mirrored the famous route’s general decline. British food columnist Jane Grigson, meanwhile, wrote miniature biographies of vegetables in an attempt to sketch the outlines of what are now, a bit awkwardly, called “foodways”; her London Times counterpart, Michael Bateman, agitated for better school lunches and exposed food industry malpractices before launching the Campaign for Real Bread, which championed local bakers over the “technological bread” of industrial plants. Still, most modern American food writers see themselves not as the heirs of these gastronomical torch-bearers, but as the descendants of the ecological movement. It’s no surprise that the name Rachel Carson pops up again and again, from Paul Greenberg’s Four Fish to the pages of Vegetarian Times magazine; after all, many of these writers are trying to expose the environmental damaged caused by agribusiness or commercial fishing in the same way that Carson showed what pesticides were doing to wildlife. Of course, food matters to most of us far more than water management, wildlife preservation, or even global warming: whether it’s three squares a day or the “efficient, joyless eating” of Dr. Oz, we are forced to see, smell, taste, and think about food every single day. And that’s why the best food writing has a unique capacity to tell us something about our social norms and attitudes and even, at a stretch, that nebulous idea called the human condition. Sometimes it’s good: the chefs/civic boosters/cultural ambassadors that Anthony Bourdain manages to find around the world, from Caracas to Dubai; Tony Judt’s observation that European multiculturalism extended to his own dining table, too. But food can also lead us to abandon reason in favor of pure hedonism. Nowhere is this more apparent than in Greenberg’s Four Fish, which manages to find culprits everywhere. There’s the tuna fisherman who says, “I love these fish…but I love to catch them. God, I love to catch them. And I know you need some kind of catch limits because I’d catch all of them if I could.” Or the trochus diver on Cook Island who, when caught harvesting out-of-season, began to cry and asked, “Why? Why did you close the season? There are still some left!” And then there’s us, the fish-eating public, for whom a decade of pressure to pay attention to which fish we eat has amounted to exactly nothing, despite the best efforts of environmentalists, journalists, and the Monterey Bay Aquarium. The moral of the story is the same whether you’re talking about fast food, factory farming, big agribusiness… Maybe this is why serious food writing has remained blissfully free of moral overtones for much of its existence. As much as we would like to think that we are all Aristotelians (in other words, that we do the right thing without being commanded), when it comes to food, we’ve shown ourselves to be equal parts insatiable and irrational; we’d really rather not think about anything that would threaten that visceral link between food and pleasure. (If anyone needs any further proof—last week, overwhelming consumer feedback forced Frito-Lay to replace its biodegradable Sun Chips bag with a non-biodegradable one simply because it was "too noisy." Suddenly, we find ourselves fighting a rearguard action: as Michael Pollan shows, we’re cooking much less, we’re eating much worse, and we’re curiously ambivalent the whole thing. So even though food writing has come a long way from Brillat-Savarin’s little epigrams (“dessert without cheese is like a pretty woman without one eye”), his most memorable claim—“tell me what you eat, and I will tell you what you are”—is still true. We might like to think about food only in terms of how much pleasure it gives us, whether it’s the collective experience of a good meal or the personal satisfaction of a well-executed dish. But increasingly, food writing prompts us to look beyond the tips of our tongues, and to realize that food can bring out both the best and the worst in all of us. (Image: Inspecting Tuna, Tokyo Fish Market, 1960s, from jaybergesen's photostream)
Ken Burns’s series The Civil War turns twenty years old this month. A plain old documentary it isn’t; in fact, by the standards of most “historical” documentaries, it lacks a certain testicular fortitude. It boasts neither flashy 3-D maps nor live-action re-enactments; what few live shots there are of battlefields were mostly taken after dusk, giving them a surreal, almost dreamlike quality. Its scoring is simple, its narration restrained. It is, well, rather bookish. For starters, it is expansive in subject and magisterial (some might say boring, but then, people say that about books in general) in pacing. It has a distinctive style, both in terms of the visuals and the narration. It is split into chapters and sub-sections, with little digressions from the main narrative in between. Like the back of a dust jacket, the film also parades out the literati. We hear quotations from Herman Melville, Nathaniel Hawthorne, Walt Whitman, William Faulkner, and John Stuart Mill; we see interviews with writers like Shelby Foote and William Safire; we even hear literary giants doing voice overs – Arthur Miller as William Tecumseh Sherman, Studs Terkel as Benjamin Butler, Garrison Keillor as almost everyone from New England, and Kurt Vonnegut, though his specific role isn’t listed. Superficialities aside: The Civil War has an argument to make. It does not glorify its subject. As historian Barbara Fields puts it, the Civil War was an “ugly, filthy war with no redeeming characteristics at all.” And the documentary hammers that point home by placing all those stories of brilliant generalship and courage and gallantry alongside accounts and pictures of the human cost of “honorable manhood.” So, at the same time that we hear Stonewall Jackson proclaim that “God has been good to us this day” after the battle of Antietam – the bloodiest single-day battle in American history – the camera pans over rows of Confederate dead in Bloody Lane. And we hear Joshua Lawrence Chamberlain’s account of nightfall after the battle of Fredericksburg, during which time he and his decimated regiment huddled against the frozen ground, using dead bodies as protection against continuing Confederate fire, for as long as we hear about the battle itself. But neither does it entirely vilify the war, Michael Moore-style. Instead, Burns casts the Civil War as an armed extension of a national conversation, one that touched on race, rights, justice, the organization of society, and much else besides. And Burns reminds the viewer that the center of the conversation – for all the talk of states’ rights, tradition, economics, electoral wrangling, and voter disillusionment – was always slavery. It’s surprising, then, to be reminded that slavery was in the 1860s not a clear-cut moral issue, but a debated political topic. Burns throws cold water on abolitionists like Horace Greeley and even The Great Emancipator, Abraham Lincoln, himself, as their political support for abolition wavers in the face of military defeats and an unpopular war. Conversely, he shows the conviction with which some Southerners clung to the wrong side of history, even those without a particular animus against slaves, such as Jefferson Davis, who called the Emancipation Proclamation “the most execrable measure recorded in the history of guilty man.” Only former slave Frederick Douglass holds his course throughout the entire war; everyone else takes years to admit, in the words of Ralph Waldo Emerson, that “emancipation is the demand of civilization; all else is intrigue.” And it doesn’t rely on words alone to make the point that slavery was the issue of the war. For much of the first few episodes, we’re treated mostly to traditional European and American folk instruments – piano, fiddle, guitar, and the occasional brass band. But after the Emancipation Proclamation is announced in episode 3, we start to hear something new: the human voice. The episode ends with the Abyssinian Baptist Church Sanctuary Choir singing the Battle Hymn of the Republic, and from then on the music of gospel choirs begins to leaven all of the instrumentals. One colonel called the singing of freed slaves right after the Emancipation Proclamation was issued “the choked voice of a race at last unloosed”; Burns seems to have taken that to heart. So The Civil War describes its subject as a story of national redemption that came close to failure many times. And he shows that all of the gentlemanly military stuff was a thin veneer of civilization over a five-year-long nightmare of butchery. It’s a brave argument to make, when so much of our collective memory of the Civil War has more to do with half-remembered textbook summaries and re-enactments – both live and on television – than with reality. And in an age where most historical documentaries are content to celebrate warfare, or wax nostalgic for a world in which moral issues were clear-cut, Ken Burns’s refusal to do so really does seem old-fashioned. But old-fashioned, in this case, is no bad thing. Which makes The Civil War pretty book-like, in the best sense of the word. As novelist and professor of law Stephen L. Carter wrote, “Books are essential to democracy… Long books. Hard books. Books with which we have to struggle. The hard work of serious reading mirrors the hard work of serious governing—and, in a democracy, governing is a responsibility all citizens share. And if we are willing to work our way through difficult texts, we are far more likely to be willing to work our way through our opponents’ difficult ideas.” In other words, The Civil War simply relies on a constant immersion in a world of challenging and complex ideas. And that makes it just like the best books out there.
1. With the release of George Romero’s 1968 movie Night of the Living Dead, zombies became the monster of choice for those wishing to blend in a little social commentary with their horror. Ever since then, people have found zombies for the “thinking man” everywhere, from the hit movie 28 Days Later to obscure horror novels like Dying to Live: A Novel of Life Among the Undead by Kim Paffenroth or Pariah by Bob Fingerman. Even Pride and Prejudice and Zombies has a bit of an edge to it, according to its author: “The people in Austen’s books are kind of like zombies. No matter what's going on around them in the world, they live in this bubble of privilege.” Of course, after fifty years, the zombie genre has hit the creative doldrums. Instead of covering new thematic territory, zombies simply became more physically frightening – bloodier, gorier, faster. And the stories in which they starred became more mindless, content only to revisit the themes of Romero’s Living Dead series. Okay, people become selfish and shortsighted in a crisis; okay, zombies can stand-in for mindless consumerism, ignorance, or ideological conformity. Is that really all the zombie genre has to offer? In 2006, Max Brooks turned the zombie story on its ear with a supreme act of genre-bending. Because most zombie stories had to be, well, stories, they necessarily focused on a single person or a small group of survivors. But Brooks wrote World War Z as an oral history, which allowed him to form a kind of pointillist view of the “zombie war” from almost two dozen points of view. Thus we see the war through the eyes of a soldier who fought in every major battle in North America; of the Chinese doctor who discovered Patient Zero; of two unlikely heroes who survived the war in Japan even after the islands had been evacuated. And instead of focusing on how crisis brings out the worst in individuals, though there is plenty of that, Brooks mostly concerns himself with big picture: the failure of governments and societies. In that sense, World War Z sets out to do exactly what oral histories of the end of the world have always done. 2. According to Brooks, Studs Terkel’s 1984 book The Good War “influenced me more than anything… When I sat down to write World War Z, I wanted it to be in the vein of an oral history." In terms of narrative framing, Brooks follows Terkel almost exactly; The Good War is basically a series of interviews with people who fought in, or lived through, World War II. The Good War is as much biting social criticism as a mere compilation of conversations. On the very first page of the book, Terkel lobs a rhetorical grenade right at the reader: “the disrememberance of World War II,” he writes, “is as disturbingly profound as the forgettery of the Great Depression.” It’s an odd sentiment to read, especially nowadays. After all, we live in a post-Saving Private Ryan, post-Band of Brothers world, where bookstores have separate World War II sections and the History Channel has at least two hours of World War II programming a day. Surely no one could “disremember” World War II. About a hundred pages and a dozen interviews into the book, however, the reader begins to see what Terkel means. For most, World War II was the adventure of their lives and an “epochal victory” over evil. But many also remember their feelings of ambivalence, helplessness, and confusion in the face of a world-spanning conflict. More than one person remarks that everything after the war “is anticlimactic;” others, like the Italian immigrant who said that the war “obliterated our culture and made us Americans,” are downright regretful. Although The Good War is a strongly antiwar book, Terkel shrewdly lets his interviewees make the point for him. There are, of course, the stereotypical antiwar voices: the disillusioned veteran, the vaguely contemptuous academic, the shallow celebrity. But Terkel manages to find people whose insistence that “people in America do not know what war is” seems much less rote. An orderly in a burn ward who describes how she “had to keep the skin wet with these moist saline packs. We would wind yards and yards of this wet pack around people. That’s what war is.” An admiral who insists that “the twisted memory of [World War II] encourages the men of my generation to be willing, almost eager, to use military force anywhere in the world.” An otherwise happy veteran who closes the book by saying, “I hope I can die of old age, before the world starts the war.” And scariest of all, the congressman, Hamilton Fish, who founded the precursor to the House Un-American Activities Commission and who insisted that the United States would never use the bomb solely because “we are a God-fearing country.” The Good War is a record of profound change, as “a country psychically as well as geographically isolated had become, with the suddenness of a blitzkreig, engaged with distant troubles. And close-at-hand triumphs.” But it also shows the variety of opinions that people can hold about something that seems, at first glance, a simple struggle between good and evil. It is a necessary counterpoint to cloying, chest-thumping, action-packed narratives of war, as Terkel intended it to be. And, by coming out with a strong antiwar message during one of the tensest periods of the Cold War -- just after Soviet fighters shot down Korean Air Flight 007 and both sides deployed new nuclear missiles throughout Europe -- it showed that something as simple as a collection of interviews could say as much about its present day as it did about the past. 3. Besides The Good War, 1984 saw the publication of Whitley Strieber and James Kunetka’s Warday, a documentary-style oral history that takes place five years after a 36-minute nuclear exchange between the United States and the Soviet Union. (First things first: yes, this is the same Whitley Strieber who wrote the alien abduction book Communion and the environmental sci-fi novel The Coming Global Superstorm, which inspired the movie The Day After Tomorrow. But the Whitley Strieber of Warday still has a few years to go before all of this.) Warday matches World War Z even more closely in terms of tone, themes, and narrative techniques. Strieber and Kunetka imagine their way into a United States devastated by a “limited” nuclear exchange -- one that still managed to vaporize San Antonio and Washington, D.C., and render New York, New Jersey, and most of the Midwest uninhabitable. The bombs themselves are horrific enough: at one point, a superheated tidal wave from an offshore nuclear blast inundates the New York subway, and the authors can hear the screams of the drowning cut off by the “nasty bellow of water.” But even worse is the aftermath. As they travel around the country, Strieber and Kunetka document the dozens of ways in which a nation that once prided itself on individual liberties and a stubborn, can-do attitude has turned into a collection of petty fiefdoms whose laws “are an affront to the very memory of the Bill of Rights.” The government requires doctors to turn away patients who have been exposed to enough radiation to significantly shorten their life expectancy. The relatively untouched parts of the country now refuse to accept “illegals” from others -- a trainload of orphans from Philadelphia are turned back at the Georgia border, for example, and when the authors smuggle themselves into California, they are chased out at gunpoint by immigration police. And with perfect journalistic aim, the authors document the death of American self-confidence in a series of fictionalized polls that ask questions like “Do you think that the destiny of this country is presently in the hands of other nations?” and “Do you believe that the federal government should abandon the War Zones permanently?” As Strieber told People magazine in 1984, “We did not want to write a book about explosions. We wanted to take people into life beyond The Day After -- to wake them up in the New World of the years after.” And his and Kunetka’s decision to make Warday a cautionary tale about nuclear war without focusing on the warfare itself makes it a successful cri de coeur. “Modern nuclear war,” they write, "means life being replaced by black, empty space” -- both physically and spiritually. Nuclear weapons might destroy our homes and lives, they suggest, but only we can decide to abandon our principles in the face of fear, ignorance, and a permanent state of pessimism. 4. Chances are, however, that you’ve never heard of Warday. Although the book spent six months on the New York Times bestseller list and earned Strieber and Kunetka the equivalent of a million-dollar advance today, it has been out of print since 1985. It isn’t that the book lacked timeliness; 1984 meant plenty of post-apocalyptic pop culture, including Mad Max the television shows The Day After in the United States and Threads in the UK. Yet after that brief burst of success, people put down Warday and never really picked it back up. Maybe the book bit a little too hard. It’s shocking, for example, to hear a Canadian traveler joke about the “Uncle Sam Jump” (the postwar American equivalent of Montezuma’s Revenge), or to hear about American nannies considered to be a status symbol by wealthy foreign businessmen -- in other words, to see the United States treated like a developing country. More importantly, Warday portrays the American Republic -- “the last great experiment for promoting human happiness,” according to George Washington -- as something extremely fragile, and not easily restored once lost. And all this at the height of the Cold War, when Ronald Reagan told Americans that no rational human being would prefer authoritarianism to democracy. Warday is also relentlessly grim. The fact that World War Z is about zombies means that it flirts with silliness and the adolescent flair for ultra-violence against things that aren't quite human beings -- see, for example, the helicopter pilot who uses his rotor blades as a giant zombie buzzsaw during the Battle of Yonkers. And both The Good War and World War Z end with American victories, which at least balances all of the loose ends, postwar traumas, and moral gray areas in both books. In the end, a happy ending and plenty of flag-waving patriotism makes the bitter pill of social commentary go down much easier. With Warday, there are no such spoonfuls of sugar. We're left knowing only that the authors have succeeded in their journey, and arrive home simply to endure the "epidemic of shortened lives." Strieber and Kunetka are only the historical equivalent of a bucket brigade, passing on knowledge of their post-apocalyptic world while knowing that in the end it helps no one. Still, if Warday sails too close to the Scylla of moralizing heavy-handedness, at least it avoids the Charybdis of slapdash social commentary that permeates World War Z. Granted, a zombie apocalypse can be a metaphor for many things, but Brooks never quite seems to know exactly what his stands for. Right off the bat, he tacks leftward, lamenting the fact that lax FDA regulation contributed to the panic and sneering along with the reader at the official who asks, “Can you ever ‘solve’ disease, unemployment, war, or any other societal herpes?” (Just in case anyone doubts Brooks’s political sympathies, the stand-in for the Bush administration ends the book literally shoveling shit). But then Brooks finds a savior in authority, tradition, and centralized planning: Israel becomes a police state and survives relatively unscathed, the Queen inspires a nation by refusing to leave Windsor Palace, Nelson Mandela (who goes unnamed) saves South Africa from being overrun, and, most of all, a charismatic American president announces his decision to take back the world aboard an aircraft carrier. So are we supposed to hate government, or embrace it as our last, best hope? Are individuals and individual liberties important, or do we need Great Men (and Women) -- aided, of course, by a competent bureaucracy -- to compel us toward safety and salvation? What is its message about violence, when it portrays the mass “killing” of zombies in painstaking, almost loving detail? And does the fact that World War Z is a monster story mean that we cannot take it seriously at all, even though it clearly invites us to do so? Obviously, the World War Z references a variety of Bush-era woes. And Brooks’s reviewers draw attention to World War Z’s “parallels” and “metaphors” and “expressly political and socioeconomic material,” but they never identify what the book is supposed to mean. What purpose, except for the thrill of recognition, do all of these modern-day references accomplish? They don’t add up to an overarching moral point, except to get us even angrier about “incompetence in high places and lack of preparedness” -- which, incidentally, is exactly what George Romero tried to tell us in the 1960s and 1970s. This is not to say that World War Z is a shallow book by any means. It has scary moments and exhilarating ones, violence and poignancy, and quite a few colorful personalities (though Brooks resorts to stereotypes a bit too often when it comes to international characters). Still, it’s a bit disingenuous to claim, as the book’s dust jacket does, that Brooks does for zombies what Studs Terkel did for World War II. Yes, his choice of narrative frame refreshes a genre that had already entered its baroque phase. But World War Z never quite manages the same level of moral pique as The Good War and Warday; it is so constrained by its undead subject matter that it can only gesture at modern-day relevance before falling back on the same shopworn themes. Although it has more brains than the average zombie story, it still doesn’t have much of a heart.
George Orwell never thought that his work would outlive him by much. After all, he considered himself “a sort of pamphleteer” rather than a genuine novelist, and confidently predicted that readers would lose interest in his books “after a year or two.” Yet sixty years later, Orwell endures, and I am not sure that this is a good thing. I say this as someone who not only reads Animal Farm and Nineteen Eighty-Four once a year, but who also owns collections of essays, biographies, and even a copy of Orwell’s 1936 novel Keep The Aspidistra Flying, which according to one reviewer “at times mak[es] the reader feel he is sitting in a dentist’s chair.” But for people like me who are under 30, there will always be something remote and incomprehensible about Orwell. I was in preschool when the Berlin Wall fell, and I know perestroika and détente as answers to exam questions rather than lived experiences. I grew up fearing nuclear power plants more than ICBMs, and found LBJ’s infamous “Daisy Girl” ad far less terrifying than some of the spots from the 2008 presidential election. I think of politics in terms of individual issues and partisan planks rather than grand, historicizing political ideologies. In short, because my worldview is so different from that of Orwell and his Cold War-era readers, I have to “think” my way into their political struggles in a way that someone even twenty years ago probably did not. In ninth grade, I was required to read Animal Farm. My class read the book over a period of three weeks, which was not that hard of a task, since it is all of 30,000 words. Our teacher gave us the barest outline of historical context, enough at least to know that Napoleon represented Stalin, Snowball represented Trotsky, and that was about it (a whole unit on allegory would have to wait until sophomore year, and Billy Budd). But because the book is a “fairy story,” I learned its themes easily: power corrupts, principles are elastic, revolutions will be betrayed, and evil’s greatest allies are the unthinking masses. Two years later, I found myself following Winston Smith into the cabbage-smelling hallways of Victory Mansions on a bright cold day in April. This was the year of “relatable” protagonists, so after Ralph from Lord of the Flies and Holden Caulfield from Catcher in the Rye, I was primed to look for affirmations of my own worldview. And Nineteen Eighty-Four was both cynical, anti-authoritarian, and a paean to hopeless dissent in the face of inexorable conformity (its working title, after all, was “The Last Man in Europe”). To my teenage mind, Winston was both pathetic and sympathetic – a role model – even if Big Brother got him in the end. Surely, I thought, these were the only lessons that were worth keeping from the book, since nothing else was obvious. If there is such a thing as a “right way” and a “wrong way” to read books, then my high school approach to Animal Farm and Nineteen Eighty-Four would have been the latter. But that was because I did not know exactly how these books were shaped by their times, and how contemporary audiences would have reacted to them. We never heard about Orwell’s influences, such as Arthur Koestler, Yevgeny Zamyatin, or James Burnham, because they are not part of the literary canon. We never learned about the show trials in Moscow or the Spanish Civil War, either, because that was meant for history class, not English. And any textual analysis that smacked too much of politics was strictly out of bounds: I did not, for instance, understand that the concept of “Ingsoc” was supposed to be a satire of Nazism, whereby fascism advanced under a socialist veneer, until much later. In short, I could not have known what Orwell intended his works to be, and so I understood them in the only way I knew how, as advice manuals for the American adolescent. I’m not the only one who never quite “got” Orwell the first time around. Because few people who read Orwell’s novels in classrooms also learn about their context, most people misunderstand them, or at least half-remember them, in the same way. Sometimes, his name gets applied to topics that he never really thought about, such as the “Orwellian” investment philosophy of Goldman Sachs (at best, Orwell railed against the “sheer vulgar fatness of wealth” and the “worship of money” in general) and the “downright Orwellian” American Community Survey form for the 2010 Census (Orwell has nothing specific to say about government paperwork). Other times, this means that Orwell’s political enemies try to claim him for their own side. This is nothing new: in the 1950s and 60s, for example, Soviet publications like Kommunist and Izvestia argued that Nineteen Eighty-Four was actually a critique of American excesses and amorality, and in 1984, Norman Podhoretz famously tried to make Orwell into a pro-nuclear neoconservative hawk. But even though Hitler and Stalin belong to the dustbin of history, people still manage to find shades of totalitarianism and organized lying – Orwell’s favorite targets – in more places than ever. During the summer of 2009, for instance, opponents of health care reform wielded Orwell’s name indiscriminately. Steven Yates, a philosophy Ph.D. and member of the John Birch Society, told us that “‘Obama-care’ would make George Orwell spin in his grave.” Bill Fleckenstein, an MSN Moneywatch columnist and hedge fund manager, also decried such an obviously “socialist” project: “For those who aren’t clear on why socialism doesn’t work, I recommend reading George Orwell’s Animal Farm.”3 And Tea Party protesters have carried signs reading STOP. YOU’RE STARTING TO SCARE GEORGE ORWELL, ORWELL WARNED US, or ORWELL WAS A VISIONARY. Never mind that, in “How the Poor Die,” Orwell criticized how the indigent had inadequate access to health care; never mind that, in The Road to Wigan Pier, he blamed inadequate government intervention for poor nutrition and squalid living conditions in northern mining towns. Never mind that, for most of his life, Orwell advocated nothing short of a socialist revolution in England! As far as these people were concerned, Orwell’s works amount to nothing more than an anti-government, anti-change screed. Overuse on the one hand, distortion on the other: what perversely fitting tributes to a writer who underscored the dangers of reductionism, revisionism, and willful ignorance. Clearly, George Orwell is a victim of his own success, and in a peculiar way – there are no public fights over the legacy of Hemingway or Joyce or even over other midcentury political writers like Hannah Arendt that rival the ones for Orwell’s posthumous stamp of approval. So Orwell was right to consider himself more pamphleteer than novelist. Many critics have dismissed this as a kind of false modesty, but in this case, Orwell was not merely managing expectations. Pamphlets are designed to make a specific point to a specific audience, and then to be thrown away because they can no longer serve the purpose for which they were intended. Orwell’s works are ephemeral too, in the sense that they cannot really be understood without some semblance of historical and intellectual context. It takes a lot of patience, a lot of reading, and a lot of extracurricular effort to do so, however. Obviously, many readers simply find it easier to shout down any opposite political position with Orwell’s own words – Big Brother, thoughtcrime, Some Animals Are More Equal Than Others – than to really understand what these words, in context, were supposed to represent. And Orwell was wrong to believe that good writing alone could promote honesty. He wrote that euphemistic, dishonest, and generally bad prose “is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind,” whereas “good prose is like a windowpane,” through which the author’s purpose can be seen clearly. All true. But good writing can still be perverted, as many of his readers have shown and continue to show. As Louis Menand observed in The New Yorker, “Orwell’s prose was so effective that it seduced many readers into imagining, mistakenly, that he was saying what they wanted him to say, and what they themselves thought.” His style, in other words, has overwhelmed his substance, and if he had not been such a good, clear, memorable writer, he would not be plagued by grave-robbers. Clearly, literary immortality has its downsides. And as the last sixty years have shown, Animal Farm and Nineteen Eighty-Four are not like other canonical works of literature such as To Kill a Mockingbird or The Great Gatsby, whose messages are straightforward in comparison. Instead, they are as much pamphlet as novel, which means that it is impossible to understand his political purpose without knowing the intellectual and ideological environment in which he wrote. Until Orwell’s readers bother to do so – which, as a rule, they don’t – then we can look forward to another sixty years of use and abuse.