1. It’s become quite common by now for me to tell people who ask why I am doing a doctorate in biology that I really have no real idea. But in point of actual fact, I do very much know why I am here—or rather, I know at least one very good reason why I am here. When I was 17, a friend and I entered a bookstore together in Kuala Lumpur. We were far from home; for precisely the reason I should perhaps never have been persuaded to study biology in the first place. We were debaters, in a different country for a debate competition, and as debaters, we were used to trucking in rapid-fire exchanges on high-minded ideas on politics, economics, history. Debate in Pakistan, as in many countries, is both culture and cult. It is both a battle of wits and a blood sport. We would constantly litigate if we—teenagers barely on the precipice of self-education—felt that the death penalty was justifiable, if South Sudan should be allowed to secede, if the U.N. should be responsible for paying reparations to Palestine, and so on. But, that day, in Kuala Lumpur, we were entering the bookstore at a new time in our lives. We were both in college and had both come under the influence of the dangerous idea that “science” was fashionable (“science” didn’t really mean science; it meant Isaac Asimov and Richard Dawkins, Carl Sagan and Rachel Carson—the celebrities, the “charismatic megafauna” of science). As luck would have it, a section hewing so closely to our idea of “science,” as science sections always do, was the closest to the entrance. We both left the bookstore that day with copies of Stephen Jay Gould’s Eight Little Piggies (the handsome black cover of the 2007 Vintage Books edition is genuinely hard to resist). I, unfortunately for myself, bought another of Gould’s books as well: Wonderful Life: The Burgess Shale and the Nature of History. I may as well leave my friend behind right now; she escaped science by the skin of her teeth. She was a law and policy major who happened to be a science buff. I, on the other hand, was still uncommitted and was taking predominantly math and physics classes. Today, she’s at Oxford finishing a doctorate in international development and still a science buff, in the best of ways. Eight Little Piggies is bad enough for someone who has yet to decide a major. If you add Wonderful Life into the mix, or Gould’s other masterpiece The Mismeasure of Man, you’re in a good deal of trouble. In the past few years, I’ve returned to Wonderful Life—a bit begrudgingly, I might add—because for some time now I’ve been all in a tizzy about environmental history, what used to be called “natural history” for biologists like Gould and the many who came before him. But the first time I read it, I lost all my mental defenses against biology. I had to give up this idea that what mattered was the grand scheme of things: the charismatic megafauna. In Wonderful Life, before Gould launches full-force into the story of the Burgess Shale—a place in the Canadian Rockies where paleontologists found the best-preserved fossils of animals arising from the Cambrian Explosion—he conducts the most expert feints I have ever read in nonfiction. He tells you that what you are about to read will be a litany of details. He then admits you don’t even need to know the details to understand the story. But immediately after, he pleads of the reader not to skip them, for if they do they won’t see the beauty of the details, of all these fossils, and then he talks about animals we will never see except in drawings, and if you turn the page, you’ve already given in. When a writer as utterly convincing as Gould prostrates himself before you in such a manner, what else can you do? Incidentally, Gould also once wrote that “natural historians tend to avoid tendentious preaching” along philosophical lines, and admitted that he fell prey to the temptation far too often. He wasn’t wrong, but I bet many of his readers read precisely because they were too enthralled by the “tendentious preaching” the way he practiced it. The day I finished Wonderful Life was also the day had to give up on the idea that I could keep just one toe dipped in the world of science while the rest of my feet were soaking in all the things I loved so much more: mostly, history, but also at one point, law. I gave up on the idea of being a polymath, which was what I truly wanted to be. And most painfully now in retrospect, I had taken Gould to mean that to appreciate the details, I had to actually become a scientist, settle in, and prepare myself as if for a duel with gatekeepers who’d insist I hadn’t the mettle for the fight. When I read Wonderful Life now, I’m amazed at this mistake. The only correct reading of Wonderful Life is Gould corralling both scientists and non-scientists by his side, asking them all to see our evolutionary and natural history both as beautiful and humanistic—I say this unequivocally because Gould doesn’t just allude to this; he says it explicitly. And throughout his work, Gould displays a lot of the polymathic tendencies I desperately wanted to develop: He uses iconography, sociology, cultural studies, anthropology, a great deal of history of science, and an excruciating amount of Major League Baseball. But this is all in retrospect, because like it or not, I had read Gould all wrong, and in doing so I had somehow begun what seems like the world’s longest Monopoly game, except that money is real money and when you’re bankrupt, as in real life, you must stay in the game forever and ever, for that is the only rule that matters. In reading Gould wrong, I had chosen my “path.” After that, there’s really only one thing to do: You major in biology, procure a high enough GPA that top-tier American universities who are loath to take international students in the first place will be persuaded that you will be a star student, you apply to these universities with personal statements that pull passages from Gould’s work to indicate how much of your intellectual thought is shaped exclusively by biology, and if you do it well enough, soon you’re off packing to a university abroad, leaving behind a family proud that you will be the first to get a doctorate. Once there you will meet fellow students who will seem even more ambitious than yourself, you will be startled to find in the laboratory you work in postdoctoral researchers who have been working in the same laboratory for about a decade since they completed their own doctorates, you will join the graduate student union after you realize that your advisor mostly reads and responds to email while you do all the actual labor, but somehow you yourself cannot pay your bills on time, until one day you will realize that every single ambitious colleague with whom you entered graduate school who wanted to be a tenured professor in biology actually swore off academia quite some time ago, and is now unsure about what to do next, and embarrassed about it, because what do you do next when there is nothing you have been trained to do well enough except inspect a single protein or a single gene every day for six years, and then you will finally realize the dark humor in your conundrum, that it wasn’t just this awful, exploitative system of apprenticeship, but also a false reading on your part, a sense of certainty you developed somewhere along the line, and that you do—as a matter of fact—bear some responsibility for this crisis. And what a shame! All this time you could have been trying to become an actor. 2. I believe there is something special, and specially awful, to be an academic apprentice in our times. For the world of academia, it is the kind of ruse that breeds certitude in young people, despite the obvious fact that there is so little in this world to be certain about. The certainty we had when younger has led to so many of us feeling trapped in cages of our own making. We didn’t quite know then that the average length of an apprenticeship in academia—ostensibly to develop the mere ability to study something—has gotten bigger and bigger, just as the prospects become dimmer and dimmer. But we only seem to find out once it’s too late. What makes these times even more special, I think, is the knowledge that we have that we may not see many generations coming forward who can tell us where we went wrong. We feel trapped, we know we are trapped, and we know there is urgency in this crisis, but we appear paralyzed by it. This is merely a belief, but it certainly seems true. It is entirely possible that future generations, if possible, would implore us to rebel. The point I am trying to make is of the many lives that we could live but don’t. I, who made the hasty decision to be an apprentice in biology, am saddened by the fact that it will now take me longer to do other things, and also know that there are some things people are too embarrassed to admit they have always wanted to do. Here are just a few things that either myself or my closest colleagues have admitted, in an air of scandalous secrecy, that they always wanted to do: Audition to be in a film. Write a play. Travel the world to rescue animals. Learn how to make buildings. Visit conflict zones as a medic, journalist, translator, or photographer. Become a lifeguard at a beach. I don’t think it quite escapes any of us that while we don’t have the financial luxury to rescue animals across the world, we also don’t really have the luxury of complaining of it as if it were a tremendous cruelty. That is, after all, the complaint about us millennials, is it not? That we don’t have what it takes for hard work and grit; that we are too ungrateful. If only. Ingratitude would be a relatively easy problem to solve. There’s enough of us to bring each other back down to earth, to ground each other. I assure you—I do know how to count my blessings. “I am lucky,” and because this is a mantra repeated over and over to me by everyone I meet who somehow still believes a Ph.D. in squandered dreams is a thing of beauty, I do acknowledge this, though somehow I seem to be doing so more to convince the beholder that I am not an elitist ingrate, because I am embarrassed that I do not feel as lucky as I should because that must mean I am, really, at the heart of it all: an elitist ingrate. And yet somehow, we’re also in a world where an elitist ingrate with a Ph.D. doesn’t really have all that much to look forward to. The world is both ablaze and deluged—quite literally! There are wildfires that rage across and hurricanes that destroy homes and property and all this time more of us could have been gathering knowledge about how it is that the entire world has suddenly become so endangered. There has been simply no need—again, quite literally!— for this glut of Ph.D.s put out to market (or more appropriately, put out to pasture) except for the metrics in U.S. News rankings and profits to the presidents and top-tier administrators of universities. Knowing that, of course, solves nothing. In fact, it’s all quite depressing and this is just one way in which what we now know to be true becomes so paralyzing. Sure, you can quit. You can always quit. But we’ve been taught quitting is for quitters, and also we hold out hope that the title of “Dr.,” judiciously employed, may one day mean something. I’m not sure any of us is truly convinced by this argument, but it’s the only excuse we have. When we’re sitting in a graduate student union meeting, we all know these things, and there is camaraderie in the misery (though not with that student: the one who sweeps into the meeting 10 minutes late after months abroad and tells us just how inspiring her fieldwork was!). [millions_ad] In Camille Bordas’s How to Behave in a Crowd, out of six siblings, almost all are either finishing their doctorates or preparing to start one—a fairly accurate reflection of us, our overeducated millennial generation. At one point, the second-youngest, Simone, who is very ambitious, explains to the youngest, Isidore, who is constantly wondering why he is so unambitious, why it is that their older sister Aurore has been so depressed since she defended her doctorate. She uses a “funnel theory.” Here’s the funnel: “What does the funnel represent?” Isidore asks. Simone replies: The funnel represents our lives. The possibilities, the choices. When you’re born, you virtually have an infinity of options, you get to swim at the top of the funnel and check them all out, you don’t think about the future, or not in terms of a tightening noose at least. And when she mentions a tightening noose, she points to the bottom of the funnel. At the top, with all the choices, I’ve placed an x. She goes on: You think, if anything, that the future will be even more of that, get you more freedom, more choices, because you see your parents pushing your bedtime farther and farther and you think, Well that’s swell, you think it means being an adult will just be super, but then little by little, you get sucked to the bottom. You don’t realize it at first. It starts with the optional classes you elect in high school. More literature or more physics? Should you start learning a third foreign language or get serious about music? And then choices you could’ve made for the future get ruled out without you knowing it, and you sink down to the bottom faster and faster, in a whirlwind of hasty decisions, until you write a PhD on something so specific you are one of twenty-five people who will ever understand or care about it. Now we’re talking about being inside the funnel, in which I’ve placed a y. At this point, Isidore, often wiser than his Ph.D.-hungry siblings, protests that “Ph.D.s are not the only option.” Simone, just 18 months older than Isidore, but already a star student who has skipped a few grades, replies: But they’re the slowest possible way down the drain. They buy you time, they allow you to believe for a while that the amount of specialization of your thesis verges on some kind of universality—and for the best academics, it does, or at least I want to think so—but then in the end it doesn’t matter how brilliant you are, or that you think you can apply that brilliance to other areas of research: academia has already confined you to the one field you picked years before. That’s why Aurore is all depressed. Aurore is reluctant to go there. And here is the final funnel, with z showing where you end up as you finish your doctorate: As Simone draws Aurore’s position at the neck of the funnel, Isidore asks if this isn’t true of everyone: isn’t everyone afraid to go the neck of the funnel? “Don’t be so sure, Dory,” Simone says. “Some people enjoy being trapped. Some people need it.” This—again—is the crisis, and it is inescapable, and it is unfortunate for our generation unlike previous generations where the ratio of Ph.D.s produced to tenured positions available was far more reasonable and well-matched. But even if eventually we want to go to the neck of the funnel, as Simone seems to think some people do—what then? Let’s say you remain steadfast. Will your apprenticeship ever end, or will you spend—as so many, many, many people today do—years and years stuck in what academics, somewhat charitably, call “postdoc hell?” “Postdoc hell” is a funny phrase because like many things that are very funny, it is very true. I could sit down this very moment and start writing a very long list of the names of people I know personally who graduated with their Ph.D., and because they wanted to continue on in their field—and truly believed they were special enough for that one tenure-track job—entered one lab for a postdoc and remained in that position for anywhere between five to 10 years, mechanically pushing journal articles into submission queues and conducting endless experiments, all the while being expected to teach all the extremely specific skills you have gained to silly undergraduates and graduate students with lofty ambitions. You could be pushing buttons on a computer to teach the 50th person how to use the fancy new microscope the university just bought, knowing full well that it’d probably be obsolete by the time you’re done because soon you’ll get news of yet another, probably in a presentation by another postdoc—but still, somehow, you’re supposed to be excited about it. You can, however, do something else. You can spend those years in one lab, and then hop to another—one with more prestige, more postdocs, less graduate students. But do know this! Your postdoc advisor will tell you, when you email them, that you must come with your own funding. And so, along with all those applications for tenure-track positions, or adjunct positions, you must also annually fill out fellowship and grant applications for years and years until you finally cave, and where are you most likely to end up? In the growing administrative cabal of the corporate university. Maybe you’ll even be happier there, but you will only really be studying one thing: the unhappiness, or delusion, of academics. “Postdoc hell” is no less an abstraction than “graduate school.” It is a place, a 10th circle of hell. I can take you there. I’ve been a writer for long enough now, and I’ve studied history along with biology for long enough now, for people to ask me if I regret taking on so much. I never say yes, because all that keeps me sane. If they know this long complaint of mine, friends will ask if I regret starting my doctorate in biology, and I don’t quite know what to tell them because I know I knew—almost as soon as I had to take my qualifying exam for Ph.D. candidacy—that if it wasn’t for the fact that I wasn’t a citizen, I would have cut my losses. What I don’t tell them is that instead of regret, I’m thinking about something that will most certainly seem ridiculous to them: going back to graduate school, this time for history. Yes, I know, it sounds preposterous. And yes, this time I will have no excuse for optimism. It is probably straight-up masochism that I seem to want to emerge through the neck of one funnel and into another. But just so you know—Isidore’s eldest sister, Berenice, begins another Ph.D. before the book is over. In Chicago no less. I defend my doctoral thesis on November 7. I’ve told my mother to be prepared for the occasion that may arise from knowing I’m hovering at the neck of the funnel, and soon I’m going to have to make a move. Image: Flickr/Maxim Mogilevskiy
1. For the burgeoning fields of environmental humanities, it has long since become a commonplace notion that there isn’t really any such thing as “nature” or “wilderness”: both words used to connote real places—pristine and untouched places—but with the increasing knowledge that such a state of being likely never existed, the words come up empty. There are, however, new narratives: Through a case study of the global matsutake mushroom trade, anthropologist Anna Tsing shows compellingly in The Mushroom at the End of the World that the human-disrupted landscapes we find everywhere are worthy of study. How far do we have to look to find that in the stories we tell today? Not far at all. Lauren Groff’s collection of stories, Florida, seems to see every landscape it describes as contaminated—the wreckage of things wrought by both humans and non-humans. In “Dogs Go Wolf”—a survivalist tale of two sisters stranded on an island, abandoned and threatened by adults—more than monkeys, more than dogs, it is a menacing man from whom the sisters hide. “He moved toward the boat and kicked it once, twice, then the girls saw the rotten wood break apart, and a hundred frightened bugs ran out.” Groff rarely allows herself the common narrative—what is termed “declensionist” in academic works, i.e., the conventional narrative trope of “human beings cause progressive degradation,” a trope that is, depending on your point of view, incorrect, selective, colonialist, racist, and/or anthropocentric. In one instance, she allows it smack-dab in “Snake Stories,” a story, arguably, about ambivalence itself: In February, one day, I found myself sad to the bone. A man had been appointed to take care of the environment even though his only desire was to squash the environment like a cockroach. I was thinking about the world my children will inherit, the clouds of monarchs they won’t ever see, the underwater sound of the mouths of small fish chewing the living coral reefs that they will never hear. But because this is an ambivalent story, this passage follows soon after the narrator asks her son, “Why, of all beautiful creatures on this planet of ours, do you keep writing about snakes?” He answers, “Becus I lik them and thy lik me.” Although I myself am uncertain about the extent to which we ourselves are aware of how literature is changing with regard to nature, when you begin to see the ugliness, the ambivalence—the “contamination”—of nature in one place, you begin to see it everywhere. Carmen Maria Machado’s justly lauded collection Her Body and Other Parties, for instance, seems to me just as much a realist rebuke of the triteness of “nature” as a work of science fiction or fantasy. The tentative resident at an artist’s colony, for instance, finds the horrors of nature everywhere: She tests the railings on the deck of a cabin “to see if anything was rotting or came off in my hand like a leprous limb”; looking up in the bathtub, she finds a showerhead “dark and ringed with calcified lime, like the parasitic mouth of a lamprey”; when the discovery of a rabbit she had previously run over turns up outside her studio door, she observes that “its visible organs glistened like caramels, and it smelled like copper.” Kneeling to the rabbit’s carcass, she apologizes. “You deserve better than that,” she says. What does it deserve? Where did this vein of what I can best call un-nature writing begin? When did the environmental historians and anthropologists begin to convene with novelists and storytellers to arrange this complicated vocabulary? More precisely—when did we begin to recognize the banality of “nature writing,” a legacy largely assumed, correctly to some degree, as that of the Romantics? The answer, in short, is: We didn’t. The legacy of long, meandering, anthropocentric meditations on nature—be they through Wordsworth’s “tranquil restoration” by nature through springs, sycamores and sober pleasures in “Lines Written a Few Miles above Tintern Abbey, on Revisiting the Banks of the Wye during a Tour, July 13, 1798,” or Coleridge’s hymn to “green vales and icy cliffs” in “Hymn before Sun-Rise, in the Vale of Chamouni”—may actually be very much with us. 2. When we think of “nature writing,” a common Romantic phrase that comes to mind is “sublime.” Sublime, too, is an unstable word. But unlike “wilderness,” which has switched from negative to positive connotations, the sublime is more capacious. When Edmund Burke wrote about the sublime, it was to refer to “the strongest emotion which the mind is capable of feeling.” One assumes that could apply equally to the experience of death as it would be to experiencing vertigo while bungee jumping in the redwood forests of Humboldt County, California. Paradoxically, it is both a banality and a point of actual contestation to confront Romantic literature as the era of simply “nature writing.” The literary scholar Alan Bewell, who focuses particularly on British Romanticism, admits that one of the biggest problems he faces “in writing about or teaching British Romantic poetry to a mainly urban audience is to explain why most of these poets … spent so much time talking about landscapes and rural scenery, describing the seasons and the weather, and meditating on birds, flowers, mountains, rocks, and trees.” Bewell would do better to start off with why the works of Romantics are so heavily contested in literary studies. Bewell himself represents a school that calls itself various names—as, frustratingly, many academic schools do—but ecocriticism should suffice. Broadly, the ecocritics argue that what the Romantics’ preoccupation with nature represented was a response to modernity, one that foreshadowed the biological, materialist understanding of “nature” that formed the basis of modern environmentalism. The Romantics in other words, were “proto-ecologists.” Collapsing a whole academic school of thought is an act of heresy, so allow me to pause and insist that the ecocritics are obviously not a monolith, nor do they agree entirely on particular works. Still, writ large, ecocritics argue for some degree of coherence in the Romantic tradition. This is, in and of itself, controversial. The British literary critic Marilyn Butler, for instance, who lived long enough to see the beginnings of these tensions in the meeting of environmental and literary studies, was scathing on the attempts to slot things in neatly. Butler argued that the contemporary intellectual tradition saw “aesthetic discussions often [resting] upon the belief, also ultimately historical, that there is a single coherent Romantic movement. This belief is reflected in, say, the unquestioned coupling in a book or article of Coleridge and Shelley, or in the widely found inference that a work with Romantic traits has found something it ought to have found, that it is profounder and better than work characteristic of an earlier date.” Ecocriticism developed as a counterpoint to “new historicism,” the literary theory that emerged in the mid-20th century and argued for examination of the cultural contexts of literature as a way to chart intellectual history. New historicism ascended along with postmodernism; the two are historically connected. The ecocritics are a response to these new historicists, academics for whom the ecocritics charge “nature” was merely a smokescreen behind which ideology, history and politics hid. According to Bewell’s characterization, the new historicists saw “nature” “as an obstacle to both the history that human beings make and the histories that they write, and since it places limits on human freedom, the task of most historicist criticism of Romantic literature has been to penetrate or dissolve nature so that the human agency that stands behind it can be recognized.” [millions_ad] It boggles the mind a bit that these two forms of literary theory do not find a common middle, but most often they haven’t. More than once, a new historicist has argued that there is no such thing as nature; in turn, ecocritics have objected strongly that that is a rebuke to materiality itself. But contemporary literature has certainly found a middle. In an essay entitled ”Not Your Grandfather’s Nature Writing” in the Fiction Writers Review, Andrea Nolan points to a spate of literary journals like Flyway, Ecotone, and Orion, which focus on the environment and distance themselves from “nature writing.” Indeed, she quotes the mission of Ecotone as being distinguished from “the hushed tones and clichés of much of so-called nature writing.” As far as I can discern, however, the most radical change in register for un-nature writing lies in complicated human/nonhuman juxtapositions. In Lauren Groff’s most recent story for The New Yorker, “Under the Wave,” an arresting little passage appears mid-story in what reads as a wild nightmare with a fluid sense of time: Images accumulated. A woman in filthy panties limping down a road with a bone knuckling out of her arm. A mass of faceless people huddled around a fire. The gray vinyl of a bus seat, scored like aged skin, and the strange flat brown landscape passing dreamily by the window. Filthy panties. Bone. People. Fire. Gray vinyl. Aged Skin. Flat brown landscape. These juxtapositions of the excruciatingly human with classically-descriptive words for nature that seem so new are made possible in a literary landscape that is realizing how incontestable it is that nature is inseparable from the human and the cultural. Thus far, literary theory has found this difficult to attain, especially for the work of the original “nature writers.” As postmodernists tend to dismiss materiality entirely, the ecocritics bristle from dismissing it even the slightest: Ceding any ground at all would be to dismiss the aesthetic and, crucially, ecological worth of the Romantics’ work. Take, for instance, Coleridge in “France: An Ode”: O ye loud Waves! And o ye Forests high! And O ye clouds that far above me soared! Thou rising sun! thou blue rejoicing Sky! Yea, everything that is and will be free! The ecocritic Karl Kroeber notes that “Coleridge can imagine the sky as joyous because he feels that freedom of individual being is to participate fulfilling in a dynamic unity of forces greater than himself but to which he can satisfyingly belong.” Granted, Coleridge’s invocation of forms of unity emphasize an interconnectedness with nature that can be termed “proto-ecological” because they emphasize both the aesthetic power and beauty of nature as well as the practical and social duties of man to the natural world. Further, it would be hard to argue that this view of nature does not represent some actual thing—the sky is, basically, blue; the forests, often, very high. But simultaneously, the ecocritics decry the commercialization of ”nature” based on the idea that human beings only leave alone those natures that they do not value. Could it not be, then, that the Romantics’ views led us here directly by romanticizing the pastoral and pristine and wild—by representing the nature that deserved to be valued? After all, for every complex representation of the environment through writers like Groff, Machado, and those who grace the pages of Flyway, Ecotone, and Orion, there are non-literary works that play right into the hands of problematic assumptions of nature. The Pulitzer Prize-winning work of science journalism The Sixth Extinction by Elizabeth Kolbert, for instance, has been heralded as a major work charting the loss of species. At the same time, however, it has been criticized by environmental scholars for its focus on some species and not others, for its unquestioning assumption of “species” as the unit of analysis, and for assuming that some Platonic form of “nature” existed before industrial humans began destroying it. And so even as the postmodernists have lost ground, problems remain. While ecocritics take their cue from environmental scholars about the need to examine environmental and natural themes in their work, the idea that “nature” itself might be a construct—many, many different constructs, in fact—remains largely unquestioned. It’s a reactionary impulse. As literary critic Dana Phillips has argued, even as the ecocritics bring back the idea that there is something material, biological, and empirical about the world (i.e., “nature” is not entirely a cultural construct), what that “something” is remains to be settled—not in ecology or humanistic inquiry, and definitely not in Romantic literature. For compare Coleridge to the Percy Bysshe Shelley in the third stanza of “Mont Blanc” personifying the mountain itself: ugliness (“rude, bare, and high”) and bleak destruction (“Ghastly, and scarr’d, and riven”): Far, far above, piercing the infinite sky, Mont Blanc appears – still, snowy, and serene; Its subject mountains their unearthly forms Pile around it, ice and rock; broad vales between Of frozen floods, unfathomable deeps, Blue as the overhanging heaven, that spread And wind among the accumulated steeps; A desert peopled by the storms alone, Save when the eagle brings some hunter's bone, And the wolf tracks her there – how hideously Its shapes are heap'd around! rude, bare, and high, Ghastly, and scarr'd, and riven. Is this the scene Where the old Earthquake-daemon taught her young Ruin? Were these their toys? or did a sea Of fire envelop once this silent snow? None can reply – all seems eternal now. What Shelley did was strip away some of the sentimentality of nature writing. “Mont Blanc” is, after all, an expression of Shelley’s atheistic beliefs and his political reformist idea that without human imagination, all those silences would be vacuous (“Mont Blanc” is famously considered a rebuke to Wordsworth and Coleridge). Whatever “Mont Blanc” is for the Romantics, it’s clearly not just a well-described mountain. 3. In the Romantic works I’ve encountered, none poses as direct a challenge to the generalizability of the Romantic view on nature than Mary Shelley’s Frankenstein. The fable has been seen, variously, as anti-modern, a cautionary tale about science and technology that echoes contemporary fears, as a nightmare about “nature” gone wild, and a plea for stewardship: that humans must care about nature so it does not go awry. To see how different Mary Shelley was from her contemporaries, consider Wordsworth’s “Lines Written in Early Spring,” which begins with Wordsworth glorifying Nature and decrying the state of Man: To her fair works did Nature link The human soul that through me ran; And much it grieved my heart to think What man has made of man. Despite Wordsworth’s “faith that ever flower / Enjoys the air it breathes,” there is also doubt: The birds around me hopped and played, Their thoughts I cannot measure:— But the least motion which they made It seemed a thrill of pleasure. The doubt, of course, is less of a service to the representation of nature (“If such be Nature’s holy plan”) than to Wordsworth’s lament of “What man has made of man.” Even as Wordsworth trucks in pleasure and invokes doubt and uncertainty, his representation of nature is relatively benign. Autonomy is granted to “nature,” but it is a gentle and soothing sort of autonomy. It stands in contrast to Wordsworth’s helplessness about the state of man. Needless to say, this is fundamentally different to the autonomy of nature that is presented in Frankenstein. The famous passage where Victor beholds his making: For this I had deprived myself of rest and health. I had desired it with an ardour that far exceeded moderation; but now that I had finished, the beauty of the dream vanished, and breathless horror and disgust filled my heart. This grants Victor a terrifying hyper-autonomy. Where Shelley’s Frankenstein departs from Wordsworth is in the hyper-autonomy both of man and of nature when man is hubristic enough to wish to dominate it, which is why Frankenstein is thought so often as the anxious industrial precursor to living in the age of anthropogenic climate change. Indeed, more than one literary critic has seen the current geological epoch of the Anthropocene as the modern-day Monster from Frankenstein. There are problems with reading even Frankenstein as the “proto-ecological consciousness” of a Romantic writer. The most major is that it collapses “nature,” “science,” and “technology” as if they were all part of the same whole. There is considerable ambivalence in Frankenstein about this. It is, after all, the Monster who regards “nature” in a similar fashion to many of the Romantics: Autumn passed thus. I saw, with surprise and grief, the leaves decay and fall, and nature again assuming the barren and bleak appearance it had worn when I first beheld the woods and the lovely moon. Yet I did not heed the bleakness of the weather; I was better fitted by my conformation for the endurance of cold than heat. But my chief delights were the sight of the flowers, the birds, and all the gay apparel of summer. The obvious other problem with the “proto-ecological” Frankenstein is that it blurs too many lines. Not only does it transpose an eloquent man-beast who resents his birth, his maker, his countenance, and society—all qualities and emotions that many humans express and are known to have—onto the “nature” that faces us in the Anthropocene; it also casts Mary Shelley as the prescient seer of the Romantic movement, undercutting the prescience of other skeptics with less forceful work. If Mary Shelley is the Romantic double of Lauren Groff and Carmen Maria Machado, it goes without saying that William Wordsworth has his, too. In an essay in n+1, “Thinking Like a Mountain,” environmental historian Jedediah Purdy skewers the anthropocentric conceits of contemporary works of nature-writing, works that bear an uncomfortable similarity to “Mont Blanc”: For writers, this strange world — tamed to death, feral as a wild hog — has inspired a fascination with nonhuman action, agency, and consciousness. This is true in high academic culture, where literary scholars wax lyrical on the agency of storms and trees, political economists propose that capitalism be seen as both an ecological and a social form, and social theorists outline ethnographies and alliances across species. But as usual the academic trends are just the owl pellets of Minerva. Stronger evidence of a mood is the ambitious, often excellent, sometimes ridiculous writing, from essays and memoirs to popular science, that asks obsessively: What is looking back at us through other species’ eyes? Could we ever escape our own heads and know the viewpoint of a hawk? Is there such a thing as thinking like a mountain? Like me, Purdy also finds ridiculous that this is all still called “nature writing” in an age where no one knows what “nature” is. But his broader point is key: Whatever this genre, it has made a comeback, just as more complicated works of un-nature sit beside them on shelves. Tsing’s work has its doubles, and so does the ecocritic’s. It’s like the ecocritic sitting next to the new historicist: The battle lines are real but also bewildering. They probably tell us more about ourselves than about “nature,” but they may also be very captivating. Or if you prefer: distracting. After all, as Purdy points out, it remains both “baffling and beautiful” that Thoreau once asked of his pond: “Walden, is it you?”
1. No novel has entranced me this year like the French author Mathias Énard’s Compass, short-listed for the 2017 Booker Prize. Énard, a writer with tremendous empathy for his characters, both as individuals, and also as contextualized individuals embedded within contemporary geopolitical conflicts—the book is dedicated on the last page “to the Syrian people”—writes what ostensibly seems a didactic treatise on the world of orientalist academics. The protagonist, Franz Ritter, is a musicologist whose dreamscape and memories over the course of one sleepless night populate the entirety of the text while taking us through both Eastern and Western lands: Vienna, where Franz lies on a sickbed in the present, to Aleppo, Tehran, Damascus, Paris, and Istanbul, to which Énard pays special attention as the historic "conduit" between Europe and Asia. As Franz dreams restlessly about the woman he loves— another orientalist scholar, Sarah, a historian, whose polyglot prodigiousness on all things worldly and otherworldly pays homage to all forms of scholarship—Compass emerges as both a technical and scholarly feat as well as a love letter to the "Orient" and a rebuke to the fiction of its otherness. In amusingly familiar academic segues we can see, through Franz, what Sarah might write about: a fanciful article entitled “On the Cosmopolitan Fates of Magical Objects,” Franz imagines (probably accurately) as a title for an article that Sarah would write to show “how these objects are the result of successive shared efforts…that Orient and Occident never appear separately, that they are always intermingled, present in each other, and that these words—Orient, Occident—have no more heuristic value than the unreachable destinations they designate.” Énard’s brilliance is as self-evident as it is comical: Where else but in the idiosyncratic exchanges of academics could we ruminate on such grand ideas through the study of genie lamps and flying carpets? Through Franz’s one-night journey through memory, we meet quirky Egyptologists, composers, writers, archaeologists, philosophers, even charlatans; many of whose stories, whether they physically featured in Franz’s life or not, peter out in a tale of heartbreaking fits and starts. Franz and Sarah’s own story is, predictably, no less sad. I have been in awe of Énard’s gifts since Street of Thieves, during which I marveled at the empathy with which he treated his Moroccan protagonist, Lakhdar, a young man who travels from Tangier to Tarifa and finally, Barcelona, haunted by an Islamist bombing he had minor involvement in and his excommunication from his family, but assuaged by his love for literature and art: Ibn Battuta and Naguib Mahfouz, the familiar beauties of Tangier and the exotic newness of Barcelona. In Compass, Énard ostensibly faces less of a challenge writing a protagonist with whom he shares at least some cultural sensibilities (although obsessed as Franz is with the appropriation of Oriental music on European composers from Franz Liszt to Hector Berlioz to Ludwig van Beethoven, all of whom get several fascinating pages of description, we shouldn’t minimize the author’s feat: to my knowledge Énard is not an ethnomusicologist), even as the ghost of Edward Said hangs insistently over the orientalist scholars’ cerebral quibbling. Books like these give me an unerring hope in the human capacity to reach out to an unknown self and try, with meticulous research, observation, erudition, but principally with empathy, to understand a self distinct from one’s own. When I first began to read Compass, I had just begun writing another short story of my own: the first that didn’t include subcontinental Muslim characters. I struggled with the sweep and ambition of the story I wanted to write—one that would have to pass through many generations of an interracial family to plumb the effects of environmental disaster—the Dust Bowl for instance—to demonstrate the ephemeral nature of intergenerational memory. I settled on a four-monologue, play-like structure for the story: one for each generation. I spent months reading first-hand accounts, history texts, longform stories about the impacts and memories of natural disasters. I used my historiographical research in environmental history to think about the people in books as people I could try to know. I read books that described catastrophes: starting off with John Steinbeck’s The Grapes of Wrath, which I remembered as a ruthless story of tenant farmers trapped in economic hardship and poverty as the Dust Bowl reared its ugly head; as crops failed and harsh drought swept over the prairie. When I finally had a draft I could consider complete, I gave it to my first reader—my most generous reader. She returned it with the terse comment that I should “write what I know.” What had I done wrong? Had I failed in my research? The details were all correct, I was confident about that. Had I failed to do justice to the two white characters from whose perspective I wrote the first two monologues? Or to the two mixed-race black characters in the last two monologues? Had I failed to empathize? I went back to the drawing board, trying to convince myself to jettison the story entirely. But the logic of writing solely what I knew was unconvicing. How can I reconcile myself to writing stories about people solely from my cultural background when the stories I want to write have a different sweep, a distinct subject matter that requires me to understand characters outside of my lived experience? That is what I have always seen as the point of literature: its capacity for universality. 2. As it turns out, this isn’t unfamiliar ground for writers today. Rachel Cusk, recently profiled in The New Yorker by Judith Thurman, had her first book published at 26. She now deems her early work as inferior. Thurman takes Cusk’s disillusionment as a reflexive turn away from the earliest iterations of herself because she managed “to upend the plot of her own life—to break up her family, then to lose her house and her bearings.” Cusk is now married for the third time; about her book Aftermath, a painterly if perplexing memoir about the dissolution of Cusk’s marriage with Adrian Clarke, Thurman argues that Clarke “haunts the text like a ghost.” Thurman wonders: Why doesn’t Aftermath explain why the marriage dissolved? “This was partly for the children’s sake,” Cusk says. But Aftermath met with some cruel reviews, after which Cusk seemed to change course. She says of her trajectory: “There seems to be some problem about my identity. But no one can find it, because it’s not there—I have lost all interest in having a self. Being a person has always meant getting blamed for it.” Profiling writers of fiction, mining their lives for clues to explain the eccentricities and artfulness, or perhaps even artifice inside the work themselves—not just thematically but as a direct analog for a protagonist or an entire plot—has become a bit of a trope. Ever since Lena Dunham burst on to the scene, the justification of using autobiographies as the principal quarry from which to mine stories from the vantage point of the writer (what is essentially primary research for the literary critic) has become increasingly more ubiquitous. But of course, you don’t need to have a degree in literary criticism to know that the tradition is far older than Dunham. One could argue it is steeped in the pursuit of the Great American Novel itself: in the specificities of Philip Roth’s Newark Jewish oeuvre, or Norman Mailer’s racially-charged machismo, or as literary critics rigorously argue, on any work of fiction anywhere and any time. But a certain timbre of particularity, coincident with the rise of the personal essay, has most certainly become more central and self-aware in literature of late: specific questions about which characters represent the author and whether plots actually occurred in the author’s real life pop up in interviews when they were once considered gauche to ask a novelist. A recent interview between writers Chelsea Martin and Juliet Escoria finds them talking about “self-serving writing,” work inspired by autobiography, as if it represented the pinnacle of truth-telling. Escoria talks intimately about her book Juliet the Maniac, contending that she doesn’t really “understand the difference between writing fiction and writing nonfiction.” There’s more than a whiff of writers being far too hard on themselves. The problem is why contemporary literary trends motivate young writers to believe that their own personal histories are the only histories they can plumb with any believable depth: a belief that visibly flails when confronted with the Enlightenment origins of humanistic “imaginative” capacities that can be traced to at least as far back as Denis Diderot. As Jean le Rond d’Alembert demonstrates in his Preliminary Discourse to the Encyclopedia of Diderot, for Diderot painting, sculpture ,and architecture were deemed at the head of knowledge known as “Imitation,” but it was poetry and music that demonstrated imagination: that the skill demonstrated “by the warmth, the movement, and the life it is capable of giving, it seems rather to create than to portray them.” This creation was rarely conceived merely as reproduction, nor has it been for a very long time. After all, with writers like Leila Guerriero and Joan Didion, as Daniela Serrano so powerfully writes, the compulsion is reversed: it is not looking at yourself that is the most uncomfortable, but at other people. There can be no doubt, however, that "identity"—with all the limitations and deliverances the word connotes—has become so powerful in popular culture, that the imaginative arts, across different mediums, have found themselves in a bit of a bind. Dunham, when criticized about the whiteness of Girls, claimed that she wanted “to avoid rendering an experience I can't speak to accurately.” In Sofia Coppola’s recent remake of the Civil War-set, Don Siegel movie The Beguiled, she shifted the perspective from that of the male interloper’s to the women in the cloistered Miss Martha Farnsworth Seminary for Young Ladies, but crucially she also excised the role of a slave character—one that was present both in Thomas Cullinan’s original book, which served as the source material for both films, and in the first film, where she was played by Mae Mercer. Coppola received her share of outrage for "whitewashing," an accusation she deflected the way Dunham did: by essentially arguing that she didn’t wish to take an important subject lightly the way the original source material did; instead, by focusing on what she knew best. But if the dogged discoverers of Elena Ferrante’s true identity are to be believed, Ferrante didn’t know much about the poverty of Lila and Elena’s Neapolitan upbringing either. Has lived experience supplanted all other forms of knowledge as the sole true source of authenticity? As an avid Ferrante fan, I take umbrage with such a reading: I could care less about her true identity—and if she hasn’t truly lived it, then the Neapolitan novels merely display a capacity for virtuosic observation and insight. 3. But if this is truly an impasse, the contemporary moment in fiction, then it is a problem we must contend with. Arguably we are already contending with it, although perhaps with less success than one would hope. Lionel Shriver told an audience at a writer’s festival last year that, “Taken to their logical conclusion, ideologies recently come into vogue challenge our right to write fiction at all. Meanwhile, the kind of fiction we are 'allowed”'to write is in danger of becoming so hedged, so circumscribed, so tippy-toe, that we’d indeed be better off not writing the anodyne drivel to begin with.” As Sarah Schulman reported, Viet Thanh Nguyen responded in the L.A. Times by saying: “It is possible to write about others not like oneself, if one understands that this is not simply an act of culture and free speech, but one that is enmeshed in a complicated, painful history of ownership and division.” Nguyen makes a compelling point: we can use this schism to our advantage, but only if we understand the baggage that attends literary, cultural, and political history. Personally, I found Coppola’s version of The Beguiled captivating—with a particularly heartbreaking performance by Kirsten Dunst with a depth almost entirely missing from the earlier incarnation—just as I find much to admire in Dunham’s writing on Girls. But both come saddled with a crucial lack of ambition and not, as they had ostensibly hoped, racial sensitivity. Wouldn’t The Beguiled be all the more interesting if Coppola had extended her nuanced portrayals to a black female character? If it weren’t so illustrative of the loaded identitarian schism at the heart of leftist politics, it would make for the perfect right-wing conspiracy: not only have well-meaning liberals become too PC, they are now roundly dismissed as blinkered by the same folks whose ire they hoped to deflect in the first place. It goes without saying that the problem doesn’t operate solely at the level of the artist herself. Somehow the gambit has been working, arguably with a deep historical legacy, to widen gaps between artists and audiences, with publishers eager to pander to particular readers depending on the artist. It is by now a cliché that many novels written by women are designed to look like romance novels. On the covers of her books being targeted to specifically to female audiences, Margaret Atwood, in an interview in 2015, mentioned that “there were probably some quite disappointed readers.” Atwood’s interviewer Jessica Stites responded that she couldn’t get her friends to start reading the Neapolitan novels because the first book has a wedding dress on it. Meanwhile, author Nnedi Okorafor wrote a book with a female Muslim protagonist, only for her publisher to suggest a cover with a white female figure on it. One wonders: How could publishers be failing so much to adapt? Surely this is not what Nguyen had in mind. Indeed, if writers are to be brave they must truly go there, and like any writer for any story, do meticulous research. But that may not be enough: one hopes writers have the capacity to publish in a world less maladapted to receive their work as well. How did we find ourselves here in the first place? Surely writers never decided in closed-door meetings that the social scientific and humanistic academic emphasis on Culture with a capital c would bleed into fiction to such a degree that writers would begin to parse identities into little parcels, keeping only those they could hold ground on; seeing the act of storytelling itself as one circumscribed by the belonging of a identitarian category. Far more likely is that for writers this is a passive process, one driven by our politics (and/or publishers), by reading the expectations of audiences or anticipating outrage, fears, and concerns that are exacerbated by the near-monopoly in fiction of white authors. Surely writers writ large know there is something reductive about using our own lives as not only the canvases for our art, but of art itself. The argument, or perhaps merely a passive trend riding on a form of herd mentality, seems to dictate that the craft itself has become one’s calling card. Which is to say: not only has the liminal space between identity and individuality been overcome, but storytelling has crashed right through its center, obviating the need for anything else. Why should a story need anything more than an identity? Why shouldn’t Kumail Nanjiani plumb the comedic depths of his own lived trajectory the same way Lena Dunham, Aziz Ansari, Louis C.K., and countless others do? There can be no prescriptive answer on this question that is not simultaneously political. But I suspect that there comes a point when the regurgitated version of one person’s life, especially when that person belongs to a minority group, begins to feel tired: a genre as trope; Oriental fiction with veils on the covers. The ruse being played here is that there is no more a sense of a story without an identity preserved through the complex Venn diagrams one inhabits (or fails to); no universality, no totality: merely a small set of interlocking bricks that hold together the walls of our perception of the world. A place where Plato’s Cave is now color-coded, numbered and charted—hierarchies everywhere, opportunities only to move up or down or sideways like chess pieces. And now that the Cave is so stratified, why feel the need to leave it and see it as it is? How can one tell a story, any story, about any form of universal phenomenon if the response one instinctively pre-empts is: How could you know anything about that? This should have been written by a white gender non-conforming person who grew up without money or the awareness of privilege but nonetheless took advantage of it and grew to believe in less humane economic precepts than she/he/they would have had they not been white. It underlies an inherent paralysis, not too different from the paralysis Amitav Ghosh describes for storytelling which is failing to grapple with climate change in The Great Derangement: Climate Change and the Unthinkable. Climate is not identity or racial politics, however, regardless of how closely their consequences are intertwined, but the concept of paralysis, I suspect, prevents talented artists like Sofia Coppola from stretching the bounds of their own ambitions; and more dangerously than for the minority writer it becomes a convenient alibi for the white artist’s conception of believability. But as with most things, it is a double-edged sword. How can we disregard the critique of the white writer who considers himself (often himself) objective enough to take on any character, a critique which has only become more prominent because marginalized writers have pushed it up in discourse after decades of unrewarded work? Today, at least, it is acknowledged in some circles that not only do minority writers deserve a pulpit, but that storytelling in turn requires minority writers (although certainly not a standard held up nearly enough). Still, it requires a peculiar moment in contemporary culture when certain white male writers can comically (and of course also infuriatingly) decry that their jobs are harder as white men than if they were minorities. In that way, storytelling as with most things bears a truly striking institutional likeness—to the extent that the enterprise of writing and publishing is an institution—to our current politics. Regardless, the argument of constriction applies to minority writers too—identitarian thought has bled into the wholesome creed of “write what you know.” We have erected walls for ourselves that are both comforting in the way that occupying a niche gives a writer and claustrophobic in the sense of wells running dry, new writers providing old stories that are tired reflections of the works of older writers. Nowhere in my experience is this more true than in fiction from my native South Asia, where the timbre of even the most lauded works by Arundhati Roy, Mohsin Hamid, and Kamila Shamsie has acquired a quality of permanence most subcontinental writers cannot help but emulate in the sprint for awards success. Interestingly, the most incisive critics of Roy have recently pointed out the utter lack of tonal difference between her abundant nonfiction and The Ministry of Utmost Happiness: where non-fiction can afford to proselytize, fiction ceases to breathe when crafted in the same mold. Again, however, publishers erect roadblocks in the name of pandering to certain audiences. From a Pakistani perspective, it is all to easy for me to envision publishers who expect me to deal with Islamophobia or terrorism on some level in all my writing, even if apropos of nothing. And thus: in reifying the fictions of identity (the baseline fact most left-leaning writers can agree on), we have elevated almighty Culture, enforced monopolies of singular identities and mashed them all up. No longer can storytelling be ambitious in the fashion of Doris Lessing (who admittedly dabbled in both very autobiographical and very non-autobiographical work, the height of the latter reached in her sublime Canopus in Argos space fiction). Instead, every story would serve itself best as another iteration of your own personal diagram, chipping away at your own identity slowly, painstakingly, even dully over the decades like Philip Roth, but surely not like Mathias Énard: there would no imagination, only personal research. No external perception, only introspection. 4. With this conversation raging in my head as a writer of color, it’s fascinating sometimes to dissect my own responses to my work. Had my first reader got it right—was she letting me off the hook by telling me to write what I knew because the story didn’t hold up to the literary standards she knew I aspired to? Very possibly. I didn’t let the story go, however. I doubled down, and worked even harder at it. But even more intriguing to me than the cases where I double down are those where I have chosen to let go. When my first work of fiction was published, at The Rumpus, my editor told me that the website had commissioned an artist to illustrate my story. I couldn’t wait, both for the story, and for the art it would sit alongside. When the story was published, I was astonished. The style of the art was sparse and completely appropriate to the story: three drawings in all. But curiously, the second illustration, inspired by a pivotal scene where my male Pakistani protagonist has a brief exchange with a friend’s grandmother, looked suspiciously Western. There was a reference to chai in the text, but scant other details. I remember instinctively thinking: there’s no way the grandmother would look like that. A Pakistani grandmother would be wearing a loose dupatta, along with a shalwar kameez—a long tunic and loose trousers. I thought about it for a long time. Ultimately, I decided that there was something about that drawing that captured other specificities—the posture of the grandmother, her spirit—that moved me. I concluded that it was great as it was. The artist had read my story and decided to interpret it the best way she could, and despite the initial skepticism it aroused in me, I liked the idea of the illustration reading my work as something transcendent, something neither here nor there but everywhere: maybe, something even universal. The day after the story came out, I contacted the artist: one of her works hangs on my bedroom wall, a reminder both of my resistance and release and of the artist’s intended or unintended attempt to universalize my work. I don’t wish to ask. Why should I? No matter how much specificity we try to achieve, we will always fall short. After all, as the (white, male) writer Mark Greif tells us, “your life has to be your own: no one else can live it for you, as you can’t enter anyone else’s life to know it feels.” Image Credit: Public Domain Pictures.