Cai Emmons on Women’s Rage

-

When my mother and I used to ride the New York City subway together, she would look at the men sitting across from us with their legs splayed and launch into a rant about the entitlement that allowed them to take up so much space. Couldn’t they squeeze their knees together as we women did to create room for another person to sit? She had resented male privilege all her life; I suspect she even resented my son as a baby, before he became a person she loved.

I’ve summoned these memories because of the imminent publication of my new novel Livid, which explores female anger—in particular the kind of low-key anger my mother harbored, which was always simmering under the surface of a friendly, extroverted, life-loving woman. Sybil, the protagonist of Livid, is assigned to sit on a jury with her ex-husband to decide the fate of a woman who is accused of killing her own husband. Sitting in the courtroom day after day, Sybil has plenty of time to ruminate about her rage, the defendant’s rage, and how this binds them as women.

Though on the surface I am cheerful as my mother was, I have been riding a rollercoaster of anger all my life, beginning as a teenager when I attended a predominantly male college and watched every weekend as women were imported as sex toys for my schoolmates. Outrage propelled me to work for an abortion referral service pre-Roe,where we helped desperate women of all ages, from all over the country, get abortions in states where they were legal. And in my thirties, I felt indignation as a filmmaker when cinching a job at the helm of feature film was, for women, like threading a camel through the needle’s eye. Finally, rage dogged me into the worlds of academia and publishing where in both arenas lip service is paid to gender parity, but nevertheless men receive higher pay, hold most of the top jobs, get more regular promotions, and receive more recognition in the form of publications and prizes.

My mission is to explore what it means for women to live on such a rollercoaster of anger. My mother and I are not alone. Most women I know have been angry in similar ways, suppressing the feeling for stretches of time until it explodes again with the right catalyst—when we lose out on a job to a less-qualified male candidate, when our husband neglects his share of  childcare duties; when we are accosted on the street or groped on the train; when a sniveling liar is appointed to the Supreme Court. We may explode for a while, then we usually suck it up and move on into another day, another week, month, year, seeing the same patterns repeat but wondering if we have the requisite energy to make a fuss. These periods of silence take a toll: we might turn our unexpressed rage inward, where it congeals into depression, health problems, addiction. Anger moves in our bodies in mysterious ways, and attempts to address it can feel like playing whack-a-mole; we are constantly punching it down only to see it rise again.

Of course, none of this is new; women have resisted male predominance for centuries. In the U.S. Mary Wollstonecraft wrote A Vindication on the Rights of Women in 1791, and well before that, in fourteenth-century France, Christine de Pisan wrote The Book of the City of Ladies. Earlier still was the barricading of the Roman Forum when women were protesting a law that forbade them from using expensive goods. And, one can’t forget the Aristophanes play Lysistrata, written around 411 BCE, in which women protested men waging war by refusing to have sex.

So what can be done with our anger now? Experts tell us to meditate, breathe deeply, or exercise as ways to manage our rage. When I was a kid, my mother bought me a weighted blow-up Bobo, a clown that I was supposed to punch during my temper tantrums. But Bobo only infuriated me more by popping back up. Such solutions might calm us or distract us from our anger, but they don’t address the source. Is there some way to really alter things so we aren’t doomed to push the proverbial rock up the hill again and again?

From this vantage point, I see a future in which women will have to continue to assert our equality and rights over and over again. Seeing Roe appear and disappear during my lifetime has shown me that. Rights have a tendency to disappear right at the moment we assume they are permanent. We need to keep our anger alive. To speak up, engage in conversations, write, protest. To give voice to anger, without apology. Most of all I think we need to learn to not to be afraid of our anger but to harness it as a force that binds us.

The Industrial Visions of Precisionist Artists

-

In the early years of the twentieth century, an eight-year-old girl named Elsie Driggs was traveling by train with her parents from Sharon, Pennsylvania, to New York. She had dozed off by the time the train reached Pittsburgh, but as the writer John Loughery would recount years later in Woman’s Art Journal, her sleep was interrupted: “She was awakened by her father to witness the drama of the black night-sky over Pittsburgh ablaze with soaring flames from the steel plants. It was a memorable sight.”

So memorable that 20 years later, with her artistic career beginning to flourish, Driggs returned to Pittsburgh hoping to recapture the scene in paint. But the fiery Bessemer steel-making process had been abandoned by then, so there were no longer any flames spurting into the night sky. Worse, the local mill’s managers insisted that a steel mill was no place for a young lady—and they were suspicious that she was a union organizer or industrial spy. But Driggs did not give up. “Walking up Squirrel Hill to my boarding house one night, I found my view,” she told Loughery. “It was such a steep hill. You looked right down on the Jones and Laughlin mills. You were right there. The forms were so close. And I stared at it and told myself, ‘This shouldn’t be beautiful. But it is.’ And it was all I had. So I drew it.”

And then she made a painting from her sketches. And then, nearly a century later, while viewing an exhibition of the Whitney Museum’s permanent collection, my eye was drawn to a smallish black-and-white painting—just 34 by 40 inches—that from a distance appeared to be abstract. A miniature Franz Kline? As I got closer I realized it was a stark depiction of cylindrical industrial smokestacks bound by wires at the top left and a gush of smoke at the bottom right. The smokestacks are black and gray, the only color coming from a hint of sulfur in the pale sky. And that’s it: smokestacks, wires, smoke, sky. No flames, no human beings. How did this unremarkable image manage to be so otherworldly beautiful?

Elsie Driggs, Pittsburgh, 1927. Oil on canvas, 34 1/4 × 40 1/4 in. (87 × 102.2 cm). Whitney Museum of American Art, New York; gift of Gertrude Vanderbilt Whitney 31.177

The card on the wall told me that the picture was called “Pittsburgh 1927” and it was painted by Elsie Driggs. She soon followed it with an equally stark painting of silos and ducts and smokestacks called “Blast Furnace,” and then a monumental, faceted painting called “The Queensborough Bridge.” These paintings gained Driggs entry into a group dubbed the Precisionists, an informal movement of mostly young artists who in the 1920s were drawn to America’s emerging industrial landscape of factories and skyscrapers and bridges, which they rendered with both a geometric precision that echoed Cubism and a sparseness that sometimes bordered on abstraction, typified by “Pittsburgh 1927.”

A Visual Gold Mine

There were other Precisionists on the walls of the Whitney the day I discovered Elsie Driggs, most notably Charles Sheeler and Charles Demuth. While Driggs was in Pittsburgh, Sheeler was in Detroit on a commission to photograph Henry Ford’s sprawling River Rouge complex, the workplace of 75,000 people, then the largest factory in the world. Sheeler’s assignment was part of the corporation’s publicity campaign for the Model A, which was about to replace Ford’s obsolete Model T. Sheeler spent six weeks roaming the complex with his camera, producing 32 prints that the company used for publicity and that are now regarded as high art. Possibly his most memorable image is “Criss-Crossed Conveyors, River Rouge Plant, Ford Motor Company,” a picture of two conveyors making an X over a tangled netherworld of fences and buildings and ironwork, all of it topped by eight slender smokestacks that reach into the heavens. (A year later, Driggs did a pencil drawing of the same scene.)

Charles Sheeler, River Rouge Plant, 1932. Oil and pencil on canvas, 20 7/16 × 24 3/8 in. (51.9 × 61.9 cm). Whitney Museum of American Art, New York; purchase 32.43

Only one of Sheeler’s works from the Rouge was in the Whitney show—a painting made five years later called simply “River Rouge Plant,” an oddly serene depiction of the factory’s coal processing and storage facility. The exterior walls of the buildings are creamy or tan, the foreground waters of a boat slip are glassy and calm, the sky is blue. There are no workers, no smoke or sparks or grease or slag heaps. The only hint of the repetitive, soul-crushing work that was done there is the nose of a freighter visible on the right. Part of Henry Ford’s genius was to make everything he needed to produce cars—what’s known as vertical integration, the cutting out of all middlemen—and so to make steel he had coal brought up by train from Appalachia, while his fleet of freighters brought iron ore down from Duluth, Minnesota. You would not know this by looking at Sheeler’s stately photographs or serene paintings, nor would you know that Ford was a union-busting anti-Semite who ruthlessly policed the morals of his captive work force. Sheeler was not concerned with such unpleasant facts. For him, all that mattered was that the Rouge was a visual gold mine. “The subject matter,” he wrote to his friend Walter Arensberg, “is undeniably the most thrilling I have ever worked with.” American factories, he added, were “our substitute for the religious experience.”

Hanging on a wall near “River Rouge Plant” was Charles Demuth’s painting “My Egypt” from 1930, a depiction of a grain elevator in his hometown of Lancaster, Pennsylvania. The gray structure, topped by exhaust ducts and flanked by a smokestack, is seen from a low angle, giving it a monumental, nearly monstrous presence. Diagonal lines hint at stained glass—and at the notion, widespread at the time among Sheeler and others, that industrial structures were the cathedrals of the machine age.

Charles Demuth, My Egypt, 1927. Oil, fabricated chalk, and graphite pencil on composition board, 35 15/16 × 30 in. (91.3 × 76.2 cm). Whitney Museum of American Art, New York; purchase with funds from Gertrude Vanderbilt Whitney 31.172

Only later did I learn that Demuth had painted a picture called “Incense of a New Church” in 1921, which made the link between industry and religion explicit. In the foreground, ropes of smoke—from incense or machines?—coil through a dark world of smokestacks and machinery. Though there are no people visible in either painting, the title “My Egypt” hints at the slave labor that built the pyramids in ancient Egypt and, by extension, at the dehumanizing pressures of the machine age.

But it’s a veiled hint at best. By depicting industrial architecture and machinery but not the humans who built and operated it, the Precisionists left themselves open to the charge that they were glorifying the machine while minimizing or simply ignoring its human costs.

An Ethereal Experience

While Driggs was in Pittsburgh, Sheeler was in Detroit and Demuth was in Pennsylvania, Margaret Bourke-White was in Cleveland photographing the Otis Steel Works. Like Driggs, Bourke-White was drawn to the steel mill by a memory from her girlhood, in this case a visit to a New Jersey foundry with her father. Her recollection of the episode was quoted in the catalog for a 2018 exhibit called “Cult of the Machine” at the de Young Museum in San Francisco: “I remember climbing with him to a sooty balcony and looking down into the mysterious depths below. ‘Wait,’ father said, and then in a rush the blackness was broken by a sudden magic of flowing metal and flying sparks. I can hardly describe my joy…. Later when I became a photographer…this memory was so vivid and so alive that it shaped the whole course of my career.”

Unlike Driggs, Bourke-White gained access to the interior of the Cleveland mill, and after five months she had produced hundreds of prints but only a dozen that satisfied her. One of them, the 1928 print “Otis Steel Works,” was included in “Cult of the Machine.” The picture is an exterior, two railroad tracks running from the bottom left of the frame into the middle distance under an iron bridge. On the left are the brooding vertical forms of the mill, including two stacks that send smudges of smoke into the gray sky; in the middle of it all is a ghostly puff of steam. Only after close looking do you realize that there are two human forms in the distance, under the bridge, reduced to the scale of ants by all this monstrous machinery. In the show’s catalog, the curator Lauren Palmor writes:

Between the steel, iron, and smoke, it is easy to overlook the two minuscule workers in the distance, standing near the tracks like specters. Their diminutive portrayal is consistent with Bourke-White’s early tendency to focus on machines rather than their operators. Even when she includes workers in her images of the mill’s interior, as in this work, they are visually subsumed by machinery, as though they are merely more of its moving parts.

What was behind the Precisionists’ tendency to focus on machines rather than their operators, on the mechanical rather than the human? Was it a way of condemning the pulverizing power of the industrial age? Or was it a way of glorifying these monuments to human ingenuity and will? Or could it be that it was not one or the other, but a bit of both? Or neither? 

One possible answer comes from Elsie Driggs. The year she painted “Pittsburgh 1927,” Charles Lindbergh became the first person to fly solo across the Atlantic Ocean. A year later, Driggs experienced her first flight, aboard a Ford Tri-Motor that carried her from Cleveland to Detroit. It was an ethereal experience, judging by the painting she executed that year, “Aeroplane,” which was included in “Cult of the Machine.” It shows a curvaceous, silvery plane floating through the heavens. Black diagonal lines suggest the whirring of propellers—and identify it as the work of a Precisionist. It’s a lovely, loving homage, clearly a way of glorifying this airborne monument to human ingenuity and will.

“Anything Can Be Interesting and Beautiful.”

The Precisionists are long gone but their impulses live on. A contemporary artist who is also drawn to the pictorial possibilities of American industry is the British-born painter Rackstraw Downes, who has spent the past half century producing plein-air paintings of what he calls “disprized” places—lumber yards, sewage treatment plants, exhausted oil fields, the underbellies of highway bridges and elevated train tracks. He finds most of his scenes near his home in New York and in south Texas, but in 1976 he ventured to Pittsburgh as part of the project by the U.S. Department of the Interior to have 45 artists offer their versions of the American landscape. Downes was given the choice of painting Cape Cod or Pittsburgh. It was an easy call. “Cape Cod is a boring old subject that everybody has done,” Downes told me in a recent interview. “It’s like a warm glass of milk. I like a shot of whiskey myself.”

After driving around Pittsburg for a couple of days, he found his spot: the view from the Clairton-Glassport Bridge over the Monongahela River, with the massive Clairton Coke Works on the right bank. “It was immediately apparent to me that that was the place,” Downes said, echoing Elsie Driggs’s epiphany half a century earlier. “It was the largest coke works in the world. It was sensational with all that smoke going up. It was terribly filthy, and I loved it.” He drew sketches and then completed the painting called “The Coke Works at Clairton, Pa.” that became part of a traveling bicentennial show, “America 1976.” In the foreground, the nose of a barge is visible passing under the bridge while the factory pumps its smoke and toxins into the sky. The barren land on the left bank was, according to Downes, “absolutely dead ground.”

“The Coke Works at Clairton, Pa.” by Rackstraw Downes (1976)

Downes studied literature at Cambridge before he turned to art, and what sets him apart is that he chooses his subjects not only for their visual and sociological properties but also for their potential to produce a narrative. His narratives typically reveal the pulverizing power of the industrial age and, sometimes, the natural world’s way of pushing back. This comes through in his rendering of the Clairton Coke Works in the act of scarring the surrounding river and air and rolling hills. It comes through even more vividly in a 1990 painting called “In the High Island Oil Field, February, After the Passage of a Cold Front.” At first glance, the painting, which is 10 feet long and just 16 inches high, does not seem particularly compelling. It’s a panorama of some wheezing oil rigs and a ditch full of reddish water on a platter of scrub somewhere in south Texas. But if you keep looking, the details begin to multiply—a reminder that Downes paints onsite, without benefit of a camera, sometimes revisiting a spot dozens of times over many months to produce a single picture. And now listen to Downes as he explains what you’re seeing in his 2004 book In Relation to the Whole:

Cows, horses and wading birds share this 1,200-acre field with the pumps, and when strong winds blow in from the north after the passage of a cold front, the sediments that are pumped up with the oil and natural gas and which collect in the bottoms of the ditches are stirred up so the ditch water looks red. The perspective down the center of this painting is the raised embankment of an old railroad bed. The cows like to congregate and lie down to rest on this long-infertile ground because it dries off quickly after a rain; and so they dung it up intensively too. So, it is gradually beginning to regain fertility and support a sparse cover of weeds.… Here the tenses of a landscape imagery which represents what is lost or threatened are reversed; we see decaying industrialization being replaced or reclaimed by the progress of nature. These weeds interest me more than ancient redwoods…

Downes sees beauty and narrative potential even in something as prosaic as razor wire. In 1999, he painted a four-panel series of the coils of razor wire on the fence around a subway maintenance yard in Brooklyn. In a recent phone conversation, Downes tried to explain the visual and sociological elements that drew him to this subject. “If you pay close attention, anything can be interesting and beautiful,” he told me. “I didn’t make the razor wire beautiful. I just painted what was there. Painting it was extremely difficult—the light was always glinting—and it’s vile in its sociological implications. It’s all about keeping people out or keeping people in. It’s nasty stuff.” And yet in his hands it becomes the stuff of high art. 

What Downes and these very different artists share is the ability to find beauty in unpromising places. “These industrial forms,” Bourke-White said of the Cleveland steel mill, “were all the more beautiful because they were not designed to be beautiful.” In other words, beauty is not confined to the picturesque vistas that attracted Albert Bierstadt and Ansel Adams and the members of the Hudson River School. Beauty can be accidental, unintended, counterintuitive—even anti-pastoral. And it takes an attuned sensibility to discern such beauty. 

All of this is not an attempt to burnish the old chestnut that beauty is in the eye of the beholder. Rather, it’s to suggest that beauty is everywhere if you know how to look for it. Which is what artists like Driggs, Sheeler, Demuth, Bourke-White and Downes know how to do. Beyond knowing where—and how— to look for beauty, they also know how to capture it and present it and make it interesting. This is what elevates them from mere painters and photographers to true artists. I’m reminded of a remark Downes made to me several years ago at the opening of an exhibition of his paintings at the Weatherspoon Art Museum in Greensboro, North Carolina. I asked him about his propensity for choosing subjects that most people think of as anything but beautiful. He replied, “As for beauty, people say to me, ‘Why do you paint such banal subjects? There’s nothing beautiful there.’ It’s not my job to paint something beautiful; it’s my job to make a beautiful painting of something. And that something is something that intrigues me. Sometimes it intrigues me because it’s very modest, very ordinary or very neglected—but I have to have good feelings about it.”

Downes’ remark reminded me of a remark the renowned art critic Henry McBride made in the New York Sun back when “Pittsburgh 1927” was fresh in the world. “Elsie Driggs,” he wrote, “is capable of interesting us in anything in which she herself is interested.”

That, to my mind, is the measure of greatness in any artist.

 

On Rejection, Abortion, and What’s Left Behind

-

“Do you like your job?”

From one working mom to another, it’s always a loaded question. This time it was particularly fraught. The woman asking me—I’ll call her Mailea—was my patient. She was on her back, knees splayed. I was about to perform her abortion. 

“I do like my job.” I said.

“What do you like about it?”

I thought about this for a moment. “I like putting people at ease,” I said, “treating them with respect and keeping them safe, at a time when they might otherwise feel vulnerable and afraid.” 

I touched the inside of her leg, placed two gloved fingers inside her vagina and, with my other hand, pressed on her abdomen. I often talk with my patients during their abortions. Many of them want to be distracted; we’ll talk about what food they’re craving, or what movie they’ll watch with their kids that night. Some, though, go straight for the deeper topics, the harder questions. They seem to be searching for something, some kind of meaning, in the space of our 15-minute visit. I pressed the speculum into her vagina. “You’ll feel some pressure now.” She was quiet. I felt the silence between us like a gulf, or maybe a bridge—I didn’t know which.

“What do you do for work?” I said after a while. I wasn’t sure what I was doing with this question—trying to reengage Mailea, or merely distract her? Or was I trying to satisfy my own curiosity? Something about her intrigued me. She had a calm, self-assured presence, the air of someone who went for what she wanted and got it.

“I’m the editor of a magazine,” she said.

My heart rattled in my chest: an instinct, an alarm. “Oh? What magazine?” I steadied my hand, placing an instrument, aiming a needle, injecting medication around her cervix. She told me about her work for a small, online publication that she and a friend had started, an “arts-as-activism collective” centered on climate change and environmental justice. 

“I love it,” she said. “I can work part time, and from home, which is perfect for me since I had my son two years ago.” 

My heart calmed; my hands steadied. “I’m going to dilate your cervix now,” I told her. “You’ll feel strong cramps that come in waves. Just breathe through each one.”

In between breaths, she kept talking. She explained the mission behind the magazine: how she and her colleagues believed in the power of the written word to change people’s minds and bring them together, to heal not only individuals, but communities, even the planet. She told me that before today’s appointment, she’d written a letter to some of her close friends about her decision to have an abortion, asking for their support and solidarity. “It really helped me to share it with them. It helped me not to feel so alone, you know? To let go of the shame.”

There was that alarm again, a rattling inside my ribcage. Shame. I shook it off. I was almost done with her procedure now. Through a small tube, a gentle vacuum, I pulled the tiny, early pregnancy out of her uterus and deposited it into a dish. I told her I’d be back in a minute, then stepped out of the dark room into the fluorescent hallway.

In the lab I strained the material from Mailea’s uterus through a sieve. A fluff of gestational tissue flashed white against red blood: the pregnancy. I swirled it in a clear dish, watching it drift for a moment, singular and weightless, before I emptied it into its legally mandated destination: a red biomedical waste bin. Later that day, it would be incinerated. She would leave this behind.

*

When I was young, I used to tell everyone I was going to be a writer. I wrote dozens of short stories and sent them off to magazines—The New Yorker, Harpers, The Atlantic. I was twelve, thirteen years old. No editor ever wrote back.

Once my father gave me a strip from the Peanuts series, in which Snoopy sits atop his doghouse typing furiously: “Gentlemen, Enclosed please find the manuscript of my new novel.” In the next frame, Charlie Brown brings Snoopy a letter in response: “Dear Contributor, Already we hate it.” My dad watched as I read the comic. He laughed and laughed. I smiled up at him, pretending to get the joke.

My dad was an engineer with a deep creative streak. He built most of our furniture in his basement woodshop, sang Christmas carols with abandon, played the piano decently. He was an excellent and passionate cook. Although not an artist himself, he held a deep reverence for what he would call “professional” or “real” artists: those who were dedicated to their craft and who were, in turn, recognized in their field; artists and writers who were known, published, awarded, acclaimed. He held a corresponding contempt for what he called “wannabe” artists: people (particularly women) who talked about their art (the instrument they used to play in high school and might one day pick up again, the novel they planned to write), but who never followed through with their grand plans, never produced anything of real value. To believe in oneself, in one’s own worth and talent, without any external validation was, to him, the height of vanity and hubris.

I didn’t become a doctor in order to please my dad. But I did become a doctor, at least in part, so I could set my sights on something honorable, meaningful, and measurable, and achieve it. Also, I did it so I could stop telling him I was going to be a writer.

Throughout my medical training, however, I found that I couldn’t stop writing. In fact, the work I was doing and the world I inhabited with my patients was so raw, urgent, and moving, the compulsion only grew stronger. I used every precious minute of my free time to write—usually in my diary, occasionally publishing short narrative essays in medical journals. I pretended writing was just something I did on the side, a hobby. I didn’t tell anyone—including my dad, including myself—that I still wanted to be a writer: a published writer, what my dad would call a “real” writer. 

Instead, I was a doctor. And despite hating almost every minute of my training, I found some parts of the work deeply rewarding. I discovered that I loved caring for pregnant women, particularly those whose pregnancies were unexpected or unwanted. I completed my residency and went into practice. I got married, had a baby. I kept on writing, always “on the side.”

A few years ago, shortly before my dad died, I wrote a book, mostly while my infant daughter slept. It was a memoir of my training and work as an abortion doctor, framed by the story of my pregnancy. I found an agent willing to represent it. She sent the book to more than thirty publishers. It was rejected by every single one. By the time I truly understood that the book I’d written was never going to be published, I was pregnant with my second child. I should have been glowing with pride and expectation, but I could not stop crying. 

I went to see a prenatal counselor, where I sobbed and sobbed. I have no idea what other women talk about at the prenatal counselor’s office. I don’t think I uttered one word about the pregnancy, or the baby, or any of the things I was supposed to be focused on. “A disappointment,” was all I would say, through my tears. “It’s sort of a professional disappointment. Something I really thought would happen, and then it didn’t.” The counselor didn’t push me for details. 

I gave birth to our son. A few weeks later I returned to see her, still crying. Finally, after all those months, I forced myself to tell her the truth. I was so deeply ashamed I could barely get the words out. My hands trembled. I wasn’t just ashamed of the book’s failure, but of my own selfishness, that I could cry over this disappointment while nursing my beautiful, healthy baby. Look at what I had. Look at how fortunate I was. And I was dwelling on this: my stupid book, my stupid dream. That was three years ago. 

If the prenatal counselor felt out of her comfort zone, she didn’t show it. Now it occurs to me that my problem may not have been so different from the typical referrals she saw, including the many women who must have been referred after a miscarriage. Something I really thought would happen, and then it didn’t. She never offered practical suggestions, never said I should find a new agent, never said, “What about self-publishing?” as so many well-meaning friends did. (“I could never self-publish it,” was my repeated response to those friends. “That would be worse than not publishing it at all.”)

Instead the counselor looked me in the eye. “The question now is,” she said, “what meaning will you make of this great disappointment?”

*

Shame is an enormous emotion. It is so diverse and pervasive, it can be hard to recognize, even for someone who encounters it every day. For me, shame is located in my secret life as a writer, in the pain of rejection, and the sting of my dad’s derision for “wannabe writers” like me. For my patients, shame is located in their wombs, in sex, in their choices and in their bodies. It is a different shame, and, of course, it is exactly the same.

Here is another way of saying what I said to Mailea: what I like about my job is meeting a woman who is carrying shame, and giving her a chance to let it go. By talking about her abortion as a choice with which I trust her completely, by treating her with respect, by keeping her safe and comfortable, I allow her to hand some of her shame over to me. I offer to hold it for her.

Not every woman takes me up on that offer. Some cannot let go of their shame. These are the ones who shake and cry the whole time, who won’t look me in the eye, who are angry with me. But most of them can let it go, at least part of it. They seem relieved to hand it over to someone else, even just for those 15 minutes. And some of them, I believe, actually let go of their shame forever. They walk out of that room a little bit lighter—not just because of what I’ve removed from their bodies, but because of the burden they’ve left behind.

When I insisted, for those three years, that I would never self-publish the book, I was thinking, of course, of my father: what he would say about a writer who refuses to take “no” for an answer, a “wannabe” writer who insists on promoting her own work after it has been deemed undeserving of publication. But I could not shake the counselor’s question. It seemed to me I had two options: to dwell in my own failure and shame, or to move past it—across it.

Does the book deserve to be published? I guess I think it does. Sometimes, of course, I falter in that belief. I still look at the floor whenever I use the phrase “self-published.” I believe if my dad were still alive, he would look at the floor, too. In my privileged life, that is perhaps the greatest disappointment, the deepest shame I’ve ever known. But what I am trying to do—here, on this page—is to make some meaning out of it. Like my patients (at least in some ways, not quite like them in others), I am looking for someone to whom I can hand my shame. I am asking of you, reader, what Mailea asked of me. 

Do you like your job?

Back in the procedure room with Mailea, I told her that her abortion was complete. She nodded and thanked me. Then she closed her eyes, and I saw tears gathering at the edges of her lids. I paused for a moment. Then I said, “I know you told me you’re an editor, but by any chance are you also a writer?”

She opened her damp eyes. She looked at me. “Yes,” she said. “I’m a writer, too.”

“I thought you might be.” I asked her for the name of her magazine and told her I’d look it up when I got home. “I’d love to read your writing,” I said. “You have such a calm, confident presence. I bet you’re a great writer. And a wonderful mom.”

Her face crumpled and she began to cry. “I try,” she said.

Shelve This in Memoir: Confessions of a Teenage Bookseller

- | 1

1.
A few months after being hired as a bookseller at the second largest independent bookstore in Indiana, I received an unexpected promotion—to Easter Bunny.
I stared at the bunny suit and accompanying head crushed into the crate in the bookstore’s backroom. “So you want me to put that on?” I said, turning to my boss. My boss—a man in his late 20s—nodded.
“Why me?”
“I don’t know,” he said, grinning. “You look about the right size.”
It was the spring of 2001. I was 16, a sophomore, and, as it turned out, about the right size for the entire menagerie of Saturday morning story hour costumes. Following my stint as the Easter Bunny, I became Clifford the Big Red Dog, Angelina the Ballerina, Franklin the Turtle, Curious George, and Spot, amid a host of other children’s books characters. Since the clunky costumes didn’t allow much in the way of maneuverability, I relied on Sherry—the store’s story time specialist—to guide me by the furry hand from the backroom to the throne in the children’s section. No matter the costume, as soon as I turned the corner, the children greeted me with an enthusiasm generally reserved for Barney and Big Bird. While Sherry read the story, I summoned the miming skills of Marcel Marceau, dramatically reacting to every page. When the story ended, the children engaged in a little light rioting in their attempt to hop onto my lap for a photo.

“Sm-ile!” Sherry called, snapping the photos and allowing the images to unspool from the Polaroid camera. I sweated a river inside that suit until, at last, the line of children reached its end.
“Nice work,” my boss said, removing my head and peeling me out of the suit. “And thanks.”
“No problem,” I replied. I’d begged my way into this job in the hopes that it might lead me closer to a literary life. And it would. Eventually. One character costume at a time.
2.
Long before serving as the bookstore’s various mascots, I was its most loyal customer. I’d grown up in its overstuffed chairs, whittling away my elementary school years with stacks of chapter books piled high alongside me. Over time, the booksellers got to know me; I was the earnest young kid who always reshelved the books he didn’t buy. In some ways, I was a different store mascot then—no costume required.
To celebrate the end of my seventh-grade school year, my mother drove me to the bookstore and said, “You can pick out any book you want.” While my friends spent the first night of summer at the movies, I spent mine running my finger along the spines of books as if performing some divination.

Once, buried in the science fiction section I noticed an unassuming book by Ray Bradbury, whose short story, “The Fog Horn,” we’d read in English class earlier that year. I pulled Dandelion Wine from the shelf and read the back copy. “The summer of ’28 was a vintage season for a growing boy,” it read. “A summer of green apple trees, mowed lawns, and new sneakers. Of half-burnt firecrackers, of gathering dandelions…”
I was sold, not only by the poetic flourishes of the subject but the timing, too.The book began on the first day of summer 1928—precisely 70 years from the moment in which I then lived. Surely it was a sign, a bit of bibliomancy to direct me to whatever came next. I made my way to the register that I would one day operate.
“Great choice,” the bookseller said, ringing me up and returning the book. “When you’re done, come back and tell me what you think.”
“I will,” I promised.
3.
In 2001, Americans were reading three books: Who Moved My Cheese, The Prayer of Jabez, and John Adams. I knew this because,when I wasn’t “in character,”it was my job to shelve them directly across from the registers. I’d face their covers out, giving customers a visual reminder of what they’d seen on the Today Show or in the pages of People magazine. Of course, readers got their recommendations from at least one other source, Oprah Winfrey, whose book selections catapulted more than a few titles—Icy Sparks, Cane River, Stolen Lives—to the top of the bestseller lists. I never read any of them, though many people did. Or at least many people bought them, paged through them, and lugged them to the book club and back. At which point Oprah would make her next selection and the cycle repeated.
I never begrudged this brand of book buyer. True, their literary preferences were decided by a talk show host, but at least that talk show host got them through the door. And without their willingness to plunk down $24.95 a dozen times a year, we’d have never survived the big box bookstores, whose daily encroachment threatened our survival. I didn’t much care if the books were any good; I just wondered what it must feel like to hold in your hands a book you’d written yourself. I wasn’t thinking like a reader, I was thinking like a writer.
For the remainder of my high school career, I began acting like a writer, too—penning failed story after failed story when I wasn’t at school or at the store. And for nearly three years, it seemed I was always at the store. I worked weekends, in addition to one or two weeknights. If my work schedule wreaked havoc on my social life, I can’t recall. What I do remember were the perks: getting paid slightly more than minimum wage to borrow books, chat up fellow readers, and brush shoulders with every author who stepped through our doors. Hardly a week passed when some author didn’t. In my first month, I worked events featuring Dave Pelzer and Bill Bryson, and marveled at their standing-room-only crowds. I set up the chairs and tore down the chairs and peddled their books as best I could. The process repeated itself for James Patterson, Jim Davis, David Almond, and so many others.
Most nights, I maintained some semblance of professionalism, but when I met John Updike, I was too starstruck to manage much beyond the idiotic ramblings of a 17-year-old. “I…I saw your cameo on The Simpsons,” I told him. Updike pondered this and then released his tight New England laugh.
“That’s right,” Updike said. “I did do that, didn’t I?”
Yes, I confirmed. He had.
4.
One night in 2001, the author Peter Jenkins, who’d famously walked across America, stuck around after closing to sign a few stacks of books. I’d been sequestered behind the register for most of his talk, though now that the doors were locked and the customers had gone home, I lingered in his field of vision, hopeful for a conversation. I’d just finished cleaning the bathrooms, my body reeking of lemon-scented disinfectant and toilet bowl cleaner. If he noticed, he offered no indication. Instead, he signed the last of his books, then turned toward me and introduced himself. For 15 minutes, we small-talked near the gas fireplace in the center of the store, surrounded by hundreds of travel books.
“Do you write?” Jenkins asked me.
“I try,” I said.
“Well, what do you write about?”
Nobody had ever asked me that, and whatever answer I managed lacked conviction. My boss was in the backroom counting the drawers, so I took it upon myself to walk the author to the front of the store and send him off. He waved, wished me the best with my writing, and then walked toward his car. I thought: The man now walking before me once walked across America. But even more impressive, he’d written a book about it.
5.
One night, as the rain drummed down upon the store’s skylight, I hid in the fiction section and read a collection of stories by a writer named Jhumpa Lahiri. I was so deeply immersed in the collection’s opening story that I didn’t hear my coworker buzz me over the intercom. Before that night, I’d never heard Lahiri’s name; I knew nothing of her work. But the story of Shoba and Shukamar—and their confessions to one another under cover of darkness—so moved me that I became numb to the world around me. I sunk to the floor alongside the other L- authors—Tim LaHaye, Dennis Lehane, Laura Lippman—and tried to understand how a writer could make me care so deeply about people I’d never met.
“Where on earth were you?” my co-worker asked upon my eventual return to the register.
“I don’t know,” I said. “Somewhere else.”
6.
Four years after purchasing Ray Bradbury’s Dandelion Wine, I received a letter from the author in the mail. I’d written an essay on Bradbury which, much to my astonishment, had won first prize in a national contest. I was working a late shift at the bookstore when my mother hand delivered it.
“This came for you,” she said, pointing to the return address.
Standing at the same register where I’d purchased my first Bradbury book, I opened the envelope. Ray Bradbury had read my essay, his letter informed me, and he thought it “one of the finest” essays he’d ever read. He promised to keep it in his desk drawer “as a permanent piece of literature for me to read from time to time.”
That night, I shelved the books twice as fast. Memoir, history, psychology, reference—the books found their place in record time. I directed customers to The Lovely Bones, The Da Vinci Code, The Corrections—whatever they wanted. It took every ounce of restraint I had to keep from pulling Bradbury’s letter from my pocket and sharing it with every customer. See this? I wanted to shout. Someone out there was listening! Next time somebody asked me what I was writing, I’d have an answer.
7.
When Lois Lowry was in town, the bookstore’s owner—who was aware of my literary ambitions—generously assigned me the task of assisting Lowry with her autographs. My job was simple: open each book to the proper page and let her do the work. At that point in my life, Lowry’s The Giver was one of the most consequential books. It was the first book that felt like it was taking me seriously as a reader. It refused merely to entertain.
Since the bookstore owner hadn’t explicitly prohibited me from doing anything dumb, I did something rather dumb. Upon learning that I was a writer, too, Lowry made the mistake of asking the same question Peter Jenkins had asked, only this time I was ready. What was I writing?
“Mostly this,” I said, pulling forth from my backpack my two-hundred-page manuscript. “You can have it if you want,” I said. “I’ve got other copies.”
A polite “thanks but no thanks” was in order. An impolite, “Leave me alone, kid,” would have been warranted, too. Instead, she summoned some wellspring of generosity, and said, “Thank you. I look forward to reading it on the plane.”
A few years back, after a young writer handed me a similarly sized unsolicited manuscript, I reached out to Lois Lowry to beg forgiveness for my faux pax. “I don’t know if you remember me,” I began in an email, “but back in 2002 or so, I handed you my novel….” She did not remember me. Thank God. “But I will tell you that I do treasure each individual encounter,” she added. “And I’m glad you’ve treasured the memory of it as well.”
8.
Later, after publishing a few books of my own, I returned to my hometown to give a reading at the local library. Following the presentation, I glanced up to see the bookstore owner who, 15 years prior, had the bad sense to trust me not to pass off my manuscript to Lois Lowry. I didn’t know what to say to the man. How to thank the person who created the space for some high school kid to get a taste of the literary life?
“I wouldn’t be here if it wasn’t for you,” I finally managed.
“Oh, it was nothing,” he replied.
“It was something,” I said. “Meeting all those authors and shelving all those books, it allowed me to be a part of things.”
Only a handful of people remember how the rain drummed atop the store’s skylight. And how when you looked up in the right light, you could see a hint of your reflection staring back. We shook hands, and I never saw him again. Moments later, I was approached by a teenaged student who informed me he was a writer, too. I took the risk; I asked the question.
“Oh yeah? And what do you write?”
9.
A few summers back, I founded an arts camp for 14-to-18-year-old artists in five disciplines. I did it not because I am a glutton for punishment, but because only now, in middle age, do I understand the impact seasoned artists have had on my life. Not just the aforementioned writers but also the local journalists, the English teachers, the librarians. Those people who helped me understand that there are plenty of paths to the literary life, and only sometimes does that path involve costumes.
When I watch young artists interact with their artist mentors at camp, it all seems far more casual. Either their artistic heroes are humbler, or today’s young artists are harder to impress. If they want to be starstruck, the young artists scroll TikTok. But if they want to learn about color, shape, and how to craft a line, they pocket their phones and turn to their mentors. They are direct, pragmatic, and unafraid. They are what I wish I’d been.
10.
On the night Harry Potter and the Order of the Phoenix was released, the bookstore stayed open late. Hundreds of people packed the store, many of them wearing cloaks and scarves, all of them wielding wands. Though I was fully aware of the Harry Potter phenomena, I’d underestimated the commitment of the series’s superfans. They trembled with anticipation, sheer joy flooding their faces as they dreamed up spells to speed up the hands on their watches.
Legally, we weren’t allowed to sell the books until midnight. And so, when the hour struck, I joined a parade of booksellers to wheel out the book carts. The superfans—children and adults alike—nearly tore my clothes as they threw their bodies toward those carts, snatching up copies before I even made it to the front of the store. It was the closest I ever came to being a rockstar.
Today, it seems impossible: that there was a time when hundreds of people would gather, stay up late, and squeal at the prospect of getting their hands on a long-anticipated book. It was a communal event, an occasion, something you marked on your calendar. Book buyers have since foregone such occasions in favor of convenience: a click, a swipe, and the book’s delivered straight to your door. At which point the solitary act of buying the book can be accompanied by the solitary act of reading. The bookseller is lost in the process. As is the budding writer.
11.
“Smi-ile,” my mother called.
It was May 2003, days before I received my high school diploma. I was at the bookstore, posing before a giant display of graduation books (Oh, The Places You’ll Go, The Healthy College Cookbook). Directly above me was a sign with my name on it, congratulating me on my impending graduation. By then, I’d been working at the bookstore for years, and the people there had become family. Not just my coworkers (one of whom would be a groomsman at my wedding) but also the customers (one of whom would attend that wedding).
By August, the bookstore would shutter its doors for good—the latest casualty of the big box bookstores. But even if our store hadn’t closed—even if we’d survived our brick-and-mortar competition—my time donning character costumes had ended. I’d move to Illinois in early September to study English and creative writing at a small liberal arts college. And while the next four years were vital to my writing development, they were not nearly as important as the three years that came before. Amid all those interactions with all those writers, I came to learn one essential truth: Writers were people just like me.
Too-cool-for-school senior that I was, I offered my mother and her camera no more than a two-second grin. Just long enough to snap a shot of me in my natural environment: an 18-year-old in cargo shorts and a Hollister shirt. Staring at the photo today, I notice something else: a pen dangling from my writing hand. It’s the only bookstore photo I’ve got when I’m not in costume. And it’s the photo that captures the most.

Just a Choice: LaToya Watkins on Motherhood

- | 1

These days, I don’t like leaving the house. I prefer to work from home, shop from home, and visit with family and friends from home. I’m not an agoraphobe—I run every morning, visit my favorite cookie shop twice a week, and go to grocery stores and farmer’s markets during low traffic times. Nonetheless, I’m beginning to recognize my social limitations. My heart becomes a violent storm inside of my chest, I fidget, and sometimes I can’t control my breathing when I think about being in peopled spaces. I don’t know exactly when it started, but I have grown into someone else. Some days I wonder what happened to me, why I am this way.

I think this is what happened: I became a mother. Not in the “I just want to stay at home with my kids” sense—more like the “I’ve given birth to these black babies and now I have to keep them alive” sense. I feared that my children wouldn’t quite understand that no matter what they knew themselves to be, “America” might see them as what she always has. Danger. To me, motherhood has always felt like a burden. I once thought of myself as a reluctant mother, no—a resistant mother. I’ve never worn motherhood like a second skin, always had to work hard at it and have help and relief from it. And though I love my children fiercely, I don’t think I ever really grew into being their mother.

My babies are all young adults now. My son, the oldest, is the sweetest of them. He’s beginning to understand that as a young black man, it is unlikely that he’ll experience the freedom of his white counterparts. My daughters don’t quite get it yet. They don’t understand that here, life means to break them. I used to think my fears for them were the same ones my mother had for me. When I was a little girl, she constantly checked in to make sure I was safe in this world. She didn’t want me to fall behind in school. She told me to say “no” to drugs, to eat my vegetables, to stay away from boys. That these were the things I needed to do to stay alive, to grow old. And as I did grow older, she grew more protective, suspicious even, of the choices I was making. When she learned I was pregnant at 17, she told me that one of her biggest fears had been realized. 

I didn’t tell my mother, then the wife of a deacon, that I’d already visited an abortion clinic. That I didn’t have enough money or time away from her strict gaze to go through with my choice. The day she realized that she hadn’t bought me tampons or pads in over three months, I stood there and watched her unravel. And I waited for her to tell me what to do. Eventually, she asked, “So I guess you want to have an abortion?” When I nodded yes, she told me flat-out that she wouldn’t allow it. And I didn’t argue or fight her on it. I was 17. She was God and politics and everything I knew about the world. 

This year, I published a novel set in an imagined town inspired by the place where my family is from. In the opening scene of Perish, 16-year-old Helen Jean, with the help of her cousin Ernestine, attempts to end an unwanted pregnancy by drinking turpentine. When the procedure fails, Ernestine advises Helen Jean to return later to try a different abortive method. Believing that she has heard the voice of God, Helen Jean declines and resolves to marry a man she doesn’t love and give birth to a child that she knows she can’t love. I imagine that if the story had been set in the 1990s, when I was a pregnant teenager, instead of the 1950s, there would have been other options, healthier options, for women in Helen Jean’s predicament. 

I also imagine that Helen Jean and other women from her community might’ve still ended up coming to her cousin for assistance. Just because abortion was legal did not necessarily mean that Black women had access to it. For one, Congress restricted the use of federal dollars from covering the abortion costs and services for poor women. And in addition to cost restrictions, someone in Helen Jean’s condition might have had limited access to healthcare, lack of choices for birth control, inadequate sexual education, and a strong concern about the social implications of abortion—and premarital sex, and contraception—within her community.

When I became pregnant in the 1990s, the legal right to choose wasn’t the issue for me. Once my family found out, I didn’t have any other choice than to go through with that pregnancy. In my family, in the community we come from, the shame of abortion is more burdensome than the shame of giving birth to a child without adequate access to the resources needed to provide for it. What I knew about myself is that I wasn’t ready to be a mother. That I didn’t have the skills or the energy or the patience for such an undertaking. But none of that mattered—it was too late for me to make my own choice. 

Nonetheless, I spent hours at a time in the bathroom beating bruises onto my own stomach trying to expel what I thought marked the end of my life. I ingested things that could’ve killed me. I climbed to the roof of our single-story home and jumped off. I was desperate in ways that my mother didn’t understand, that my family wouldn’t understand. So I tried in secret to miscarry my ruin. But in the end, I took on the burden of loving someone else more than myself and became a young mother. My daughter was born strong and healthy. She grew fast and wore my face, but none of these things made our bond instant. It took me a few months to realize I loved her, took me that long to really look at her—to see her as something other than the end of me. The guilt of all those feelings before that moment—before I really saw her—is something I still carry. 

When my daughter was seven months old, my mother called me at the part-time job I’d started working after my high school graduation. “She stopped breathing,” she said. She went on to explain that she had managed to resuscitate her and that I should come to  the hospital. Those next eight hours are a blur all these years later, but my daughter didn’t survive that night. We left that place without her. What I do remember from that night is that no one believed us. No one believed that she had stopped breathing, that something wasn’t right, that something was ending for one of us.

I’ll never be able to explain how I managed grief as a mother when I hadn’t wanted to be one and knew I shouldn’t have been one, or how I didn’t know what grief was because where I’m from we don’t talk about grief or how to grieve, or how I spent the next few years chasing gods and begging them to give her back to me, or how I decided to become a mother again and again to make up for the guilt and the unpacked grief and how the fact that I never did still feels like that first loss all over again. And these are the things that I feared for the daughters I birthed after that first one: that something bigger than them would steal their choices and that there would be too much pain because of that. I wanted them to be young and to be free and to be alive. I wanted them to have pleasure and to live there until they were old enough, strong enough to carry pain. And I wanted them to know that the choice would always be theirs. 

When my 19-year-old daughter, my baby girl, came home from college two springs ago, needing to talk to me, something inside me knew exactly what she had to say. As I listened to her trudge through her over-practiced speech, I closed my eyes and saw myself at 17. I was scared and alone and desperate. I needed to be asked what I wanted. I needed to be given a choice. To be supported in whatever it was I chose to do. If things had happened that way, I might be free from some of these things I carry. When I opened my eyes, my baby was looking at me, looking to me. 

“You hear me, Mom? I’m pregnant.”

I swallowed hard and thought about my own bruised belly, about my shame, about my dead baby, and about all the grief and guilt that has turned to anxiety and fear. I thought about how I’m dragging it all behind me and I knew I didn’t want that for my daughter.

 I nodded my head and told my baby I heard her.

“Mama’s here, baby,” I told her. And then I put my arms around her and said something like, I’m here. We can talk through your options, but this is your choice. Whatever you decide, I’m here. The question is: What do you want to do?

And I held my breath and trusted that she knew.

How Many Errorrs Are in This Essay?

- | 10

A man of genius makes no mistakes. His errors are volitional and are the portals to discovery.
James Joyce, Ulysses (1922) 

Technically known as “sorts,” the letters a print setter used were crafted from copper and stored like tiny inked seeds in a wooden case. Capitalized letters were kept in the top portion (hence “upper case”) and those that weren’t were stored in the bottom (thus “lower case”). Carefully fastened into an iron composing stick, the printer would spell out words and sentences which would be locked into a wood-frame galley and then organized into paragraphs and pages. Arranging sorts was laborious, and for smaller fonts, such as those used in a Bible, the pieces could be just a millimeter across. Long hours and fatigue, repetitive motion and sprained wrists, dim light and strained eyes—mistakes were inevitable. The King James Version of the Bible has exactly 783,137 words, but unfortunately for the London print shop of Robert Barker and Martin Lucas, official purveyors to King Charles I, their 1631 edition left out three crucial letters, one crucial word—”not.” As such, their version of Exodus 20:14 read, “Thou shalt commit adultery.” Their royal patron was not amused. This edition was later deemed the “Wicked Bible.”
Literature’s history is a history of mistakes, errors, misapprehensions, simple typos. It’s the shadow narrative of expression—how we fail because of sloppiness, or ignorance, or simple tiredness. Blessed are the copyeditors, for theirs is a war of eternal attrition. Nothing done by humans is untouched by such fallenness, for to err is the universal lot of all of us. Authors make mistakes, as do editors, publishers, printers (and readers). If error were simply an issue of a wrong comma here or an incorrect word there it wouldn’t be nearly as interesting, but mistakes undergird our lives, even our universe. They can be detrimental, beneficial, neutral. When Lockheed Martin designed the Mars Climate Orbiter using American units and NASA assumed that they’d used the metric system instead, a discrepancy that resulted in that satellite crashing into the red dust of the fourth planet from the sun—that was a mistake. And when the physician Alexander Fleming left out a culture plate which got contaminated, and he noticed the flourishing of a blue mold that turned out to be penicillin—that was a mistake. Errors in how people hear phonemes are what lead to the development of new languages; mistakes in an animal’s DNA propel evolution; getting lost can render new discoveries. Sometimes the flaw is that which is most beautiful. 
Certainly, there are no shortage of them. Typographical errors such as those in the 1631 Bible are known as “errata,” and despite the incendiary mistake of Barker and Lucas, less salacious typos were common. A 1562 printing of the sternly doctrinaire translation the Geneva Bible prints Matthew 5:9 as “Blessed are the placemakers” rather than “peacemakers;” an 1823 version of the King James replaced “damsels” in Genesis 24:61 with “camels,” and as late as 1944 a printing of that same translation rendered the “holy women, who trusted God… being in subjection to their own husbands” in 1 Peter 3:5 as referring to those pious ladies listening to their “owl husbands.” But not all errors seemed as innocent—in 1637, an unlucky compositor deleted one letter “s” from the sixth word and the single comma which was to follow it from Luke 23:32, so that it read “And there were also two other malefactors [with Jesus],” those two accidental deletions blasphemously impugning the moral rectitude of the savior. A 1653 printing stated that it was the unrighteous who would inherit the earth, a 1716 edition records Jeremiah 31:34 as telling us to “sin on more,” and a 1763 volume replaces the penultimate word “no” with “a,” so that Psalm 14:1 reads “the fool hath said in his heart there is a God.” Those copies were pulped, the printers fined 3,000 pounds. One wonders if the Lord wasn’t sending a message in a 1612 errata where Psalm 119:161 had the first word of the line “Princes” replaced, so that it now read, “Printers have persecuted me without cause.” 
Christ assured his apostles that he came not to destroy the law, and he might as well have added that he came not to amend, edit, revise, or misquote it either. Precision matters when it comes to scripture. Slipping up when transcribing God’s word can have perilous consequences concerning damnation or redemption, but a mistake in the law—and not just God’s law—can have pressing repercussions here on earth. Practicing law is, after all, nothing more than being able to interpret ambiguities, but sometimes mistakes are embedded in contract and legislation themselves. Jarndyce v. Jarndyce, Charles Dickens’s fictional Court of Chancery probate case in Bleak House, propels that novel’s byzantine plot for 912 pages because of conflicting disputes about a will. “The parties to it understand it least,” writes Dickens, “but it has been observed that no two Chancery lawyers can talk about it for five minutes without coming to a total disagreement as to all the premises.” Much depends on being able to reconcile those mistakes of phrasing, as “innumerable children have been born into the case; innumerable young people have married into it; innumerable old people have died out of it.” 
Depending on whether ambiguity is a mistake or a strength, the law often codifies uncertainty. The U.S. Constitution isn’t particularly long—only a few pages—and yet it’s filled with grammatical and spelling errors, as well as confusing syntax that bedevils contemporary citizens. The word “choose” is rendered “chuse” several times, Capitalization is inconsistent, possessive apostrophes and those of contraction are often dropped, and even “Pensylvania” is misspelled. Much more significant are the idiosyncrasies in punctuation: commas are placed between nouns and verbs, errant commas in the Second Amendment makes it unclear as to whether the right to bear arms is reserved for individuals or only “well regulated militias,” and a semicolon in Article VI seems to invalidate the Constitution itself. “The Constitution, and the Laws of the United States which shall be made in Pursuance thereof; and all Treaties made, or which shall be made, under the Authority of the United States, shall be the supreme Law of the Land.” Strictly speaking, the end stop of that semicolon implies that “all Treaties made,” and not the Constitution, shall be the supreme law of the land, and yet we’ve always just assumed it’s an obvious typo. 

The Constitution has only a scant 4,543 words, but that a book as large as the Bible—783,137 words!—should see a few letters accidentally left out, even to potentially perilous soul-damning effect, isn’t so surprising. Adam Nicholson explains in God’s Secretaries: The Making of the King James Bible that when a nineteenth-century scholar “attempted to collate all the editions of the King James Bible then in circulation, he found more than 14,000 variations between them.” Nor were such blunders unique to print; anyone working with Medieval manuscripts is familiar with all of the human failings that render an incorrect Greek conjugation or a sloppy Latin declension, but there are even more blatant accidents. Dittography is common—that is, the doubling of a word word because a scribe happened to glance up and lose their place, thus repeating themselves. Homeoteleuton is another frequent peril, which is when a scribe takes a break and returns to where they were writing and, misled by a similar keyword, finishes with a latter sentence from whatever they are copying. Homeoarchy is the accidental deletion of lines; metathesis the reversing of letters in a wrod. An entirely more delightful flaw can be found in a fifteenth-century Croatian manuscript, where splayed across the pages are the inky pawprints of the scribe’s cat. 
A small blip could be scratched out, but more extensive defects required radical editing.  A copyist of the Abbey Bible, transcribed between 1250 to 1262 in Bologna, happened to leave out several verses from 1 Samuel, while garbling syntax in the ones which he did include. Rather than abandon expensive vellum, decorated with lustrous hues of cobalt, gilt, and burgundy, the scribe crossed out the offending portion with a set of blue and red lines and began over on the following page. Sometimes scribes made apologies in the marginalia for a poor showing; one Medieval Irish copyist explained his errors by writing, “Let me now be blamed for the script, for the ink is bad, and the vellum defective and the day is dark.”
Another particularly creative fellow, an English scribe from the late thirteenth century, deleted a line by accident, and so on the page’s borders he drew a little peasant using a pulley to hoist the missing sentence back into its correct spot. Not even the most beautiful manuscripts were without faults; the ninth-century Irish Book of Kells, its intricate Celtic knot work dyed in indigo and copper, ochre and lapis lazuli, flubs the genealogy of Jesus, while several passages are missing. Yet when the copyist rendered Christ’s declaration in Matthew 10:34, which is supposed to read, “I came not to send peace, but a sword,” as “I came not only to send peace, but joy,” one can’t help wonder if it’s an error or an improvement. 
Returning to the Wicked Bible, the Archbishop of Canterbury George Abbot blamed the perfidious errata on what he saw as a slipshod operation, claiming that their “paper is naught, the composers boys, and the correctors unlearned.” The king didn’t find it funny, with the ecclesiastical historian Peter Helyn writing in 1669 that “Order was given for calling the Printers into the High-Commission where upon the Evidence of the Fact, the whole impression was called in, and the Printers deeply fined, as they justly merited.” Charles I had as many copies destroyed as could be found, the result being that only 15 survive in scholarly libraries; seven in Great Britain, including at the British Library and Oxford’s Bodleian; seven in North America, including at the New York Public Library and Yale, and one in Australia. By comparison, the first edition of Shakespeare’s folio, printed a decade before in 1623, has a whopping 750 copies that have endured. 
A 1632 copy of the King James also printed by Barker and Lucas—an edition which doesn’t say that it’s cool to cheat on your wife—is currently selling on AbeBooks for $3,250. By contrast, in 2015 a private buyer purchased a copy of the Wicked Bible from an Orlando book dealer for $99,500. That rarity commands price isn’t surprising, but that so often it’s the flaw that collectors’ eyes wander toward says something. It’s the slipup in the American Tobacco Company’s release of Pittsburgh Pirates shortstop Honus “The Flying Dutchman” Wagner’s 1909 baseball card—largely destroyed at the behest of the anti-smoking fielder’s demand since they hadn’t first received his legal permission—which helped it sell at auction this last August for an astounding $6.6 million. The same reason that an “Inverted Jenny” U.S. Postal stamp that featured an upside-down blue biplane framed by a red border was bought at auction three years ago for $1,350,000 (the original would have been 24 cents). Or the popularity of Elias Garcia Martinez’s 1930 workman-like fresco of Christ, which in 2012 was touched up by Cecelia Gimenez, an amateur art restorationist and parishioner at Borja, Spain’s Sanctuary of Mercy Church, and who rendered the Messiah not as the suffering servant but as something which looked like a chipper capuchin, leading some wags to christen the new piece Ecce Mono—Behold the Monkey. “The restoration has put Borja on the world map,” Gimenez said, “meaning I’ve done something for my village that nobody else was able to do.” Perhaps a failure, and yet because of the 81 year old’s accident the church was able to raise over $66,000 for charity. Enough to raise up praise, for as the Wicked Bible reads in another misprint—”Behold, the Lord our God hath shewed us his glory and his great-asse.” 

There is a sublimity in inadvertency, a profundity in accident. Certainly an inevitability. Perhaps a lyric poet can polish the 14 lines of a sonnet to an immaculate sheen, but an ample preponderance of text is going to accrue some errors, just as no consumer can avoid a few spider legs in their peanut butter and the FDA sets limits on the number of rodent hairs acceptable in your oatmeal. A finished novel will go through rounds of revisions at the hands of the author, sometimes a whole coterie of copyeditors, and any number of eyes that will see it before ink is pressed onto paper, and still a carefully examined book will inevitably have typos. The fact is that these sorts of mistakes, while perhaps embarrassing, are also largely irrelevant, and few notice them other than anal-retentive punctilious prigs. Yet when they’re noticeable, they’re noticeable. Possibly expensive as well; just ask Jonathan Franzen, whose novel Freedom had a first printing accidentally based on a rough draft, with HarperCollins producing 80,000 error-laden novels and selling a tenth of them in the United Kingdom before noticing. Franzen writes of how a “wealth of experience with mistake-making would have been a comforting resource” in the (corrected edition) of Freedom, but then again, maybe not. There are less disastrous instances as well. Theodor Dreiser’s An American Tragedy describes a pair of lovers as being “like two small chips being tossed about on a rough but friendly sea,” while Daniel Defoe tells us that Robinson Crusoe stripped naked, swam out to his sinking ship and retrieved supplies, which he then stored in his pockets on the returning laps. As Horace noted, even Homer nods. 

When it was claimed that Shakespeare never blotted out a single word, his theatrical colleague and noted frenemy Ben Jonson quipped, “Would he had blotted out a thousand.” For all his genius, the Bard was often uninformed or lazy, author of a veritable comedy of errors. The Winter’s Tale references landlocked Bohemia as having a coast. In Julius Caesar, Cassius uses a clock some 14 centuries before they were invented, and in The Two Gentlemen of Verona they sail from that titular city to Milan, a geographic impossibility. Those are the mistakes that remain in modern editions, but as Marjorie Garber reminds us in Shakespeare After All, the version that “we have come to admire, revere, quote and cite” is rendered from that which has been “improved and altered over time by the conjectures of editors trying to make sense of what may appear to be gaps or errors.”

When all literary details are fodder for close reading, every choice of “that” over “which,” every single comma, every instance of plotting, then the blunder is a rock that we trip over. Not infrequently that which announces itself as a fault was intentional; when Richard Tottel printed Songes and Sonnets in 1557, an anthology that posthumously reproduced the poetry manuscripts of Thomas Wyatt and Henry Howard, he “corrected” their visionary prosody; Thomas Wentworth Higginson infamously “cleaned up” the radical punctuation of Emily Dickinson’s poems in his 1890 publication; and in 1677 John Dryden “fixed” the blank verse of John Milton’s Paradise Lost in the former’s libretto The State of Innocence, having decided that what it needed was a plodding rhyme scheme where “In liquid burnings, or on dry, to dwell,/Is all the sad variety of hell.” 

Some errors are so obvious that there is no need to parse their significance. When William Faulkner makes clear in Sanctuary that a scene takes place at 5:30, and then a few sentences later he writes that it’s “almost seven oclock,” that doesn’t belie some deep metaphysical profundity—it’s a screwup. Because of his convoluted editing history, these sorts of gaffes aren’t uncommon in Faulkner, with nothing more infamous than the ever-shifting architecture and history of the McCaslin-Edmonds Plantation in Go Down, Moses. Sometimes the house was built in 1813, in other stories it predates 1807; occasionally it’s a modest “two-room mud-chinked log half domicile and half fort” and in other descriptions it’s a neo-classical, porticoed, columned, mansion featuring “clapboard and Greek revival and steamboat scrollwork.” Faulkner didn’t intend for McCaslin-Edmonds to be so mercurial—the flub doesn’t mean anything. But in some ways, it means everything. Because a mistake reminds us that fiction is artifice, announcing the illusion to all of us, and reaffirming that it’s been crafted by a fallible human. For the writer, every perfect sentence comes from some place higher; every flawed example is irredeemably your own. Furthermore, a slip up uncomfortably announces just how much of composition is unconscious, beyond the mere dictates of planning and a speaking. The error is a window where that force which compels the writer takes a moment to peak out into the world.   

Other accidents are more ambiguous than Faulkner’s clear continuity errors. In his 1816 poem, John Keats famously compares the experience of his first reading George Chapman’s Homeric translations to being “like stout Cortez when with eagle eyes/He stared at the Pacific,” except the first European to see the western coast of that ocean was actually Vasco Núñez de Balboa. Did Keats just not know this, or was this intentional? Does the purposefulness or not of the inaccuracy matter to how we read the lyric? Or take Franz Kafka’s incendiary beginning to his unfinished 1927 novel Amerika, wherein European émigré Karl Roßmann arrives in New York Harbor and from the bow of his ship he studies the Statue of Liberty, describing how the “arm with the sword now reached aloft.” Is this a mistake—did Kafka not know what the statue actually looked like? Is it a comment on the United States, perhaps? It’s uncertain, and yet as with all of Kafka’s oeuvre, such a strange detail seems pregnant with meaning, even if a slipup. Whether or not it’s the wiseman or his opposite who claims, “There is no such thing as a mistake,” we can’t quite dispel the desire for purpose, and so we read even blunders as having import. Unconsciousness is useful here, for then some part of Faulkner’s mind hidden even to himself wanted McCaslin-Edmonds to have its variable description; if Kafka floundered into that martial Statue of Liberty, it must have reflected something deep within.   

“A slip of the tongue can be amusing,” notes Sigmund Freud in The Psychopathology of Everyday Life, “and may be very welcome to the doctor practicing psychoanalytical methods.” An instance of Fehlleistung, which in German roughly translates into “faulty performance,” and that early English editions rendered as “parapraxis,” but which everyone else calls a Freudian slip. These are the fumbled turns of phrase where the unconscious has commandeered the tongue, the mistakes which aren’t really mistakes. “For seven and a half years I’ve worked alongside President Reagan,” then Vice President George H.W. Bush said in a speech during his second term. “We’ve had triumphs. Made some mistakes. We’ve had some sex—setbacks.” The crowd giggles; Freud wouldn’t think this misspeaking so innocent. While campaigning for Mitt Romney in 2012, Senator John McCain said, “I am confident with the leadership… President Obama will turn this country around,” inadvertently endorsing the governor’s opponent. Fehlleistung encompasses tongue slips, events that are misremembered, words that are misheard, and the phrase that cannot be found. Freud argues in An Autobiographical Study that “these phenomena are not accidental… they have a meaning and can be interpreted, and that one is justified in inferring from them the presence of restrained or repressed impulses and intentions.” Bush didn’t commit a gaffe—he was stating repressed carnal desires (I’ll hold to that interpretation). Therapists today place less emphasis on tongue slips, yet it’s hard not to think that at least the sentiment behind Fehlleistung is something we intuit. We suspect that we’re witness to a confession rather than a trifling error. That’s why Charles I punished Barker for his errata—the sin outed itself while his mouth protested innocence. 
Psychology remains the science of remedying our mistakes. From psychoanalysis to cognitive behavioral therapy, psychologists work to assist their patients in understanding how delusions, obsessions, compulsions, anxieties, and so on, are based in faulty understanding. Neurotics may fear that there are castles in the sky while psychotics live in them, but both are erroneous about there being anything above other than clouds. We are wrong about fears which are irrelevant and ignore those that are pressing, we return to past sleights and embarrassments that are irrelevant, we are angry at people who are undeserving and admire those who aren’t worthy. An argument could be made that fiction is the mode for exploring those misapprehensions, but mistake can’t be reduced to conflict, problem, or mystery, though literature certainly explores the tragic implications of error. When error comes up against cold reality the result is often guilt. Think of Edgar Allan Poe’s “The Tell-Tale Heart,” whose narrator murders an old man because of his glassy eye and is driven into a frenzy by what he perceives to be the sound of the dead man’s beating heart, believing that “what you mistake for madness is but over-acuteness of the senses.” Here an error—a pathology, really—the belief that there is a dead heart loudly beating under the floorboards—reveals a hidden truth. Then there is the initial mistake, if a crime is a mistake (frequently it is), of the narrator’s heinous act, based on his own delusions about the nature of his victim. 

Failure, which is just a mistake with legs, can be far less ghoulish than it is in Poe’s story, even if no less mad. Charles Jackson’s arresting novel about alcoholism The Lost Weekend dramatizes how a certain mistake made by those who can’t afford it—the fallacy that those of us who are unable to drink can have just one—is repeated to terrible consequence. A counselor explains to Don Burnham, the drunken adman who is the central protagonist, that folks like him “can’t bring themselves to admit they’re alcoholics, or that liquors got them licked. They believe that can take it or leave it alone—so they take it… promising they’ll take one, or at the most two, and—well, then it becomes the same old story.” Sometimes the writing itself is the mistake. Journalist Geoff Dyer was tasked with a biography of D.H. Lawrence, but being stalled he wrote a meta-book about his inability to complete the project. “I do everything badly, sloppily, to get it over with so that I can get on to the next thing that I will do badly and sloppily,” Dyer writes. His Out of Sheer Rage: In the Shadow of D.H. Lawrence is by definition a blunder, but one where he turns failure into a triumph. “Like it or not you have to try to do something with your life, you have to keep plugging away,” Dyer writes, an echo of Samuel Becket’s paeon to fucking up when he writes in his play Worstward Ho!, “Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.” That bit is the portion that ends up in inspirational greeting cards sold at Whole Foods, but the untruncated passage is a bit more despairing. “Or better worse. Fail worse again. Still worse again. Till sick for good. Throw up for good. Go for good,” writes Beckett. Some might find this a downer, but I think it’s more inspiring. 

If writers make mistakes, explore mistakes, celebrate mistakes, and embrace lives that are mistakes, they also can use mistakes. During the Victorian era there was a method of composition which left everything up to chance, codified fortuitous error. Automatism is a method of composition that eliminates flaws from writing by making everything an error. First practiced in the gilded drawing rooms of upstate New York or the English Lake District, where wealthy freethinking women in tight-lace corsets and bustles and their husbands in starched collars sat about tables summoning the dead at seances, automatism involved turning the mind off while writing, so that a person could be a conduit through which words arrived. Whole books were written this way: Spiritualists like Patience Worth claimed that her automatic writing led to poetry and novels from authors living in the hereafter; her contemporary Emily Grant Hutchings asserted that Mark Twain dictated Jap Heron: A Novel of the Ouija Board to her from the astral realms in 1917 (the judge in the ensuing copyright hearing disagreed, concluding that the book was of too poor of a quality to be by Clemons). William Butler Yeats was the most respected author to take automatism seriously, using Ouija boards, tarot cards, and other occult paraphernalia to generate verse which he claimed came from a long dead spirit named Leo Africanus. As one of them wrote, “If spirits seem to stand/Before the bodily eyes, speak into the bodily ears, /They are not present but their messengers.”

Dadaists and Surrealists practiced automatism so as to indulge Fehlleistung—to make everything a slip. In dismal rooms above shops on the great avenues of Paris and Zurich, the air thick with hash smoke, poets like André Breton and Philippe Soupault ceded authorial intentionality to the whims of their hidden thoughts, letting the fingers on the keys of their Remingtons and within their smudged Moleskins freely perambulate. A heaven with no blockage, entirely structured by what could be seen as error. “Write quickly, without preconceived subject, fast enough so that you will not remember what you’re writing,” explains Breton in The Surrealist Manifesto. “The first sentence will come spontaneously, so compelling is the truth that with every passing second there is a sentence unknown to our consciousness which is only crying out to be heard.” If the unconscious makes possible the mystery of any writing, then the Surrealists were at least honest about that score. “The rail stations were dead, flowing like bees stung from honeysuckle,” write Breton and Soupault in their 1919 experimental novel The Magnetic Fields. “The people hung back and watched the ocean, animals flew in and out of focus. The time had come.” What does any of that mean? Who knows—least of all its authors. Automatism isn’t writing with a few moments of carelessness, its carelessness turned into writing. Paradoxically, in their oracular exactitude, their dream-like logic, such writings seem anything but careless. What Breton promised in his study The Automatic Message was, “Thought’s dictation, free from any control by the reason, independent of any aesthetic or moral preoccupation.” Error elevated to art. 

Surrealists were fascinated with how to cede control. Techniques were developed to force what any conventional artist would consider faults, and to make these their work’s basis. The Cut-up technique, made famous by Beat writer William S. Burroughs in Naked Lunch, involved cutting out portions of your own writing and randomly rearranging it, for as “one judge said to the other, ‘Be just and if you can’t be just be arbitrary.'” Many Surrealist techniques have the feel of parlor games, which is not incidental. Some of the most experimental methods of the Avant-Garde migrated into the sorts of things you may have done at summer camp. Alastair Brotchie writes in A Book of Surrealist Games that through play, Surrealists “subverted academic modes of enquiry, and undermined the complacent certainties of the reasonable and respectable.” Exquisite Corpse is the most enduring method of Surrealists, a form of collaborative artistic endeavor where each author offers in turn an adjective, noun, adverb, verb, adjective and noun, from whence the game derives its name after the game’s inventors generated the sentence “The exquisite corpse shall drink the new wine.” Today the game is associated with illustration where it is called Consequences, with one artist drawing the head of a figure, the next participant illustrating a torso, and the final player drawing the lower regions, resulting in a collectively generated chimera.

Monsters dwelled in our thoughts long before the Surrealists. A mammoth’s skull is unearthed, and somebody assumes that the nasal cavity is the eye socket of a cyclops; dinosaur bones evoke dragons. Albrecht Dürer, the great German painter and printmaker, never saw a rhinoceros with his own eyes, but that didn’t stop him from making a 1515 woodcut that remains the most enduring depiction of the animal. Having rendered his drawing from eyewitness accounts of those who did see a rhinoceros brought from India to Portugal, the first since the Roman era, and from a distance Dürer’s drawing conjures the idea of the creature, if not the actual species. The printer captures the heft of the rhinoceros, the thickness of its hide and the bulk of its weight, the cumbersome awkwardness of its form. Obviously the horn is there. “This is an accurate representation,” the woodcut’s German inscription reads, and yet in the details virtually none of it is. Dürer shows the rhinoceros as covered in thick, segmented, armored plates; he has a small horn on his hindquarters, and his scaled legs are clearly that of a reptile. Ganda, the animal’s Gujarati name given to him when the rhinoceros embarked from Goa to Lisbon, would drown off the Ligurian coast while being transported to Pope Leo X. A final fatal mistake. Elena Passarello writes of Ganda in Animals Strike Curious Poses with the heartbreaking detail that on his voyage he was wrapped in “green velvet and decked with carnations and gilt rope, perhaps to mimic wedding garb,” dressed for an appointment that he’d miss.

To those crowds who gathered at Estaus Palace to watch Ganda fight one of King Manuel’s elephants (the latter was frightened and ran off), or King Francis I of France who waited in Marseille so that he could see the animal disembark briefly on its way to Italy, or for the Medici Pope who was to add the doomed rhinoceros to his own menagerie, such a beast must have seemed like a divine mistake. His cumbersome heft and pachyderm mouth, Ganda’s knobby thick skin and that priapic horn. For Renaissance Europeans, there was something fantastical about a rhinoceros; Dürer’s woodcut was the first widely disseminated picture of the species since the first-century quadrans minted during the reign of Domitian when those creatures weren’t uncommon in the Colosseum. So distant was the memory of such megafauna, that some assumed Pliny the Elder’s description from Natural History of this fierce monster was more myth than animal. Pliny lists two creatures as the rhinoceros’ natural enemy; the first is the elephant, and the second is the dragon. When Manuel paraded Ganda through Lisbon, there must have been a sense that God is capable of anything, for the line between a wonder and a freak is thin. So unusual is the long neck of the giraffe, the nose of the elephant, and of course the horn of the rhinoceros, that unless observed with your own eyes it can seem as fantastical as anything in a Medieval bestiary. 

Women and men born with unfamiliar variations—the seventeenth-century Spanish courtier Petrus Gonsalvus who along with his children had hypertrichosis and thus appeared to his contemporaries as if he was a wolf-man; John Merrick in the nineteenth-century whose Proteus syndrome caused huge swellings on his hand and face which led to him being exhibited as the Elephant Man; Madame Dimanche, a nineteenth-century Parisian widow who had a ten inch cutaneous horn which grew from her forehead, and looked nothing so much as a rhino horn. Such human beings—human beings—were oft displayed at circuses and carnivals, freaks shows and geek shows, slandered by physician and cleric alike as being “mistakes” within the order of things. “The term freak is actually of some interest,” writes Jan Bondeson in The Two-headed Boy, and Other Medical Marvels, “as it originates in the expression ‘freak of nature,’ implying that the malformed child was a unique, unclassifiable phenomenon, the result of some strange ‘maternal impression,’ a witch’s curse, or divine displeasure.” The result of such categorization is predictable—dehumanization, persecution, oppression. Those who didn’t match normative expectations were placed first outside the divine order, and later outside the biological order. The result was the same, the slandering of people as being inferior because of physiognomy. The basic, sacred, and inviolate rule is actually rather simple, however—no human being is a mistake. Merrick, a man of uncommon intelligence, grace, and empathy, expressed it well. In letters which he wrote, he would end with a poem by the eighteenth-century hymnologist Isaac Watts: “‘Tis true my form is something odd, /But blaming me is blaming God.” 

Biology doesn’t broach “mistakes” exactly, for a variation or mutation or adaptation is neutral in and of itself. It’s only in how well suited that variation is to survival that it proves advantageous or not, all of which depends on environment. Ganda’s horn may have seemed a fearsome ornament, but as Pliny wrote—albeit with some details a bit shaky—his protuberance was estimably practical; he prepares “for the combat by sharpening its horn against the rocks; and in fighting directs it chiefly against the belly of its adversary, which it knows to be the softest part.” Such fortunate oddities are a result of adaptation, of generations of rhinoceroses born with shorter and longer horns, with natural selection sorting out which is better suited. Charles Darwin defines natural selection in On the Origin of Species as simply the “preservation of favorable variations and the rejection of injurious variations,” so that a rhino’s horn may be well adapted to the rainforest but poorly to a Mediterranean boat. Darwin explains that “any variation, however slight and from whatever cause proceeding, if it be in any degree profitable to an individual… will tend to the preservation of that individual, and… its offspring.” A Galapagos finch’s beak which is a bit larger may seem flawed, though better at cracking a nut; a moth’s wings that are smudgy looks marred, but it can blend into a smoggy London park better. The mechanism of mutation was a mystery to Darwin, awaiting a fuller understanding of genetics, but if anything this fuller understanding demonstrated how much we owe to mistakes. “Mutations arise from errors made by the machinery that copies or repairs DNA,” explains Armand Marie Leroi in Mutants: On Genetic Variety and the Human Body, the adaptations which emerge in a species due to the sloppiness of replication at the cellular level. Life itself propagates from the intransigence of error since the very beginning, when a spicy primordial stew of amino acids first coalesced. 

Reality itself depends on broken symmetry. Physicists speak so much of elegance—beautiful theories, beautiful mathematics, beautiful equations—that one is tempted to see them as practicing aesthetics. “Physicists love symmetry because it has a mathematical and intuitive beauty,” explains Leon Lederman in The God Particle: If the Universe is the Answer, What is the Question? Yet there is a lack of elegance deep within our universe. For example, it could be expected that there should be equivalent amounts of matter and antimatter, the latter referring to the mirror-image version of the former, where their quantum numbers—including attributes like charge—are the opposite. Electrons, those elementary particles that circle the nuclei of atoms, have negative charge, but their opposite of a positron has (predictably) positive charge. It would be expected that in a harmonious universe there would be equal numbers of both, but we’re lucky that there isn’t, because when matter meets antimatter the result is a flash of pure energy. Such an asymmetry is the only reason why anything is here. Considering this inelegance, Lederman sighs that “our depression is tempered by the faith that when all is known, a deeper beauty will be revealed.” Quantum mechanics is historically content with garish potentiality, however; particles which are also waves smeared across space, with everything reduced to probabilities. Because Werner Heisenberg’s uncertainty principle says that the exact location of a particle or an amount of energy in infinitesimally small areas is indeterminate, there is a rich virtual mélange of particles forever popping in and out of existence along with their antimatter siblings. That these pairs of virtual particles always come twinned is a bit of cosmic bookkeeping due to laws of conservation, perhaps indicative of that deeper symmetry that Lederman pines for. 

Something else here though, because these quantum fluctuations are their own kind of mistake, with uncertainty providing a loan from reality that’s paid off in nanoseconds. “They live on borrowed momentum, borrowed energy, and borrowed time,” explains Robert Oerter in The Theory of Almost Everything: The Standard Model, the Unsung Triumph of Modern Physics. Because you can never say with utmost certainty that there’s zero amount of energy in a vacuum—nothing is ever certain—then there is also nothing that’s ever really a vacuum. An explanation of why there is something rather than nothing, that question which has bothered philosophy from the pre-Socratics until today. If reality began with the Big Bang—and all data indicates that it did—the problem still remains of why. Quantum fluctuation provide an explanation—it was all just a mistake from the get-go. That is to say that our universe began just like those virtual particles, in some other place that isn’t here. With infinite time it would stand to reason a virtual universe would emerge. As physicist Steven Weinberg wrote in The First Three Minutes: A Modern View of the Origin of the Universe, “The more the universe seems comprehensible, the more it also seems pointless.” Far from nihilism. Anything pointless is also perfect, and nothing is more pointless than nothing. Come again? 

In the sixteenth century, Rabbi Isaac ben Solomon Luria Ashkenazi gathered his followers at Safed to contemplate our fallen creation. The Ari, or “Lion,” as he was known, had his fellow kabbalists meet the Sabbath at dusk, facing towards Jerusalem and rubbing soil in their eyes, davening Hebrew prayers about the initial rupture in perfection. Wrapping tefillin underneath olive trees, the Ari promulgated a new understanding. Even in traditional Kabbalah, before the beginning, everything was the Ein-Sof, the perfected, incomprehensible, undifferentiated glory of God. The Zohar explains that God “was alone, without form and without resemblance to anything else.” Such a Lord was like that singularity which physicists speak of, that perfected state without time and space in which error couldn’t exist. But nothing else could either. And as a random quantum fluctuation birthed our reality, the Ari taught that the Ein-Sof contracted within itself so space for the created universe could expand outward, a process called Tzimtzum. The first mistake which made all others possible. But oh, what a beautiful mistake it was! Every joy and every sorrow, every love and every hatred, every triumph and degradation first emerged from a crack in nothing. The imperfection is what is profound, the marring is what is beautiful. The Ari taught that the meaning of our lives was a process of rebuilding that which had been broken—a task of editing, if you will. Tikkun Olam refers to the erasing of those original typos. “And so it is/now and till twilight,” the Ari explained about this task of fixing. To delete a comma is perhaps to address an injustice, to add a word is possibly to perform a charity, and all of those little acts of the red pen, all of our collective choices, revising and editing for when all errors will be reconciled again in that final draft.

Cecily Wong on the Thrill of Nostalgia

-

In 2014, while traveling through the Indian state of Goa, my boyfriend Read and I received a chance invitation to a wedding. We had met the groom and his friends at our hotel, hit it off, and learned in quick succession that he was the prince of a small Goan town, that he was getting married at the end of the week, and that we were invited. Two days later, we found ourselves among the thousand guests gathered at a sprawling sixteenth-century palace compound. I remember this day with the familiarity of a favorite movie: the half-dozen archways in the courtyard, each with a German Sheppard poised below; the four-foot oil paintings of the royal family in gilded frames; the newlyweds on a raised platform festooned with marigolds, Read in a turban, me in a bindi, both of us quietly gobsmacked. 

A childhood friend of the groom took us under her wing. She was a popular radio DJ who knew practically everyone in attendance. She was the kind of person who could tell you who among the guests had married for money, who for love, and who would be too drunk by the evening’s end. At a certain point, she pointed out a two-year-old boy sitting with a grey-haired couple in their sixties and told me they had lost a son. Her voice turned serious. A few years before, their only child had died in a terrible car accident, and the whole community had been gutted. The son was just 20 years old, magnetic and handsome and heir to his parents’ sizable fortune. The little boy was his replacement. She said it like that, and I could think of little else for the rest of the night.

The Indian wedding occurred at the start of a nine-month backpacking trip that bent the course of my life. I was 26 at the time, Read a year older, and we felt at the height of our youthful powers: old enough to pay for the trip, to not ask for permission, yet young enough to sleep in hostel bunk beds and sit on buses for entire days and feel only joy that we’d quit our jobs and were spending all our money. For the first time, we felt the throbbing pleasure of deep time—the hours and days extending into the distance, with the single, liberating purpose of seeing a bit of the world. And yet I could not shake my obsession with the Goan brothers. 

What fascinated me—at least, at first—was the idea of replacing a person. How did one fill a person-shaped hole? I thought about grief, about the delusion that accompanies loss, and the strange solutions people arrive at when their memories loom larger than their reality. I thought about the kind of memories that can entomb a person—with longing, with happiness. It’s incredible what the mind can do. Mine latched onto this family’s story, sparked an investigation into the mania of memory, without ever considering why the subject had captivated me in the first place. As our trip unfolded week by glorious week, across three continents, a nine-month summer that was pushing a crest into the terrain of my life, I kept returning, with blind intensity, to the idea of trying to replace something inherently irreplaceable.

Nostalgia was once considered a highly contagious disease, suffered by those far from home. Seventeenth-century Swiss mercenaries, fighting in France and Italy, pined for their native land, memories of which caused anxiety, depression, and a host of physical ailments. Nostalgia is often defined in terms of homesickness—nostos, the Greek word for homecoming, and algia, the word for longing—which works just fine if we enlarge the definition of home. Home is rarely a physical thing: people and places can become our home and just as easily, they can stop. As with most feelings, the difficulty with home is it becomes abstract—a cherished reminiscence that exists largely in the mind, that resists faithful recreation because we, or else the world, has fundamentally changed. But nostalgia is fed by desire. Like love, like ambition, it persists beyond logic, beyond our control. 

After our trip, we returned to New York and life took on a strange shape. We resumed the trappings of our lives, slid back into jobs and social lives, and yet privately, whenever Read and I were alone, without fail, we pulled out our phones and looked at pictures from our trip. We hung our memories on the walls. We asked each other—in bed, waiting for the train—if we could eat one meal from that year, lay on any of those beaches, which would we chose? It was the fish curry thali from Take a Break. It was the deserted coastline of Romblon Island. On weekends, we chased down experiences we hoped would suck us back into 2014, but only made our longing stronger. Eating pho was ruined, every time, by our talk of what it would cost in Vietnam, about that legendary bowl that couldn’t be topped. People shoved past us on the train platform and we lamented the neat queues in Taiwan. We missed the convenience store cuisine of Japan, the milk bread sandwiches with soft egg salad. We couldn’t stop—and the more we talked about it, the more we reached for it, the greater it loomed in our collective yearning.

According to Harvard professor and nostalgia scholar Svetlana Boym, nostalgia can be divided into two types. The first, called restorative nostalgia, produces a literal desire to restore the past, to recreate that lovely snapshot in time for which you long. It’s the impulse that makes you move back to your hometown to raise your children as you remember being raised. The same impulse that makes bands go on reunion tours and reboots old movies and looks back on warm memories like keys to unlocking happiness in the present. Restorative nostalgia is anguished by time’s incessant forward march. Many traditions worthy of preservation are kept alive by restorative nostalgia—but pushed further, it is also the very mania that makes a person believe they can, and should, throw an anchor into the past and hang on against the current.  

Two years later, in 2016, Read and I still hadn’t let the trip go. Instead, we’d hatched a cockamamie plan to return to Asia. Using Read’s college thesis—a case for growing bamboo for biofuel—we’d spent months applying for grants that would fund research in Asia. We’d received $10,000, a glorious sum that was not enough to cover our fantasy of living another year abroad. Still, we packed up our lives, said goodbye to New York, and set off for northern Thailand without a plan to return. 

We’d chosen Chiangmai because we’d loved it in 2014—it was teeming with nature and restaurants and friendly, creative people, all while being inexplicably cheap. And it still was! Chiangmai was almost exactly as we remembered, delivered us an experience nearly untouched by time. But dropping your life to see the world, for the second time in as many years, carries questions that do not arise with the first. There we were, eating a meal we’d imagined a hundred times, wondering what exactly we were doing. Were we lost? Were we hiding from adulthood? Read and I were entering our thirties, newly married, different people trying to make the old shoe fit. What had changed was not Thailand, or our carefully reconstructed fantasy, but us. Instead of feeling younger and looser, we felt unbearably old. Back in the land we’d memorialized, dragging around the heavy bag of our memories, we were forced to confront the passing of time. The singlemindedness that had brought us there was now two years old, and we’d since become more skeptical of our scrappy, restless plans, our new-apartment-every-year lifestyle. The wide-open days, our expat idleness, it made us increasingly self-conscious. 

Each morning, Read rode his motorbike out to the fields of Chiangmai University, where he was growing bamboo with Thai scientists, desperate for the plantlets to do something impressive, while I walked my laptop to the roof of our apartment and tried to write. But neither of us could focus on work. We became obsessed with unlocking the joy we had chased across the world. We wanted it so badly, we were certain we could manufacture it, if only we went more places, met more strangers, followed the blueprint of our bygone success. Instead, we spent a year messing with our memories. Every stubborn attempt to recreate the past built a strange edifice, which gradually cast a shadow over the very thing we were trying to preserve. 

Back in New York, humbled and skittish, we wanted jobs. We wanted steady paychecks and purpose and an apartment into which we could unpack and simply breathe. And as we slowly collected these things—a steady writing gig, a start-up job, a 10-piece set of pots and pans—the joy we’d chased across the world made its first appearance in years. It took us both by surprise—this feeling of release that accompanied the very thing we feared would strangle us: getting older, growing up. And what had changed? The answer, maddeningly, was the same as it had been in Thailand. We had changed. It was us that had been marked by time, the angle of our vision widened and warped. 

Looking backward, everything had taken on new meaning. Our perfect memory of 2014 was altered by the pain of failing in Thailand, which was altered by the relief of succeeding in New York, which would undoubtedly be altered by whatever came next. It was foolish to behave as if experiences could exist in a sealed-off, immaculate state—that I could dig a memory from its soil, drop it in the wrong environment and expect it to thrive. And yet even as I marveled at this realization—that experience, that memory, are inherently unreplicable—I couldn’t kick the habit of looking over my shoulder. I began to ogle my memories like an old lover I missed but could never call. 

The second type of nostalgia, reflective nostalgia, is the kind that lets you steep in the past for the sake of sheer remembrance, in search of happy feelings. It’s a meditation on the passing of time, without the ambition of recreating it. Reflective nostalgia allows a person to draw comfort from memory—a beautiful and necessary resource. But in the act of recalling happiness, memory becomes distorted. I know this because on brutal New York winter days, Read and I reached for Thailand in this lopsided way, warming ourselves with the memory of endless days and long motorbike rides, erasing the distress of being broke, restless, and lost. Pain is diminished with time, while happiness tends to bloom. The thrill of nostalgia—restorative and reflective both—is the indulgent pleasure of contrasting something smoothed by time and perspective with the messy, complicated present.

Nearly six years had passed since 2014—the trip that ruled over all our memories—when I began itching for a homecoming to confront the nucleus of my nostalgia. Read and I both felt the desire to return to the place that loomed the largest: to Goa, where we’d attended the wedding that set the tone of our trip, mixed us up, and kept us chasing a feeling we could never recreate. In December of 2019, we boarded a flight to India and retraced our steps as a kind of confrontation with ourselves. 

We charted a pilgrimage into our past: we checked into the same hotels, ate at the same restaurants. At the hotel where we’d met the prince, of which he was a part-owner, we spotted him. Six years older, in jean shorts and a t-shirt, he remembered us. He seemed amused as he eyed us like relics and told us he had two kids now. The hotel had rows of newly constructed bungalows, a glass-walled shop selling expensive silks. It was a far cry from the barebones establishment we remembered. And it helped. It force-fed us perspective, showed us how everything changes, everything shifts, and to lament the passing of time is to lament the act of living itself. We looked at Goa with eyes from the future, and what we saw helped us to shake the experience from its impossible pedestal. 

It was a bewildering feeling, being on the far side of time, yet just beginning to understand how the dynamic between experience and memory would continue to shift, over and over, often without my knowing it. There I was, sitting on a Goan beach contending with my own recollections, wholly unaware that in two months, a global pandemic would halt the world, that I would move back to my home state, deliver my first child, that this trip to India would be the final act before everything turned upside down. I knew none of this, all of which would bleed ever outward, staining my perception of past and future. Memories are living things. They are ever-evolving, kaleidoscopic. And to remember this is a lesson I will likely relearn many times. 

Last year I became a mother. I often think about how I will remember this time—chasing a tiny tornado around our home, adrift in a new city, barely writing, habitually lonely, navigating an unending pandemic. With enough distance, the answer, likely, is fondly.  

Culture Shock: Reassessing the Workshop

- | 4

1.

I had been living in Japan for 20 years when I moved back home to the United States in 2011. Upon my return, I decided that I would finally focus my energies on creative writing. But where to begin? The more I consulted friends and colleagues, the more I heard the same thing, over and over

“Whatever you do, don’t get an MFA.”

I received this advice from published authors and, even more frequently, from MFA-holders themselves. My understanding of MFA programs was limited—I’d heard of the prestigious “Iowa Workshop” but was otherwise completely unaware that MFA programs in the U.S. had exploded in popularity over the past two decades.

In Japan, there is no such thing as an MFA. People who want to be writers study things like literature and journalism. My own undergraduate background was in philosophy from Berkeley, and I later got an MA in Japanese literature and linguistics from the University of Wisconsin at Madison. In grad school, we read voluminously, wrote translations, and did literary analysis. After graduating, I worked as a translator in the fields of business, government, and academia. It was around this time that I had what was probably the best creative-writing “craft” experience of my life: I was hired to translate a novel from Japanese to English.

More than once I’ve been told by successful writers that if I wanted to become a writer, I should copy out by hand my favorite novel. “You have to write out the entire thing,” one of them told me. “You can’t imagine how illuminating it can be.” I’ve never done this exercise myself, but I believe that I’ve experienced its intended effect doing literary translations. Translating a novel was a formative experience for me as a writer because I learned that writing is like any other art: while talent can’t be taught, technique can be learned. So, how exactly does one learn technique? I decided to take creative writing classes, earning my Certificate in Creative Writing from UCLA Extension and attending classes through Stanford Continuing Education’s program, as well as a handful of other writing centers. It was then that I experienced the creative writing workshop for myself.

So, what is a workshop?

It is, first and foremost, an American invention. The basic idea is that a group of writers, sometimes called a cohort, submits a set number of pages to the group—usually students submit in small groups or singly. The cohort then provides structured feedback concerning strong and weak points in the writing with suggestions for development and revision. During this time the authors are under “a cone of silence,” a kind of gag order. Authors are not allowed to speak during their own workshops until the end, at which point they can thank their cohort and perhaps provide clarification or ask a question or two.

In his 2021 book Craft in the Real World: Rethinking Fiction Writing and Workshopping, Matthew Salesses argues that, at base, workshops should enable writers to articulate their artistic visions. I remember when I started working on my certificate at UCLA Extension, a friend in one of my first classes told me that the best outcome I could hope for was that “the classes will help you understand what you like.” I was skeptical. Is that all? But as I continued to study, I came to agree with my friend. It is valuable to be able hammer out one’s own artistic vision through conversation with other people. For this reason, giving feedback is usually more valuable than receiving it, which is also something most writing teachers will tell you.

But Craft in the Real World also vigorously critiques the American MFA program and the workshop model in general. (Salesses was just appointed assistant professor of writing at Columbia University’s MFA program.) Salesses wonders: if only one type of writing is held up as “good” and the programs remain highly insular, how can an artist articulate a unique vision? He writes:
If you have been taught to write fiction in America, it is a good bet that you have been taught a style popularized by Ernest Hemingway and later by Raymond Carver, sometimes described as ‘invisible,’ that is committed to limiting the use of modifiers and metaphors, to the concrete over the abstract, to individual agency and action, and to avoiding overt politics (other than the politics of white masculinity). Instead of a political argument, a character might angrily eat a potato.
Ironically, the workshop’s roots are overtly political. It was by reading two books—Anis Shivani’s Against the Workshop: Provocations, Polemics, Controversies and Eric Bennett’s Workshops of Empire: Stegner, Engle, and American Creative Writing during the Cold War—that I learned about how the U.S. government, via the CIA, interfered in the creation of the Iowa Workshop, and how this interference continues to inform the multitude of writing programs we have today. From 1941 to 1964, Iowa Writers’ Workshop director Paul Engle fundraised for the program by using explicitly anticommunist rhetoric, promising to promote American values through the teaching of creative writing. Among the program’s donors were the Rockefeller Foundation, State Department, the Fairfield Foundation, and the Asia Foundation, the latter two of which were CIA front groups.

And so the Iowa Workshop became a form of soft power to push back against Soviet collectivist ideas. As a result, writes Bennett, the Workshop was mandated to promote stories centered around the private life of the individual. Literature was to highlight “sensations, not doctrines; experiences, not dogmas; memories, not philosophies. Anything to ensure collectivist movements would not form.”

Elif Batuman, in a 2010 essay in the London Review of Books called “Get a Real Degree,” argues that these rules have become so embedded in creative writing shops that today they are not questioned. Sometimes called psychological realism, the basic canon encourages a kind of writing style that is based on strong individualism, an encapsulated self out in the world. And in fiction, as well as a lot of nonfiction, the name of the game is something we call “conflict.”

You will often hear in workshops that conflict is the fuel that drives all story. We are taught to begin in-scene—and then, teachers tell you, “stay in scene”—and to begin as close to the central point of conflict as possible. From here the story moves toward its resolution. Along the way, there must be plenty of “character development.” One of my fiction teachers told me that this focus on character arcs is almost an obsession in American literary fiction.

Americans read far fewer books in translation than readers in places like Japan, Poland, France, or Spain, and the workshop is even more insular than the general reading population. My biggest complaint about my workshops, shared by many of my cohorts, is that we read the same books and stories over and over again. There is a case to be made that this insularity and practice of narrow reading has helped create a canon of bland and cookie-cutter books.

2.

Throughout Craft in the Real World, Salesses questions the extreme whiteness of creative writing programs. (Tongue and cheek, Batuman, in her London Review of Books essay, places “writing classes” at #14 on a list of “stuff white people like.”) Speaking anecdotally, my writing classes and workshops have overwhelmingly been taught by white female instructors. Taught by faculty that is so homogenous—racially, linguistically (teachers have been primarily monoglots with no experience reading globally in other languages), and in gender—the classes present a narrow and skewed system of aesthetic values.

In her 2017 book The Girl at the Baggage Claim: Explaining the East-West Culture Gap, Gish Jen unpacks the way differing underlying concepts of self inform the various storytelling traditions around the world. Throughout her career as a writer, Jen has made a case for fiction that combines both Eastern and Western craft. In the West, she says, this concept is something she calls the “big pit self,”
a self unlike any other in the world, assertive and full of self-esteem, and yet anxiously protective of its self-image and obsessed with self-definition. Why is it, exactly, that Americans must have fifty flavors of ice cream when other cultures are happy with ten? Why do we talk about ourselves so much? Why are we consumed with the memoir? Why is personal growth so important? Does self-esteem come at a price? And why do we see work the way we do, and how did we get this way?
In contrast to the “pit self,” Jen explores a notion of self that is far more prevalent outside the Western world, the interdependent “flexi-self” associated with collectivistic societies. In this case, the boundary between self and world is “nowhere near so absolute. It is, rather, porous and fluid—a dotted line.” It would only be natural that this latter sense of self would inform the writing traditions of those countries. And so, the American workshop can be encouraging or stifling depending on one’s background. Because, as Salesses argues, the workshop is all about societal expectation. Being so firmly founded in cultural norms and ideology, it will not promote artistic rule-breakers or genre-defilers.

3.

After 20 years in Japan, where for the last decade I thought, dreamt, and read mainly in Japanese, my thinking and writing now reflects Japanese storytelling styles. I prefer more meditative writing with constant pivots and turns. I love surprises, and prefer the lyric over the concrete, the “nobility of failure” over the hero’s journey. And more than anything, I love books that refer to other books.

Salesses, who was born in Korea, reminds us that not all traditions favor conflict, or character-driven models, like the hero’s journey. He cites Chinese, Korean, and Japanese stories, which “developed from a four-act, rather than a three- or five-act structure: in Japanese it is called kishōtenketsu (ki: introduction; sho: development; ten: twist; ketsu: reconciliation).” The kishōtenketsu structure informs fiction, nonfiction, theater, and even the movements of the tea ceremony. It is a profoundly different aesthetic system from the Western model, with its primary focus on conflict. Perhaps the most common critique I hear from Western readers about Japanese fiction is that nothing ever seems to happen.

Last year, I reviewed Multispecies Cities: Solarpunk Urban Futures for the Asian Review of Books. Multispecies Cities is a collection of climate fiction set primarily in the Asia-Pacific, that seeks to imagine what cities might look like in a future of multi-species co-existence and green justice. The stories are filled with a polyphony of voices—some non-human and a few non-alive—working together to bring about solutions that address global warming, the extinction of animal species and the coming climate disaster. The stories in Multispecies Cities call on us to change not just what we write about, but how we write. The stories themselves question progress-based narratives and stories of the individual, or the lone hero. At first, Western readers might even feel disoriented by these stories of cooperation and immersion in the environment, where rivers speak and stars can be heard. You might need to re-read some of them to assess what is going on when no-one wins or loses, overcomes or fails—because at first glance nothing seems to change.

Salesses laments that we have come to teach plot as a string of causation in which the protagonist’s desires move the action forward. He says:
Western fiction can often be boiled down to A wants B and C gets in the way of it. This kind of story shape is inherently conflict-based, perhaps also inherently male (as author Jane Alison puts it: “Something that swells and tautens until climax, then collapses? Bit masculo-sexual, no?”). In East Asian fiction, the twist (ten) is not confrontation but surprise, something that reconfigures what its audience thinks the story is ‘about.’
In workshop, “Nothing happens” is always meant as a criticism, an inherently bad thing. This can be stifling for a writer who doesn’t read for urgency or conflict in everything.

5.

I have spent roughly half my life in Japan and half in America. For so long, I had two languages switching back and forth in my mind, one for America, one for Japan. There were different clothes, ways of speaking, foods to eat, body languages, and styles of friendships. It has been an enormous adjustment to permanently return to America in mid-life. I have had to relearn and get reacclimated to a lot. But I never would have expected that reading and writing would be the most challenging adjustments so far. I had not realized how insular American publishing is, how few books that might broaden Western tastes are actually translated for the American market, and how translations are almost completely ignored in creative writing classes. And how students are made to read and re-read the same books over and over again.

And so, I find myself in a quandary. It is not that I think we should scrap existing syllabi, but rather that we must make room for other storytelling traditions in these programs. And this must start with reading. As Matthew Salesses repeats again and again in his book: what is being taught as universal rules of good writing in these programs is nothing more than a highly narrow understanding of literary taste.

I wonder how we can ever change the world if we don’t first change our dominant mode of storytelling, with its intense and politically-motivated focus on individual experience and conflict? The future of creative writing programs should be radically inclusive, allowing not just for a multiplicity of voices but of a wide variety of traditions. There is room on the syllabus for wider reading that pushes back against the established rules of the workshop—literature in translation, for instance—that would infuse new ways of thinking, fresh forms, and more creativity into a cookie-cutter literary landscape.

As I look toward my own studies, I feel ambivalent. As someone who, as a reader, tends toward more international styles of storytelling, I have never been a huge fan of writing that is scene-driven, nor have I ever been all that interested in the hero’s journey—much preferring Borges and Calvino to Carver or Hemingway. Before I do anything else, I think I need to make my own rules. There is value in the workshop. My writing courses have challenged me to think about how and why people read, to engage with the purpose of the art. I am convinced that people read not just to be entertained, but to be enchanted. There are many roads to that place of enchantment, some less traveled. But in the end, that is a road that I will have to discover for myself.

The Poetic Life of Samuel Menashe

- | 1

Samuel Menashe, who died 11 years ago this month, lived most of his life in a three-room railroad flat on Greenwich Village’s Thompson Street. He was reluctant to call himself a poet, though if accused, he wouldn’t deny it. In 2003, at the age of 79 and after decades of toiling in relative obscurity, he was awarded the Poetry Foundation’s first-ever “Neglected Masters Award.” This all-too-fitting capstone cemented Menashe’s legacy as a poet of unambiguously astonishing power who, several honors and admirers aside, was famous most of all for not being famous.
Looking for a thread of silver lining, it might be tempting to pigeonhole Menashe as a poet’s poet—he had many celebrated champions, among them Donald Davie, Stephen Spender, Danielle Chapman, Dana Gioia, and, perhaps most meaningful for Menashe, Austin Clarke, who praised Menashe’s work in the Irish Times and even recited some of it on the radio. Menashe’s most crucial ally was probably the poet Kathleen Raine, who helped him find a London publisher for his first collection and contributed its forward. But to say that Menashe had any major impact on his peers or the generations that succeeded them would be a stretch. When, as an experiment, I googled “influenced by Samuel Menashe,” it took .57 seconds to return exactly zero results.

In the introduction to his 2005 self-titled collection, Menashe writes, “When the Beat poets ‘made the scene,’ I heard the pious platitude that it was good for poetry, but it was not good for my poetry. If confessional poetry was to the fore, I had nothing to offer its devotees.” Echoing these sentiments, Don Share, another notable admirer, wrote upon Menashe’s death, “In our time of poetry movements, schools, coteries, and communities, Samuel Menashe was singular and self-sufficient.”
Share’s observation has a more positive spin than Menashe’s own, though one can’t help but wonder if Menashe’s singularity didn’t perhaps necessitate his self-sufficiency. There is no doubt strength in numbers and, counterintuitive as it may seem, being part of a movement can actually help an artist to stand out. The creator who goes it alone might garner interest and admiration, but it can be harder to determine their overall relevance, which to a critic means it’s harder to justify writing about them. In fact, even some of the most visionary and fearlessly independent artists like Picasso and Dylan were at least partially associated with larger movements (in their cases, cubism and folk, respectively), even as they clearly dominated them. Davie perhaps put it best when he wrote, “One trouble is that [Menashe’s] poems are as far from being traditional as they are from being in the fashion, or in any of the several fashions that have come and gone, whether in British or American poetry, over the last twenty-five or for that matter one hundred years.” Being consistently out of fashion is hardly a recipe for recognition, let alone financial comfort.
Would Menashe then have been better off if he’d tried to attach himself to a more “fashionable” movement? Could he have ingratiated himself, for example, with the Beats? This seems unlikely. While the spirituality in Menashe’s work would surely have resonated with the likes of Ginsberg and Kerouac, those poets’ energy, exuberance, and expansiveness, not to mention their legendary subversiveness, could not be more different than Menashe’s dignified, finely chiseled poems.
How about the confessional poets then? There is plenty of autobiography in Menashe’s poetry, but its minimalism and enigmatic nature puts him at odds with that movement, as well. As Danielle Chapman notes, Menashe “doesn’t dredge through memories or parade us through his bedroom, and, except as the archetypal mother, father, or friend, he rarely makes mention of specific people or places.” When he does dig into his roots, however, the results are marvelous:

My father drummed darknessThrough the underbrushUntil lightning struckI take after himClouds crowd the skyAround me as I runDownhill on a high—I am my mother’s sonBorn long agoIn the storm’s eye

What about the New York School? “No, they won’t have me,” Menashe said of the city’s famed poetry scene of the 1950s and ‘60s. It doesn’t seem a good fit anyhow: While poets like Frank O’Hara and John Ashbury wrote charmingly and inventively about their struggles to put words on the page, and about the form itself, their work is worlds apart from Menashe’s spare approach to the same subject matter:

The statue, that castOf my solitudeHas found its nicheIn this kitchenWhere I do not eatWhere the bathtub standsUpon cat feet—I did not advanceI cannot retreat

None of this is to say that Menashe’s work exists in a vacuum. In a wonderful video interview from 2009, we are treated to a shot of some of the books that clutter his shelf, which include Shakespearean sonnets and collections from Blake, Frost, and Yeats. In interviews, Menashe would frequently cite the influence of Blake and of English translations of the Hebrew Bible. The latter might raise the question of whether Menashe could be thought of as a Jewish poet—a valid proposition. The son of Russian-Jewish immigrants, his first language was Yiddish, and Jewish themes pervade his work. His first American collection was called No Jerusalem But This, whose title poem “The Shrine Whose Shape I Am” contains these lines:

There is no Jerusalem but thisBreathed in flesh by shameless loveBuilt high upon the tides of bloodI believe the Prophets and BlakeAnd like David I bless myselfWith all my might

At times, he is unabashedly spiritual and some of his poems could almost pass for psalms:

O Many Named BelovedListen to my praiseVarious as the seasonsDifferent as the daysAll my treasons ceaseWhen I see your face

However, “Jewish poet” is an exceptionally broad category, and if some of his themes may overlap with modern Hebrew poets like Yehuda Amichai and Hayyim Nachman Bialik, there is also something distinctly American about his cadence. And unlike modern Yiddish writers like I.B. Singer or the masterful Chaim Grade, Menashe experienced and wrote about World War II through the lens of having been a U.S. soldier rather than a Holocaust survivor. He took part in the famous Battle of the Bulge in the Ardennes, and his wartime experiences suffuse his work, even as they’re often veiled in mystery:

Do not scrutinizeA secret wound—Avert your eyes—Nothing’s to be doneWhere darkness liesNo light can come

Menashe is similarly ill-suited to be grouped in with the so-called “Deep Image” poets who gained prominence in the ‘60s and ‘70s. The rich, vivid language of poets like Gallway Kinnell, James Wright, and Diane Wakoski is replaced in Menashe’s poetry by a far simpler, sparer vocabulary. His minimalism is in fact part of what makes his work so intimidating: he wields so much power with so few tools.
I wonder if Menashe ever felt tempted to adjust his poetry in order to better fit in—perhaps, but then he would have been a different poet. Whether intentionally or unconsciously, many creators have been all too willing to compromise their work by allowing forces beyond pure, unadulterated self-expression to impact its creation. Samuel Menashe, it seems, never made such a bargain.
In his later years, Menashe sometimes seemed rueful about his lack of recognition, even as he marveled over his long overdue accolades and the legacy they granted to his work. In a 2005 interview with Adam Travis of the Poetry Foundation, he said that his had been “the opposite of a life buttressed by grants and having a publisher and going to him every few years with new poems. Each time I’ve had to start from scratch.” When asked if there were any merits to obscurity, his reply could not have been more emphatic: “NO! No, no, no, no, no! You want your work to be read. Obscurity means you’re not read.”
Even if Menashe’s underappreciation has on some level come to define him, is that really so terrible? In my own brief but fateful encounters with Menashe over the years, I never once thought of him as unsuccessful. Watching him recite his poetry in his regal, Jimmy Stewart-like voice before a loving crowd at the Bowery Poetry Club was nothing short of magical. Success can take many forms and can mean different things depending on the artist or the medium. Samuel Menashe focused on the work rather than the scene, lived frugally and modestly and achieved his much-deserved recognition just in time to get some satisfaction and no small measure of bemusement out of it. There’s a purity to the way he led his artistic life, a charm and grace. You might very well call it poetry.

A Debut Novelist Imagines Life After Her Own Death

I was 17 years old when I died.
In the years after my death, I liked to use this line as an ice breaker in conversations, always enjoying the disturbed reactions. Even alone, I found myself compulsively scribbling the same sentence again and again—I was 17 years old when I died. Over the proceeding years, one sentence eventually became two, two became three. Seemingly outside of my own choice, my brain, desperate to process my death, forged the beginning of a story. The story evolved, as most stories do. It wasn’t about me. Not exclusively. It became a story about a woman in the throes of true love with a promising life ahead of her who dies of lymphoma, cryogenically preserves her body, and is resurrected 100 years later in a world where it is illegal to be a resurrected human. Ultimately, it became a story about the repercussions of cheating death—the way I had.
The truth of my death is much more common. I died in a car accident on the way to take my junior year biology final. I don’t remember any of it, the moments before my death or the many minutes I was without a heartbeat. In fact, that period where I wasn’t alive distinctly stands in my mind as a stark nothingness, a vacuum of time where I didn’t exist. I can still feel it, the scar of not existing, as real as I feel the scars on my face that I try to hide behind thick bangs. It was mere chance that my life was returned to me. My unlikely guardian angel was a former EMT with a history of substance abuse who happened to be driving by when she saw my sky blue VW bug crumpled underneath a relatively unscathed Suburban SUV. She leapt to my rescue, scaling the smoking wreckage and heaving my lifeless body through the broken windshield. With her bare palms, she held my severed temporal lobe artery to try and buy time for the ambulance to get there and restart my heart. At least that’s the story as it has been told to me. Stories from others are all I have to rely on to know anything about my own death. I was told that they worked on my body the entire ride to Grady Hospital in Atlanta and were only finally able to resuscitate me as we arrived. I was rushed into surgery. and they repaired my bleeding brain. A few hours later I awoke, having been given a second chance. I had come back to life—or at least a version of me did.
As traumatic as that morning was, I wonder if I wouldn’t be so marked by death if my accident was all I had to deal with. If my own death was all I had been forced to endure, then maybe that sentence never would have wormed its way into my pen and I never would have written the story that became my debut novel The Awoken, which publishes on August 9. But unfortunately, mine wasn’t the only death in the spring of 2005. Two weeks before my own death, my childhood best friend died in a car accident on his way home from work.
Charlie was 17 years old when he died.
We’d been friends since we were seven. He was goofy and charming and kind. We made plans of living together after college in Los Angeles and going to the beach every day. A year before we died, I took him to my high school dance and hated it when my friends flirted with him. We were never romantic, but still I wanted him to myself. He somehow knew, walking away from the gaggle of girls surrounding him to sit with me and put his feet in the cold pool next to mine. We didn’t talk. We didn’t need to. I just rested my head on his shoulder while the world spun, only our future lay ahead of us.
The proximity of our life-ending events is mere coincidence. Sadly, there was no passing EMT with a troubled past looking for redemption. Charlie died and stayed dead while I had the privilege, and shouldered the guilt, of coming back to life. To this day, my death haunts me. Shards of glass remain trapped beneath my skin. Every few years one will push itself close enough to the surface that it’s uncomfortable, and I have to get it removed. When that happens, I’m face-to-face with what could’ve been. I walk through life shadowed by the corpse of myself from a parallel universe who was not as lucky. The universe where I, like Charlie, died and stayed dead. Or better yet, a universe where Charlie and I switch places.

These dual, soul-altering events led me to that indelible sentence which then led me to write a novel. It wasn’t that simple of course. I wasn’t a novelist. At that point, I was a screenwriter and a documentarian, but I’d always been an avid reader. I grew up on big, fantastical adventures like Dragon Riders of Pern and The Chronicles of Narnia and Lord of the Rings. After my death, my reading shifted along with the rest of my life. I wore thin my copy of Frankenstein, drawn to stories that questioned the very definition of life. Until then, death had been abstract, something I read about in books, and so after my death, I turned to books to make sense of it.
More than a decade later, I still found myself pulled back to that sentence, back to my original word document, and only then did I admit to myself that this was a novel. I began writing. Although it seems naïve to say now, the truth is that I wasn’t consciously aware that I was writing a story about my death. To anyone reading this, I imagine it seems absurd considering The Awoken is about a woman, eerily like myself, who dies young and then is given a second chance at life. But I wasn’t aware it was also about my death until I read my first completed draft. The nothingness I had experienced was there on the page, anthropomorphized as the ultimate antagonist. My husband always knew, as did my mother, but they never pushed me on it. Meanwhile, my editor at Penguin Random House didn’t even know I was in a car accident until we were in copyedits, more than a year into our relationship. But there it was on the page, plain as day. I still had a lot of work to do around processing my death, and even more clear was the fact that I was terrified of ever dying again.
Since my and Charlie’s deaths, I’ve developed an obsessive interest in life extension science. When I crawled out of my hospital bed at 17, I wanted nothing more than immortality. I convinced myself I was immortal, and I even tried proving so by playing an ultimate frisbee game only two weeks after my death. The doctor had ordered me to strict bed rest. I wasn’t even supposed to go downstairs to the kitchen, but I couldn’t lay there with death surrounding me. Much to my mother’s horror, I ran across a football field with a pseudo-aneurism the size of a golf ball protruding from my temple, ready to explode if strained. My defiance stemmed from something deeper than mere teenage rebellion. It was a primal necessity. I had to be immortal, or else be forced to face my mortality. I never wanted to see that nothingness ever again. This overriding need to stay far away from death for as long as possible is what brought me to learn about cryogenics in the first place. So it was only natural that I used cryogenics as the method by which my protagonist, Alabine Rivers, gets her second chance. In fiction, we, perhaps me most of all, can viscerally consider the nuances, dilemmas, and ramifications of such a world where death isn’t the end.
Fear of death is a part of life, I’ve been told. It’s what makes us human. But to be honest, after my research, I now share an equally powerful fear of humanity attaining immortality. Since the dawn of man, we’ve fantasized about escaping death—from Herodotus’s Fountain of Youth all the way to today’s billionaires of Silicon Valley pouring their wealth into life extension research. Unlike the ancient Greeks, Elon Musk actually has a chance of cheating, or at least delaying, death. Never before has humanity, through science, been so close. The emerging field of cryogenics is at the heart of this. Imagine a pause button. A person with organ failure able to wait until the perfect transplant comes available. A cancer patient able to wait until a cure is developed. A victim of a car accident able to be frozen at the scene to prevent bleeding out, regardless of luck.
You don’t have to imagine it, people have been preserving themselves for decades, awaiting the day science figures out how to resuscitate a human. We already know how to bring rats back from preservation. Humans won’t be long off. In Yuval Noah Harari’s book Sapiens, and even more so in his follow up book Homo Deus, Harai details humanity’s past, present, and future attempts to defeat death. While my heart races at the idea, Harai clinically and academically considers what would happen to our world in such a future where death is no longer inevitable. He writes that civilization has kept going on the singular knowledge that everyone, despite race, wealth, or privilege, will face death. What happens if that great equalizer is taken away? Some believe the wealthy will live forever while the marginalized remain unable to shed their mortality. Or, maybe it would be the reverse. Those brought back from the dead might be seen as a blight against mankind. Our new tool of resurrection seen as a direct attack on God. Afterall, in a world without death, God is no longer necessary. This is the world I forged in my novel.
The ethics of life extension are complex, and now I question my need for immortality. Don’t get me wrong, I still am terrified of the nothingness, but I was surprised to find the alternative similarly terrifying. Would I, like my character Alabine, actually choose to preserve my body? It was psychologically hard enough for me to come back from the dead once, would I really want to do it again? Would you? This is the question we should all be asking ourselves. We’re on the precipice of species altering developments. Even this very day, any of you can call up Alcor, the preeminent cryo preservation company in the US, and pay real money to cryogenically preserve your body. You die with an agreement for Alcor to resurrect you when it’s possible, most likely into a world that is so foreign, so different to what you know today, without the loved ones you know today, that one must question if it’s even worth it. That is my biggest hesitation in calling Alcor myself, along with the clear financial constraints. My before-death self is a different person than who I am now. I have her memories and her body (at least most of it), but there’s something in my core that has changed. An intangible quality that separates me from her. I recognize her, but am different. I don’t have her hope. I don’t have her carefree outlook on life. I don’t even have many of her friends left as I systematically separated myself from them in the months that followed, believing that they just would never understand. The metamorphosis from her to me was hard. The idea of doing that again and again without end is crippling. What if the next time I didn’t recognize the girl they woke up at all? What if the time after that, I no longer even wanted to?
Perhaps fortunately, that world remains merely the fodder for fiction. For now. There are many hurdles to jump before we can claim we’ve defeated death. It could be argued that inventing the technology to resurrect a human might be the easiest part. Convincing the world to embrace such an immortal existence would likely prove a greater challenge. As we’ve seen acutely in recent years, science can be vilified, and not everyone trusts it. Although we learned to clone mammals a quarter of a century ago, we have yet to see a cloned person born (independently verified, that is). Change is hard, and humans are pack-minded creatures; we ostracize those who are different. Imagine the divide between those who have been dead and those who haven’t. Like Dr. Frankenstein, we would have to redefine what life is, and decide if those who have touched death should be seen as the same as those who haven’t.
It was my own death that made me see the value and fragility of life, a perspective I find separates me, even now in my thirties, from my peers. I wonder how fundamentally our civilization would change if we all met the nothingness and gained that perspective. Then again, would life retain meaning once humans have done away with death? Would I carry the same preciousness of my life re-given if I didn’t have the comparison of Charlie’s life taken? Regardless of if I would actually choose to preserve my body, I certainly am prepared to fight for a future world where second chances are doled out to anyone who asks, and not just the billionaires. What a kinder and more loving world this would be if second chances weren’t “hard to come by,” left up to chance or privilege, but instead available to everyone.
In my early twenties, my Facebook messenger randomly dinged with a message. It was from my guardian angel EMT. We had never spoken before, despite my multiple attempts to contact her and thank her for saving my life. Her message showed up in the “other” inbox for people you’re not friends with, so it was only by chance that I even saw she’d sent it. She told me that in the years since she held my bleeding body, she’d been silently watching me on social media go to college, make friends, lose friends, direct films, and write stories. She told me that watching me live in turn saved her life. She got sober. She got her second chance too.
I write this essay weeks before The Awoken is due to be released into the world. I am excited and scared—feelings I assume I share with most debut authors, but I have an added unsettling layer. The day The Awoken publishes becomes yet another start of a new chapter in my life. Like the day my first film came out. The day I got married. Or the day I had my son. These are all chapters that Charlie will never get to have. I try to push these dark thoughts out of my head, as terrifying as the nothingness itself. I try to focus on the fact that I am working hard to do something with my privilege of life. Even if I never choose to cryogenically preserve my body, perhaps in writing this novel, I’ve already preserved a piece of myself to live forever—my words and thoughts inscribed for generations to come. Meanwhile, I’ll keep reading new articles about the developments in cryogenics, continually wondering what if.