The Greeks Aren’t Done with Us: Simon Critchley on Tragedy

We know that ghosts cannot speak until they have drunk blood; and the spirits which we evoke demand the blood of our hearts.—Ulrich von Wilamowitz-Moellendorff, Greek Historical Writing, and Apollo (1908)

Thirteen years ago, when I lived briefly in Glasgow, I made it a habit to regularly attend the theater. An unheralded cultural mecca in its own right, overshadowed by charming, medieval Edinburgh to the east, the post-industrial Scottish capitalI was never lacking in good drama. Also, they let you drink beer during performances. Chief among those plays was a production of Sophocles’s Antigone, the final part of his tragic Theban Cycle, and one of the most theorized and staged of dramas from that Athenian golden age four centuries before the Common Era, now presented in the repurposed 16th-century Tron Church. Director David Levin took the Attic Greek of Sophocles and translated it into the guttural brogue of Lowlands Scotts, and in a strategy now deployed almost universally for any production of a play older than a century, the chitons of the ancient world were replaced with business suits, and the decrees of Creon were presented on television screen, as the action was reimagined not in 441 BCE but in 2007.

Enough to remind me of that headline from The Onion which snarked: “Unconventional Director Sets Shakespeare Play in Time, Place that Shakespeare Intended.” The satirical newspaper implicitly mocks adaptations like Richard Loncraine’s Richard III which imagined the titular character (devilishly performed by Ian McKellen) as a sort of Oswald Mosley-like fascist, and Derek Jarman’s masterful version of Christopher Marlowe’s Edward II, which makes a play about the Plantagenet line of succession into a parable about gay rights and the Act Up movement. By contrast, The Onion quips that its imagined “unconventional” staging of The Merchant of Venice is one in which “Swords will replace guns, ducats will be used instead of the American dollar or Japanese yen, and costumes, such as…[the] customary pinstripe suit, general’s uniform, or nudity, will be replaced by garb of the kind worn” in the Renaissance. The dramaturgical perspective behind Levin’s Antigone was definitely what the article parodied; there was nary a contorted dramatic mask to be found, no Greek chorus chanting in dithyrambs, and, as I recall, lots of video projection. The Onion aside, British philosopher Simon Critchley would see no problem with Levin’s artistic decisions, writing in his new book Tragedy, the Greeks, and Us that “each generation has an obligation to reinvent the classics. The ancients need our blood to revise and live among us. By definition, such an act of donation constructs the ancients in our image.”

Antigone, coming from as foreign a culture as it does, still holds our attention for some reason. The story of the titular character—punished by her uncle Creon for daring to defy his command that her brother Polynices’s corpse be left to fester as carrion for the buzzards and worms in the field where he died because he has raised arms against Thebes—would seem to have little to do with Tony Blair’s United Kingdom. When a Glaswegian audience hears Sophocles’s words, however, that “I have nothing but contempt for the kind of governor who is afraid, for whatever reason, to follow the course the he knows is best for the State; and as for the man who sets private friendship above the public welfare—I have no use for him either” a bit more resonance may be heard. Critchley argues that at the core of Greek tragedy is a sublime ambivalence, an engagement with contradiction that classical philosophy can’t abide;as distant as Antigone’s origins may be, its exploration of the conflict between the individual and the state, terrorism and liberation, surveillance and freedom seemed very of the millennium’s first decade. Creon’s countenance of the unthinkable punishment of his niece, to be bricked up behind a wall, was delivered in front of a camera as if George W. Bush announcing the bombing of Iraq from the Oval Office on primetime television. “Evil sometimes seems good / To a man whose mind / A god leads to destruction,” Sophocles wrote. This was a staging for the era of the Iraq War and FOX News, of the Patriot Act and NSA surveillance, and of the coming financial collapse. Less than a year later, and I’d be back in my apartment stateside watching Barack Obama deliver his Grant Park acceptance speech. It was enough to make one think of Antigone’s line: “Our ship of fate, which recent storms have threatened to destroy, has come to harbor at last.” I’m a bad student of the Greeks; I should have known better than to embrace that narcotic hope that pretends tragedy is not the omnipresent condition of humanity.

What could Sophocles, Euripides, and Aeschylus possibly have to say in our current, troubled moment? Tragedy, the Greeks, and Us is Critchley’s attempt to grapple with those disquieting 32 extant plays that whisper to us from an often-fantasized collective past. What survives of Greek tragedy is four less plays than all of those written by Shakespeare; an entire genre of performance for which we have titles referenced by philosophers like Plato and Aristotle, with only those three playwrights’ words enduring, and where often the most we can hope for are a few fragments preserved on some surviving papyri. Critchley emphasizes how little we know about plays like Antigone, or Aeschylus’s Oresteia, or Euripides’s Medea; that classicists often hypothesized that they were born from the Dionysian rituals, or that they focused on satyr psalms, the “song of the goats,” giving tragedy the whiff of the demonic, of the demon Azazel to whom sacrifices of the scapegoat must be made in the Levantine desert.

Beyond even tragedy’s origin, which ancient Greek writers themselves disagreed about, we’re unsure exactly how productions were staged or who attended. What we do have are those surviving 32 plays themselves and the horrific narratives they recount—Oedipus blinded in grief over the patricide and incest that he unknowingly committed but prophetically ensured because of his hubris; Medea slaughtering her children as a revenge on the unfaithfulness of her husband; Pentheus ripped apart by her frenzied Maenads in ecstatic thrall to Dionysius because the Theban ruler couldn’t countenance the power of irrationality. “There are at least thirteen nouns in Attic Greek for words describing grief, lamentation, and mourning,” Critchley writes about the ancients; our “lack of vocabulary when it comes to the phenomenon of death speaks volumes about who we are.” Tragedy, the Greeks, and Us is Critchley’s attempt to give us a bit of their vocabulary of excessive lamentation so as to better approach our predicament.

Readers shouldn’t mistake Tragedy, the Greeks, and Us as a conservative defense of the canon; this is no paean to the superior understanding of the ancients, nor is its highfalutin’ self-help. Critchley’s book isn’t Better Living Through Euripides. Easy to misread the (admittedly not great) title as an advertisement for a book selling the snake-oil of traditionalist cultural literacy, that exercise in habitus that confuses familiarity with the “Great Books” as a type of wisdom. Rather, Critchley explores the Greek tragedies in all of their strange glory, as an exercise in aesthetic rupture, where the works of Sophocles, Aeschylus, and Euripides configure a different type of space that renders a potent critique against oppressive logic. His task is thus the “very opposite of any and all kinds of cultural conservatism.” Critchley sees the plays not as museum pieces, or as simple means of demonstrating that you went to a college with diplomas written in Latin, but rather as a “subversive traditionalism” that helps us to critique “ever more egregious forms of cultural stupefaction that arise from being blinded by the myopia of the present.” This is all much larger than either celebrating or denouncing the syllabi of St. John’s College; Critchley has no concern for boring questions about “Western Civilization” or “Defending the Canon,” rather he rightly sees the tragedies as an occasion to deconstruct those idols of our current age—of the market, of society, of law, of religion, of state. He convincingly argues that any honest radical can’t afford to ignore the past, and something primal and chthonic calls to us from those 32 extant plays, for “We might think we are through with the past, but the past isn’t through with us.”

Critchley explains that the contemporary world, perhaps even more so than when I watched Antigone in Glasgow, is a “confusing, noisy place, defined by endless war, rage, grief, ever-growing inequality. We undergo a gnawing moral and political uncertainty in a world of ambiguity.” Our moment, the philosopher claims, is a “tragicomedy defined by war, corruption, vanity, and greed,” for if my Antigone was of its moment, then Tragedy, the Greeks, and Us could only have been written after 2016. That year, and the characters it ushered into our national consciousness, can seem a particular type of American tragedy, but Critchley’s view (even while haunted by a certain hubristic figure with a predilection for the misspelled tweet) is more expansive than that. In his capable analysis, Critchley argues that tragedy exists as a mode of representing this chaos; a type of thinking at home with inconsistency, ambiguity, contradiction, and complexity. It’s those qualities that have made the form suspicious to philosophers.

Plato considered literature in several of his dialogues, concluding in Gorgias that the “effect of speech upon the structure of the soul / Is as the structure of drugs over the nature of bodies” (he wasn’t wrong), and famously having his puppet Socrates argue in The Republic that the just city-state would ban poets and poetry from their affairs for the aforementioned reason. Plato’s disgruntled student Aristotle was more generous to tragedy, content rather to categorize and explain its effects in Poetics, explaining that performance is the “imitation of an action that is serious, and also, as having magnitude, complete in itself…with incidents arousing pity and fear, wherewith to accomplish its catharsis of such emotions.” Aristotle’s view has historically been interpreted as a defense of literature in opposition to Plato, whereby that which the later found so dangerous—the passions and emotions roiled by drama—were now justified as a sort of emotional pressure gauge that helped audiences purge their otherwise potentially destructive emotions. By the 19th century a philosopher like Friedrich Nietzsche would anticipate Critchley (though the latter might chaff at that claim) when he exonerated tragedy as more than mere moral instruction, coming closer to Plato’s claim about literature’s dangers while ecstatically embracing that reality. According to Nietzsche, tragedy existed in the tension between “Apollonian” and “Dionysian” poles; the first implies rationality, order, beauty, logic, and truth; the second signifies the realm of chaos, irrationality, ecstasy, and intoxication. Nietzsche writes in The Birth of Tragedy that the form “sits in sublime rapture amidst this abundance of life, suffering and delight, listening to a far-off, melancholy song…whose names are Delusion, Will, Woe.” For the German philologist that’s a recommendation, to “join me in my faith in this Dionysiac life and the rebirth of tragedy.”

As a thinker, Critchley Agonistes is well equipped in joining these predecessors in systematizing what he argues is the unsystematizable. Faculty at the New School for Social Research,and coeditor for The New York Times philosophy column “The Stone” (to which I have contributed), Critchley has proven himself an apt scholar who engages the wider conversation. Not a popularizer per se, for Critchley’s goal isn’t the composition of listicles enumerating whacky facts about Hegel, but a philosopher in the truest sense of being one who goes into the Agora and grapples with the circumstances of meaning as they manifest in the punk rock venue, at the soccer stadium, and in the movie theater. Unlike most of his countrymen who recline in the discipline, Critchley is a British scholar who embraces what’s called “continental philosophy,” rejecting the arid, logical formulations of analytical thought in favor of the Parisian profundities of thinkers like Jacques Derrida, Emanuel Levinas, and Martin Heidegger. Critchley has written tomes with titles like The Ethics of Deconstruction: Derrida and Levinas and Ethics-Politics-Subjectivity: Essays on Derrida, Levinas, & Contemporary French Thought, but he’s also examined soccer in What We Think About When We Think About Football (he’s a Liverpool fan) and in Bowie he analyzed, well, Bowie. Add to that his provocative take on religion in Faith of the Faithless: Experiments in Political Theology and on death in The Book of Dead Philosophers (which consists of short entries enumerating the sometimes bizarre ways in which philosophers died, from jumping into a volcano to love potion poisoning) and Critchley has announced himself as one of the most psychedelically mind-expanding of people to earn their lucre by explaining Schopenhauer and Wittgenstein to undergraduates.  

What makes Critchley such an engaging thinker about the subjects he examines is both his grounding in continental philosophy (which asks questions about being, love, death, and eternity, as opposed to its analytical cousin content to enumerate all the definitions of the word “is”) and his unpretentious roots in working class Hertfordshire, studying at the glass-and-concrete University of Essex as opposed to tony Oxbridge. Thus, when Critchley writes that “there is an ancient quarrel between philosophy and poetry,” it seems pretty clear that he’s a secret agent working for the latter against the former. He rejects syllogism for stanza and embraces poetics in all of its multitudinous and glorious contradictions. The central argument of Tragedy, the Greeks, and Us is that the form “invites its audience to look at such disjunctions between two or more claims to truth, justice, or whatever without immediately seeking a unifying ground or reconciling the phenomena into a higher unity.” What makes Antigone so devastating is that the title character’s familial obligation justifies the burial of her brother, but the interests of the state validates Creon’s prohibition of that same burial. The tragedy arises in the irreconcilable conflict of two right things, with Critchley explaining that Greek drama “presents a conflictually constituted world defined by ambiguity, duplicity, uncertainty, and unknowability, a world that cannot be rendered rationally fully intelligible through some metaphysical first principles or set of principles, axioms, tables of categories, or whatever.”

This is the central argument: that the “experience of tragedy poses a most serious objection to that invention we call philosophy.” More accurately, Critchley argues that tragedy’s comfort with discomfort, its consistent embrace of inconsistency, its ordered representation of disorder, positions the genre as a type of radical critique of philosophy, a genre that expresses the anarchic rhetoric of the sophists, rather than their killjoy critic Socrates and his dour student Plato. As a refresher, the sophists were the itinerant and sometimes fantastically successful rhetoricians who taught Greek politicians a type of disorganized philosophy that, according to Socrates, had no concern with the truth, but only with what was convincing. Socrates supposedly placed “Truth” at the core of his dialectical method, and, ever since, the discipline has taken up the mantle of “a psychic and political existence at one with itself, which can be linked to ideas of self-mastery, self-legislation, autonomy, and autarchy, and which inform the modern jargon of authenticity.” Tragedy is defined by none of those things; where philosophy strives for order and harmony, tragedy dwells in chaos and division; where syllogism strives to eliminate all contradiction as irrational, poetry understands that it’s in the complexity of inconsistency, confusion, and even hypocrisy that we all dwell. Sophistry and tragedy, to the recommendation of both, are intimately connected; both being methods commensurate with the dark realities of what it means to be alive. Critchley claims that “tragedy articulates a philosophical view that challenges the authority of philosophy by giving voice to what is contradictory about us, what is constricted about us, what is precarious about us, and what is limited about us.”

Philosophy is all arid formulations, dry syllogisms, contrived Gedankenexperiments; tragedy is the knowledge that nothing of the enormity of what it means to be alive can be circumscribed by mere seminar argument. “Tragedy slows things down by confronting us with what we do not know about ourselves,” Critchley writes. If metaphysics is contained by the formulations of the classroom, then the bloody stage provides a more accurate intimation of death and life. By being in opposition to philosophy, tragedy is against systems. It becomes both opposite and antidote to the narcotic fantasy that everything will be alright. Perhaps coming to terms with his own discipline, Critchley argues that “it is necessary to try and think theatrically and not just philosophically.” Tragedy, he argues, provides an opportunity to transcend myths of progress and comforts of order, to rather ecstatically enter a different space, an often dark, brutal, and subterranean place, but one which demonstrates the artifice of our self-regard.

A word conspicuous in its absence from Tragedy, the Greeks, and Us is that of the “sacred.” If there is any critical drawback to Critchley’s argument, it seems to be in the hesitancy, or the outright denial, that what he claims in his book has anything to do with something quite so wooly as the noumenal. Critchley gives ample space to argue that, “Tragedy is not some Dionysian celebration of the power of ritual and the triumph of myth over reason,” yet a full grappling with his argument seems to imply the opposite. The argument that tragedy stages contradiction is one that is convincing, but those sublime contradictions are very much under the Empire of Irrationality’s jurisdiction. Critchley is critical of those that look at ancient tragedy and “imagine that the spectators…were in some sort of prerational, ritualistic stupor, some intoxicated, drunken dumbfounded state,” but I suppose much of our interpretation depends on how we understand ritual, religion, stupor, and intoxication.

His claims are invested in an understanding of the Greeks as not being fundamentally that different from us, writing that “there is a lamentable tendency to exoticize Attic tragedy,” but maybe what’s actually called for is a defamiliarization of our own culture, an embrace of the irrational weirdness at the core of what it means to be alive 2019, where everything that is solid melts into air (to paraphrase Marx). Aeschylus knew the score well; “Hades, ruler of the nether sphere, / Exactest auditor of human kind, / Graved on the tablet of his mind,” as he describes the prince of this world in Eumenides. Critchley, I’d venture, is of Dionysius’s party but doesn’t know it. All that is argued in Tragedy, the Greeks, and Us points towards an awareness, however sublimated, of the dark beating heart within the undead cadaver’s chest. “To resist Dionysius is to repress the elemental in one’s own nature,” writes the classicist E.R. Dodds in his seminal The Greeks and the Irrational, “the punishment is the sudden complete collapse of the inward dykes when the elemental breaks through…and civilization vanishes.”

Absolutely correct that tragedy is in opposition to philosophy; where the latter offers assurances that reason can see us through, the former knows that it’s never that simple. The abyss is patient and deep, and no amount of analysis, of interpretation, of calculation, of polling can totally account for the hateful tragic pulse of our fellow humans. Nietzsche writes “what changes come upon the weary desert of our culture, so darkly described, when it is touched by…Dionysius! A storm seizes everything decrepit, rotten, broken, stunted; shrouds it in a whirling red cloud of dusty and carries it into the air like a vulture.” If any place best exemplifies that experience, and this moment, it’s Euripides’s The Bacchae, to which Critchley devotes precious little attention. That play depicts the arrival of that ambiguous god Dionysius to Thebes, as his followers thrill to the divine and irrational ecstasies that he promises. It ends with a crowd of those followers, the Maenads, mistaking the ruler Pentheus for a sacrificial goat and pulling him apart, his bones from their sockets, his organs from their cavities. Until his murder, Pentheus simultaneously manifested a repressed thrill towards the Dionysian fervor and a deficiency in taking the threat of such uncontained emotion seriously. “Cleverness is not wisdom,” Euripides writes, “And not to think mortal thoughts is to see few days.” If any didactic import comes from The Bacchae, it’s to give the devil as an adversary his due, for irrationality has more power than the clever among us might think.

Circling around the claims of Critchley’s book is our current political situation, alluded to but never engaged outright. In one sense, that’s for the best; those demons’ names are uttered endlessly all day anyhow. It’s desirable to at least have one place where you need not read about them. But in another manner, fully intuiting the Dionysian import of tragedy becomes all the more crucial when we think about what that dark god portends in our season of rising authoritarianism. “Tragedy is democracy turning itself into a spectacle,” and anyone with Twitter will concur with that observation of Critchley’s. Even more important is Critchley’s argument about those mystic chords of memory connecting us to a past that we continually reinvent; the brilliance of his claim about why the Greeks matter to us now, removing the stuffiness of anything as prosaic as canonicity, is that tragedy encapsulates the way in which bloody trauma can vibrate through the millennia and control us as surely as the ancients believed fate controlled humans. Critchley writes that “Tragedy is full of ghosts, ancient and modern, and the line separating the living from the dead is continually blurred. This means that in tragedy the dead don’t stay dead and the living are not fully alive.” We can’t ignore the Greeks, because the Greeks aren’t done with us. If there is anything that hampers us as we attempt to extricate the Dionysian revelers in our midst, it’s that many don’t acknowledge the base, chthonic power of such irrationality, and they refuse to see how violence, hate, and blood define our history in the most horrific of ways. To believe that progress, justice, and rationality are guaranteed, that they don’t require a fight commensurate with their worthiness, is to let a hubris fester in our souls and to court further tragedy among our citizens.

What Medea or The Persians do is allow us to safely access the Luciferian powers of irrationality. They present a more accurate portrayal of humanity, based as we are in bloodiness and barbarism, than the palliatives offered by Plato in The Republic with his philosopher kings. Within that space of the theater, Critchley claims that at its best it “somehow allows us to become ecstatically stretched out into another time and space, another way of experiencing things and the world.” Far from the anemic moralizing of Aristotelian catharsis—and Critchley emphasizes just how ambiguous that word actually is—that is too often interpreted as referring to a regurgitative didacticism, tragedy actually makes a new world by demolishing and replacing our world, if only briefly. “If one allows oneself to be completely involved in what is happening onstage,” Critchley writes, “one enters a unique space that provides an unparalleled experience of sensory and cognitive intensity that is impossible to express purely in concepts.” I recall seeing a production of Shakespeare’s Othello at London’s National Theatre in 2013, directed by Nicholas Hytner and starring Adrian Lester as the cursed Moor and Rory Kinear as a reptilian Iago. Dr. Johnson wrote that Othello’s murder of Desdemona was the single most horrifying scene in drama, and I concur; the play remains the equal of anything by Aeschylus or Euripides in its tragic import.

When I watched Lester play the role, lingering over the dying body of his faithful wife, whispering “What noise is this? Not dead—not yet quite dead?” I thought of many things. I thought about how Shakespeare’s play reflects the hideous things that men do to women, and the hideous things that the majority do to the marginalized. I thought about how jealousy noxiously fills every corner, no matter how small, like some sort of poison gas. And I thought about how unchecked malignancy can shatter our souls. But mostly what I thought wasn’t in any words, but was better expressed by Lester’s anguished cry as he confronted the evil he’d done. If tragedy allows for an audience to occasionally leave our normal space and time, then certainly I felt like I was joined with those thousand other spectators on that summer night at South Bank’s Olivier Theatre. The audience’s silence after Othello’s keening subsided was as still as the space between atoms, as empty as the gap between people.

True Fake Fact: Donald Trump Is Andrew Jackson

Sometimes we open a book hoping to learn one thing and wind up getting bushwhacked by something completely unrelated and unexpected. I’m having that unnerving experience right now with Nancy Isenberg’s White Trash: The 400-Year Untold History of Class in America.

I started reading the book as research for a nonfiction book I’m writing about a man who was born into a slave-owning family in Virginia during the Civil War and died at the age of 92 at the peak of the Cold War. I was looking for insights into the origins and evolution of Virginia’s (and America’s) class system and, specifically, for evidence supporting my long-held belief that the United States never was and never will be a classless society.

Though Isenberg has solid credentials—she wrote the well-received Fallen Founder: The Life of Aaron Burr, coauthored Madison and Jefferson, and teaches American history at Louisiana State University—I’ll admit I approached White Trash with some trepidation. That “400-Year Untold History” claim in the subtitle smelled of over-reach, and the early chapters failed to convince me that my nose was malfunctioning. Then I came to chapter five, “Andrew Jackson’s Cracker Country: The Squatter as Common Man.” After a meandering description of the landless, uncouth “crackers and squatters” who led the young republic’s expansion beyond the Appalachian Mountains, Isenberg comes to her central character: Andrew Jackson, Old Hickory, the raw-boned Tennessee scrapper and warrior who would become the seventh president of the United States. Isenberg’s sketch of Jackson opens hot and quickly catches fire: “Ferocious in his resentments, driven to wreak revenge against his enemies, he often acted without deliberation and justified his behavior as a law unto himself…Jackson’s personality was a crucial part of his democratic appeal as well as the animosity he provoked. He was not admired for statesmanlike qualities, which he lacked in abundance in comparison to his highly educated rivals…His supporters adored his rough edges…Using violent means if necessary, and acting without legal authority, Jackson was arguably the political heir of the cracker and squatter.”

That was when the gong went off. It was impossible to miss. Isenberg was not merely sketching Andrew Jackson; she was, chapter and verse, sketching the personal and political biography of…Donald Trump. As I continued reading, I found myself subconsciously substituting Trump’s name for Jackson’s, and other players in our contemporary political shitshow for the 19th-century actors in the Jacksonian soap opera. The parallels were so precise they were spooky. Here, with italics marking my mental edits, was what I read:

“Trump’s was a career built on sheer will and utter impulse…Controversy, large and small, seemed to follow the man. Because Trump had relatively little experience holding political offices, his run for the presidency drew even more than the normal amount of attention to his personal character. A biography written for campaign purposes…focused on his volatile emotions. He certainly lacked the education and polite breeding of his presidential predecessors.”

At this point, a suspicion sprang to life. Could it be that Isenberg was writing a cleverly coded takedown of Donald Trump? But I soon learned that this was nearly impossible because White Trash was published five months before the 2016 election, when just about no one, least of all Hillary Clinton and The New York Times, thought Donald Trump had a snowball’s chance of winning the presidency. So Isenberg was not writing in code. The uncanny parallels between our seventh and 45th presidents are the fruit of deep scholarly research. They are actual facts. Isenberg continues, again with my italics:
Prominent critics insisted on a congressional investigation. The powerful Speaker of the House, Nancy Pelosi, demanded the president’s censure. Trump damned the established legal authorities…Confirmed rumors circulated that Trump had threatened to cut off the ears of some senators because they had dared to investigate—and humiliate—him on the national stage.
Of course, both besieged presidents had their defenders:
Trump’s nomination provoked “sneers and derision from the myrmidons of power at Washington,” wrote one avid Trump man, who decried “the degeneracy of American feeling in that swamp.” Trump wasn’t a government minion or a pampered courtier, and thus his unpolished and un-statesmanlike ways were an advantage. In 2019, in a speech before Congress, Mitch McConnell of Kentucky used this kind of language to reproach members of the House for investigating Trump’s activities…The men and women censuring Trump, whom the Kentucky senator mocked as the “young sweet-smelling and powdered beau of the town,” were out of their league. With this clever turn of phrase, McConnell recast Trump’s foes as coastal elites, the classic enemies of flyover country.
Just when I started thinking it was time to get some sex into this parallel-universe narrative, Isenberg obliged: “The candidate’s private life came under equal scrutiny. His irregular marriage became scandalous fodder during the election of 2016…In the ever-expanding script detailing Trump’s misdeeds, adultery was just one more example of his uncontrolled passions. Having affairs with porn stars and then paying them hush money belonged to the standard profile of the backwoods aggressor who refused to believe the law applied to him…He simply took what he wanted, and was even willing to, by his own admission, ‘grab them by the pussy.’”

Even staggering ignorance of international affairs was seen as a virtue by these presidents’ supporters, as Isenberg notes: “If his lack of diplomatic experience made him ‘homebred,’ this meant he was less contaminated than former ambassador to the Ukraine Marie Yovanovitch by foreign ideas or courtly pomp. The class comparison could not be ignored: Hillary Clinton had been a first lady and a secretary of state, while Trump was ‘sprung from a common family,’ and had written nothing to brag about. Instinctive action was privileged over unproductive thought.”

That “common family” claim required a little more massaging in Trump’s case than in Jackson’s, and Trump’s minions have been happy to oblige. “Partisans of Trump claimed that he was from backwoods stock,” Isenberg writes. “This was untrue. Trump was born into an elite New York real-estate family, and though he had briefly been a resident of Queens, that five-bedroom Tudor had been abandoned long ago in favor of Trump Tower.”

It’s likely that Trump, like Jackson before him, has brought lasting changes to the American scene. As Isenberg puts it: “Trump’s candidacy changed the nature of democratic politics. One political commentator noted that Trump’s reign ushered in the ‘game of brag.’ Another observer concluded that a new kind of ‘tweeting country politician’ had arisen, who could tweet for hours before having finally ‘exhausted the fountain of his panegyric on President Trump.’”

As I reached the end of chapter five in White Trash, I dimly remembered hearing that Donald Trump is a big fan of Old Hickory. A little digging reminded me that early in his presidency, in March 2017, Trump had visited Jackson’s estate, the Hermitage, near Nashville to commemorate the 250th anniversary of Jackson’s birth. In one of his keener readings of history, Trump declared, “I mean, had Andrew Jackson been a little later, you wouldn’t have had the Civil War.” Hard to fact-check that whopper because Jackson died 16 years before the first shots were fired at Fort Sumter. No matter. Trump added that he admires Jackson—and he has Jackson’s portrait on the wall in the Oval Office—because he was “a very tough person” with a “big heart.” Tell that to the 150 human beings Jackson owned at the time of his death, some of whom he hunted down personally when they tried to escape from bondage. Or tell that to the thousands of Native Americans and black slaves who perished during Jackson’s enforced relocation known as the Trail of Tears, an act of genocide by any other name.

But let’s not get bogged down with true facts when the world is bursting with so many fake facts. And let’s not lose sight of the completely unexpected lesson in Isenberg’s book. The republic survived Andrew Jackson—and Andrew Johnson, Warren Harding, Richard Nixon and George W. Bush. Surely it will survive Donald Trump? We might get the answer to that question sooner than anyone expected, shortly before the swearing in of President Pence.

Image: Wikimedia Commons

Kill Your Idols: On the Violence of Experimental Literature

In a recent lecture on innovative writing, Myung Mi Kim argued that any artistic experiment is inherently violent, as the artist is dismantling an inherited tradition in order to make way for the new. For many writers, innovation does indeed contain destruction in its very definition. After all, the experimental text cannot exist in the same space as the conventions that restrict its meaning, stifle its performativity, and deny its legitimacy.


Three recent books remind us that an experiment, though it challenges elements of a familiar literary heritage, does not have to sacrifice unity of voice and vision. Karla Kelsey’s forthcoming Blood Feather, Kenji Liu’s Monsters I Have Been, and Grace Talusan’s The Body Papers skillfully dismantle received forms to offer alternative ways of creating meaning and coherence from human experience. Though vastly different in style and scope, these three innovative texts share a commitment to a unity of concept, presenting us with larger questions about the politics of language that ultimately guide and focus the generative violence of the experiment. In their hands, innovation becomes an exercise in precision, as well as a legitimate danger. As Liu writes, “The under
state / swarms our / documents. Our / lungs.”

Monsters I Have Been opens with an articulation of the artistic goals and the parameters of an invented poetic form called “frankenpo.” Liu writes in the form’s definition: “to create a new poetic text by collecting, disaggregating, randomizing, rearranging, recombining, erasing, and reanimating one or more chosen bodies of text, for the purpose of divining or revealing new meanings often at odds with the original texts.” As the book unfolds, the constraints and freedoms of “frankenpo” serve to unify the book’s wild flights of the imagination, as Monsters I Have Been reads as an extended exploration of the possibilities inherent in this specific literary form.

In many ways, it is the intense focus of Liu’s experiment that brings his discoveries into sharp relief. Culling text from a variety of sources, which range from screenplays to New York Times articles, feminist theory, and U.S. presidential executive orders, Liu shows us beauty and danger contained within the same turns of phrase, which can house both violence and redemption, light and unspeakable darkness. The poems in Monsters I Have Been call attention to the remarkable disconnect between language and the real world toward which it constantly gestures. At the same time, Liu frames this disconnect, the inherent arbitrariness of the signifier, as a source of agency for the creative practitioner.

Liu writes, for example, in “Thus I Have Heard,” “We are visas / in a national / drowning. / Each of us an executive / decision, pursuant to clay. / Each a subsection
 of protocol / and yet.” Here Liu reconfigures language from unspecified source texts, reminding us that intent not only shapes outcome with respect to the words we use, but also that intent can bring to light the beauty that resides just beneath the surface of a seemingly unremarkable text. For Liu, the same language can carry revelation and violence, enlightenment and oppression.

What’s more, he shows us the myriad ways that language is illuminated by conversation, dialogue, and juxtaposition. In many ways, the personae contained within Monsters I Have Been are strengthened and refined by conversation, as proximity brings a single voice into clearer focus. He writes, for instance, in “As the light diminishes again,” “To fit the average, we come / as animals, with a pocket map / of the sky and nothing under. // How the ragged hairpiece gapes / open and declares teeth.” This poem utilizes found text from Judith Butler’s theoretical writings as well as the Heart Sutra. Approached with that in mind, the poem becomes a space for dialogue in which one texture of language complicates, and calls into question, the other. As Liu himself asks, “What masks / What power”?

Much like Liu’s book, Talusan’s recent memoir, The Body Papers, reveals (and renegotiates) the politics inherent in language. Yet Talusan takes this kind of experimentation in a new direction, pairing text with found images as she investigates the authority, reverence, and doubt that we invest in various types of cultural documents. The artifacts that inhabit The Body Papers range from canceled passports to immigration forms to family photographs. As the book unfolds, these politically charged and authoritative documents are positioned in service of personal narrative, a gesture that proves as innovative as it is subversive. The hierarchies that we impose upon types of language are provocatively reversed. Talusan summons the authority of official documents, journalistic photographs, and the various traces of governmental power to further a personal narrative of risk, family ties, and discovery.

Talusan’s daring reversal of these power structures comes through most visibly in her depiction of the journey of her emigration to the United States from Manila with her parents and siblings. Describing the obstacles her parents encountered as they applied for citizenship, she writes, “I was terrified. I had never thought about how meaningful U.S. citizenship was until I was told I didn’t have it. With a shuffle of papers, life as I knew it could be lost. I am still astounded by how meaningful these papers are, how they are pasted onto our bodies and determine where and how we can move through the world.” This powerful narrative, in which the narrator realizes the precarity of what she had remembered as a joyful childhood, is spliced with images of a canceled Philippine passport and a character reference in support of an application for United States citizenship.

In many ways, the images included in The Body Papers complicate and enrich the narrative proper. By pairing this section with these specific documents, for example, Talusan evokes the stateless and liminal status of her younger self. Yet at the same time, she provocatively claims the authority and power of these documents for own narrative, a reversal of the ways in which we often shape and reshape personal narrative in the service of government procedure.

This investment in revealing and challenging the authority placed in government documents unifies a gorgeously capacious narrative. Talusan writes, for example: “Without physical proof, I started to question whether I had even written [the letters]—a psychological pattern that I think is intertwined with the immigrant experience.” As this powerful memoir unfolds, however, Talusan challenges the artificial divide culture has created between objective and subjective types of language, laying claim to both in prose as deeply felt as it is precise and sharply focused.

Kelsey’s Blood Feather, like the work of Liu and Talsuan, utilizes experimental language in service of social justice. This book-length poem, inspired by a rich store of archival material associated with women’s history, manifests as three dramatic monologues spoken by different personae. The whole of the archive is subsumed into the voices of these richly imagined narrators, with Kelsey drawing from texts that include Aristotle, Pina Bausch, Julian Beck, Richard Brody, Cheiro, and many other writers, philosophers, cinematographers, and thinkers. By challenging the fiction of the single speaker in such a way, Kelsey gestures at voice as a social construct, calling into question the myriad ways culture presupposes that ownership over language is even possible.

It is the unity of voice, remarkable given the scope and range of archival material represented in this volume, that renders Kesley’s text as sharply focused as Talusan’s narrative memoir and Liu’s extended exploration of a single form. As the book unfolds, this unity of voice and vision is revealed as integral to the poem’s deeply philosophical meaning. For Kelsey, the self, the single spoken voice, contains multitudes. She shows us, through her sharply focused experimentation, that the boundary between individual and community is porous and indistinct. She writes, for example, in Blood Feather:
the aesthetic problem of
form exists essentially and simultaneously as
a moral problem writes Deren in
An Anagram of Ideas on Art
and so how to perform an
ethical relation to the footage of

a flood mobile homes uprooted a
man in a canoe paddling after
his lowing cow the film then
cutting to the tremor of a
hand-held camera actress gagged and bound
to the bed how to punctuate
Here the speaker reflects on the ethical problems inherent in representation. If the boundary between self and other remains blurry, Kelsey asks us to consider where cultural appropriation begins when attempting to depict one’s own perceptions. In many ways, the philosophical quality of Kelsey’s poetry is in itself subversive, as she uses the artistic repertoire of poetry to claim agency over a predominantly masculine philosophical tradition. In doing so, she reminds us that despite the rigid binary distinctions that circulate within culture, alterity inevitably resides within the subject, who is a world unto herself.

If innovation is in itself a destructive gesture, can that generative violence be placed in service of activism and advocacy through language? Kelsey, Talusan, and Liu show us that the precision of the experiment constitutes its power. In each of these three collections, this dismantling of convention is placed in service of a specific philosophical question, the work an inquiry into what is possible when specific rules associated with language are renegotiated. Here, language is wielded as veiled threat, as provocative reversal, as gloriously shattered syntactic convention. Yet it is this space between words that allows us to see the light.

Image credit: Annie Spratt

King, God, Smart-Ass Author: Reconsidering Metafiction

“Whoever is in charge here?” -Daffy Duck, Merrie Melodies (1953)

Like all of us, Daffy Duck was perennially put upon by his Creator. The sputtering, stuttering, rageful water fowl’s life was a morass of indignity, embarrassment, anxiety, and existential horror. Despite all of the humiliation Daffy had to contend with, the aquatic bird was perfectly willing to shake his wings at the unfair universe. As expertly delivered by voice artist Mel Blanc, Daffy could honk “Who is responsible for this? I demand that you show yourself!” In animator Chuck Jones’s brilliant and classic 1953 episode of Merrie Melodies titled “Duck Amuck,” he presents Daffy as a veritable Everyduck, a sinner in the hands of a smart-assed illustrator. “Duck Amuck” has remained a canonical episode in the Warner Brothers cartoon catalog, its postmodern, metafictional experimentation heralded for its daring and cheekiness. Any account of what critics very loosely term “postmodern literature”—with its playfulness, its self-referentiality, and it’s breaking of the fourth wall—that considers Italo Calvino, Jorge Luis Borges, Vladimir Nabokov, and Paul Auster but not Jones is only telling part of the metafictional story.  Not for nothing, but two decades ago, “Duck Amuck” was added to the National Film Registry by the Library of Congress as an enduring piece of American culture.

Throughout the episode, Jones depicts increasingly absurd metafictional scenarios involving Daffy’s sublime suffering. Jones first imagines Daffy as a swordsman in a Three Musketeers parody, only to have him wander into a shining, white abyss as the French Renaissance background fades away. “Look Mac,” Daffy asks, never one to let ontological terror impinge on his sense of personal justice, “what’s going on here?” Jones wrenches the poor bird from the musketeer scenery to the blinding whiteness of the nothing-place, then to a bucolic pastoral, and finally to a paradisiacal Hawaiian beach. Daffy’s admirable sense of his own integrity remains intact, even throughout his torture. Pushed through multiple parallel universes, wrenched, torn, and jostled through several different realities, Daffy shouts “All right wise guy, where am I?”  

But eventually not even his own sense of identity is allowed to continue unaffected, as the God-animator turns him into a country-western singer who can only produce jarring sound effects from his guitar, or as a transcendent paintbrush recolors Daffy blue. At one point the animator’s pencil impinges into Daffy’s world, erasing him, negating him, making him nothing. Daffy’s very being, his continued existence depends on the whims of a cruel and capricious God; his is in the world of Shakespeare’s King Lear, where the Duke of Gloucester utters his plaintive cry, “As flies are to wanton boys are we to th’ gods; / They kill us for their sport.” Or at least they erase us. Finally, like Job before the whirlwind, Daffy implores, “Who is responsible for this? I demand that you show yourself!” As the view pans upward, into that transcendent realm of paper and ink where the animator-God dwells, it’s revealed to be none other than the trickster par excellence, Bugs Bunny. “Ain’t I a stinker?” the Lord saith.

Creation, it should be said, is not accomplished without a certain amount of violence. According to one perspective, we can think of Daffy’s tussling with Bugs as being a variation on that venerable old Aristotelian narrative conflict of “Man against God.” If older literature was focused on the agon (as the Greeks put it) between a human and a deity, and modernist literature concerned itself with the conflict that resulted as people had to confront the reality of no God, then the wisdom goes that our postmodern moment is fascinated with the idea of a fictional character searching out his or her creator. According to narrative theorists, that branch of literary study that concerns itself with the structure and organization of story and plot (not synonyms incidentally), such metafictional affectations are technically called metalepsis. H. Porter Abbot in his invaluable The Cambridge Introduction to Narrative explains that such tales involve a “violation of narrative levels” when a “storyworld, is invaded by an entity or entities from another narrative level.”

Metalepsis can be radical in its execution, as when an “extradiegetic narrator” (that means somebody from outside the story entirely) enters into the narrative, as in those narratives where an “’author appears and starts quarreling with one of the characters,” Abbot writes. We’ll see that there are precedents for that sort of thing, but whether interpreted as gimmick or deep reflection on the idea of literature, the conceit that has a narrator enter into the narrative as if theophany is most often associated with something called, not always helpfully, “postmodernism.” Whatever that much-maligned term might mean, in popular parlance it has an association with self-referentiality, recursiveness, and metafictional playfulness (even if readers might find cleverness such as that exhausting). The term might as well be thought of as referring to our historical preponderance of literature that knows that it is literature.

With just a bit of British disdain in his critique, The New Yorker literary critic James Wood writes in his pithy and helpful How Fiction Works that “postmodern novelists… like to remind us of the metafictionality of all things.” Think of the crop of experimental novelists and short story writers from the ’60s, such as John Barth in his Lost in the Funhouse, where one story is to be cut out and turned into an actual Moebius strip; Robert Coover in the classic and disturbing short story “The Babysitter,” in which a variety of potential realities and parallel histories exist simultaneously in the most mundane of suburban contexts; and John Fowles in The French Lieutenant’s Woman, in which the author also supplies multiple “forking paths” to the story and where the omniscient narrator occasionally appears as a character in the book. Added to this could be works where the actual first-person author themselves becomes a character, such as Auster’s New York Trilogy, or Philip Roth’s Operation Shylock (among other works where he appears as a character). Not always just as a character, but as the Creator, for if the French philosopher Roland Barthes killed off the idea of such a figure in his seminal 1967 essay “The Death of the Author,” then much of the period’s literature resurrected Him. Wood notes, perhaps in response to Barthes, that “A certain kind of postmodern novelist…is always lecturing us: ‘Remember, this character is just a character. I invented him.’” Metafiction is when fiction thinks about itself.

Confirming Wood’s observation, Fowles’s narrator writes in The French Lieutenant’s Woman, “This story I am telling is all imagination. These characters I create never existed outside my own mind…the novelist stands next to God. He may not know all, yet he tries to pretend that he does.” Metafictional literature like this is supposed to interrogate the idea of the author, the idea of the reader, the very idea of narrative. When the first line to Calvino’s If on a Winter’s Night a Traveler is “You are about to begin reading Italo Calvino’s new novel, If on a Winter’s Night a Traveler,” it has been signaled that the narrative you’re entering is supposed to be different from those weighty tomes of realism that supposedly dominated in previous centuries. If metalepsis is a favored gambit of our experimental novelists, then it’s certainly omnipresent in our pop culture as well, beyond just “Duck Amuck.”

A list of sitcoms that indulge the conceit includes 30 Rock, Community, Scrubs, and The Fresh Prince of Bel-Air. The final example, which after all was already an experimental narrative about a wholesome kid from West Philly named Will played by a wholesome rapper from West Philly named Will Smith, was a font of avant-garde fourth-wall breaking deserving of Luigi Pirandello or Bertolt Brecht. Prime instances would include the season five episodes “Will’s Misery,” which depicts Carlton running through the live studio audience, and “Same Game, Next Season,” in which Will asks “If we so rich, why we can’t afford no ceiling,” with the camera panning up to show the rafters and lights of the soundstage. Abbot writes that metafiction asks “to what extent do narrative conventions come between us and the world?” which in its playfulness is exactly what The Fresh Prince of Bel-Air is doing, forcing its audience to consider how “they act as invisible constructors of what we think is true, shaping the world to match our assumptions.”

Sitcoms like these are doing what Barth, Fowles, and Coover are doing—they’re asking us to examine the strange artificiality of fiction, this illusion in which we’re asked by a hidden author to hallucinate and enter a reality that isn’t really there. Both audience and narrator are strange, abstracted constructs; their literal corollaries of reader and writer aren’t much more comprehensible. When we read a third-person omniscient narrator, it would be natural to ask “Who exactly is supposed to be recounting this story?” Metafiction is that which does ask that question. It’s the same question that the writers of The Office confront us with when we wonder, “Who exactly is collecting all of that documentary footage over those nine seasons?”

Far from being simply a postmodern trick, metalepsis as a conceit and the metafiction that results have centuries’ worth of examples. Interactions between creator and created, and certainly author and audience, have a far more extensive history than both a handful of tony novelists from the middle of the 20th century and the back catalog of Nick at Nite. For those whose definition of the novel doesn’t consider anything written before 1945, it might come as a shock that all of the tricks we associate with metafiction thread so deep into history that realist literature can seem the exception rather than the rule. This is obvious in drama; the aforementioned theater term “breaking the fourth wall” attests to the endurance of metalepsis in literature. As a phrase, it goes back to Molière in the 17th century, referring to when characters in a drama acknowledge their audience, when they “break” the invisible wall that separates the action of the stage from that of the observers in their seats. If Molière coined the term, it’s certainly older than even him. In all of those asides in Shakespeare—such as that opening monologue of Richard III when the title villain informs all of us who are joining him on his descent into perdition that “Now is the winter of our discontent”—we’re, in some sense, to understand ourselves as being characters in the action of the play itself.  

As unnatural as Shakespearean asides may seem, they don’t have the same sheer metaleptic import of metafictional drama from the avant-garde theater of the 20th century. Pirandello’s classic experimental play Six Characters in Search of an Author is illustrative here, a high-concept work in which unfinished and unnamed characters arrive at a Pirandello production asking their creator to more fully flesh them out. As a character named the Father explains, the “author who created us alive no longer wished…materially to put us into a work of art. And this was a real crime.” A real crime because to be a fictional character means that you cannot die, even though “The man, the writer, the instrument of the creation will die, but his creation does not die.” An immaculate creation outliving its creator, more blessed than the world that is forever cursed to be ruled over by its God. But first Pirandello’s unfortunates must compel their God to grant them existence; they need a “fecundating matrix, a fantasy which could rise and nourish them: make them live forever!” If this seems abstract, you should know that such metaleptic tricks were staged long before Pirandello, and Shakespeare for that matter. Henry Medwall’s 1497 Fulgens and Lucrece, the first secular play in the entire English canon, has two characters initially named “A” and “B” who argue about a play only to have it revealed that the work in question is actually Medwall’s, which the audience is currently watching. More than a century later, and metafictional poses were still explored by dramatists, a prime and delightful example being Shakespeare’s younger contemporary and sometimes-collaborator Francis Beaumont’s The Knight of the Burning Pestle. In that Jacobean play of 1607, deploying a conceit worthy of Thomas Pynchon’s The Crying of Lot 49, Beaumont imagines the production of a play-within-a-play entitled The London Merchant. In the first act, two characters climb the stage from the audience, one simply called “Citizen” and the other “Wife,” and begin to heckle and critique The London Merchant, and its perceived unfairness to the rapidly ascending commercial class. The Knight of the Burning Pestle allows the audience to strike back, the Citizen cheekily telling the actor reading the prologue, “Boy, let my wife and I have a couple of stools / and then begin; and let the grocer do rare / things.”

Historical metalepsis can also be seen in what are called “frame tales,” that is, stories-within-stories that nestle narratives together like Russian dolls. Think of the overreaching narrative of Geoffrey Chaucer’s 14th-century The Canterbury Tales with its pilgrims telling each other their stories as they make their way to the shrine of Thomas Becket, or of Scheherazade recounting her life-saving anthology to her murderous husband in One Thousand and One Nights, as compiled from folktales during the Islamic Golden Age from the eighth to 14th centuries. Abbot describes frame tales by explaining that “As you move to the outer edges of a narrative, you may find that it is embedded in another narrative.” Popular in medieval Europe, and finding their structure from Arabic and Indian sources that go back much further, frame tales are basically unified anthologies where an overreaching narrative supplies its own meta-story. Think of Giovanni Boccaccio’s 14th-century Decameron, in which seven women and three men each tell 10 stories to pass the time while they’re holed up in a villa outside of Florence to await the passage of the Black Death through the city. The 100 resulting stories are ribald, earthy, and sexy, but present through all of their telling is an awareness of the tellers, this narrative about a group of young Florentines in claustrophobic, if elegant, quarantine. “The power of the pen,” one of Boccaccio’s characters says on their eighth day in exile, “is far greater than those people suppose who have not proved it by experience.” Great enough, it would seem, to create a massive sprawling world with so many stories in it. “In my father’s book,” the character would seem to be saying of his creator Boccaccio, “there are many mansions.”

As metaleptic as frame tales might be, a reader will note that Chaucer doesn’t hitch up for that long slog into Canterbury himself, nor does Boccaccio find himself eating melon and prosciutto while quaffing chianti with his aristocrats in The Decameron. But it would be a mistake to assume that older literature lacks examples of the “harder” forms of metalepsis, that writing before the 20th century is devoid of the Author-God appearing to her characters as if God on Sinai. So-called “pre-modern” literature is replete with whimsical experimentation that would seem at home in Nabokov or Calvino; audiences directly addressed on stage and books speaking as themselves to their readers, authors appearing in narratives as creators, and fictions announcing their fictionality.

Miguel de Cervantes’s 17th-century Don Quixote plays with issues of representation and artificiality when the titular character and his trusty squire, Sancho Panza, visit a print shop that is producing copies of the very book you are reading, the errant knight and his sidekick then endeavoring to prove that it is an inferior plagiarism of the real thing. Cervantes’s narrator reflects at an earlier point in the novel about the novel itself, enthusing that “we now enjoy in this age of ours, so poor in light entertainment, not only the charm of his veracious history, but also of the tales and episodes contained in it which are, in a measure, no less pleasing, ingenious, and truthful, than the history itself.” Thus Cervantes, in what is often considered the first novel, can lay claim to being the primogeniture of both realism and metafictionality.

Following Don Quixote’s example could be added other metafictional works that long precede “postmodernism,” including Laurence Sterne’s 18th-century The Life and Opinions of Tristram Shandy, Gentleman, where the physical book takes time to mourn the death of a central character (when an all-black page is printed); the Polish count Jan Potocki’s underread late-18th-century The Manuscript Found in Saragossa, with not just its fantastic caste of Iberian necromancers, kabbalists, and occultists, but its intricate frame structure and forking paths (not least of which include reference to the book that you’re reading); James Hogg’s Satanic masterpiece The Private Memoirs and Confessions of a Justified Sinner, in which the author himself makes an appearance; and Jane Austen’s Northanger Abbey, in which the characters remark on how it feels as if they’re in a gothic novel (or perhaps a parody of one). Long before Barthes killed the Author, writers were conflating themselves as creator with the Creator. As Sterne notes, “The thing is this. That of all the several ways of beginning a book which are now in practice throughout the known world, I am confident my own way of doing it is the best—I’m sure it is the most religious—for I begin with writing the first sentence—and trusting to Almighty God for the second.”

Sterne’s sentiment provides evidence as to why metafiction is so alluring and enduring, despite its minimization by some critics who dismiss it as mere trick while obscuring its long history. What makes metalepsis such an intellectually attractive conceit goes beyond simply that it makes us question how literature and reality interact, but rather what it implies about the Author whom Sterne gestures toward—“Almighty God.” The author of Tristram Shandy understood, as all adept priests of metafiction do (whether explicitly or implicitly), that at its core, metalepsis is theological. In questioning and confusing issues of characters and writers, narrators and readers, actors and audience, metafiction experiments with the very idea of creation. Some metafiction privileges the author as being the supreme-God of the fiction, as in The French Lieutenant’s Woman, and some castes its lot with the characters, as in The Knight of the Burning Pestle. Some metafiction is “softer” in its deployment, allowing the characters within a narrative to give us stories-within-stories; other is “harder” in how emphatic it is about the artifice and illusion of fiction, as in Jones’s sublime cartoon. What all of them share however, is an understanding that fiction is a strange thing, an illusion whereby whether we’re gods or penitents, we’re all privy to a world spun from something as ephemeral as letters and breath. Wood asks, “Is there a way in which all of us are fictional characters, parented by life and written by ourselves?” And the metaphysicians of metafiction answer in the affirmative.

As a final axiom, to join my claim that metafiction is when literature thinks about itself and that metalepsis has a far longer history than is often surmised, I’d finally argue that because all fiction—all literature—is artifice, that all of it is in some sense metafiction. What defines fiction, what makes it different from other forms of language, is that quality of metalepsis. Even if not explicitly stated, the differing realms of reality implied by the very existence of fiction imply something of the meta. Abbot writes “World-making is so much a part of most narratives that some narrative scholars have begun to include it as a defining feature of narrative,” and of that I heartily concur. Even our scripture is metafictional, for what else are we to call the Bible in which Moses is both author and character, and where his death itself is depicted? In metafiction perspective is confused, writer turns to reader, narrator to character, creator to creation. No more apt a description of metafiction, of fiction, of life than that which is offered by Prospero at the conclusion of The Tempest: “Our revels now are ended. These our actors, / As I foretold you, were all spirits and / Are melted into air, into thin air.” For Prospero, the “great globe itself…all which it inherit, shall dissolve / And, like this insubstantial pageant faded…We are such stuff / As dreams are made on, and our little life / Is rounded with a sleep.” Nothingness before and nothingness after, but with everything in between, just like the universe circumscribed by the cover of a book. Metafiction has always defined literature; we’ve always been characters in a novel that somebody else is writing.

On Body Horror and the Monstrosity of Women

In Sharlene Teo’s wise, tenderly grotesque novel Ponti, the introduction of teenaged protagonist Szu is effected in a cloud of body odor. “When I was eleven,” Szu grumbles, sitting in a classroom that smells of Impulse body spray and soiled sanitary towels, “I used to hope that puberty would morph me, that one day I’d uncurl from my chrysalis, bloom out beautiful. No luck! Acne instead. Disgusting hair. Blood.” Overflowing with monsters and matriarchs, Teo’s novel is at least partially a horror narrative and draws much of its impetus from the backstory of Szu’s mother, Amisa, a former horror actress, who once starred in a movie named Ponti! The film, telling the story of a deformed girl who makes a deal with a bomoh—a shaman—to become beautiful, pins the theme of transformation at the novel’s heart. Her wish is granted, but the transformation is a dual one. She does become beautiful, but she also becomes a bloodthirsty monster who feeds insatiably upon men. Teo’s novel stresses this duality, writing female adolescence as, in effect, synonymous with female monstrosity, with the becoming of something other. Szu is nicknamed “Sadako,” after another classic horror movie monster, and her adolescence is a lank, disquieting thing, at once disappointing and horrendous. She is turning into a woman, she is turning into a monster; the two things are one and the same.
 
 
+
 

“Body Horror” is a term that comes from horror cinema, but its literary origins can be traced back as far as Frankenstein. It is a trope that springs from primal fears—from the knowledge of oneself as a physical object and the consciousness of pain—and its roots wind through the Gothic, to the fin de siècle and the birth of science fiction. As a sub-genre, it broadly encompasses the concept of bodily violation, whether that be via mutilation, zombification, possession, or disease, but arguably one of its most pervasive themes is that of transformation. From Ovid to Cronenberg, transformation occupies an anxious corner in so much of film and literature that it more or less forms a tradition all its own. Folklore and myth are littered with metamorphosis—Daphne twisting into a bay tree, Alice in Wonderland with her Eat Me’s and Drink Me’s—and its impact is frequently an unsettling one. It is a fairy-tale punishment, a warning to naughty children, a reminder of the body’s unreliability.
In talking about Salt Slow, my short story collection, I have found myself returning frequently to the concept of the body, particularly the body as a locus of fear. The majority of stories in the collection are built on body horror and transformation, concerning women and the ways in which their bodies both contain and betray them: a girl whose teenage skin is hiding something terrible, a woman taking on the aspect of a jellyfish, a woman birthing something slippery. I think that writing about women goes hand in hand with horror writing. The female body is a nexus of pain almost by design (that by-now ubiquitous line from Fleabag: “women are born with pain built in, it’s our physical destiny”), but it is also potentially monstrous—an object traditionally subjugated, both for its presumed weakness and its perceived threat. The mutations and transformations of horror writing are uniquely qualified to evoke this: the difficulty and unreliability of the female body, its duality as an object both to be feared for and to fear.


When Daphne transforms into a bay tree, the moment is one of both horror and deliverance. She is no longer what she once was, but the metamorphosis frees her from the unwanted attention of Apollo. This duality of horror and emancipation sits, I think, at the core of female transformation. Within the horror genre (and arguably everywhere else), bodies read as female are always subject to pain, and to the threat of violation. Becoming something else—a tree, a freak, a monster—preempts this pain and reduces the risk of harm. It may even, if the transformation is the right one, allow you to cause harm in return.
 
 
+
 
As made evident in Teo’s Ponti, female adolescence is frequently a site of horror. It is, after all, the “transformation” aspect of the body horror trope made literal. Over and over, in books and in movies, girls are made subject to the bleedings and stretchings of adolescence: that werewolf period where the body becomes an othered version of itself (the horror movie Ginger Snaps comes to mind—the protagonist’s first period coinciding with her transformation into something hairy and ill-tempered). Often, the horror of this alteration is fed by misinformation—the teenaged girls I find myself writing about frequently haven’t been told. Adolescence is frightening enough on its own, but the strange, puritanical sketchiness that consistently surrounds sex and sex education only serves to heighten the panic. In Jeffrey Eugenides’s The Virgin Suicides, a teenage boy relays to his friends a jumbled impression of the informational “puberty” video he has stolen into a Girls Only class to watch: “When girls hit twelve or so,” he tells them breathlessly, “their tits bleed.” A monster, created by whispers.
In my short story “Mantis,” a Catholic schoolgirl’s troubled teenage skin extends far beyond acne. In between receiving woefully outdated sex education and attempting to navigate a newfound interest in boys, the protagonist loses teeth and hair, watching her skin come away in pieces to reveal something altogether buggier beneath. The transformation is a horror, of sorts, but it is also an emancipation—once freed from her physical strictures, the protagonist can act in a way that is natural to her (can do to a boy what a praying mantis would normally do to her chosen mate). This sense of transformation as a moment of freedom is not new. From certain representations of Lilith to Lucy Westenra, unloosed female power is often linked to a transformation into the monstrous, a shucking of one skin for another. Teenagers shed their younger selves for the relative power of womanhood, women shed their skins to take on fresh, more threatening aspects. In another story from my collection, a girl begins to take on attributes of her newly acquired stepsister, who happens to be a wolf. While it isn’t a transformation, in a physical sense, her newly feral traits allow her to contend far more effectively with the teenage boy who has started following her home. In this case, as in many others, the “transformation” is somewhat an active choice: looking at the possibilities presented to your body—pliancy, weakness, suffering—and choosing another option. Changing into a vampire, a wolf, a praying mantis, and eating people instead.
 
 
+
 
There is another aspect to transformation—or rather, another level. In Katherine Dunn’s Geek Love, a mother experiments with cocktails of drugs and hallucinogens to achieve what she perceived as a perfect brood of “mutant” children to display at the family carnival. The extremity of transformation, here, conveys status and happiness, with “normality” considered a failing. “I get glimpses of the horror of normalcy,” notes the protagonist, “Each of these innocents on the street is engulfed by a terror of their own ordinariness. They would do anything to be unique.”
In my story “Salt Slow,” a woman gives birth to something that might not be quite human—a tentacled creature which her boyfriend tries, and ultimately fails, to destroy. The transformation here, and in Geek Love, manifests in the child rather than the mother—an inherited transformation, so to speak. Transformation, in this sense, can be not only an active choice but one to pass on—the act of becoming, or of siring, a monster as a means of reclaiming control over the female body. Within the scaffolding of the body horror trope, female bodies are arguably presented with unprecedented choice: They can be at once unstable, vulnerable, suffocating, difficult, frightening, monstrous, and changing right before your eyes.

This piece was produced in partnership with Publishers Weekly and also appeared on publishersweekly.com.
Image credit: Marten Newhall

Am I a Bad Feminist?

“How could you publish this novel?” That’s what I heard after I chose to write about a girl falsely accusing a man of sexual assault during the #MeToo era. When The Liar was first published in Israel, a male journalist told me I should have delayed the publication. “You are hurting the struggle,” he said. It’s always nice to have a man telling me what a woman should or shouldn’t write about. But in the following weeks, I wondered: Am I a bad feminist?

I consider myself a feminist. Like most women, I have also experienced sexual harassment, which I never reported. One of the things that silenced me was the fear that people would say I’m making it up, as they often do in these cases. And yet, I wrote a story about a girl who’s “making it up.”

When my baby girl was born, I knew for sure that, one day, she’d experience some sort of sexual harassment, like so many women do. It was terrible: There she was, lying in her crib, and though I didn’t know yet what music she’s going to like, or what she will want to do when she grows up, I already knew that she is going to be harassed. One day or another, there will come a man who will say something, or grab something—simply because he can.

I hope she won’t feel ashamed, as I did when it happened to me. I pray she won’t blame herself as women in my generation so often do. I wish she won’t ask herself if it’s because of what she wore or something she said. I hope she’ll be able to talk to me about it. I never talked with my mom about it.

In the rain of thoughts, standing next to my baby girl’s crib, one thing was certain—somewhere along the way, this horror is likely to occur. I cannot protect her from this experience, which is written in the future of almost every woman.

But when #MeToo started, I began to hope that I was wrong. Is there a chance that my little girl won’t have to go through the sexual exploitation that my generation suffered? Is there a chance that my girl will be able go to high school, to summer jobs, to university without any creepy suggestions, or overtures made by bosses, professors, or other men in position of power? If #MeToo will manage to truly change the way women are treated, it will be a real revolution.

A few months after my daughter was born, I began writing a novel about a girl who gets caught in a story about a sexual assault that never happened. I decided to write it after a friend of mine, a public defense attorney, told me about an illegal migrant who was jailed because of what turned out to be a false accusation. My friend was so furious at the false accusation that she called the woman who filed the complaint “a psychopath, a monster.” But as I heard my friend calling the accuser names, I felt sorry for her. As a psychologist, I asked myself what can cause a woman to make up such a lie? It’s too easy to turn this woman into a monster. It’s much harder—and a whole lot more interesting—to try and understand her.

The work of literature—for both reader and author—is to dare and face the human condition, human complexity. Good people do bad things. False accusations of sexual assault are rare, but it doesn’t mean they never happen, or that we can’t talk about them. Sometime we write fiction not about the common case, but about the uncommon one.

And yet, it was very important for me to respect and represent the real statistics of sexual assault: Apart from the protagonist, all the other female characters in the novel have a back story of sexual harassment or assault, and this echoes my observations in reality.

As I wrote the novel, my biggest fear was that it would be read through a chauvinist perspective, one that automatically assumes that all female accusers are liars or attention seekers. After all, that’s one of the things that kept me silent about my own sexually harassment. And we all know that’s one of the favorite defenses of predators: she’s making it up.

Yet, it was clear to me that I wasn’t going to censor a story just because of the fear that someone might twist it and use it for his own misogynistic purposes. Clearly, writing about a girl making up an assault doesn’t mean that most girls make up assaults. Male authors such as Nabokov and Dostoevsky can write about pedophiles in Lolita, or murders in Crime and Punishment, yet no one would say that these novels portray most men as murderers and pedophiles.

And so, there I was, taking care of my baby girl, writing a story about a girl who makes up a sexual assault. Because I won’t let those men who falsely accuse women of lying to limit the variety of stories that a woman can write. Telling a woman what a feminist is allowed to write about is in itself a sort of repression. A Jewish author can portray Jews as complex characters, some of them doing bad deeds, without fearing that anti-Semites might use this in their propaganda.

And this, too, is something that I wish for my daughter when she grows up: to be able to say and write whatever she feels like, so that no one, ever, shuts down her voice.

Image credit: Mario Azzi.

A Game Like Heroin: On Escapism, TwitchCon, and Kicking the ‘Fortnite’ Habit

1.

I couldn’t tell you what time it was when I got the call from The Bosco—the San Francisco based photo service I freelanced for that mainly covered tech events scattered around the city—but I could tell you what I was doing. I was playing video games, specifically Fortnite on Playstation 4, a competitive, cross-platform battle royale game created by Epic Games with more than 250 million players and counting.

This is not an honorable story, merely a common one.

I don’t even know how I saw The Bosco’s call. Out of self-preservation, I’d left my phone face down to prevent me from instinctively doing math on how much sleep I wouldn’t be getting. I didn’t want that information, that guilt. I couldn’t do anything with it. It would only make me stop gaming. And that—late at night and lacking any desire to better myself spiritually, physically, or intellectually—was out of the question. Fortnite was all I had. What was I going to do? Read a book? No, I needed to kill. I needed to win. I needed Victory Royale.

The lights in my living room were dimmed to replicate the conditions of a cave, which heightened the sharpness and overall saturation of the digital desert, forest, and farms on which I was battling. As I worked the paddles of my PS4 controller, dodging, shooting, ducking, and weaving, I reveled in the movements of the miniature me on the 65-inch HDTV 4K TV. The character (or “skin”) I’d chosen was your typical caricature of an ancient Viking: gut juice smeared across his face; hatchet dangling from his side; braided golden yellow beard; tattoos of pagan idols; leather helmet with two giant horns jutting into the sky. I’m half Norwegian, so I figured why not connect my Fortnite addiction to my lineage?

As the phone rang and rang, 100 different players were dropping into the arena at once. As I fumbled to pick up the call, I felt a weird sense of pride as I watched my Viking avatar’s honeysuckle hair flutter behind him like a cape. Luckily, I was able to follow my team to a safe area, where I immediately hid in a bush. I picked up my phone. My boss’s name read big and white on the screen.

Tyler, I thought, Interesting…he never calls.

“What do you want?” I said, never taking my eyes from the game. “I’m in the middle of something.”

“I got some work for you next week,” he told me. “You ever heard of TwitchCon?”

“Of course,” I said. “The videogame streaming app where hordes of nerds watch godly nerds play video games for hours on end.”

I am one of these nerds. Obviously, I’m unable to address this truth. Que sera sera.

“We’re doing a booth there in one of their tents,” he said. “There’s going to be a green screen, something called a glider they are installing, guys coming up from L.A., and all that. It’s in the Fortnite tent. You ever heard of Fortnite?”

2.

Last year Fortnite—a game that grossed $3 billion in 2018 and boasts 200 million registered players—celebrated its first birthday. The CEO of Epic Games is now worth $7 billion. The company came out of nowhere, fusing the old paradigm of third-person shooters like Star Wars Battle Front and Grand Theft Auto with a brand-new mode of competitive gaming: battle royale. So how did Fortnite suddenly become so popular?

Well, it’s addicting. I remember how sweaty and shaky my hands were after my first win, after hours and hours of grinding. I could feel the dopamine and adrenaline surging through me like a swarm of bees. I was literally buzzing. I wanted nothing more than to feel that physiological spike of victory over and over, again and again.

My story isn’t unique. The dominant demographic of players registered with Fortnite is 18- to 24-year-olds (62.7 percent) followed by 25- to 34-year-olds (22.5 percent). It’s no surprise that people ripe for existential crises of one sort or another choose a game that gives them a specific task: kill anyone that gets in your or your team’s way.

That kind of freedom, just given away for free, can’t be found out in the real world

But in the world of Fortnite, it can.

With pop icons like Drake and sport stars like Richard Sherman selling copyrighted dance moves that Epic uses in the game, it’s no surprise that the masses follow suit. In March 2018, Ninja, the most popular streamer of Fortnite on Twitch—recently he moved to Mixer, a Seattle-based video game live-streaming platform owned by Microsoft; he makes roughly $3 million dollars per year in support of the game—played alongside Drake and Travis Scott for a crowd of millions. Like the Avengers and Harry Potter films, Fortnite draws huge crowds not only because of its content but also because of its culture. And, as with any successful online venture, there’s plenty of commercial opportunities: in-game micro transactions; battle passes, and targeted advertising pushing energy drinks, headphones, and specialty game controllers. All this (and much more) is ready to strike like some invisible eel from the coral.

So, what are the repercussions of all this innocent mayhem and electronic carnage? Lorrine Marer, a British behavioral specialist whose clients range from kids to adults, says, “This game is like heroin. Once you are hooked, it’s hard to get unhooked.” That may sound harsh, insensitive, or overly dramatic; nobody’s technically dying from playing Fortnite but some kids are logging 10 to 12 hours per day. There have been reports of Fortnite-obsessed kids disregarding school and homework, butting heads with parents who threaten to disconnect the Internet. And for what? Another level? A new skin? Another victory?

What Fortnite does—along with a slew of other games—is tap into and exploit the universal desire for legacy, fame, and recognition. Fortnite is really a metaphor for and microcosm of Capitalist America.

3.

Arriving at TwitchCon at the giant McEnery Convention Center in San Jose on a blindingly hot Saturday afternoon, I couldn’t help but assess the crowd as we walked in: the usual super fans with their hair dyed various shades of puke green, yellow, bubble-gum pink, and purple, their giant cell phones recording anything and everything. Then, there were the normal looking 20-somethings, most likely there to, somewhat ironically, view new games while simultaneously mocking their own (and everyone else’s) nerdiness. But as I scoped out the food tents, I noticed another breed of a gamer: more professional, almost athletic. Their backs were straighter, the muscles in their hands and forearms toned. Their eyes weren’t sluggish or glazed over; there was a kind of sharpness to them. Something told me something very serious was on the line for this group.

I would later learn that these gamers had been practicing for up to 16 hours a day for who knows how long with the hope of winning a cash prize of $1.5 million. That’s a lot of money for playing a video game. Some of these players had jerseys—some glittery aqua blue, others shiny obsidian—with team names like Disquiet, 100 Thieves, and Solary. As they sauntered into the Fortnite tent—many of them with an odd, awful royal bearing—they reminded me of the workers at the tech companies that litter the streets of San Francisco: dull, common, amoeba-like beings with oversized backpacks and Apple Watches, carrying themselves as if they owned the world. It was like The Revenge of the Nerds, but with an extra dose of hubris.

“Hey,” the captain of our crew shouted at me. “You’re laggin’. Let’s go.”

The fruity stink of vape pens and savory smell of overpriced food filled the air. We made our way through security—pat downs and metal detectors, because of a shooting last year at another video game convention, were common now—and then were let loose in the Fortnite tent.

4.

I remember gasping. The ceiling of the tent was at least 100 feet high. It felt like being in the belly of some freakishly giant blue whale, it’s ribs, muscles, and tendons all visible and aglow. On the thick canvas above, yellow, green, and red strobe lights danced. There was a muted beat of some indecipherable EDM in the distance. The setting was hectic and random, a Technicolor playground of cheap merchandise and desperate developers trying to push their wares.

“Quiet past the players area,” one of the guards told me in an annoyed whisper.

“The what?” I asked.

He jerked his head in the direction of a huge wall of translucent orange glass. Behind it, were hundreds of computers set up with gaming headsets, ergonomic chairs, and NASA-grade keyboards. This was where professionals played. I saw a few of these spectral, shadowy untouchables hunched over monitors practicing. Through the barrier, I could hear the furious tip-tap-tapping of their fingers.

The pro player closest to me must have been barely 16. He sat alone, save a pile of twisted Red Bull cans and his gaming set, his face six inches from the screen. His eyes, mirroring the rainbow of lights of the game, were tight and unblinking. And there in the pearly glow of the screen, he was talking himself. One eye would suddenly bulge; a burst of giggles would follow as he landed a shot or made a kill. And when something went wrong, he’d bark at the game. He looked as if he was trying to literally enter the world of Fortnite, tooth and nail, foot and fist.

Entering the main hall, I did a double take: there in the middle of the room was a real-life school bus. Just like the one at the beginning of the game. And just like the bus in Fortnite, this one had a giant balloon attached to the roof. All around the bus, children in oxygen masks were spray-painting its doors and body. No one seemed to mind. A few people clapped, smiling dead-eyed as the mist from the paint coated their cheeks. The energy of Fortnite—that feeling of gleeful chaos you get in the middle of a firefight—was ubiquitous. It was as if the digital world of my gaming addition had suddenly—and terrifyingly—come to life.

Next to the bus were barrels oozing some sort of neon-green goo and plastic palm trees, their leaves brittle and lime-green. At the base of the bus were various bottles: ceramic healing and shield potions. Near that was a bunch of kids—decked out in Fortnite shirts—waving colorful flags. And smack-dab in the middle of this strange excuse for a put-put course, was a giant Durr Burger. If you haven’t seen Fortnite’s Durr Burger, it’s a giant hamburger with googly eyes, a giant pink tongue, and an olive stabbed into the top of his head. I found myself staring directly into its dead eyes, transfixed by its wanton stupidity. All of it begged the simple question: Who the hell puts an olive on a hamburger?

The coup de gras was a vast lawn of fake grass littered with rainbow colored beanbags with a view of the giant IMAX screen. This is where the masses could view the digital Fortnite battles. And the area was already packed with hungry spectators lying about. For anyone unable to score a beanbag, there were rows of metal bleachers, all of them covered in peanuts and popcorn and surrounded by plastic fencing and security guards. I was trying to meditate on this terrible battleground when a short kid with electric blueberry hair tugged at my shirt.

“Hey,” he barked. “Are you excited for the competition?”

“The what?” I pulled his clammy hand off me.

“The Fortnite fight, stupid!” he said, spitting on my pants.

“Sure,” I lied. The game I’d been so addicted to, the game I’d lost countless hours to, was finally showing me its true colors. I was nothing but dejected. “Ecstatic.”

“I don’t know what that words means.” He stuck his tongue out at me, ruffled his blueberry hair, and sprinted into the crowd. I was about to shout something at him—something elderly sounding—but he was gone.

5.

Just as we were about to open the photo booth, a giant explosion echoed across the fake lawn.

“What the hell was that?” shrieked a man dressed as a Viking

I was about to answer when a seven-foot, silver-headed Donkey DJ emerged from a cloud of smoke. The creature had jutting ears, cavernous nostrils, and a massive head no doubt forged from some alien material. I had no words of comfort or explanation for the Viking—let alone for myself. As EDM roared louder, Donkey DJ galloped to a bank of turntables, slapped its hooves down on the records, and started spinning. The crowd was clearly still in shock. Everyone slowly rose from their beanbag chairs and awkwardly bopped their shoulders and indifferently swiveled their hips.

“For today’s main event,” Donkey DJ shouted, “We’ve got a Fortnite competition with a prize of $1.5 million.”

Jesus, I thought. This goddamn donkey can talk.

My awful musings were interrupted as the Fortnite tournament kicked off in earnest. The announcers were clean-cut, business casual, and they narrated every single play, every last shot. When a player’s avatar died, the crowd roared with delight, like Romans watching a gladiator being torn to ribbons. In the face of all that atavistic shouting and madness, I saw clearly how my own pleasure, my need for victory, my addiction to Fortnite was both a product of, and fuel for, the horrors of late-stage capitalism.

Just as I was trying to make sense of this realization, the crowd began to turn. Their malaise, as the matches grew more intense, morphed into violence. Donkey DJ pumped out song after song of trance music, switching the track every 30 seconds. Everywhere was a mass of churning bodies. Everyone was hopped up on sugar and caffeine. And they were starting to lose control.

Three or four spectators climbed onto the stage and interrupted the play-by-play broadcast. Some kid crawled on top of Donkey DJ. People surged closer to the screen, almost as if hypnotized, almost as if they were trying to enter the game. Security was called. It wasn’t enough. A 12 year old ripped one of the beanbags to shreds, sending tiny white balls flying in all directions. And then it dawned on me: This is what these people wanted, maybe even needed, all along—a true escape. They came to TwitchCon to get as close to Fortnite as they possibly could. To enter the game. But what would happen when they couldn’t? What then?

6.

Later, with Fortnite’s spell forever broken, I spotted the kid I’d seen before, the one with the luminescent blueberry hair. He was fussing at the exit, refusing to leave. His parents called to him. I watched him stomp his feet, grip the door, anger evolving into rage as his parents tried to drag him from the building. But he clung to the doorframe, and his parents eventually gave up and walked away in defeat. In that instant, the kid had to make a choice. His gaze shifted from the real world outside—one filled with family, friends, and actual sunlight— to the Fortnite tent.

I watched him as he shifted his little conflicted body around. He saw the beanbags piled up and packed away. He saw his favorite Fortnite players hanging their heads in defeat. He saw the chasm between the game world and the real world for perhaps the first time in his young life. And in the face of this harsh reality, his eyes grew wide with what I could only hope was some kind of epiphany. Then, he seemed to relax. His tensed muscles relaxed. His manic vibe calmed. Was the dreamer finally waking from his dream?

I don’t know. And I never will know. But, in that moment, I felt a ripple of hope as the blue-haired kid turned around, faced the door, and chose to step back into the world.

TwitchCon 2019 takes place at the San Diego Convention Center from Friday Sept. 27 to Sunday Sept. 29. The Fortnite Open kicks off Friday at 11 a.m. in the Twitch Rivals Arena.

Why We Need to Read the Literature of Incarceration

1.
At my university, I once attended a dinner to help support first-generation students. This was a varied, singular group of students, undergraduates and grad students, who had overcome all sorts of challenges in order to land, and thrive, at Columbia.

The next day, I attended a ceremony celebrating a graduating senior in the Directly Impacted Group, a university-wide organization comprised of students who have been incarcerated or who are impacted by incarceration via family members. Many of the bright, shiny, brilliant students I’d met the day before were in this group as well.

My own family has been affected by incarceration, and none of this actually should come as a surprise considering that fact that the United States incarcerates more people than any other country in the world, including China and Russia. According to the Vera Institute of Justice, the U.S. holds five percent of the world’s population yet nearly 25 percent of incarcerated people.

Put another way, if the population of people in prison and jail were a city, it would be a city somewhere in size between Phoenix and Houston. If you added people on probation, the number rises to 7.3 million—somewhere in size between Los Angeles and New York City.

Tandem to the rise of “Supermax” prisons that are often for-profit and constructed solely of solitary confinement cells, is the rise of using local jails and private prisons to confine migrants and asylum seekers. New documents have revealed the widespread use of solitary confinement, often with no reason, in Immigration and Customs Enforcement (ICE) detention centers. Given the numbers of people being incarcerated, the outward radiating effects in the community, and that our tax dollars are paying for it (including private prisons, which often merely take the state budgets and rejigger them for profit), the prison problem should be a concern of every American.

2.
Incarceration exists from the very beginning of America’s history. In 1675, just after the start of King Phillip’s war, 500 Native Americans were imprisoned on a barren strip of land off of Boston Harbor called Deer Island. Half died over the winter, the same Native Americans who welcomed the English to America’s shores in 1621. In the 1880s the site became a concentration camp for Irish fleeing the famine, then it became an actual prison, and is now a sewage treatment plant.

The early 19th century saw the rise of more codified systems, specifically the penitentiary system, also known as the Pennsylvania system, which was rooted in an optimistic idea of social rehabilitation (the “penitence” in penitentiary) versus the “Auburn” system that emphasized prisoners laboring together in silence and physical punishment. The Philadelphia penitentiary system in particular relied almost exclusively on solitary confinement, which resulted in catastrophic mental damage to inmates, causing the system to be abandoned.

The U.S. leads the world in its use of prolonged incarceration and solitary confinement despite bleak statistics that show the ineffectiveness of such a system: a Bureau of Justice study showed that five of six state prisoners were rearrested within nine years, a rate of 83 percent.

The 2019 memoir Solitary, by Albert Woodfox, addresses the prison industrial complex and dangerous overuse of solitary confinement—and has just been longlisted for the National Book Award. Woodfox was held in solitary confinement in a six-by-nine foot cell for more than 40 years—longer than any American. He’d entered the prison system as a teen accused of various crimes, culminating in his arrest for robbery. He escaped from jail, and fled to New York, where he became acquainted with the Black Panther Party and their ideas of organizing and education. He was arrested and extradited to the Orleans Parish Prison where he helped organize a strike that eventually forced the prison to improve its conditions. As punishment, he was sent to the notorious Louisiana State Penitentiary known as Angola, named after the slave plantation that formerly occupied its ground (the plantation itself was named for the African country that was the origin of many slaves brought to Louisiana).

In 1972, when a prison guard was found stabbed to death, Woodfox, despite no evidence linking him to the crime (the guard’s widow would eventually testify that she believed he was innocent), was framed for the murder. He was placed in solitary (also euphemistically rebranded as “closed cell restricted,” CCR) for 44 years and 10 months. In his book he describes it:
We were locked down 23 hours a day. There was no outside exercise yard for CCR prisoners. There were prisoners in CCR who hadn’t been outside in years. We couldn’t make or receive phone calls. We weren’t allowed books, magazines, newspapers, or radios. There were no fans on the tier; there was no access to ice, no hot water in the sinks in our cells. There was no hot plate to heat water on the tier. Needless to say, we were not allowed educational, social, vocational, or religious programs; we weren’t allowed to do hobby crafts (leatherwork, painting, woodwork). Rats came up the shower drain at the end of the hall and would run down the tier. We threw things at them to keep them from coming into our cells. Mice came out at night. When the red ants invaded they were everywhere all at once, in clothes, sheets, mail, toiletries, food.
His case (along with two other Black Panthers, collectively known as the Angola Three) attracted the attention of Amnesty International and other activists. Eventually the murder conviction was overturned and Woodfox was released on his birthday in 2016.

I spoke with Mr. Woodfox, now 72, about how he constructed his powerful memoir.

The Millions: Can you tell me how you started writing the book, and also how you got the physical writing materials at all?

Albert Woodfox: I always knew I would tell the story of what happened to me. But when I was in, I didn’t actually write. People smuggled in writing materials to me, just like we got books. There were ways. In my mental space, I had to stay optimistic and not think about what might happen if I stay in here for the rest of my life. In my mind that would be me wondering if I was ever go free. So I didn’t think about things that deeply. I just took notes. I took notes for 27 years and managed to get those notes out to my brother. But then they were stolen out of his car!

But that’s why I am very open that I wrote this book with [journalist] Leslie [George]. She helped me go through my memory and put the things together.

TM: What was the most difficult part of being in solitary?

AW: I couldn’t go to my mother’s funeral. They don’t let people in solitary out even for that. First thing I did when I got out was have my brother drive me to the cemetery, and because of a delay in my processing, it was closed. The next day, we went to a store and bought out all the flowers and I brought them to my mother’s grave.

Many countries have banned solitary confinement as torture, and the work of psychiatrist and former Harvard Medical School faculty member Stuart Grassian suggests that humans are such social beings that being deprived of contact in solitary confinement causes irreversible mental and emotional damage to set in almost immediately.

The Angola Three have filed a civil lawsuit on the grounds that being locked down 23 hours a day violates Eighth Amendment protection against cruel and unusual punishment. Their case is still pending.

3.

The New Jim Crow by Michelle Alexander examines how the carceral system rose from the dregs of slavery to control and exploit labor from the black body: A person could be picked up not just for “loitering” but also for “suspected loitering,” and then taken into a prison and put to work via the convict lease system or “chain gang.” Angola prison is literally on a former plantation that named itself for the African country from which its slaves were stolen; one does not need a huge amount of imagination to draw the lines from slavery to the prison system. The “new” Jim Crow aspect of her book shows how the “War on Drugs” has concentrated on the black community, and how—in ways reminiscent of ICE—law enforcement has been able to operate outside the law, often trampling the Fourth Amendment that protects against unreasonable searches and seizures.

The myth of the missing black father is born out of this war. When politicians and cultural figures—not just right wing pundits but also former President Barack Obama and comedian Bill Cosby—lament missing black fathers, none of them note (nor does the media) that many of these father-child separations are due to arrests for minor infractions, for example, marijuana possession or selling loose cigarettes, as was the case with Eric Garner.

City of Inmates: Conquest, Rebellion, and the Rise of Human Caging in Los Angeles, 1771–1965 by Kelly Lytle Hernández is a good complement to The New Jim Crow in its examination of the rise of incarceration in non-slave states. The book looks at how histories of native elimination, immigrant exclusion, and black disappearance are behind the rise of incarceration in Los Angeles, which built one of the largest systems of human caging in the world to remove marginalized groups ranging from itinerant white “tramps and “hobos” to Chinese immigrants, African Americans and Mexican immigrants.

In American Prison, journalist Shane Bauer, who himself was held in solitary confinement in Iran, went undercover as a prison guard at a private prison (also in Louisiana) and wrote a widely praised feature about it for Mother Jones. This book exposes not just the shocking conditions of the prison (for guards and inmates alike) but also charts the rise of the private prison system: At a Republican presidential fundraiser in 1983, an executive of the Magic Stove company daydreamed that privatizing prisons would be “a heck of a venture for a young man to solve the prison problem”—i.e., overcrowding from the flood of new, disproportionately nonwhite inmates via the war on drugs—“and make a lot of money at the same time.” Thus the Corrections Corporation of America was born; its first project: converting a motel into an immigrant detention center in Texas. CCA went public in 1986. Thomas Beasley, one of CCA’s founders, told Inc. “You just sell it like you were selling cars or real estate or hamburgers.”

American Prison takes an in-depth look at the roots of the idea of incarcerating people for profit: how in the 1990s CCA actually built prisons without state contracts, betting (rightly) on a massive increase in the prison population, and how private prisons make for a system deliberately opaque and shielded from public accountability—they are businesses, not government entities. It’s disturbing to think that early investors included Marriott-Sodexho and a venture capitalist who helped create the Hospital Corporation of America. The book’s historical view makes an important point: Using private prisons for immigrant detention is not something new to the Trump administration but dates back to the 1980s and Ronald Reagan.

4.
Another fundamental but surprising fact about incarceration in the U.S. is that 4 percent of the world’s female population lives in the U.S., but the U.S. accounts for more 30 percent of the world’s incarcerated women. “Total” prison statistics have often obscured the fact that on a state level, women have become the fastest growing segment of the prison population—even to the point where the growth of their populations is significant enough to counteract reductions in the men’s population, i.e., too often, states have an incomplete commitment to prison reform by ignoring women’s incarceration.

Rachel Kushner’s The Mars Room—a novel for which the author went undercover with a group of criminology students—provides an immersive look at life in a women’s prison. The book’s fictional Stanville Prison is a composite of various women’s prisons, including Central California Women’s Facility, the largest women’s prison in the U.S., and the only one in California to house a death row. The novel follows several characters, including a white GED teacher and an incarcerated cop, but is mainly the story of Romy, who is serving a life sentence for murdering a john who was stalking her. She is very much representative of the female prison population, coming in with a history of trauma, abuse, and drug use, and, like the majority of the women in prison, the mother of a young child who will be deeply affected by her incarceration. Further, she spends significant time in and out of solitary confinement, here rebranded as “administrative segregation,” or “ad-seg.”

To be sure, many people are incarcerated because they have committed horrific crimes. But as a shocking video of a woman giving birth in her cell, scared and alone shows, incarcerated persons are some of the most voiceless and forgotten people in our society. Incarcerated or not, they are still human, they have families, and some will return to society, and as our tax dollars pay for their care (one year in prison costs the same as a year in law school), it is our business to understand how they are being treated—and if they should be incarcerated in the first place.

Image credit: Unsplash/Carles Rabada.

Judging God: Rereading Job for the First Time

“And can you then impute a sinful deed/To babes who on their mothers’ bosoms bleed?” —Voltaire, “Poem on the Lisbon Disaster” (1755)

“God is a concept/By which we measure our pain.” —John Lennon, “God” (1970)

The monks of the Convent of Our Lady of Mount Carmel would have shortly finished their Terce morning prayers of the Canonical hours, when an earthquake struck Lisbon, Portugal, on All Saints Day in 1755. A fantastic library was housed at the convent, more than 5,000 volumes of St. Augustin and St. Thomas Aquinas, Bonaventure and Albertus Magnus, all destroyed as the red-tiled roof of the building collapsed. From those authors there had been endless words produced addressing theodicy, the question of evil, and specifically how and why an omniscient and omnibenevolent God would allow such malevolent things to flourish. Both Augustin and Aquinas affirmed that evil had no positive existence in its own right, that it was merely the absence of good in the way that darkness is the absence of light. The ancient Church Father Irenaeus posited that evil is the result of human free will, and so even natural disaster was due to the affairs of women and men. By the 18th century, a philosopher like Gottfried Leibnitz (too Lutheran and too secular for the monks of Carmo) would optimistically claim that evil is an illusion, for everything that happens is in furtherance of a divine plan whereby ours is the “best of all possible worlds,” even in Lisbon on November 1, 1755. On that autumn day in the Portuguese capital, the supposedly pious of the Carmo Convent were faced with visceral manifestations of that question of theodicy in a city destroyed by tremor, water, and flame.

No more an issue of scholastic abstraction, of syllogistic aridness, for in Lisbon perhaps 100,000 of the monks’ fellow subjects would die in what was one of the most violent earthquakes ever recorded. Death marked the initial collapse of the houses, palaces, basilicas, and cathedrals, in the tsunami that pushed in from the Atlantic and up the Tagus River, in the fires that ironically resulted from the preponderance of votive candles lit to honor the holy day, and in the pestilence that broke out among the debris and filth of the once proud capital. Lisbon was the seat of a mighty Catholic empire, which had spread the faith as far as Goa, India, and Rio de Janeiro, Brazil; its inhabitants were adherents to stern Iberian Catholicism, and the clergy broached no heresy in the kingdom. Yet all of that faith and piety appeared to make no difference to the Lord; for the monks of the Carmo Convent who survived their home’s destruction, their plaintive cry might as well have been that of Christ’s final words upon the cross in the Book of Matthew: “My God, my God, why hast thou forsaken me?”

Christ may have been the Son of God, but with his dying words he was also a master of intertextual allusion, for his concluding remarks are a quotation of another man, the righteous gentile from the land of Uz named Job. If theodicy is the one insoluble problem of monotheism, the viscerally felt and empirically verifiable reality of pain and suffering in a universe supposedly divine, then Job remains the great brief for those of us who feel like God has some explaining to do. Along with other biblical wisdom literature like Ecclesiastes or Song of Songs, Job is one of those scriptural books that can sometimes appear as if some divine renegade snuck it into the canon. What Job takes as its central concern is the reality described by journalist Peter Trachtenberg in his devastating The Book of Calamities: Five Questions about Suffering, when he writes that “Everybody suffers: War, sickness, poverty, hunger, oppression, prison, exile, bigotry, loss, madness, rape, addiction, age, loneliness.” Job is what happens when we ask about why those things are our lot with an honest but broken heart.

I’ve taught Job to undergraduates before, and I’ve sometimes been surprised by their lack of shock when it comes to what’s disturbing about the narrative. By way of synopsis, the book tells the story of a man whom the poem makes clear is unassailably righteous, and how Satan, in his first biblical appearance (and counted as a son of God to boot), challenges the Creator, maintaining that Job’s piety is only a result of his material and familiar well-being. The deity answers the devil’s charge by letting the latter physically and psychologically torture blameless Job, so as to demonstrate that the Lord’s human servant would never abjure Him. In Bar-Ilan University bible professor Edward Greenstein’s masterful Job: A New Translation, the central character is soberly informed that “Your sons and your daughters were eating and drinking wine in/the house of their brother, the firstborn,/When here: a great wind came from across the desert;/it affected the four corners of the house;/it fell on the young people and they died.”—and the final eight words of the last line convey in their simplicity the defeated and personal nature of the tragedy. Despite the decimation of Job’s livestock, the death of his children, the rejection of his wife, and finally the contraction of an unsightly and painful skin ailment (perhaps boils), “Job did not commit-a-sin – /he did not speak insult” against God.

Job didn’t hesitate to speak against his own life, however. He bemoans his own birth, wishing the very circumstances that his life could be erased, declaring “Let the day disappear, the day I was born, /And the night that announced: A man’s been conceived!” Sublimely rendered in both their hypocrisy and idiocy are three “friends” (a later interpolation that is the basis of the canonical book adds a fourth) who come to console Job, but in the process they demonstrate the inadequacy of any traditional theodicy in confronting the nature of why awful things frequently happen to good people. Eliphaz informs Job that everything, even the seemingly evil, takes part in God’s greater and fully good plan, that the Lord “performs great things too deep to probe, /wondrous things, beyond number.” The sufferer’s interlocutor argues, as Job picks at his itchy boils with a bit of pottery, perhaps remembering the faces of his dead children when they were still infants, that God places “the lowly up high, /So the stooped attain relief.” Eliphaz, of whom we know nothing other than that he speaks like a man who has never experienced loss, is the friend whom counsels us that everything works out in the end even when we’re at our child’s funeral.

Bildad, on the other hand, takes a different tact, arguing that if Job’s sons “committed a sin against him, /He has dispatched them for their offense,” the victim-blaming logic that from time immemorial has preferred to ask what the raped was wearing rather than why the rapist rapes. Zophar angrily supports the other two, and the latecomer Elihu emphasizes God’s mystery and Job’s impudence to question it. To all of these defenses of the Lord, Job responds that even “were I in the right, his mouth would condemn me. /(Even) were I innocent, he would wrong me… It is all the same. /And so, I declare:/The innocent and the guilty he brings to (the same) end.” In an ancient world where it’s taken as a matter of simple retributive ethics that God will punish the wicked and reward the just, Job’s realism is both more world-weary and more humane than the clichés offered by his fair-weather friends. “Why do the wicked live on and live well,/Grow old and gain in power and wealth?” asks Job, and from 500 BCE unto 2019 that remains the central question of ethical theology. As concerns any legitimate, helpful, or moving answer from his supposed comforters, Greenstein informs us that “They have nothing to say.”

Finally, in the most dramatic and figuratively adept portion of the book, God Himself arrives from a whirlwind to answer the charges of Job’s “lawsuit” (as Greenstein renders the disputation). The Lord never answers Job’s demand to know why he has been forced to suffer, rather answering in non-sequitur about his own awesomeness, preferring to rhetorically answer the pain of a father who has buried his children by asking “Where were you when I laid earth’s foundations?/Tell me – if you truly know wisdom!/Who set its dimensions? Do you know? /Who stretched the measuring line?… Have you ever reached the sources of Sea, /And walked on the bottom of Ocean?” God communicates in the rhetoric of sarcastic grandeur, mocking Job by saying “You must know… Your number of days is so many!” Of course, Job had never done any of those things, but an unempathetic God also can’t imagine what it would be like to have a Son die—at least not yet. That gets ahead of ourselves, and a reader can’t help but notice that for all of His poetry from the whirlwind, and all of His frustration at the intransigent questioning of Job, the Lord never actually answers why such misfortune has befallen this man. Rather God continually emphasizes His greatness to Job’s insignificance, His power to Job’s feebleness, His eternity to Job’s transience.

The anonymous author’s brilliance is deployed in its dramatic irony, for even if Job doesn’t know why he suffers, we know. Greenstein explains that readers “know from the prologue that Job’s afflictions derive from the deity’s pride, not from some moral calculus.” Eliphaz, Bildad, Zophar, and Elihu can pontificate about the unknowability of God’s reasons while Job is uncertain as to if he’s done anything wrong that merits such treatment, but the two omniscient figures in the tale—God and the reader—know why the former did what He did. Because the Devil told him to. Finally, as if acknowledging His own culpability, God “added double to what Job had had,” which means double the livestock, and double the children. A cruelty in that, the grieving father expected to have simply replaced his dead family with a newer, shinier, fresher, and most of all alive brood. And so, both Job and God remain silent for the rest of the book. In the ordering of the Hebrew scriptures, God remains silent for the rest of the Bible, so that when “Job died old and sated with days” we might not wonder if it isn’t the deity Himself who has expired, perhaps from shame. The wisdom offered in the book of Job is the knowledge that justice is sacred, but not divine; so that justice must ours, meaning that our responsibility to each other is all the more important.

Admittedly an idiosyncratic take on the work, and one that I don’t wish to project onto Greenstein. But the scholar does argue that a careful philological engagement with the original Hebrew renders the story far more radical than has normally been presented. Readers of Job have normally been on the side of his sanctimonious friends, and especially the Defendant who arrives out of His gassy whirlwind, but the author of Job is firmly on the side of the plaintiff. If everyone from medieval monks to the authors of Midrash, from Sunday school teachers to televangelists, have interpreted Job as being about the inscrutability of God’s plans and the requirement that we passively take our undeserved pain as part of providence, than Greenstein writes that “there is a very weak foundation in biblical parlance for the common rendering.” He argues that “much of what has passed as translation of Job is facile and fudged,” having been built upon the accumulated exegetical detritus of centuries rather than returning to the Aleph, Bet, and Gimmel of Job itself.

Readers of a more independent bent have perhaps detected sarcasm in Job’s response, or a dark irony in God’s restitution of new and better replacement children for the ones that He let Satan murder. For my fellow jaundiced and cynical heretics who long maintained that Job still has some fight in him even after God emerges from the whirlwind to chastise Him for daring to question the divine plan, Greenstein has good news. Greenstein writes that the “work is not mainly about what you thought it was; it is more subversive than you imagined; and it ends in a manner that glorifies the best in human values.” He compares a modern translation of Job 42:6, where Job speaks as a penitent before the Lord, exclaiming “Therefore I despise myself/and repent in dust and ashes.” Could be clearer–>Such a message seems straightforward—because he questions the divine plan, Job hates himself, and is appropriately humbled. Yet Greenstein contrasts the modern English with a more accurate version based in an Aramaic text found in the Dead Sea Scrolls, where the man of Uz more ambiguously says “Therefore I am poured out and boiled up, and I will become dust.” This isn’t a declaration of surrender; this is an acknowledgement of loss. Greenstein explains that while both the rabbis and the Church wished to see Job repentant and in surrender, that’s not what the original presupposes. Rather, “Job is parodying God, not showing him respect. If God is all about power and not morality and justice, Job will not condone it through acceptance.” Thus, when we write that the book’s subject is judging God, we should read the noun as the object and not the subject of that sentence.

As a work, the moral universe of Job is complex; when compared to other ancient near Eastern works that are older, even exemplary ones like Gilgamesh, its ambiguities mark it as a work of poetic sophistication. Traditionally dated from the period about a half-millennium before the Common Era, Job is identified with the literature of the Babylonian Exile, when the Jews had been conquered and forcibly extricated to the land of the Euphrates. Such historical context is crucial in understanding Job’s significance, for the story of a righteous man who suffered for no understandable reason mirrored the experience of the Jews themselves, while Job’s status as a gentile underscored a dawning understanding that God was not merely a deity for the Israelites, but that indeed his status was singular and universal. When the other gods are completely banished from heaven, however, and the problem of evil rears its horned head, for when the Lord is One, who then must be responsible for our unearned pain?

Either the most subversive or the most truthful of scriptural books (maybe both), Job has had the power to move atheist and faithful alike, evidence for those who hate God and anxious warning for those that love Him. Former Jesuit Jack Miles enthuses in God: A Biography, that “exegetes have often seen the Book of Job as the self-refutation of the entire Judeo-Christian tradition.” Nothing in the canon of Western literature, be it Sophocles’s Oedipus Rex or William Shakespeare’s Hamlet, quite achieves the dramatic pathos of Job. Consider its terrors, its ambiguities, its sense of injustice, and its impartation that “our days on earth are a shadow.” Nothing written has ever achieved such a sense of universal tragedy. After all, the radicalism of the narrative announces itself, for Job concerns itself with the time that God proverbially sold his soul to Satan in the service of torturing not just an innocent man, but a righteous one. And that when questioned on His justification for doing such a thing, the Lord was only able to respond by reiterating His own power in admittedly sublime but ultimately empty poetry. For God’s answer of theodicy to Job is not an explanation of how, but not an explanation of why—and when you’re left to scrape boils off your self with a pottery shard after your 10 children have died in a collapsing house, that’s no explanation at all.

With Greenstein’s translation, we’re able to hear Job’s cry not in his native Babylonian, or the Hebrew of the anonymous poet who wrote his tale, or the Aramaic of Christ crucified on Golgotha, or even the stately if antiquated early modern English of the King James Version, but rather in a fresh, contemporary, immediate vernacular that makes the tile character’s tribulations our own. Our Job is one who can declare “I am fed up,” and something about the immediacy of that declaration makes him our contemporary in ways that underscore the fact that billions are his proverbial sisters and brothers. Greenstein’s accomplishment makes clear a contention that is literary, philosophical, and religious: that the Book of Job is the most sublime masterpiece of monotheistic faith, because what its author says is so exquisitely rendered and so painfully true. Central to Greenstein’s mission is a sense of restoration. This line needs to be clearer: Job is often taught and preached as simply being about humanity’s required humility before the divine, and the need to prostrate ourselves before a magnificent God whose reasons are inscrutable.

By restoring Job its status as a subversive classic, Greenstein does service to the worshiper and not the worshiped, to humanity and not our oppressors. Any work of translation exists in an uneasy stasis between the original and the
adaptation, a one-sided negotiation across millennia where the original author has no say. My knowledge of biblical Hebrew is middling at best, so I’m not suited to speak towards the transgressive power of whomever the anonymous poet of Job was, but regardless of what those words chosen by her or him were, I can speak to Greenstein’s exemplary poetic sense. At its core, part of what makes this version of Job so powerful is how it exists in contrast to those English versions we’ve encountered before, from the sublime plain style of King James to the bubblegum of the Good News Bible. Unlike those traditional translations, the words “God” and “Lord,” with their associations of goodness, appear nowhere in this translation. Rather Greenstein keeps the original “Elohim” (which I should point out is plural), or the unpronounceable,
vowelless tetragrammaton, rendered as “YHWH.” Job is made new through the deft use of the literary technique known as defamiliarization, making that which is familiar once again strange (and thus all the more radical and powerful).

Resurrecting the lightning bolt that is Job’s poetry does due service to the original. Job’s subject is not just theodicy, but the failures of poetry itself. When Job defends himself against the castigation of Eliphaz, Bildad, Zophar, and Elihu, it’s not just a theological issue, but a literary critical one as well. The suffering man condemns their comforts and callousness, but also their clichés. That so much of what his supposed comforters draw from are biblical books like Psalms and Proverbs testifies to Job’s transgressive knack for scriptural self-reflection. As a poet, the author is able to carefully display exactly what she or he wishes to depict, an interplay between the presented and hidden that has an uncanny magic. When the poet writes that despite all of Job’s tribulations, “Job did not commit-a-sin with his lips,” it forces us to ask if he committed sin in his heart and his head, if his soul understandably screamed out at God’s injustice while his countenance remained pious. This conflict between inner and outer appearance is the logic of the novelist, a type of interiority that we associate more with Gustave Flaubert or Henry James than we do with the Bible.

When it comes to physical detail, the book is characteristically minimalist. Like most biblical writing, the author doesn’t present much of a description of either God or Satan, though that makes their presentation all the more eerie and disturbing. When God asks Satan where he has been, the arch-adversary responds “From roving the earth and going all about it.” Satan’s first introduction in all of Western literature, for never before was that word used for the singular character, thus brings with it those connotations of ranging, stalking, and creeping that have so often accrued about the reptilian creature who is always barely visible out of the corner of our eyes. Had we been given more serpentine exposition on the character, cloven hooves and forked tales, it would lack the unsettling nature of what’s actually presented. But when the author wants to be visceral, she or he certainly can be. Few images are as enduring in their immediacy than how Job “took a potsherd with which to scratch himself as he sits in/the ash-heap.” His degradation, his tribulation, his shame still resonates 2,500 years later.

The trio (and later the quartet) of Job’s judgmental colleagues render theodicy as a poetry of triteness, while Job’s poetics of misery is commensurate with the enormity of his fate. “So why did you take me out of the womb?” he demands of God, “Would I had died, with no eye seeing me!/Would I had been as though I had been;/Would I had been carried from womb to tomb”—here Greenstein borrowing a favored rhyming congruence from the arsenal of English’s Renaissance metaphysical poets. Eliphaz and Elihu offer maudlin bromides, but Job can describe those final things with a proficiency that shows how superficial his friends are. Job fantasizes about death as a place “I go and do not return, /To a land of darkness and deep-shade;/A land whose brightness is like pitch-black, /Deep-shade and disorder;/That shines like pitch-black.” That contradictory image, of something shining in pitch-black, is an apt definition of God Himself, who while He may be master of an ordered, fair, and just universe in most of the Bible, in Job He is creator of a “fundamentally amoral world,” as Greenstein writes.

If God from the whirlwind provides better poetry than his defenders could, His theology is just as empty and callous. Greenstein writes that “God barely touches on anything connected to justice or the providential care of humanity,” and it’s precisely the case that for all of His literary power, the Lord dodges the main question. God offers description of the universe’s creation, and the maintenance of all things that order reality, he conjures the enormities of size and time, and even provides strangely loving description of his personal pets, the fearsome hippopotamus Behemoth and the terrifying whale/alligator Leviathan. Yet for all of that detail, exquisitely rendered, God never actually answers Job. Bildad or Elihu would say that God has no duty to explain himself to a mere mortal like Job, that the man of Uz deserves no justification for his punishment in a life that none of us ever chose to begin. That, however, obscures the reality that even if Job doesn’t know the reasons behind what happened, we certainly know.

Greenstein’s greatest contribution is making clear that not only does God not answer Job’s pained query, but that the victim knows that fact. And he rejects it. Job answers God with “I have spoken once and I will not repeat…I will (speak) no more.” If God answers with hot air from the whirlwind, the soon-to-be-restored Job understands that silent witness is a more capable defense against injustice, that quiet is the answer to the whims of a capricious and cruel Creator. God justifies Himself by bragging about His powers, but Job tells him “I have known you are able to do all;/That you cannot be blocked from any scheme.” Job has never doubted that God has it within his power to do the things that have been done, rather he wishes to understand why they’ve been done, and all of the beautiful poetry marshaled from that whirlwind can’t do that. The Creator spins lines and stanzas as completely as He once separated the night from the day, but Job covered in his boils could care less. “As a hearing by ear I have heard you,” the angry penitent says, “And now my eye has seen you. That is why I am fed up.”

Traditional translations render the last bit so as to imply that a repentant Job has taken up dust and ashes in a pose of contrition, but the language isn’t true to the original. More correct, Greenstein writes, to see Job as saying “I take pity on ‘dust and ashes,’” that Job’s sympathy lay with humanity, that Job stands not with God but with us. Job hated not God, but rather His injustice. Such a position is the love of everybody else. Miles puts great significance in the fact that the last time the deity speaks (at least according to the ordering of books in the Hebrew Scriptures) is from the whirlwind.  According to Miles, when the reality of Job’s suffering is confronted by the empty beauty of the Lord’s poetry, a conclusion can be drawn: “Job has won. The Lord has lost.” That God never speaks to man again (at least in the ordering of the Hebrew Scriptures) is a type of embarrassed silence, and for Miles the entirety of the Bible is the story of how humanity was able to overcome God, to overcome our need of God. Job is, in part, about the failure of beautiful poetry when confronted with loss, with pain, with horror, with terror, with evil. After Job there can be no poetry. But if Job implies that there can be no poetry, it paradoxically still allows for prayer. Job in his defiance of God teaches us a potent form of power, that dissent is the highest form of prayer, for what could possibly take the deity more seriously? In challenging, castigating, and criticizing the Creator, Job fulfills the highest form of reverence that there possibly can be.

Image credit: Unsplash/Aaron Burden.

Writing the Present for the Future: ‘The Mezzanine’ vs. ‘White Noise’

As we careen toward the 2020s (!), and I personally careen toward my fifties (!), I have been increasingly experiencing what is probably a universal, and not entirely pleasant shock of aging, i.e., how fucking long ago in history the decade of my childhood exists. Specifically, the 1980s. I was born in 1975, but the ’80s marked the true memorable—in both senses—extent of those verdant years (not so verdant, actually, as most of them were spent in Saudi Arabia, but anyway). The 1980s long ago crossed that invisible cultural line into the realm of nostalgic camp: Pac-Man, early MTV, Arnold Schwarzenegger—even grainy TV footage of Ronald Reagan has long carried with it a kind of hideous sentimental aura. But enough time has passed and sociopolitical changes have occurred that it now exists as wholly in its own time as the ’40s and WWII did when I was a child.

When this amount of time has passed, we can truly evaluate literature from an era, both in terms of how well it captures its own time, and how well it, however obliquely, anticipates or fails to anticipate ours. This seems a particularly pressing question during our current political and cultural insanity: Which books and authors are identifying something true about our moment, and in doing so, perhaps predicting something true about the next? Assuming the existence of readers 40 years from now, they will be able to judge our literature at more or less the vantage we can now judge that of the ’80s. Recently, I happened to reread two of the most-’80s of ’80s novels: Nicholson Baker’s The Mezzanine and Don DeLillo’s White Noise. It struck me what a contrast they provide—two ways of looking at what is now a startlingly previous age.

In The Mezzanine, Baker examines an 1980s office park under a scientist’s microscope. Nothing is too small to escape his notice, nothing too trivial to be beneath consideration: the superiority of paper towels to hand dryers; the overflowing multitudinousness of office supplies in cabinets; different vending machine mechanisms for dropping candy; the subtle kabuki of polite office conversations; the layout of a nearby CVS; an intricate, fantastic meditation on the similarities between office staplers, locomotives, and turntables. Applying a peeled eyeball to the overlooked mundana of office life, is, in fact, the aesthetic mission and basic point of the book. As it undertakes a seeming irrelevancy like tracing the evolution of stapler design from the early 20th century to the present, it invites a reader—unaccustomed to this level of granular detail applied to the banal—to ask what is relevant. Absent the large plot movements and rich character detail we’re accustomed to in fiction, what is left? Well, as it turns out, life, more or less. These tiny objects and customs constitute our lives—in the case of The Mezzanine, our lives as we lived them in the 1980s.

The narrator, Howie, is transfixed by the tiny ingenuities that populate the modern world, and by their evolutionary processes—both technical and cultural. Objects—and ways of using objects—have a lifespan as organic as the lifespans of the invisible humans who invent, market, and use them. The culture or character of any particular age is constituted by the stuff of that age and the way society agrees, by unconscious, collective fiat, to keep it or change it or discard it for something else, sometimes better and often worse. Throughout The Mezzanine, Howie’s little ecstasies about this type of staple remover, or that method of polishing an escalator railing, are tempered by a subtle awareness and anxiety about the loss of these inventions and learned behaviors to an ever-coarsening culture of pure productivity that doesn’t prize them (or anything much besides profit and cost-cutting). 

The book is unostentatiously prescient on this point. At one
point Howie wonders how all of this makes money, how it can last. As he says:

We came into work every day and were treated like popes—a new manila folder for every task; expensive courier services; taxi vouchers; trips to three-day fifteen-hundred-dollar conferences to keep us up to date in our fields; even the dinkiest chart or memo typed, Xeroxed, distributed, and filed; overhead transparencies to elevate the most casual meeting into something important and official; every trash can in the whole corporation, over ten thousand trash cans, emptied an fitted with a fresh bag every night; restrooms with at least one more sink than ever conceivably would be in use at any one time, ornamented with slabs of marble that would have done credit to the restrooms of the Vatican! What were we participating in here?

His thoughts go in this large, abstract direction as a natural extension of his noticing the very small, concrete things around the office. The sheer fact of the material world around him, all of the things that have to be manufactured and bought and cleaned and serviced to maintain the surface of a functioning 1980s office building, is both a delight and a bit of an existential horror at certain points. It feels—and, in fact, will turn out to be—unsustainable. Howie’s apprehension of the coming changes in the economy, the layoffs and downsizing both financial and spiritual that will render this kind of lavish and stable workplace antique, is a kind of involuntary thesis that follows unavoidably from his close reading of his world’s text. Nicholson Baker, via Howie, goes humbly about his quiet work, gathering data and making reasonable inferences about the world, inferences that have largely been borne out by the intervening decades.

In White Noise, Don DeLillo—in almost perfect contrast to Baker—looks at his world with the telescopic eye of a priest or pop-cultural anthropologist, beginning with a couple of large-scale hypotheses about modern culture and gathering particulars from there. Anyone familiar with DeLillo could more or less guess what these general hypotheses are, as they run throughout his body of work in various guises: 1) modern consumer culture is similar to primitive culture, and, related, 2) people want to be in cults.

As is standard operating procedure for DeLillo, the book, via its narrator Jack Gladney, operates in the oracular intellectual mode. There is, for instance, lots of stuff like this (during one of the many scenes in which Gladney watches one of his many children sleep):

I sat there watching her. Moments later, she spoke again. Distinct syllables this time, not some dreamy murmur—but a language quite of this world. I struggled to understand. I was convinced she was saying something, fitting together units of stable meaning. I watched her face, waited. Ten minutes passed. She uttered two clearly audible words, familiar and elusive at the same time, words that seemed to have a ritual meaning, part of a verbal spell of ecstatic chant.

Toyota Celica.

A long moment passed before I realized this was the name of an automobile. The truth only amazed me more. The utterance was beautiful and mysterious, gold-shot with looming wonder. It was like the name of an ancient power in the sky, tablet-carved in cuneiform…

Et cetera. If you’re not reading too closely, it sounds good, and the massed effect of paragraphs like this—of which there are many in White Noise—is to generate an impression of gnomic wisdom. But what is this actually saying? I suppose: brands infest our collective consciousness, more or less, though it sounds much more mystical than that. It’s never quite clear to me, reading DeLillo and especially White Noise, where the satire begins and ends. Is this supposed to be a parody of the pompous intellectual Jack Gladney, professor of Hitler Studies and wearer of a toga and sunglasses around campus? It would probably be more convincing as satire if it didn’t also sound exactly like Don DeLillo. And if there was much of anything else going on the book besides these sorts of ruminations.

Over and again, we learn versions of the same thing: car names
are like magic words, shopping malls are like temples, the Airborne Toxic Event
is like an ancient Viking Death Ship (to be fair, this is actually one of the
more striking images in the book). This may or may not be true, but it isn’t
especially illuminating on any level beyond the claim itself. The book, one
feels—despite an established critical reputation for its prescience and
incisive cultural vision—is not looking very hard at the things it purports to
look hard at.  

The result is a novel that misses many present or future aspects of Late Capitalism—Trumpism, economic inequity and class struggle, the Internet—and superficially identifies other burgeoning issues—environmental disasters, anti-depressants—without saying anything very noteworthy about them. White Noise’s mode of intellectual engagement is perfectly metaphorized by the Airborne Toxic Event—a large, dark cloud that floats above the pages and across the events of the narrative without bearing down on the characters or reader in any appreciable way, other than conveying ominousness.

DeLillo’s best book, Libra, operates in a mode much closer to The Mezzanine. Though it invokes its share of non-specific quasi-mystical dread, it is a piece of work grounded in the mundane facts of Lee Harvey Oswald’s life: the abortive time in the military; his awkward marriage to a Russian woman, Marina; the little firings and failures that pushed him, and the country, toward catastrophe. Even the larger circles of intrigue—the CIA and KGB and Mafia and the Cubans—are laboriously researched and rendered, and although a plausible conspiracy is offered, it is a conspiracy of error and stupidity and inertia, and convincing for that reason. Libra notices: it builds its case from the ground up, rather than a big top-down idea that must be proved—that perhaps only can be proved—by exhaustive and unilluminating iteration.

The difference between these novels says something important, I think, about the most fruitful way of looking at our present moment. There is, on Twitter and elsewhere, the constant search for the Big Idea, The Grand Unified Theory of Trump and Late Capitalism. In a media environment that almost exclusively rewards brevity and pithiness, memorable pronouncement is the coin of the realm. In this sense, DeLillo really was prescient—if nothing else, the style of White Noise fully anticipates our era, the superannuation of truth by the impression of truth, or just by sheer impression.

Still, the most important work will always be done on the ground level, with attentiveness to the little particularities. We are always too close to the big thing to see the big thing, and so writers are at best like the blind men surrounding the elephant of their particular era—here, a tail; there, a baffling trunk. The Jack Gladneys and their Big Ideas will not often provide a definitive record of their time, or a projection of the one to come. It will be constructed by the Howies, all the careful and conscientious noticers of the world.