Readers of James Joyce will be well used to flipping back and forth between the main text and endnotes, or perhaps keeping Google at hand, to find explanation for some obscure reference to Irish politics or a piece of 100-year-old Dublin slang. It is this specificity of setting and precision of detail that Joyce contended was a crucial cornerstone of his work, arguing that “In the particular is contained the universal.”
This point may be made most accessible in “The Dead,” the final short story in his collection Dubliners. Although, like all of his fiction, it is rooted firmly in specifics of Irish culture, it is structured around the universally familiar time of Christmas, and its very meaning hinges upon this seasonal setting.
Even if one is not an active participant in annual yuletide celebrations, popular culture has ensured that we all recognise the greeting card image of the traditional Christmas gathering: friends and family joined in a single warm home, with the snow falling outside, to enjoy food, drink, music, speeches, and general merrymaking. Joyce’s story presents this exact Christmas-time setting. However, his narrative does not move in the same stream of hokey sentimentality that so many Christmas stories do: he keeps the unfortunate exchanges, conversational faux pas, and awkward silences, which are such an inevitability, firmly intact.
Although Joyce may not believe in any notion of idealization of the holiday season, he nonetheless suggests that this Christmas romanticism, that we all on some level buy into, can in fact be the catalyst for moments of profound realization. Indeed, the story ends with the protagonist, Gabriel Conroy, experiencing one of the most famous epiphanies in all of literature, which is directly inspired by the preceding Christmas celebrations: “He wondered at his riot of emotions of an hour before. From what had it proceeded? From his aunt’s supper, from his own foolish speech, from the wine and dancing, the merrymaking when saying good-night in the hall, the pleasure of the walk along the river in the snow.” The absence of a question mark in this last sentence suggests that these Christmas celebrations, even if Gabriel does recognize them as “foolish,” are a necessary component in bringing him to his epiphany, by forcing him into a more emotional state of mind.
This emotional resonance of the Christmas season may come in part due to its unique ability to force us to simultaneously confront both the past and future. Situated on the cusp of the New Year, Christmas is a natural time for reflection, whilst also providing a hope of some new beginning. Furthermore, the image of snow — the blank white surface — is suggestive of a sense of optimistic forward thinking, and the nostalgia that comes from bringing family together is bound to force the mind backwards.
This duality implicit in the Christmas tradition is also embodied in the two most enduringly popular Christmas narratives of our time: Charles Dickens’s A Christmas Carol and Frank Capra’s It’s a Wonderful Life. Both of these stories take their protagonists on journeys into alternate realities based on decisions they have made in the past, and decisions they might make in the future, culminating in an epiphany of redemption.
While “The Dead” also forces its protagonist to examine the past and an imagined future, its ending is much more ambiguous and has none of the redemptive qualities of Dickens or Capra. This is reflected through the fact that unlike these narratives “The Dead” does not take place on Christmas Eve, which allows an opportunity for change before the day itself, but rather on the final day of the “Christmas-time” period — the day known fittingly as Epiphany.
By setting his story in the wake of Christmas, Joyce implies that it is too late for change, with that possibility lying dead in the past. Furthermore, the fact that he informs us of the story’s Epiphany setting by referring to a New Year’s resolution already broken (the “reformed” drunk Freddy Malins gets “screwed” at the party), reinforces the notion that a failure to self-improve is a theme at the story’s center.
The image Gabriel conjures of his grandfather on horseback, endlessly circling a statue in town on his confused horse, also suggests an inability to move forward, and is part of a larger tapestry within Dubliners that presents all of Ireland in a state of “paralysis.” Likewise, Gabriel’s assured vision of the future, in which he imagines his aunt’s funeral and thinks “that would happen very soon,” suggests a sense of impending death, with no chance of aversion. His prophetic name further implies the certainty of this future, and seems to end the story on a note of undiluted bleakness.
However, although “The Dead” does not offer hope in the straightforward way that A Christmas Carol and It’s a Wonderful Life do, there is surely some hope in the fact that Gabriel experiences an epiphany at all. If everything in the world were truly in “paralysis,” this moment of transcendence and absolute truth-seeing would not be possible, and although it brings no joy, it is a key moment of recognition and self-understanding; a moment of growth.
What Joyce presents us with in “The Dead” is a true, unidealized epiphany. In the real world epiphanies are hard — we do not have literal ghosts and angels to gently guide us towards them. And that is why Joyce’s story is the perfect coda for the holiday season.
As we leave Christmas, which is a time of fantasy — taking us away from the routine of our normal lives, and saturating us in narratives of schmaltz — “The Dead” represents a return to reality; a reality in which brief moments of self-realization are hard-won, and even when they do come they do not necessarily offer any possibility of tangible change. For Joyce, the true epiphanies do not come during Christmas, amidst wine and tinsel and distant relatives, but after the celebrations have died down, when we find ourselves alone, awake in the witching hour with only our own imagined ghosts of the past and the “shades” of the future to keep us company.
Image Credit: Flickr/formatc1
England, as you know if you’ve ever read A Christmas Carol, has a long tradition of telling ghost stories around Christmas. What else could you read besides the Dickens classic to partake? At The Paris Review Daily, Colin Fleming lists a number of candidates, including Smee by A.M. Burrage and The Kit-Bag by Algernon Blackwood. You could also check out our reading list for December.
Tom Nissley’s column A Reader’s Book of Days is adapted from his book of the same name.
Did Dickens invent Christmas? It’s sometimes said he did, recreating the holiday as we know it out of the neglect that had been imposed on it by Puritanism, Utilitarianism, and the Scrooge-like forces of the Industrial Revolution. But Dickens himself would hardly have said he invented the traditions he celebrated: the mission of his Ghost of Christmas Present, after all, is to show the spirit and customs of the holiday are authentic and alive among the people, not just humbug. But A Christmas Carol did appear alongside the arrival in Victorian England of some of the modern traditions of the holiday. It was published in 1843, the same year the first commercial Christmas cards were printed in England, and two years after Prince Albert brought the German custom of the Christmas tree with him to England after his marriage to Queen Victoria.
Christmas was undoubtedly Dickens’s favorite holiday, and he made it a tradition of his own. A Christmas Carol was the first of his five almost-annual Christmas books (he regretted skipping a year in 1847 while working on Dombey and Son; he was “very loath to lose the money,” he said. “And still more so to leave any gap at Christmas firesides which I ought to fill”), and then for eighteen more years he published Christmas editions of his magazines Household Words and All the Year Round. And the popular and exhausting activity that nearly took over the last decades of his career, his public reading of his own works, began with his Christmas stories. For years they remained his favorite texts to perform, whether it was December or not.
One of the Christmas traditions Dickens most wanted to celebrate is one mostly forgotten now: storytelling. The early Christmas numbers of Household Words were imagined as stories told around the fireplace, often ghost stories like A Christmas Carol. It’s an easily forgotten detail that the classic American ghost tale, Henry James’s The Turn of the Screw, is also told around the Christmas hearth. James begins his tale with the mention of a story told among friends “round the fire,” about which we learn little except that it was “gruesome, as, on Christmas Eve in an old house, a strange tale should essentially be,” and that it involved a child. Three nights later that story inspires another, even stranger and more unsettling and involving not one child but two, a ratcheting of dread that gave James the title for his tale.
Telling ghost stories around the hearth might have declined since Dickens’s and James’s times, but it’s striking how important the voice of the storyteller remains in more recent Christmas traditions: Dylan Thomas, nostalgic for the winters of his childhood in “A Child’s Christmas in Wales”; Jean Shepherd, nostalgic for the Red Ryder air rifles of his own childhood in In God We Trust, All Others Pay Cash, later adapted, with Shepherd’s own narration, into the cable TV staple A Christmas Story; and David Sedaris, nostalgic for absolutely nothing from his years as an underpaid elf in the “SantaLand Diaries,” the NPR monologue that launched his storytelling career.
Gather round the fire with these December tales:
Rock Crystal by Adalbert Stifter (1845)
In a Christmas tale of sparkling simplicity, a small brother and sister, heading home from grandmother’s house on Christmas Eve across a mountain pass, find their familiar path made strange and spend a wakeful night in an ice cave on a glacier as the Northern Lights–which the girl takes as a visit from the Holy Child–flood the dark skies above them.
The Chemical History of a Candle by Michael Faraday (1861)
Dickens was not the only Victorian with a taste for public speaking: Faraday created the still-ongoing series of Christmastime scientific lectures for young people at the Royal Institution, the best known of which remains his own, a classic of scientific explanation for readers of any age.
Little Women by Louisa May Alcott (1868)
If you were one of the March girls, you’d read the copies of The Pilgrim’s Progress you found under your pillow on Christmas morning, but we’ll excuse you if you prefer to read about the Marches themselves instead.
Appointment in Samarra by John O’Hara (1934)
Julian English’s three-day spiral to a lonely end, burning every bridge he can in Gibbsville, Pennsylvania, from the day before Christmas to the day after, is inexplicable, inevitable, and compelling, the inexplicability of his self-destruction only adding to his isolation.
“The Birds” by Daphne du Maurier (1952)
Hitchcock transplanted the unsettling idea of mass avian malevolence in du Maurier’s story from the blustery December coast of England to the Technicolor brightness of California, but the original, told with the terse modesty of postwar austerity, still carries a greater horror.
The Catcher in the Rye by J. D. Salinger (1951)
Holden’s not supposed to be back from Pencey Prep for Christmas vacation until Wednesday, but since he’s been kicked out anyway, he figures he might as well head to the city early and take it easy in some inexpensive hotel before going home all rested up and feeling swell.
Instead of a Letter by Diana Athill (1963)
The “twenty years of unhappiness” recounted in Athill’s memoir, after her fiancé wrote to say he was marrying someone else just before being killed in the war, ended on her forty-first birthday with the news she had won the Observer’s Christmas story competition (the same prize that launched Muriel Spark’s career seven years before).
Tape for the Turn of the Year by A. R. Ammons (1965)
The long poem was a form made for Ammons, with its space to wander around, contradict himself, and turn equally to matters quotidian and cosmic, as he does in this lovely experiment that, in a sort of serious joke on Kerouac, he composed on a single piece of adding machine tape from December 1963 to early January 1964.
Chilly Scenes of Winter by Ann Beattie (1976)
Want to extend The Catcher in the Rye’s feeling of unrequited holiday ennui well into your twenties? Spend the days before New Year’s with Charles, impatient, blunt, and love-struck over a married woman whom he kept giving Salinger books until she couldn’t bear it anymore.
The Ghost Writer by Philip Roth (1979)
The brash and eventful fictional life of Nathan Zuckerman, which Roth extended in another eight books, starts quietly in this short novel (one of Roth’s best), with his abashed arrival on a December afternoon at the country retreat of his idol, the reclusive novelist E. I. Lonoff.
The Orchid Thief by Susan Orlean (1998)
Head south with the snowbirds to the humid swamps of Florida as Orlean investigates the December theft of over two hundred orchids from state swampland and becomes fascinated by its strangely charismatic primary perpetrator, John Laroche.
Stalingrad by Antony Beevor (1999)
Or perhaps your December isn’t cold enough. Beevor’s authoritative account of the siege of Stalingrad, the wintry graveyard of Hitler’s plans to conquer Russia, captures the nearly incomprehensible human drama that changed the course of the war at a cost of a million lives.
The Year of Magical Thinking by Joan Didion (2005)
Didion’s year of grief, recorded in this clear-eyed memoir, began with her husband’s sudden death on December 30, 2003, and ended on the last day of 2004, the first day, as she realized to her sorrow, that he hadn’t seen the year before.
Last Day at the Lobster by Stewart O’Nan (2007)
Manny DeLeon will be all right—he has a transfer to a nearby Olive Garden set up—but in his last shift as manager of a Connecticut Red Lobster, shutting down for good with a blizzard on the way, he becomes a sort of saint of the corporate service economy in O’Nan’s modest marvel of a novel.
December by Alexander Kluge and Gerhard Richter (2012)
Two German artists reinvent the calendar book, with Richter’s photographs of snowy, implacable winter and Kluge’s enigmatic anecdotes from Decembers past, drawing from 21,999 b.c. to 2009 a.d. but circling back obsessively to the two empires, Nazi and Soviet, that met at Stalingrad.
Fred Armisen opened the first season of the TV show Portlandia singing “The Dream of the 90s is Alive in Portland,” a dream of pierced, tattooed folks hanging out, hot girls wearing glasses and putting images of birds on everything, and grown-ups making a living making coffee. He asks Carrie Brownstein if she remembers the ’90s, when people were unambitious and “they had no occupations whatsoever.” “I thought that died out a long time ago,” she says, wonderingly, before she leaves L.A. to join Armisen’s ragged troupe of relaxed and minimally-employed folks dedicated to the art of skateboarding. The context missing from this hilarious send-up is that Portland experienced a decade-long recession in the early years of the 2000s, and didn’t bounce back from it until the last couple of years.
The ’90s, like the ’80s before them, were a decade of company mergers and the birth of bigger, leaner, and meaner mega-corporations. They accomplished this goal by slashing the numbers of middle managers, which had bloomed and burgeoned as the white-collar workforce expanded in the 20th century. So the dream of the ’90s was more of the last resort of the ’90s — making lemonade out of some very sour lemons. This disappearance of stable, salaried jobs as the dominant form of employment in the United States has been touted recently as an opportunity — now you can chase your dream! Now you can be an entrepreneur! Now you can wake-up at 11:00 and lounge around before making coffee at a low hourly wage! It turns out, though, that those middle managers made up a large portion of the modern middle class, which was thoroughly shaken by the mass layoffs of the last two decades of the 20th century and has not recovered.
The Portlandia dream of escaping the stultifying culture of the office also drives Office Space, the 1999 cult film about cube-dwellers rebelling against their repressive, meaningless work. Nikil Saval uses this movie as one of his jumping off points in his fascinating history of the workplace as a place — the office, in Saval’s book Cubed, is not only a location, but has evolved into a space designed to maximize the company’s return on their investment in its worker bees, while also trying to hide that fact. The tight, three-walls-and-a-desk cubicle that defined the late 20th-century office, we learn in Saval’s book, is a warped version of a design that was supposed to balance the employee’s need for privacy and the company’s need to surveil its workers. In Office Space, that cubicle has become a prison, one you are technically free to leave, but outside of which gapes the yawning gulf of unemployment and instability.
The wonky-eyebrowed hero of Office Space ends up happily working as a manual laborer, after his company literally collapses. This embrace of working outdoors, with his hands, is supposed to counter the namby-pamby paper shuffling under fluorescent lights that defines office work. He has become truly masculine again, and has found authentic, meaningful labor.
However, he’s also embarked on a career marked by hourly wages and the uncertainty of future work. He has become contingent. His switch to contract-based work actually echoes the changes in the American workforce, for blue-, pink-, and white-collar workers, at the end of the last century. Based on reports from the Bureau of Labor Statistics, Saval estimates that the number of freelance or temporary workers in the American labor force today has reached as high as 30 percent. White-collar work initially had prestige, over a century ago, because it provided not only higher earnings, but steady ones. One of the reasons that the middle class grew and grew in the 20th century is that workers were attracted to the stability of office work — it was one of the first kind of jobs to actually provide a salary rather than an hourly or per piece wage.
Back in the 1880s, when the very concept of going to an office in which your labor was mental, not menial, was being invented, only 5 percent of people were employed as clerks, the job that became the emblem of white-collar labor. Everyone else was an artisan or a small farmer or a professional, or, as was the case with most of our ancestors, they sold the labor of their backs and arms to whoever needed it. While nostalgia for the Gilded Age, with its extremes of inequality and instability, seems unlikely, the second season of Portlandia reprises the show’s original song, but tweaks it to “The Dream of the 1890s.” Bread-baking, beer-brewing, beard-wearing hipsters sonorously sing of the joy of DIY while still slapping a bird on everything, but this time in embroidery floss instead of neon paint. But Portlandia’s mockery of the penny farthing-riding youngsters in suspenders actually points out a cultural reflection of an economic fact. Saval writes in his conclusion, “The United States is returning to the preindustrial era…work appears to be moving not forward but back to an earlier era of insecurity.” Stable, predictable careers that end in a pension — jobs that, granted, could often be repetitive, meaningless, driven by others’ goals, and dominated by office politics — seem now to have been a bubble that is slowly deflating. Entrepreneurship no longer feels like a huge risk when we’ve seen friends get laid off during the recession from even the biggest companies (maybe especially the biggest), and seen some of the bastions of our financial system go completely belly up. Our current direction echoes the early days of the industrial age, when the middle class was made of shopkeepers, not bookkeepers. The dream of the 1890s is alive in Brooklyn, Oakland, and Detroit, too.
The spin, again, is that this return to self-employment actually frees both us and the companies that used to employ us. Companies that run on contract labor can be more flexible and responsive to economic conditions, hiring freelancers and consultants to get by. We can take time to hang with our families when we need to and can choose projects that appeal to our ethics and beliefs. We work for ourselves, while companies avoid paying for health insurance or unemployment insurance.
The contingent nature of this growing sector of our economy also means that workers take on more risk. Comparatively, Saval identifies the main characteristics of the typical white-collar worker in the 20th century as patience, conformity, and a fear of risk. This makes sense when you think about what characteristics are needed when you are someone else’s employee. The goal of this typical worker was to move up through the ranks slowly and steadily, and plenty of business books throughout the 20th century purported to teach readers the secrets to getting ahead of your cohort at work. However, Saval also points out that most white-collar work was secretly pretty dead-end, especially for women, who did all the low-paid clerical work and had no pathway to escape the secretarial pool. But middle management was the level at which many men’s careers leveled out, too. Most people were willing to exchange boredom for a steady, guaranteed paycheck, preferring apathy over uncertainty. By the 1970s, in the aftermath of the cultural revolt of the ’60s, over-educated workers were sick of hitting against this wall, and business culture started to change. Now white-collar workers were renamed “knowledge workers,” and their creativity and individuality were emphasized in order to soothe the growing impatience of workers facing stagnating wages and repetitive, mind-numbing work.
While dedicated viewers of AMC’s 1960s Mad Men might think of the protagonist, Don Draper, as representative of mid-century manhood, in many ways he is, in fact, the precursor to the knowledge worker of today. He works in the office and out, takes naps when needed, and is driven by his urge to come up with the new language and image that will propel his ad company forward. He adds value through creativity. The real symbol of the mid-century company man in Mad Men is Pete Campbell, Accounts. Spindly and wispy, creating nothing but money for his employers, Pete ends one episode in the first season sitting in his darkened office with a shotgun, like a colonial governor posed for a portrait that will hang threateningly over his desk when supplicants come calling. The threat to manhood of becoming a glorified clerk is Pete’s constant battle, as he yearns to be like Don but is constantly rejected by the sleek yet brawny creative. It’s Don’s protégé, Peggy Olson, who will become the knowledge worker of the future. Deriving more pleasure from life within the office walls than without, her ascension from typist pool to advertising creative symbolizes the breakdown of hierarchies and the move away from the repetitive tasks of paper pushing to the more stimulating challenges of coming up with new ways to sell beans.
Saval is a graduate student at Stanford University, in the heart of Silicon Valley, where the knowledge worker is all-important to the creation and launch of the new new thing that defines Californian capitalism. The tech industry today amalgamates art and work; designers and engineers work together to make pretty, functional games, devices, and media. We learn, in Twitter co-founder Biz Stone’s new memoir-cum-advice book, Things a Little Bird Told Me: Confessions of a Creative Mind, that like many other tech billionaires, he too was a college dropout. The difference is, though, that Stone dropped out to design book jackets at Little, Brown, the literary publisher. For a multi-millionaire, Biz Stone emphasizes how unimportant money is to him strangely frequently in his book. Over and over, he reiterates his key points, which are intended to resonate strongly with the young founders of start-ups popping up along Market Street, in the shadow of Twitter’s giant new building in San Francisco: do your work for the love of it, not the money; create products that will improve the world; don’t fear failure, and always take risks. All the qualities that once made for the ideal white-collar worker are turned on their head. Stone describes his lack of respect for imposed authority and tells the story of his high school “no homework” deal, in which he persuaded his teachers to exempt him from homework, as long as his grades stayed up. He encourages his readers to act like the rules don’t apply to them, and to “think different,” that iconic tag line that urged you to buy Macs instead of PCs. Think different: choose the slightly smaller technology company.
Stone’s advice, based on his own journey from book design to web design, brings into business the language of passion and fun. Similarly, the knowledge worker is the combination of artist and worker. The software engineer makes his own apps on weekends and the graphic designer makes cute videos for fun, and that fact offers companies the perfect opportunity to cut costs. When people are looking for jobs that don’t feel like work, companies can hire a guy who will take their lower salary and no benefits over the security of one of the quickly evaporating salaried jobs where he’ll be a middle manager counting up widget sales and thingamabob costs. Richard Florida, in his foundational book, The Rise of the Creative Class, says that the defining element of the creative class is placing flexibility and feeling challenged above base pay. The perks of not wearing a tie and telecommuting can feel like prizes that make up for a lower pay grade, too.
Stone’s description of his approach to work reads like a list of supposed Millennial characteristics — he’s easily bored, he’s impatient, and he wants to do work that’s satisfying, self-fulfilling, individualist, and creative. But he’s 40. These characteristics aren’t just those of recent college graduates, despite the many articles citing the terrible work ethic of young people today. They are the characteristics of the knowledge worker, and while managers might not like it, corporations love it.
The tone of Stone’s memoir/business advice/self-promotion book emphasizes this blend of off- and on-duty that’s the new norm of office culture. It is casual, conversational. This is how he sums up the tense negotiations between Mark Zuckerberg, Ev Williams, and himself when the Twitter co-founders visit the Facebook campus to discuss the potential purchase of Twitter for $500 million, a number Stone claims he just made up on the spot during their meeting:
Again, the takeaway here isn’t about my behavior, which I’m the first to admit was juvenile bordering on obnoxious. Making jokes about massive amounts of money and proposing them to serious potential investors is no way to build a career or a business. The point is to trust your instincts, even if you’re smaller and less powerful than the other guy.
Everything is light, with the human touch Stone prides himself on bringing to the table — he’s the vegan next door who will loan you a cup of organic sugar and help you change your flat tire. His humanization of Twitter is achieved through deploying language and design to connect with the audience. In some ways, his memoir is an argument for the role of the humanities in tech — a product isn’t finished until its soul and its face have been created. The engineers need the English majors.
The design of offices today reflects the same blend of life and work that Stone advocates. Surveying the history of the office hand in hand with its design, as Nikil Saval does, allows the incorporation of architectural history, which is perfect for helping us understand the evolution of the modern office into the postmodern. He actually begins in the 19th-century countinghouse, where, just like Bob Cratchit in the opening scene of Dickens’s A Christmas Carol, the poor and broken-down clerks hunch over their work while Scrooge watches. As offices evolved and expanded, many design elements changed, but the essential function of surveillance stayed the same. Whether it was the typing pool on the ground floor, encircled by executive offices, or the blank slate of an open office plan, managers wanted to be able to monitor work at all times. Even as workspaces purportedly get more democratic, allotting managers and their assistants the samesize cubicle, for instance, they also lose ever more privacy for employees at all levels. There’s nowhere to hide, except the bathroom. Even smoking breaks are disappearing.
Just as business culture has strived to bring together work and play in the postmodern era, so has architecture moved from the somber slabs of glass and steel that defined office buildings of the Mad Men era to the whimsical “campus” style of the Googleplex and 1 Infinite Loop. Along with their bucolic suburban grounds, there are now gourmet cafeterias and graffiti art. Companies have long offered amenities to entice workers, Saval’s research reveals, but they’ve reached new heights in recent years.
Now start-ups offer unlimited vacation, but with the implicit understanding that you’ll bring your laptop with you to Rome or Portland or your parents’ house for Thanksgiving. And there might be foosball in the office, but there’s also a fold-out couch so you don’t have to go home to sleep. Your CEO and you both wear the same company-branded t-shirt, but only one of you is going home to the multi-million-dollar house.
Reading Saval’s and Stone’s books about working in America, both released in April of this year, gives one the sense of what an awareness of history brings. Stone refuses to be tied down by the known, and looks toward a future hazy with optimism, where all failures will eventually lead down the road to great successes. He believes capitalism should be tweaked, or, in the words he imparted to all newly hired Twitter employees, “We can build a business, change the world, and have fun.” Yet paired with Saval’s book, we see that behind this innocuous soft touch lies a history of companies determined to rise by shaping the cultures and workspaces of their employees to maximize their ROI. In the 1920s, they achieved this through movement studies and efficiency training, then in the later part of the century, though human resources and building company cultures to encourage worker engagement. Stone’s book shows the necessity of bringing creativity and art into business, and Saval’s book shows the need to remember that this has been done before. Open any management book today and you’ll find an exhortation to incorporate play into the workplace, to help workers forget that they are selling the labor of their minds and bodies so someone else can reap a greater reward.
According to Stone, people like me are the risk-taking entrepreneurs who will reshape American business by doing what we love and taking minimal material compensation for it. I graduated with a Ph.D. last year, but like the rest of American employers, universities have realized how much money they can save by cutting tenure track lines and replacing them with adjunct instructors, who work on a contract without benefits or any guarantee of being employed after the semester ends. The love of the work and our students drives many new Ph.D.s to continue toiling for decades as contingent labor — it’s gotten to the point that nearly 70 percent of academic employees at American colleges are adjuncts, a total flip from just 40 years ago. Professors are now mostly just freelancing teachers. Since adjuncts so rarely make it out of the scramble into the security of tenure, I have also taken on work as a freelance writer and an editor — but the publishing and media industries are not known for their security either, especially in our most blessed information age. Reading Saval showed that the challenges I face, even as a ridiculously over-educated individual, are the same ones faced by a growing number of American workers, whether high-level consultants or low-waged call center workers. With freedom comes risk, and without a strong safety net or a lot of luck, not all of us can recover so gracefully from failure as Stone does.
Whether a cubicle gives you hives or you can’t stop working hours after your bedtime, American work today is not a unique phenomenon. The workday and work culture have always been a taut truce between those doling out the money and those taking it. Despite changes on the surface, that fundamental relationship remains in place. As soft and cuddly as the workplace has become, or however easy it is to go to work without leaving your bedroom, these changes have occurred to improve profitability, not the life and sanity of the worker. And it’s working: even as employment remains stagnant, American productivity is growing by leaps and bounds. What we’ll have to see is whether this century will prove to be closer to the 19th century or the 20th. My bet, as I sew a bird on my canvas tote in order to carry my jars of local jam, is that the last we’ll see of the stability of white-collar work is the Eames chair, installed in home offices that double as living rooms.
Image Credit: Wikipedia
I used to be the kind of reader who gives short shrift to long novels. I used to take a wan pleasure in telling friends who had returned from a tour of duty with War and Peace or The Man Without Qualities with that I’ve-seen-some-things look in their eyes—the thousand-page stare—that they had been wasting their time. In the months it had taken them to plough through one book by some logorrheic modernist or world-encircling Russian, I had read a good eight to ten volumes of svelter dimensions. While they were bench-pressing, say, Infinite Jest for four months solid, I had squared away most of the major Nouveau Romanciers, a fistful of Thomas Bernhards, every goddamned novel Albert Camus ever wrote, and still had time to read some stuff I actually enjoyed.
I was a big believer, in other words, in the Slim Prestige Volume. Nothing over 400 pages. Why commit yourself to one gigantic classic when you can read a whole lot of small classics in the same period of time, racking up at least as much intellectual cachet while you were at it? I took Hippocrates’ famous dictum about ars being longa and vita being brevis as a warning against starting a book in your twenties that might wind up lying still unfinished on the nightstand of your deathbed. Aside from the occasional long novel––one every twelve to eighteen months––I was a Slim Prestige Volume man, and that seemed to be that.
Even when I went back to college in my mid-twenties to do a PhD in English literature, I still relied on a kind of intellectual cost-benefit analysis that persuaded me that my time was better spent broadening than deepening—or, as it were, thickening—my reading. Had I read Dostoevsky? Sure I had: I’d spent a couple of rainy evenings with Notes From Underground, and found it highly agreeable. Much better than The Double, in fact, which I’d also read. So yeah, I knew my Dostoevsky. Next question, please. Ah yes, Tolstoy! Who could ever recover from reading The Death of Ivan Illych, that thrilling (and thrillingly brief) exploration of mortality and futility?
There’s a memorable moment in Roberto Bolaño’s 2666 where Amalfitano, the unhinged Catalan professor of literature, encounters a pharmacist working the night shift at his local drug store whom he discovers is reading his way diligently through the minor works of the major novelists. The young pharmacist, we are told, “chose The Metamorphosis over The Trial, he chose Bartleby over Moby-Dick, he chose A Simple Heart over Bouvard and Pécuchet, and A Christmas Carol over A Tale of Two Cities or The Pickwick Papers.” This causes Amalfitano to reflect on the “sad paradox” that “now even bookish pharmacists are afraid to take on the great, imperfect, torrential works, books that blaze paths into the unknown. They choose the perfect exercises of the great masters. Or what amounts to the same thing: they want to watch the great masters spar, but they have no interest in real combat, when the great masters struggle against that something, that something that terrifies us all, that something that cows us and spurs us on, amid blood and mortal wounds and stench.”
Apart from being a powerful vindication of Bolaño’s own staggering ambition, and of his novel’s vast and unyielding darkness, I found that this passage reflected something of my own slightly faint-hearted reading practices (practices from which, by the time I had got around to reading the 900-page 2666, I had obviously started to deviate). A bit of a bookish pharmacist myself, I was content with netting minnows like Bartleby, while leaving the great Moby-Dick-sized leviathans largely unharpooned. I was fond of Borges’ famous remark about its being “a laborious madness and an impoverishing one, the madness of composing vast books,” and tended to extrapolate from it a dismissal of reading them too—as though Borges, the great wanderer and mythologizer of labyrinths, would ever have approved of such readerly timidity.
And then, three or four years ago, something changed. For some reason I can’t recall (probably a longish lapse in productivity on my thesis) I set myself the task of reading a Great Big Important Novel. For another reason I can’t recall (probably the fact that it had been sitting on a shelf for years, its pages turning the sullen yellow of neglected great books), I settled on Gravity’s Rainbow. I can’t say that I enjoyed every minute of it, or even that I enjoyed all that much of it at all, but I can say that by the time I got to the end of it I was glad to have read it. Not just glad that I had finally finished it, but that I had started it and seen it through. I felt as though I had been through something major, as though I had not merely experienced something but done something, and that the doing and the experiencing were inseparable in the way that is peculiar to the act of reading. And I’ve had that same feeling, I realize, with almost every very long novel I’ve read before or since.
You finish the last page of a book like Gravity’s Rainbow and—even if you’ve spent much of it in a state of bewilderment or frustration or irritation—you think to yourself, “that was monumental.” But it strikes me that this sense of monumentality, this gratified speechlessness that we tend to feel at such moments of closure and valediction, has at least as much to do with our own sense of achievement in having read the thing as it does with a sense of the author’s achievement in having written it. When you read the kind of novel that promises to increase the strength of your upper-body as much as the height of your brow—a Ulysses or a Brothers Karamazov or a Gravity’s Rainbow—there’s an awe about the scale of the work which, rightly, informs your response to it but which, more problematically, is often difficult to separate from an awe at the fact of your own surmounting of it.
The upshot of this, I think, is that the greatness of a novel in the mind of its readers is often alloyed with those readers’ sense of their own greatness (as readers) for having conquered it. I don’t think William Gaddis’s The Recognitions, for instance, is nearly as fantastic a novel as people often claim it is. But it is one of the most memorable and monumental experiences of my reading life. And these are the reasons why: because the thing was just so long; because I had such a hard time with it; and because I eventually finished it. (I read it as part of an academic reading group devoted to long and difficult American novels, and I’m not sure I would have got to the end of it otherwise). Reading a novel of punishing difficulty and length is a version of climbing Everest for people who prefer not to leave the house. And people who climb Everest don’t howl with exhilaration at the summit because the mountain was a good or a well made or an interesting mountain per se, but because they’re overawed at themselves for having done such a fantastically difficult thing. (I’m willing to concede that they may not howl with exhilaration at all, what with the tiredness, the lack of oxygen and very possibly the frostbite. I’ll admit to being on shaky ground here, as I’ve never met anyone who’s climbed Everest, nor am I likely to if I continue not going out of the house.)
And there is, connected with this phenomenon, what I think of as Long Novel Stockholm syndrome. My own first experience of it—or at least my first conscious experience of it—was, again, with The Recognitions. With any novel of that difficulty and length (976 pages in my prestigiously scuffed and battered Penguin edition), the reader’s aggregate experience is bound to be composed of a mixture of frustrations and pleasures. But what I found with Gaddis’s gigantic exploration of fraudulence and creativity was that, though they were greatly outnumbered by the frustrations, the pleasures seemed to register much more firmly. If I were fully honest with myself, I would have had to admit that I was finding the novel gruelingly, unsparingly tedious. But I wasn’t prepared to be fully honest with myself. Because every couple of hundred pages or so, Gaddis would take pity on me and throw me a bone in the form of an engaging, genuinely compelling set piece. Like the wonderful episode in which one of the characters, under the impression that he is being given a gift of $5,000 by his long-lost father whom he has arranged to meet at a hotel, is in fact mistakenly being given a suitcase full of counterfeit cash by a failed confidence man. And then Gaddis would roll up his sleeves again and get back to the real business of boring me insensible with endless pages of direct-dialogue bluster about art, theology and the shallowness of post-war American culture.
I kept at it, doughtily ploughing my way through this seemingly inexhaustible stuff, holding out for another interlude of clemency from an author I knew was capable of entertaining and provoking me. At some point towards the end of the book it occurred to me that what I was experiencing could be looked at as a kind of literary variant of the Stockholm syndrome phenomenon, whereby hostages experience a perverse devotion to their captors, interpreting any abstention from violence and cruelty, however brief or arbitrary, as acts of kindness and even love. Psychologically, this is understood as a defense mechanism in which the victim fabricates a “good” side of the aggressor in order to avoid confronting the overwhelming terror of his or her situation. Perhaps I’m stretching the bonds of credulity by implicitly comparing William Gaddis to a FARC guerilla commander, but I’m convinced there’s something that happens when we get into a captive situation with a long and difficult book that is roughly analogous to the Stockholm syndrome scenario. For a start, the book’s very length lays out (for a certain kind of reader, at least) its own special form of imperative—part challenge, part command. The thousand-pager is something you measure yourself against, something you psyche yourself up for and tell yourself you’re going to endure and/or conquer. And this does, I think, amount to a kind of captivity: once you’ve got to Everest base camp, you really don’t want to pack up your stuff and turn back. I think it’s this principle that explains, for example, the fact that I’ve read Gravity’s Rainbow but gave up halfway through The Crying of Lot 49, when the latter could be used as a handy little bookmark for the former. When you combine this (admittedly self-imposed) captivity with a novel’s formidable reputation for greatness, you’ve got a perfect set of conditions for the literary Stockholm syndrome to kick in.
In order for a very long novel to get away with long, cruel sessions of boredom-torture, it has to commit, every so often, an act of kindness such as the counterfeit cash set piece in The Recognitions. This is why Ulysses is so deeply loved by so many readers—as well it should be—while Finnegans Wake has been read almost exclusively by Joyce scholars (of whom I’m tempted to think as the Patty Hearsts of literature). After the grueling ordeal of the “Scylla and Charybdis” episode, in which Stephen stands around in the National Library for dozens of pages boring everyone to damn-near-literal tears with his theories about the provenance of Hamlet, we are given the unrestrained pleasure of the “Wandering Rocks” episode. Ulysses might treat us like crap for seemingly interminable stretches of time, but it extends just enough in the way of writerly benevolence to keep us onside. And this kindness is the key to Stockholm syndrome. You don’t know when it’s going to come, or what form it’s going to take, but you get enough of it to keep you from despising your captor, or mounting a brave escape attempt by flinging the wretched thing across the room. According to an article called “Understanding Stockholm Syndrome” published in the FBI Law Enforcement Bullettin:
Kindness serves as the cornerstone of Stockholm syndrome; the condition will not develop unless the captor exhibits it in some form toward the hostage. However, captives often mistake a lack of abuse as kindness and may develop feelings of appreciation for this perceived benevolence. If the captor is purely evil and abusive, the hostage will respond with hatred. But if perpetrators show some kindness, victims will submerge the anger they feel in response to the terror and concentrate on the captors “good side” to protect themselves.
If you’re the kind of reader who doesn’t intend to give up on a Great Big Important Novel no matter how inhumanely it treats you, then there’s a sense in which Joyce or Pynchon or Gaddis (or whoever your captor happens to be) owns you for the duration of that captivity. In order to maintain your sanity, you may end up being disproportionately grateful for the parts where they don’t threaten to bore you to death, where there seems to be some genuine empathic connection between reader and writer. Machiavelli understood this truth long before a Swedish bank robbery turned into a hostage crisis and gave the world the name for a psychological condition. “Men who receive good when they expect evil,” Machiavelli wrote, “commit themselves all the more to their benefactor.” When he wrote that line in the early sixteenth century, the novel, of course, did not yet exist as a genre. I’m inclined to imagine, though, that if he’d been born a century later, he might well have said the same thing about Don Quixote.
This winter, Millions contributors Emily Colette Wilkinson and Garth Risk Hallberg both happened to pick up the M.T. Anderson’s The Astonishing Life of Octavian Nothing, Traitor to the Nation. Via email, we conducted a bicoastal conversation about Octavian Nothing, Volume I: The Pox Party, which we’re sharing with you this week in three installments. Part 1 focused on Form and Style. Part 2 focused on Historical and Geographic Setting. N.B.: Today’s installment contains plot spoilers.Part 3: Audience, Character, and ConclusionGarth: What makes a kids’ book a kids’ book, in the popular sense of the phrase? After having thought about this a lot, the answer I’ve come up with is: kids’ books often have a kind of “educational” component adult novels can get away with bypassing. This is a way of broaching the topic of audience. You mentioned earlier, Emily, that you wondered about the audience of this book, and I wanted to suggest that the question of audience may persist even if you ignore, for example, Anderson’s debt to Adorno (as one can easily do.)Emily: Yes, I think I agree with your idea that didacticism is what makes this YA, but I still find myself wondering what I would have made of this book in my early teens – I wonder if I would have liked it, or even understood it. In my own teaching at the university level, I have seen students struggle with 18th-Century literature. Of course, there are a lot of children’s books that abstract their plots and characters from philosophy and history that is more adult (Narnia’s Christianity, Jane Langton’s use of American transcendentalism in The Hall Family Chronicles, Jenny Davidson’s use of an alternative history of Europe (if Napoleon had won at Waterloo) in The Explosionist, Matthew Skelton’s use of Faust legends and the history of Gutenberg and his press in Endymion Spring). But I feel like Octavian is harder – the style is harder, the form is harder – even if the history itself will probably be more familiar. I like the idea of younger readers liking this book, but I am, nonetheless, a little surprised by it.Garth: This may be one of those things you’re not supposed to ask for from books about American slavery, Sept. 11, Naziism, and so forth, but I thought that the subtlety of Anderson’s moral sense lagged at times behind his technical gifts. Octavian offers an essentially monochromatic vision of the institution of slavery. He does a great job revealing the way it was bound up in the culture, extending the responsibility to most participants of that culture, but he leaves little room for gradations of evil. The ironies are often negations, rather than complications or paradoxes. Juxtaposed with his extraordinary formal achievements, this made me wonder, for whom was this book was written? Younger readers may find fusillades of prose flying by over their heads, while older readers may be disappointed by the lack of moral complexity. It may be argued that melodrama is one of the archaic conventions Anderson is playing with here, but Harriet Beecher Stowe got there in the 19th Century, and did it more convincingly. I should issue another spoiler alert here, by the way. I think we might have to give away some of the book’s secrets to discuss our criticisms.Emily: I wouldn’t say I have criticisms of Octavian as much as I have questions because it is a difficult book.Garth: Maybe this is a way of exposing myself as overly hungry for irony in the novels I read. But for me, the problem of moral certainty (and its potential solutions in Volume II of Octavian Nothing) is grounded in the characters themselves. After watching Anderson painstakingly reconstruct the cultural environment within which anyone found slavery sane, I was disappointed to see Mr. Gitney, the head of the Novanglian College of Lucidity collapse into simple villainy. I was more interested in him when he seemed merely compromised and self-deluding. Similarly, the virtuous Private Evidence Goring, who befriends Octavian, was a little too virtuous for me. He had this kind of Rousseauvian innocence – he seems genuinely colorblind, and naturally assumes his friend Octavian’s equality. He’s like a son of the soil. Could he really be uncontaminated by the pervasive ideology of slave-owning? I wanted at least to see him be really dismissive of a woman, or something. I guess I wanted him to be capable of change.Emily: I share your disappointment in Mr. Gitney. The one aspect of this book that I found kind of clunky was the way Gitney pursued his experiment on Octavian. He aims to discover if Africans have the same intellectual and moral capacities as Europeans, but that would necessitate having a European subject raised alongside Octavian in the exact same conditions. The betrayal of the rationalist empiricism that Gitney claims to defend is glaringly obvious – but not to him. Maybe a way around our dissatisfaction is to think of Gitney and Goring as allegorical figures? Goring as the bright, naive, fresh-faced idealism of a soon-to-be nation; Gitney as… well, maybe the inhumanity of which science and commerce are capable? Something like that? And Octavian – who I think will develop into a flesh and blood, three-dimensional character in future volumes – is trying to orient himself in the midst of all of these?Garth: Ah. This might explain my lack of feeling for Gitney and Goring. I don’t have much of a taste for allegory.Emily: Though, in truth, I was not so bothered by Goring as a character, allegory aside. Perhaps because I am more at home with the idioms of the eighteenth century, his character did not seem false to me – kind of Tom Jones-y, though a bit more religious. I found his naive idealism appealing and believable. A matter of taste, I think. I can appreciate irony but do not require it in my reading. Indeed, I have been known – forgive me, Oscar Wilde – to cry uncontrollably when Dickens describes Tiny Tim’s empty stool and crutch leaning against the wall in A Christmas Carol. My occasional problems with sensibility aside, though, I think allegory might be the key here.Garth: Dr. Trefusius, Octavian’s tutor, was a much more interesting character to me, because his complicated relationship to “the peculiar institution” recalled the Jeffersonian one I sketched in Part 2 of our conversation. Trefusius is hopelessly compromised and complicit, but is not beyond redemption. Indeed, his is the kind of character who necessitates redemption. Likewise Bono, the slave you mentioned earlier, who almost forms a dyad with Dr. Trefusius. His clear-sightedness comes at the cost of his optimism. I suppose I think this kind of muddled moral position has more to teach us, because it’s the one we’re more likely to find ourselves in – beneficiaries of institutions that would bother our consciences, if we allow ourselves to see them for what they are. But here I’m starting to sound like I’m asking for more didacticism. Perhaps didacticism in literature is a paradox. For the bald didactic “moral” can only teach us so much. It precipitates a gestalt shift; we can only learn it once. Whereas putatively amoral irony and ambiguity constitute an ongoing lesson in what life is like. This is what’s so remarkable about Edward P. Jones’ The Known World, by the way. It has volumes to teach us about how one human being can tolerate owning another. Speaking of character, what did you think of Octavian himself?Emily: Octavian is hard to grasp, elusive, there is a lack of emotion about him, a lack of self-knowledge that makes him seem something like autistic at times (added to his encyclopedic knowledge of natural and classical history, there’s a bit of a Rain Man effect). This might have bothered me more if it didn’t remind me of some of Defoe’s best heroes and heroines, particularly Robinson Crusoe, and also Coetzee’s equally emotionally opaque Foe. With the exceptions of Evidence Goring and Dr. Trefusis, all of the characters in Volume I strike me as emotionally broken and joyless, either by slavery or by a deformed and deforming commitment to a perverse version of rationalism.Garth: With Octavian, Anderson clearly wants to do something with the idea of scientific observation (in which his protagonist is trained) versus engagement, but Octavian’s tendency to become a transparent eyeball at key dramatic moments made it increasingly difficult for me to get a read on his character. I longed for a dawning complexity befitting the maturity of the language, but Octavian became less plausible to me the older he got. That is, I think I saw what Anderson was up to, but had some trouble suspending disbelief. I would have liked to have seen more of a moral duality in Octavian himself: struggling with his own urge to dominate others, to lash out in violence at weaker characters, to achieve Oedipal one-ness with his mother… you get the picture. Though perhaps the point is that observation versus engagement is itself a moral quandary. I wanted, finally, to see Octavian as a particular human personality, rather than as an Everyman shaped by forces beyond his control. I’m hoping this is what Part II is for…Emily: My question is whether Octavian can get beyond this broken, stunted, deadened quality in future volumes and if such an evolution can be convincing. Garth: So maybe this is a good point to leave off the discussion. This has been fun, Emily. Maybe we should do it again.Emily: Indeed!Bonus Link: A 2008 profile of M.T. Anderson from The Washington Post
A friend who has long since gotten out of the literary scholarship racket was once, briefly, quite intent on writing a dissertation entitled “Parrots, Pirates, and Prostheses.” I have a vague recollection that the argument was to involve something about how pirates seem often to lose hands, legs, and eyes, and that along with their inanimate prosthetics (wooden legs, hooks, eye patches – if, indeed, eye patches count), they also have animate ones like parrots and monkeys. I am not quite sure where this argument was going. There was, however, an excellent plan to, at the defense of this unwritten dissertation, have a parrot, on the shoulder of the writer, declaim the defense.Though this dissertation (sadly) remains unwritten, it did generate a list of parrot books. Everyone’s favorite genre! Behold:Flaubert’s A Simple HeartKate Chopin’s The AwakeningRobinson Crusoe by Daniel DefoeCharles Dickens, A Christmas Carol (Scrooge recalls Crusoe’s Poll in the first stave)Flaubert’s Parrot, by Julian BarnesVirgina Woolf’s The Widow and the Parrot (this fable-like tale has been published as an illustrated children’s book)Robert Louis Stevenson’s Treasure Island (Cap’n Flint)20,000 Leagues Under the Sea, Jules Verne (parrot hunting!)Little Women, Louisa May Alcott (Aunt March has a parrot who tells Laurie, “Go Away. No boys allowed here.”)Gertrude Stein’s “The Good Anna” in Three Lives briefly features a parrot.Saki’s story “The Remoulding of Groby Lington”Wide Sargasso Sea by Jean Rhys (which features a haunting scene of a parrot on fire)Willa Cather’s beautiful Shadows on the Rock (Captain Pondaven’s African parrot Coco, who sings songs and drinks brandy in warm water)Cather’s Death Comes to the Archbishop (at least, I remember vaguely)