When a new thing enters our lives there is a tendency to describe it in its most positive terms. As the internet has become a central part of daily life, we’ve gazed on it with optimism, deciphering its essence from outward effects. Political fundraising has accelerated, search engine algorithms deliver us our daily facts, and our web of friendships have mushroomed. In our grandest moments it seems like a step forward, making us smarter, better informed, and more socially connected. In reality, the great aggregation project has shown us a vision of our world that is increasingly non-sensical. If there is an enduring truth revealed by the internet, it is that the world only seems to make sense when you filter most of it out. Without that filter, our histories, assumptions, and beliefs drift apart in an unordered soup of digital equivalencies. David Thorne’s The Internet is a Playground: Irreverent Correspondences of an Evil Online Genius is a celebration of the non-sequitur, proof that all things can be simultaneously true when given the cover of the internet. Thorne is the author of the website 27bslash6.com and Playground is an edited assembly of the site’s most noteworthy entries. The pieces are either email exchanges between Thorne and various unsuspecting correspondents (e.g. a bossy colleague, the police department, someone inquiring about his for-sale car) or absurdist essays about someone irritating in Thorne's daily life. Thorne’s most famous piece concerns an exchange with Shannon, the receptionist at the branding firm where he works as a designer. She emails him for help making a “Missing” poster for her cat Missy. The subsequent exchange exploits all of the intangible elements of communication that are absent in online interaction. Shannon’s only instructions are to make a poster using an attached picture of Missy. Though the subtext will immediately connect to a model of a “Missing” poster we have all seen in numberless variation, Thorne takes advantage of the lack of more specific instruction to indulge himself. The resulting image is a dramatic poster styled after a movie one-sheet, the title “Missing Missy” posed in block letters against a white background while Missy’s image has been shrunken and set cryptically into the lower right corner. “It’s a design thing,” Thorne wrote when Shannon replied in confusion. “The cat is lost in the negative space.” The poster goes through several ludicrous iterations before finally arriving at a workable compromise, the formulaic contact information and wording in place while leaving Missy in a red top hat. Later, Thorne imagines a week-long diary for Thomas, the firm’s creative director. “Have just ordered a new MacBook Pro because my current one is almost six months old and I cannot be expected play Solitaire at these speeds.” In another passage Thorne imagines the romantic needs of an obese, flat-topped man called Barry. “I am available and looking for that special woman. She has to enjoy never leaving the house, cleaning me with a damp cloth, and experiencing the beauty of a baby’s smile,” Thorne writes. “I placed an ad in the singles’ columns that simply read: ‘Woman wanted.’ I felt it would be superficial to include that she must be athletic and named Candy. I will screen them when they call.” In these sections there is a cruelty to Thorne’s writing, which, taken in context of his playground metaphor, is adolescent. Or rather, his merciless wit comes wrapped in a satirical padding that points to a resigned adult somewhere inside, piloting a ship of adolescent foolishness. Thorne is at his sharpest when addressing angry readers of his site who’ve written to him in outrage over an earlier post. “I have read your website and it is obviously[sic] that your a foggot[sic],” writes one George Lewis in a terse bit of reader feedback. “Thank you for your email,” Thorne replies. “While I have no idea what a ‘foggot’ is I will assume it is a term of endearment and appreciate your taking time out from calculating launch trajectories or removing temporal lobe tumors to contact me with such. I have attached a signed photo as per your request.” Thorne uses politesse to mock his correspondent’s intelligence while drawing them deeper into the exchange by making assumptions on an issue the correspondent has been vague about. It’s a standard comic setup of two people operating under completely different understandings, but it also exploits how much of our meaning is subtextual. Moreover, we have adapted our behaviors to suit the text-only nature of internet communication and now have automated responses to figurative language. “Foggot” is, of course, a misspelled slur, but its etymology reveals a murky history that makes its modern sense as a pejorative absurd. The root word meant sticks for kindling. In pre-enlightenment Europe older women who were often given the task of tending to fires on the hearth were colloquially called “faggots” after the sticks they collected. This tradition survived well into the 20th century, most notably in Ulysses when Joyce refers to Mrs. Riordan as “that old faggot.” The word was also sometimes used to describe soldiers as something disposable, and it was also occasionally applied to women who were burned at the stake, equating their very bodies with kindling--and it’s worth noting this cruelest sense of the word probably isn’t applicable to homosexuals as, even in countries where their sexual orientation was a capital offense, the prescribed method of execution was hanging and not immolation. In the 20th century, it came to be used more broadly to describe menial jobs and, later, a sort of patronage system that existed in British schools wherein an upperclassmen would appropriate an underclassmen to run errands. Yet somehow the word came to be among the most offensive things you might say to a stranger, both a slur against his identity and a derogation of the sexual preferences of a long-abused minority. This is not what the word means, but it’s what we take it to mean. In this way the most irrational of our suspicions can eventually be made true, conjured into existence by the masses. Thorne, of course, knows exactly what George Lewis intends by the word “foggot.” And yet he builds an elaborate interpretation that is non-sequitur because of the gulf between the word’s meaning and the cruel intention of the person using it. Because so many of the non-verbal indicators of intentionality are unavailable online, this bizarrely aggressive yet incoherent language has become the standard. In another entry, Thorne imagines the experiences of an older man who thinks “the internet is rubbish.” After dismissing Google, eBay, and email as wastes of time, Thorne’s curmudgeon ends with a recounting of his experience on /b/, the random subject area of internet message board 4Chan. I spent a good hour on this site and still have no idea what it is for. All I could work out is that I am apparently a ‘newfag’ and cannot ‘triforce’ but am unsure as to why I would need to triforce in the first place. I asked some of the people on there for their advice regarding triforcing, but the only answer I seemed to get was ‘nigger.’ Thorne’s responses when confronted with this sort of internet belligerence are no less coherent but they do offer a counterbalance of optimism and goodwill. In every exchange, Thorne matches his correspondent’s worst case assumptions with surprising generosity and openness. He responds to insults with thanks and by sharing personal anecdotes and curiosities. In so doing, Thorne exhausts the internet fury of his counterparts. Whereas they have taken the initiative to contact him for the only purpose of belittlement and scorn, his persistent replies lead to the other person running out of energy and ability to participate in the exchange. Thus the refrain of Thorne’s correspondences involves the aggressor asking his presumed victim “Please don’t email me again,” to which Thorne always answers “OK.” These types of surreal exchanges are perfectly suited for consumption on the internet, each one requiring only a few minutes of investment to run its comically distracting course. In book form, the pieces have a permanence that amplifies the absurdity, making it feel like evidence of the internet’s effect on the human personality. For centuries our recorded documents have striven to present a serious face which, even in moments of satiric exaggeration, evoked a social or emotional condition of some poignance. Either by self-discipline or editorial rigor, people cleansed their recorded thoughts of the irrationality that must surely have been a part of the human intellect then as it is now. The internet has eradicated self-discipline and editorial rigor and the result is the ordered expressions of generations past have become tiny rhetorical islands flooded by an ocean human absurdity. The last century has been one obsessed with productivity and social progress. Since the Industrial Revolution there has been strong evidence to argue that life is getting better. Infant mortality rates, life expectancy, relative poverty, and economic efficiency have all grown at an accelerating pace. This points to an irresistible narrative view of history, that record of our existence on earth can be be viewed in terms of linear progress made possible by new technologies. In its own way, Thorne’s Playground offers an absurd counter to this narrative, showing that for all of the improvements technology can bring in one area there are many other areas that it impoverishes us relative to what came before. A more productive life is not necessarily a more meaningful life. Between the lines of Thorne’s jabs is a recurring interest in space, time travel, the Large Hadron Collider, and the limits of our understanding of the material universe. We live in an era of amazement, when science and cosmology has so tantalizingly opened the door onto a new set of questions about our perceptions of the world and the rules that govern it. This era is likewise an era of remarkable stupidity, with bureaucracies built to enforce late fees on DVD’s that exceed the value of the original several times over and school boards that conspire with churches to show children religious theater during school hours. The benefits of progress come with the price of being repeatedly confronted with our own essential absurdity. For Thorne, the only appropriate response to these intrusions on whatever order we might imagine ourselves in isolation is encapsulated with his book's final words, ironically supplied by one of his unbidden correspondents. “Fuck off.”
No one knows why we have brains. We take for granted the brain’s associated functions—emotion, contemplation, special awareness, memory—and yet the reason some life on earth has a brain and other life doesn’t is an unanswerable question. Daniel Wolpert, a professor of neuroscience at the University of Cambridge, theorizes the fundamental purpose of our brains is to govern movement, something necessary to humans but which trees and flowers can live without. Joshua Foer’s Moonwalking with Einstein: The Art and Science of Remembering Everything is a brief and pithy recounting of Foer’s exploration of the fuzzy borders of his brain—a marveling at how and why it’s able to do something quite unexpected. Foer is a science writer and enthusiast of curiosities who’s worked for Discover, Slate, The New York Times, The Washington Post, and Esquire. Moonwalking with Einstein is a chronicle of his year training to compete in the U.S. Memory Championships—an arcane competition among adherents to the method of loci, an ancient memory technique that makes it possible to retain great volumes of random information. According to the theory, more commonly known as the Memory Palace, the human brain is capable of retaining huge amounts of information subconsciously. Details about color, texture, light, smell, and spatial arrangement are all absorbed in an instant, whether or not we’re aware of it. But we lose all the less immediate information, even when we want to remember: telephone numbers disappear, faces lose their names, and the year the Mexican-American War is irretrievable. According to the method of the Memory Palace, first formulated by the Greek poet Simonedes, hard-to-retain facts can be pinned in place by transforming them into visual icons in an imagined location. Each fact would become a representative image--the more bizarre and lascivious the better. These images would be placed in a childhood home or a college dormitory, any intimately remembered location. In this way memory becomes a process of traveling through a non-sequitur mental landscape instead of a flailing for disappearing facts. In mid-2005 Foer was a struggling, young writer living in his parents’ house in suburban Washington D.C. trying to make a living as a freelance writer. After a chance visit at the Weightlifting Hall of Fame, Foer started wondering if there was a Hall of Fame for smart people. Some cursory searches led him to the U.S. Memory Championships, where a small and eccentric group of mental athletes compete at memorizing long strings of two-digit numbers, the order of cards in a deck, and matching 99 faces and names after five minutes of exposure. Ed Cooke, a confident young British competitor with a roving imagination, tells Foer that these seemingly impressive feats are within anyone’s grasp. Even Foer could become a competitor. Foer decides to test the theory and accepts Cooke’s tutelage. As he begins his training routine, picking locations for his own memory palaces and building a network of imagery to associate with various playing cards and number combinations, Foer also intersperses a survey of the brain’s biology and some of its strangest outliers. He starts with the Greeks who considered memorization an essential part of human learning. “The great oral works transmitted a shared cultural heritage held in common not on bookshelves, but in brains,” Foer writes. One literally internalized philosophers’ arguments, histories, and poems. Knowledge didn’t come through exposure but through rumination and concise mastery born out of recall. Today we know where to look for answers, but the Greeks carried the answers within as instantly recallable memories. The extent to which we’ve delegated the workings of memory to Google prompts a scary question about our culture. “What we’ve gained is indisputable. But what have we traded away? What does it mean that we’ve lost our memory?” It’s a sensational question and Foer--wittingly or not—proves our memories are mostly constant, and there isn’t actually a dramatic difference in mental capacity between memory champions and everyone else. Before beginning his memory training Foer visits K. Anders Ericsson, a professor and researcher at Florida State University’s Department of Psychology. Foer describes Anderson as the “expert expert,” most popular for his theory that it takes 10,000 hours to become an expert in any field, an idea Malcolm Gladwell helped popularize in Outliers. Anderson and his aids spend three days studying Foer before his training, and again a year later after he has set the U.S. record by memorizing the order of a deck of cards in 1 minute and 42 seconds. While Foer’s ability to recall numbers has increased more than two-fold, his functional memory—the everyday process that’s not given the luxury of palaces and non-sequitur burlesques—remains largely identical. In fact, Foer recalls taking the subway home after a dinner in downtown D.C. with friends to celebrate his achievement. Upon getting back to his parents’ house he remembered that he’d actually driven to the dinner and had left his car parked downtown without any further thought. One of the overarching questions in Moonwalking with Einstein, then, is not whether we’ve been impoverished by the fleeing of memory techniques, but rather why memory masters don’t seem to gain any irrefutable benefits from leading their field. Indeed, Foer describes a few people with brains predisposed to having powerful memories in dysfunctional terms. There is S., a Russian journalist in the late 1920’s who never took notes in editorial meetings and still remembered addresses, names, and instructions perfectly. He was the subject of a seminal neuropsychological study on the brain and memory, and yet he had trouble holding a job and experienced many of the same traits that would later be ascribed to autistic savants. Then there is Kim Peek, the Utah man who memorized phonebooks and was the inspiration for the movie Rain Man. Peek’s memory didn’t need the rigorous training and discipline practiced by mental athletes. And yet he required a caretaker (his father) all his life and never held a job or moved beyond the thrill of memorizing town populations and mountain elevations. Foer acknowledges the perversity required to take a normally functioning memory and force it to work more like Peek's or S’s. At one point he has to stop using the image of his grandmother in his card memorizing routine because the vulgar actions he subjects her to are too disturbing. Cooke similarly excised his mother from his practice, preferring instead celebrities and sports figures who can be contorted, defiled, and penetrated without rippling any darker waters. In order to memorize faster and in greater volume, one has to push one’s brain to the outer limits of incoherence. To create a record of external order the memorizer must make a non-sequitur carnival of their inner orders, connecting a 5 of Clubs to the image of Dom DeLuise karate kicking Pope Benedict XVI, or a queen of spades to Rhea Perlman anally penetrating ex-NBA star Manute Bol. What rescues these discrepant fantasies is the tie to a rather dull system of real world meanings, which might not have been worth remembering in any case. What’s most interesting about Foer’s book is not its value as an idea exploration—he well documents how the Memory Palace has already been exploited by salesman and self-promoters—but in the kernel of a confession about his own life. Foer describes Moonwalking With Einstein as participatory journalism, but he never gets very far in describing who he is and what lay beneath the ordered surface of his account as a grown man living with his parents, trying to make a career out of writing stories about the country’s largest popped corn kernel, whilst privately carrying on a year-long project of memorizing random number strings aided by a pair of blackout goggles. Foer writes in a conversational but distant vernacular, like someone telling a curious story at a cocktail party and all the while talking around the less entertaining truths below the surface. His describes Perlman’s and Bol’s encounter as a “highly explicit (and in this case, anatomically improbable) two-digit act of congress.” It’s belabored for comic effect, but the obfuscation deadens the image itself, scandalizing something that is a natural product of Foer’s creativity. In this regard, Moonwalking With Einstein fits handily inline with the recent tradition of “big idea” books that take a breezy survey of scientific inquiry and discover some general truisms. In place of George Plimpton’s lyrical self-awareness it has Gladwell’s impersonal concision and Steven Johnson’s sense of portent without quite proving anything. Given enough time, all science writing—no matter how casually or clinically it is presented—winds up being wrong. Likewise, any work of participatory journalism that finds the undertaking more interesting than the author is bound for obscurity. What endures is the record of the human experience not the best scientific explanations our generation—and one’s past—could come up with. Foer is moving all around some of the most personal ideas in human experience--the intersection of the erotic imagination, nostalgia, lust for new experiences, and the tiny electrical impulses that accompany them. When Foer wonders if the loss of poetic immersion once common in antiquity is debilitating today, I immediately think of Lolita. “I am thinking of aurochs and angels, the secret of durable pigments, prophetic sonnets, the refuge of art. And this is the only immortality you and I may share, my Lolita.” As with Wolpert’s theory that the brain is necessary for movement, Foer describes the idea that everything we understand in the world is built on the recorded images of the past. This is how we can all have such different experiences of fixed historical events—the election of a president or two hours in a movie theater. This is why both the places that form the locus of memory, and the ghostly signifiers that populate them, are unique to the holder of the memory--always a childhood home or school. And yet all we are ever doing is moving from one place to the other, creating muscle memory for a neuron to send out an electrical pilgrim from one place to another. When I criticize Foer for being impersonal, it is a product of my own confessional instincts. My own writing is the kind of memoiristic turning of the embers that has become a cliché in an age of blogs and self-published novels by thrift shop dilettantes, who seek to prove themselves by bending the non-sequitur memory into something sensible; an image that will survive with or without its associated deck of cards. In the same way that science writing winds up being wrong in some way or another, few of my own scraps of memory have been true. There is always some detail wrong. I once described an ex-girlfriend with black hair. “I have brown hair,” she wrote me after reading it. I’d moved across the country for her so hair color was a painful thing to get wrong. Likewise, the details of my childhood, travels, career, who was there during big events in my life—these details are all less there than I think. So too Foer’s mnemonic Greeks, who remembered The Odyssey in the broad strokes but varied the details and line orders while still thinking they had it syllable for syllable. When I try and pull a specific image through the blurred depth of field time sets in between, I find the need to invent something becomes instinctual, almost thoughtless. This is the spirit moving through Foer’s book, the Albert Einstein who moonwalks down an empty suburban hallway—a figure that never was, now a memory that can’t be erased.
Problems are matters of faith. They need to be believed in before they can be solved and their solutions are always shaped by the ways the problems are defined. In this way, Jane McGonigal’s Reality is Broken: Why Games Make Us Better and How They Can Change the World founders as soon as it begins. McGonigal is a researcher at the Institute for the Future and one of the most accomplished alternate reality game designers in the world. Her book offers a solution to a problem that she doesn’t really define and probably doesn’t exist. McGonigal’s book is an evangelical pamphlet. It doesn’t discover a new problem so much as it insists that optimism is our destiny and amusement will accelerate it if through the power of interactive systems. “Today’s best games help us realistically believe in our chances for success,” McGonigal declares, and so by using their essential structure for more socially beneficial purposes than killing orcs and grenade bombing aliens we might one day “change the world.” McGonigal enumerates reality’s broken parts in a laundry list of suffering: obesity, global warming, starvation, poverty, the loneliness of the elderly, attention deficit disorder, and the modern rise of clinical depression. In short, games will fix everything. It’s often argued that video games are a new medium with exciting possibilities, an inheritor of the advancements of the written word and moving image that preceded them. Arguing that video games themselves constitute a medium is easy but it’s an inaccurate belief, one that warps many of McGonigal’s arguments. To McGonigal, games offer us “intense, optimistic engagement with the world around us.” They are defined by four essential traits: a goal, rules, a feedback system, and voluntary participation. These are a decent reduction of the new medium, but they better describe interactive systems than they do video games. There is indeed a new medium afoot, but it’s one that includes Google, World of Warcraft, Mario, and Microsoft Office. The medium is interactive system design, and video games are the non-productive, emotional face. In the way that movies occupy the same medium as training videos while serving a dramatically different purpose, video games are formed with the same essential elements as Excel. They both have a goal, respond to user input, have limits on recognizable input, and both can be taken or left. This might seem like a small distinction but it’s an essential one that confounds almost every argument in McGonigal’s book. “Games aren’t leading to the downfall of human civilization,” McGonigal writes. “They’re leading us to its reinvention.” Replacing “games” with “interactive systems” makes this claim more tenable, allowing the possibility to include Wikipedia, Groupon, Facebook, LexusNexus, Google’s cloud of services, and the emotional vivacity of Tetris, Ico, and Wii Sports. “Games don’t distract us from our real lives,” McGonigal claims. “They fill our real lives: with positive emotions, positive activity, positive experiences, and positive strengths.” This might also be true if we accept games as the artistic subset of interactive systems, but it should then be expanded beyond the scope of the “positive.” All art and emotional expression is catastrophically hamstrung when limited only to the positive and optimistic. To McGonigal’s credit, Reality is Broken is filled with specific examples of games modeled around the desire to change the world for the better. She describes SuperBetter, an alternate reality game she invented after suffering a concussion in 2009. McGonigal’s recovery period was long and drawn out, lasting several months during which she was instructed to avoid exertion, refrain from work, and spend as much time as possible resting. For a woman with a proactive disposition, this state of living quickly became torturous. Rather than wallow in self-pity, McGonigal decided to make a game of her recovery. She constructed a secret identity, and then identified friends, family members, and people in her neighborhood who could play a role in her recovery. She listed all of the negative behaviors that would slow her recover (e.g. caffeine, vigorous exercise, email, work anxiety). Then she defined specific actions that could contribute to her recovery, which she then assigned points to so that she could feel like she was making quantifiable progress (cuddling with her dog, listening to podcasts, excursion to the department store to smell different perfumes). Without this game-like structure, McGonigal’s recovery was slow and painful, but within the motivational bounds of points, objectives, and identifiable “enemies” she was able to accelerate her recovery dramatically. There are many similar examples. McGonigal describes a game that encourages social exchange between older people in retirement homes and younger people, one that assigns and tracks points for completing household chores, and one that encourages players to conduct there lives as if there was an oil shortage. Examples like these--small in scale and benefiting from a sense of goodwill among their participants--do indeed seem to have a positive impact on their subjects. But if a game is built to solve a real world problem, is it still really a game? McGonigal’s work is that it creates a framework for simplifying human experience into quantifiable objectives and positive rewards that make sense in a system but less so in “reality.” This is not a new criticism, Jaron Lanier, the author and one-time Web 2.0 developer, has long warned against designing systems that exploit human richness for simplified systemic objectives. Lanier describes a process of “lock-in,” where an idea must be taken as a fact or hard rule when placed in a software system. “Lock-in removes the ideas that do not fit into the winning digital representation scheme, but it also reduces or narrows the ideas it immortalizes, by cutting away the unfathomable penumbra of meaning that distinguishes a word in a natural language from a command in a computer program,” Lanier wrote in You Are Not a Gadget: A Manifesto. McGonigal veers into this territory when she talks about using games to make people happier by giving them discrete game-like objectives in their daily lives. This can create the impression of positive behavior in the short-term, but it also leads to long-term indifference and a diminution of the innumerable elements that naturally motivate us. In Bright-Sided: How Positive Thinking Is Undermining America, Barbabara Ehrenrich argues this kind of positive thinking “requires deliberate self-deception, including a constant effort to repress or block out unpleasant possibility and ‘negative’ thoughts.” Reading McGonigal’s description of how the most successful games define their fail-states with gratuitous exaggeration to protect players from the idea of failure (e.g. there’s still a humorous payoff when you lose) and it’s hard to not see her optimism in these terms: a willful construct to force out naturally occurring forces to the contrary. This is less a reconciliation with reality and more a filtering of it into one that’s easiest for humans to process. How this makes us “better” is unclear. McGonigal presses further in the closing third of her book with an argument that games can change the developing world for the better. Her most recent game is EVOKE, another alternate reality game commissioned by the World Bank as a way of positively affecting development in Africa and “other parts of the world.” The game is a loose network for brainstorming with a game-like allocation of points for contribution and a fictional story to add urgency. EVOKE has led to some interesting work, including a pilot program for sustainable farming in a South African community, a project to convert glass boats into solar powered boats in Jordan, and a communal library that requires users to contribute a piece of information for every book they check out. While these ideas all sound promising, the history of international development is littered with optimistic ideas that all sound fine in abstract. In practice, however, it’s entirely unclear why EVOKE is a game and not just a philanthropic network for people with Global Giving projects. I lived in Madagascar for two years working as a Health Educator with the Peace Corps from 2003 to 2005. The United Nations Development Project had a program to build wells in the most remote villages in the arid southern part of the country where I lived. The project was simple, had an easily identifiable goal, and was well-funded by a willing community of international donors. The wells were installed and the locals enjoyed clean and easy access to drinking water for several years. Then the pumps began to break. The UNDP had no local presence, the closest office was 1,000 miles to the north. The locals lacked the parts to fix the pumps on their own because the pumps had been manufactured in Europe. And so they went back to their old way of life, fetching water from shallow rivers and the pools of rainwater that collected after the rainy season. It’s easy to imagine how projects like this might be accelerated by game-like structure, but it’s hard to imagine the feeling of purposeful happiness making it any longer lasting. After my two years in Madagascar, I found that the things I wanted to work on in my community—HIV education, birth control access, convincing my neighbors to start farming tomatoes instead of only cassava—were greeted with disinterest. I struggled to explain these issues in dire terms. But how do you convince someone to care about HIV when their language doesn’t have a word for blood cell? How do you convince someone to grow tomatoes when it would cost them four months income to build a wooden fence big enough to keep the pigs, goats, and chickens out of their garden? It’s true we have an imperfect experience of the world we live in. We struggle and fail. We tend towards dissolution over a lifetime—to understand less at the end than at the beginning. In this way, Reality is Broken is a product of the lingering adolescence of video games, a forceful assertion of general good will and ambition that will never seem more possible than in the salad years, when the medium is still unburdened by the scar tissue of failure. Reality is Broken came from a short rant McGonigal was asked to deliver during the 2008 Game Developers Convention. The rant was inspired by a piece of graffiti McGonigal had seen in Berkeley, a sad phrase scrawled onto a sticker plastered on a wall. “I’m not good at life.” This is the emotional core of every point argued in McGonigal’s book. It’s a work of solidarity and overwhelming empathy with everyone struggling in the world and yet no one person in particular. One of the rants that followed McGonigal’s in 2008 was by Jon Mak, a Canadian game designer and musician. Mak chose not to talk at all. Instead he started playing ambient dance music over a boombox and hopped off the dais. He ran around the conference room throwing balloons into the audience. Each balloon had an irrational message written on it, a non-sequitur snippet that teased the human tendency to search for meaning even when there is none. Without instruction the audience began hitting the balloons into the air, passing them back and forth like beach balls. I suppose it’s possible, in the collected time and energy spent passing those meaningless balloons back and forth we might instead have pooled our energies to build something, maybe creating a system to feed the homeless people wandering around the sidewalks below us. But we wouldn’t have been playing at that point, and I suspect most of the people in the room would have lost interest. Which is a good reminder that there remains a vague but real difference between play and work. I finished McGonigal’s book convinced it’s a distinction worth keeping. (Image: -342 : guinea pig pwn from o_hai's photostream)