Tag Archives: book review

Squaring The Circle

“Any resemblance to actual persons, living or dead, events, or locales is entirely coincidental.”

This boilerplate disclaimer, inserted amidst various other notices on the copyright page of Dave Eggers’ latest novel, is superfluous: nothing in The Circle resembles reality in any way whatsoever.

This book administers a cudgel to the English language, among other ignominies, and, as with all such tragedies, the reader is left with only two options: remain a complicit bystander or stand firm against literary massacre.

I choose the latter. Some books are so terrible that only a review warning away potential readers has the power to absolve oneself of the guilt and self-loathing that accompany the book’s completion.

The Circle is a disaster. It is, on its face, a cautionary tale of the consequences of over-sharing and voluntary self-surveillance in the digital era, but its concerns are so explicitly belabored, its storytelling so juvenile, its characters so obviously proxies for authorial obsession, that the fictional universe is inevitably compromised in favor of absurdist dystopia.

Here, numbers — and everything else — have no meaning. Eggers tosses them around like grains of sand, wholly detached from any sort of significance. (Case in point: employees of the Circle — a thinly-disguised hybrid edition of Facebook, Google, and Twitter — actually count the grains of sand in the Sahara. It takes three weeks.) In one excruciatingly long paragraph, Eggers channels an Excel spreadsheet by quoting 40 separate numbers in mind-numbing fashion:

The total number of stats she was tracking was only 41. There was her aggregate customer service score, which was at 97. There was her last score, which was 99. There was the average of her pod, which was at 96. There was the number of queries handled that day thus far, 221, and the number of queries handled by that time yesterday, 219, and the number handled by her on average, 220, and by the pod’s other members: 198.

If there were even an inkling of a rationale for this numerical inundation, The Circle could have been at least minimally readable. But even the most disinterested reader cannot match Dave Eggers’ apathy for his own figures. In an unsurprising oversight, Eggers describes Mae’s “six weeks she’d been transparent” on page 309, then “the three weeks Mae had been transparent” on the subsequent page.

The raison d’être of the Circle — to vacuum up every conceivable data point on its users in order to better serve advertising and personalized content — is clearly borrowed from contemporary social networks. But this is where the similarities end. Eggers’ heroine, Mae Holland, achieves the Herculean task of appearing more inanimate than the Circle’s villainous algorithms, whose alleged ascendance ostensibly prompted Eggers’ hellscape.

Mae is a human being in only the most technical sense: she has eyes, ears, and a mouth, but virtually everything else suggests a quasi-robotic response to all human interaction coupled with a stunning lack of self-awareness. Mae is essentially a drone, only more predictable and less vulnerable to human emotion.

As the Circle demands ever more of her devotion — in one of the book’s rare highlights, she slowly accumulates workstation computer screens, beginning with two and expanding eventually to nine — Mae rarely betrays any semblance of human resistance, choosing instead to drown her peers’ disapproval in a pool of self-loathing.

If that metaphor sounds overwrought, you’ll have a very difficult time completing The Circle. Which brings me to the eponymous company’s “completion,” the Eggers-ian concept of absolute omniscience that, unfortunately for him, is already comically outdated thanks to Edward Snowden. While Eggers struggles valiantly to elucidate the grave danger of the creeping news feed — a phantom menace that, much like creeping sharia law, dissolves upon closer scrutiny — the nation has moved on to PRISM and XKEYSCORE: apparent mundanities belying great danger, a precise inversion of Eggers.

That is not to say The Circle isn’t terrifying, although certainly not for the reasons intended by its author. I finished the book fearing less for a grim future of autonomous digital overlords and more for the disappearance of the subjunctive tense: “For a moment, the couple watched as Mae maneuvered her way to their barge…as if this was their living room and she their night’s entertainment.”

Elsewhere: “Mercer took a deep breath, and Mae knew he was about to give a speech. If there was a podium before him, he’d be stepping up to it, removing his papers from his sportcoat pocket.”

And again: “He smiled sympathetically at Mae, but with a raised eyebrow, as if there was something about Mae that was perplexing him, something he couldn’t put his finger on.”

It’s almost as if Eggers was not familiar with the English language. In this, at least, he has his creations for company. Remember that ubiquitous movie scene where the bad guy explains his diabolical plan to the horrified hostages before carrying it out? The Circle is a 491-page version of this, right down to the expositional format and preachy condescension.

In one scene, a Circler — novelistic parlance for an employee of the Circle — explains an on-campus sculpture (designed by a literary Ai Weiwei knockoff) to Mae:

I mean, how can the Circle find a way to make the connection between us and our users stronger? To me it’s incredible that this artist, so far away and from such a different world, expressed what was on the minds of all of us here at the Circle? How to do better, do more, reach further, you know? How do we throw our hands through the screen to get closer to the world and everyone in it?

This doesn’t sound like anyone I know, and I work in online advertising. (The dead giveaway: social networks with customer service departments.) The walking dead in Eggers’ universe are categorically immune to warnings of a totalitarian eradication of privacy — their idealistic naiveté thus constituting, to borrow John Oliver’s phrase, “a straw man so large you could burn it in the desert and hold an annoying festival around it.” (Not to mention the fact that Ai, his celebrity-infused dilettantism notwithstanding, became famous for protesting surveillance, not celebrating it.)

Indeed, events of the past week undermined Eggers’ preening concern. Facebook released a study revealing that they had conducted a one-week experiment over two years ago in which approximately 700,000 users were exposed to varying levels of positive and negative posts.

Upon the study’s release, the Internet hordes went wild with speculation and fury. “Facebook and the Ethics of User Manipulation” was one of the kinder headlines. A general consensus coalesced around the idea that involuntary subjection to such an experiment was highly unethical — despite the fact that Facebook’s News Feed is, and has for years been, algorithmically curated based upon criteria that are necessarily highly subjective. Everything on one’s Facebook feed is, to an extent, the result of an experiment.

In short, on many issues we are still closer to much ado about nothing than the other way around. Yet Eggers still inhabits a 1984 world, and his star, Mae Holland, meets an end as self-nullifying as Winston Smith’s: acquiescence to her masters via the betrayal of a lover.

But even in the wake of Snowden’s devastating disclosures, Aldous Huxley’s prophesies ring truer than George Orwell’s. As a social network, the Circle may dull our senses, but it is unlikely to kill us. In fact, Eggers is at his best when conjuring a near-future world in which a frenetic, almost-constant exchange of digital messages — zings, he calls them — drives their senders and receivers into paroxysms of emotional insecurity and self-regret.

This is a society I recognize (as a participant), from the quiet desperation of Like-seeking to the more overt emergence of Internet celebrity as a legitimate vocation. And so I find it truly bizarre that the debate on the vanishing art of the negative book review — recently inflamed by Buzzfeed books editor Isaac Fitzgerald’s categorical disavowal of them — was presaged by Dave Eggers all the way back in the year 2000:

Do not be critics, you people, I beg you. I was a critic and I wish I could take it all back because it came from a smelly and ignorant place in me, and spoke with a voice that was all rage and envy. Do not dismiss a book until you have written one, and do not dismiss a movie until you have made one, and do not dismiss a person until you have met them. It is a fuckload of work to be open-minded and generous and understanding and forgiving and accepting, but Christ, that is what matters. What matters is saying yes.

This is precisely the brand of overly-sensitive claptrap Eggers now decries in his novel, many years later: honesty as a casualty of a status-obsessed generation. So do not listen to 2000 Dave Eggers. Go forth, be a critic. Social networks will not destroy you, nor will punishing book reviews.

The same cannot be said of The Circle.

Book Review – Brokers of Deceit: How the U.S. Has Undermined Peace in the Middle East

brokers of deceitRashid Khalidi, Brokers of Deceit: How the U.S. Has Undermined Peace in the Middle East (Beacon Press: 2013)

 

On Wednesday, Barack Obama will travel to Israel for his first official visit as President of the United States. The day after he arrives, he will deliver a speech to Israeli students at the International Convention Center that is expected to tread conventional ground regarding the peace process while gently reminding his audience that respecting Arab public sentiment on the occupation is a necessary condition for achieving a two-state solution.

Such modest objectives may seem anathema to true believers in Middle Eastern peace. But they are perfectly in keeping with the “peace process” industrial-complex portrayed by Palestinian-American historian Rashid Khalidi in his new book, Brokers of Deceit: How the U.S. Has Undermined Peace in the Middle East.

“I want to examine here…the veil that conceals how the policy of the United States toward the Palestine question has actually functioned to exacerbate rather than resolve this problem,” writes Khalidi in his introduction. Central to this disguise is the use of deliberately misleading language that wraps the decades-long stalemate in the ennobling lexicon of progress, before smothering it in the bureaucratic technobabble of “road maps” and “facts on the ground.” (If this sounds familiar, the bloodied remains of innocent drone strike victims have now attained the similarly reverential status of “collateral damage.”) Indeed, the all-encompassing term “peace process,” which Khalidi deems an “Orwellian rubric” obscuring “decades of futile initiatives,” is itself a figment of erstwhile imaginations warped beyond recognition by enough conferences, talks, and accords to fashion world peace several times over.

A question naturally presents itself: why bother with this charade at all? For Khalidi, much of the answer can be found in the goals of the various parties. He defines a successful resolution of the Israeli-Palestinian conflict as one entailing complete Israeli withdrawal from the West Bank and East Jerusalem, a “just resolution” for Palestinian refugees, and national autonomy for the Palestinian people. That all of these outcomes have failed to materialize is a product of Israeli and Palestinian deficiencies, of course. But it is also an indictment of American foreign policy on the subject, which has unfailingly taken Israel’s side as the prospects for peace slide with increasing urgency into history.

The reasons for the American-Israeli two-step and the United States’ consequent inability to end the Israeli-Palestinian conflict are threefold, Khalidi argues. The oil-exporting Gulf states have exerted almost no pressure on the United States over the plight of the Palestinians, domestic politics (especially the overwhelmingly hawkish Israel lobby) has prevented a change in strategy, and American policymakers demonstrate virtually no sympathy for the political and psychological duress of the Palestinians. On this last point, Khalidi quotes Richard Nixon, who in 1973 confided to Henry Kissinger: “You’ve got to give [Arabs] the hope…You’ve got to make them think that there’s some motion; that something is going on; that we’re really doing our best with the Israelis.”

“Doing our best,” it is no surprise to learn, meant something quite different to the Americans than it did to their Palestinian interlocutors. Behind Nixon’s Machiavellian scheming lay a rather simple truth: the domestic constituency for Palestinians was nonexistent, while Israel’s supporters regularly raised an unholy clamor. Forty years later, the Oval Office has occasionally changed hands but the calculation remains maddeningly identical. If anything, the din of the hawks has grown even louder: Khalidi accurately notes that an “increasingly formidable constellation of obstructionist forces” confronted Obama’s every timid attempt at course correction. Continue reading Book Review – Brokers of Deceit: How the U.S. Has Undermined Peace in the Middle East

A reflection

It feels somehow appropriate that it is here, in the Mission district of San Francisco, that my writer’s block has finally begun to recede. For several weeks now, ever since I typed the last sentence of my fiftieth book review of the year, words had eluded me, replacing the year-long jackhammering of my fingertips for anxious table-tapping instead. Muddy’s Coffee House, at 1304 Valencia, is proving to be my long-awaited antidote, much as countless cafes and bars within walking distance provided a safe haven for yesteryear’s beatniks and the poets of today.

I am neither beatnik nor poet. I am, however, an Excel whiz: I create sales plans for an online company in New York, and I’m in the Golden State merely on business. But after reading Gregory Dicum’s recent feature in The New York Times, “A Book Lover’s San Francisco,” and eliciting a good friend’s boundless enthusiasm upon hearing of my trip to the West Coast, I decided a sign was a sign. Immediately after completing work today, I pointed my rental car, a Chevy Aveo with all the horsepower of a kitchen blender, in the direction of I-280 and my first-ever foray into the City by the Bay.

Although it has come to an end in San Francisco, mine is a literary journey that began last New Year’s Eve in Hong Kong, as I stood with my girlfriend atop the IFC mall to await the celebratory fireworks. She asked me if I had a New Year’s resolution. I’d always managed to steer clear of such reckless abandon in the past and, in retrospect, I blame the bitterly whipping wind and cacophonic house music emanating from the rooftop bar for my anomalous response: “I want to read fifty books this year.”

What soon followed was a rapidly growing stack of books that started with SuperFreakonomics and ended with Animal Spirits, swallowing over ten months and forty-eight books in between. To keep myself committed, I started a blog and reviewed each book as I read it, praising some, excoriating others, and – when hungry, tired or bored – barely devoted four paragraphs each to the rest. If, as some claim, a year is best measured in books, it seems I’d learned that lesson at long last. Other lessons, however, proved harder to grasp. Among axioms of literature, “reading a book is a journey” springs immediately to mind, a trope as true as it is clichéd. Yet my always-looming year-end goal rendered me the journeying equivalent of the five-year-old in the backseat, wondering, “Are we there yet?”

And so it seemed to me, just as to that precocious (hypothetical) toddler, that I never was. As the year progressed and the inaugural feverish pitch of my reading pace gradually ceded ground to work and procrastination, the practicalities of finding time just as subtly began to assert themselves. I decided, via executive fiat, to start reading shorter books. Cut out the dry non-fiction. Embrace short-story collections. These and other considerations crowded out my personal preferences, sacrificing the lengthy luxury of Jonathan Franzen’s 562-page Freedom and the satisfaction of Tolstoy’s War and Peace in favor of the immutable fifty-book bottom line.

Somewhere along the way, I became aware of the inevitable creeping sensation that my New Year’s resolution had shed its virgin luster. Where before was the refrain “only twenty-five left to go!” there now remained only a sulking “eight left until I’m finally done with this stupid thing.” The blog, too, had become a chore. The whole endeavor was feeling, quite uncomfortably, more and more like school.

This is not to say that the occasional book didn’t capture my imagination. Some certainly did, from Olga Grushin’s surrealist portrait of a declining Soviet Union in The Dream Life of Sukhanov to Michael Lewis’ hilarious recounting of Wall Street’s outsiders in The Big Short to Grégoire Bouillier’s self-psychoanalysis in his endlessly relatable memoir The Mystery Guest, and many more besides. But the act of institutionalizing my reading stripped the written word of one of its most potent weapons: the ability to fully immerse a reader into a world of the author’s creation. With a ticking clock as the omnipresent soundtrack, my suspension of disbelief was relegated to intermittent moments of reading, often lost amongst the more numerous minutes spent fretting over my remaining schedule.

While this may read like a cautionary tale against setting numeric goals for book reading, it’s actually something a little different: a suggestion to aim high but to learn to be satisfied with a less-than-100% success rate. Which is why, even as I celebrated the dissolution of my writer’s block in San Francisco, I suppose I’ll just have to accept the fact that I still didn’t finish this essay until now, back in New York.

#50: Animal Spirits

How did John Maynard Keynes know I’m not rational? Or at least, not always rational. According to authors George A. Akerlof and Robert J. Shiller, this is one key precept that vanished somewhere along the line from its initial expression by Keynes to the onset of the Great Recession seventy years later. The duo’s book, Animal Spirits: How Human Psychology Drives the Economy, and Why It Matters for Global Capitalism, is a concise attempt at its revival.

It is now nearly a foregone conclusion that humans act rationally as pertaining to economic decisions. So in the aggregate, the macro-economy will reflect thousands and millions of minor judgment calls that, taken together, constitute the long-sought-after equilibrium. The problem with this theory (even if this never seemed to bother its creator, Milton Friedman) is in its idealism. Are human beings rational? To an extent, yes. At other times, “people really are human, that is, possessed of all-too-human animal spirits,” the authors write.

What are these animal spirits, and what do they do? The definition given here is “the thought patterns that animate people’s ideas and feelings.” This sounds suitably vague, which is precisely the point. In the rush to transform economics into a science, overweening economists threw the baby out with the bathwater, discarding the very real enigma of human behavior along with the failed economic theories of prior eras. Akerlof, the 2001 Nobel Prize-winner in economics, and Shiller want nothing more than to reintroduce these animal spirits to the field of economics and the public at large.

But first, a re-branding. What was then “animal spirits” is now studied as “behavioral economics.” The authors propose five psychological aspects of this discipline: confidence, fairness, corruption and bad faith, money illusion, and stories. Each of these plays a unique role within the macro-economy, but not always intuitively. Money illusion, for example, describes what takes place when wage cuts are instituted following a deflationary trend. Even when the decrease in pay is commensurate with the drop in prices, employees usually feel cheated. A perfectly rational decision by an employer thus becomes an object lesson in the existence of money illusion (and influences the employees’ perception of relative fairness as well).

This flies in the face of classical economics, in which humans are presumed to be supremely rational. (That such theories persist alongside the ongoing public fascination with the likes of Paris Hilton or, say, the British royal family is its own nifty testament to the inscrutability of the human mind.) So Akerlof and Shiller dutifully document the effects of each of their five factors before launching into eight key questions whose answers only make sense in light of the findings of behavioral economics.

This is an enlightening book, and one made all the more pleasant for its conspicuous lack of angry demagoguery. On a spectrum of bitterness from Joseph Stiglitz to Paul Krugman, the authors of Animal Spirits are clearly more aligned with the former. This is an unexpected reprieve, which understandably lends additional gravitas to their cause. Their case can be summarized thusly: don’t buy too literally into the cult of the “invisible hand.” Markets do fail, which is precisely why government regulation (and occasional intervention) is necessary. Of course, with the benefit of hindsight since Animal Spirits was published, it appears their advice — like that of Stiglitz, Krugman, et al — has gone largely unheeded. What comes next is anyone’s guess.

Rescuing the Facebook generation

For the November 25th issue of The New York Review of Books, author Zadie Smith contributed an essay titled “Generation Why?” Ostensibly, the column was a review of Aaron Sorkin’s much-ballyhooed film, The Social Network, but Smith clearly had bigger fish to fry than nerdy billionaires (especially since Sorkin and director David Fincher had already undertaken this task so elegantly themselves).

No, the issue at stake was not Facebook but the “generation” for which it was created and for whom, perhaps, its existence circumscribes theirs. Smith, in attempting to extricate Facebook from its inevitable foundation myths, nevertheless concludes that she will someday “misremember my closeness to Zuckerberg [she, too, was on Harvard’s campus for Facebook’s birth in 2003], in the same spirit that everyone in ‘60s Liverpool met John Lennon.” And yet an acute sense of separation haunts her, as much for its seeming incongruity (Smith is only nine years Mark Zuckerberg’s senior) as for its depth.

“You want to be optimistic about your own generation,” Smith muses, with a touch of nostalgia. “You want to keep pace with them and not to fear what you don’t understand.” She would be wise to heed her own advice. For what she contends in “Generation Why?” – that for the unwashed masses who fancy Facebook, Twitter, et al among life’s requisites, their online reincarnations have themselves become unhinged from, or even superseded, reality – is as emblematic of the anachronisms of the old-guard cohort (whom she affectionately dubs “1.0 people”) as it is a functional indictment of their successors.

The New Yorker’s Susan Orlean stumbles into the same trap, albeit somewhat more amiably. On her blog, “Free Range,” she posits a new hierarchy of friendship: the Social Index, a ranking of relationships by the relative frequencies of online vs. offline contact. “Human relationships used to be easy,” she explains. But “now, thanks to social media, it’s all gone sideways.” Orlean then proceeds to delineate these subtle distinctions: between “the friend you know well” and “the friend you sort of know” and “the friend, or friend-like entity, whom you met initially via Facebook or Twitter or Goodreads or, heaven help us, MySpace,” and so on. Wisely, she keeps the column short and employs a jocular tone, one whose comic value is reaffirmed by her promotion of the Social Index on – where else? – Twitter, using the hashtag #socialindex.

But one can detect a beguiling undercurrent of cynicism beneath Orlean’s evident joviality. What Zadie Smith and Susan Orlean share – in addition to their niche of the “celebrity lifestyle” whose attainment, Smith assures us, is the raison d’être of the Facebook generation – is the creeping suspicion, despite reaching a career zenith, of their continuing exclusion from the proverbial “Porcellian Club” of Zuckerberg’s collegiate fantasies. This, then, is a fate to which both they and those they pity are likewise consigned. The irony, of course, is their refusal, or inability, to identify these “People 2.0” as their kindred spirits.

Smith opts instead for the appeal to authority. In this case, that role falls to Jaron Lanier, a “master programmer and virtual reality pioneer.” (Smith, who is 35, quickly reminds us that Lanier, 50, is “not of my generation,” an assertion whose brashness once more belies her commonalities with that perpetually group-conscious underclass of Facebookers.) Quoting extensively from Lanier’s book, You Are Not a Gadget, Smith appropriates the tech-philosopher’s arm’s-length aspect toward technology as her own, spraying the reader with snippets of his wisdom. (In the book’s preface, Lanier cautioned against this Girl Talk-esque brand of mishmash, lamenting that his words would be “scanned, rehashed, and misrepresented by crowds of quick and sloppy readers into wikis and automatically aggregated wireless text message streams.”)

But Smith and Lanier have separately, and preemptively, doomed themselves to contemporary irrelevance by adhering to a retrograde narrative of the modern condition. Together, their worst nightmare is the narrowing of human existence into unintentionally confined spaces. This process takes place via “lock-in,” a series of inadvertently interacting steps which, taken together, preclude the possibility of reversal or alteration. Such was the case, Lanier argues (and Smith dutifully recounts), in the invention of the MIDI file type, a once-cutting edge format for storing and playing digital music, whose binary limitations preternaturally forced the beautiful infinity of analog melodies into a prepackaged sepulcher of bits and bytes. Once the standard had been formalized, the jig was up: there was no turning back. Music had forever changed, and not necessarily for the better. Lanier unwittingly reformulates – on behalf of the self-described “software idiot” Zadie Smith – these same fears in regard to social media.

These visions of doom are misplaced. One can feel almost viscerally the bored sighs emanating from countless millennials’ diaphragms as Zadie Smith ages before their very eyes: “When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears.” Such generous hyperbolizing obscures whatever consideration Smith’s fretting may warrant on the margins. If rescuing this Lost Generation is her utmost objective, then her plea for sanity, easily mistaken for groveling, will scatter Zuckerberg’s millions of disciples like so many cards in a two-bit parlor trick.

Notably, Zadie Smith gently ridicules the Facebook era’s emphasis on connectivity, remarking snidely that Zuckerberg “used the word ‘connect’ as believers use the word ‘Jesus,’ as if it were sacred in and of itself.” The quality of those interactions, she worries, is not worth the minimal effort exerted to vivify them. And yet she comes agonizingly close, on multiple occasions, to grasping the essence of this generation that remains simultaneously adjacent to, but seemingly unreachable from, her own. “Watching this movie, even though you know Sorkin wants your disapproval, you can’t help feel a little swell of pride in this 2.0 generation,” Smith concedes. “They’ve spent a decade being berated for not making the right sorts of paintings or novels or music or politics. Turns out the brightest 2.0 kids have been doing something else extraordinary. They’ve been making a world.”

Sound familiar? It should. The specter of John Lennon, the one “that everyone in ’60s Liverpool met,” haunts every word of “Generation Why?”  Even Zadie Smith, for whom Lennon (unlike Lanier) is clearly not a peer, cannot ignore the contemporary relevance of the former’s transformative impact on society. Culture may move more rapidly in the digital era than it did in the 1960s, but its disruptive rhythm has survived largely intact. Rebellion, experimentation, innovation: these are all hallmarks of the creative subculture, as each subsequent breakthrough quickly buries its predecessors. Mark Zuckerberg, then, is the spiritual descendant of John Lennon’s “Imagine.” We are, indeed, all connected (much to Smith’s everlasting surprise).

This is the epiphanic truth that the Facebook generation has uncovered, even if in so doing they remain blissfully unaware of the historical import of their actions. To be sure, their self-absorbed ignorance of a chronology of innovation is itself a product of the ever-shifting nature of modern culture. A generation once encompassed two or three decades; now, an absence of even five years from civilization would reduce the most precocious techie to the countenance of a Luddite. But, somewhat paradoxically (considering her alarm at Facebook’s social impact), Smith digests technology’s ephemeral nature with ease, as she states at the end of her essay: “I can’t imagine life without files but I can just about imagine a time when Facebook will seem as comically obsolete as LiveJournal.”

If this is the case, then what, precisely, is the cause for concern? Conceivably, Zadie Smith, who teaches literature, senses an intellectual fence over which the social media-savvy yet literarily deficient minds of her young charges are unable to vault. Perhaps, for a ponderous writer such as Susan Orlean, who once penned a 282-page paean to orchids, it is a fear of losing her audience to ever-decreasing attention spans. For Jaron Lanier, it may be the horror at a remix culture in which the devolution of works of art into haphazardly scissored segments (à la David Shields’ Reality Hunger) threatens the very nature of public expression. Perhaps Zadie Smith and Susan Orlean and Jaron Lanier and so many others of their age and temperament, finding themselves unable to “keep pace with [the younger generation],” succumb to the all-too-human instinct to “fear what [they] don’t understand.” In short, they face the same challenge that confronted the parents and teachers and writers of the ‘60s generation, fifty years later. They, like Mark Zuckerberg and the hordes of Facebook users who followed him in the quest for digital immortality, face the fear of oblivion.

#49: The Last Utopia

In just a few short weeks, the world will celebrate the sixty-second anniversary of the Universal Declaration of Human Rights. Adopted by the United Nations on December 10th, 1948, the document ushered in an unprecedented era of international rights norms that has since culminated in the prominence of human rights organizations such as Amnesty International and Human Rights Watch.

What Samuel Moyn argues in his book, The Last Utopia: Human Rights in History, is that the thematic line running from the UDHR’s adoption in 1948 through today is misrepresented in the nascent field of human rights studies. Although cemented now as the defining moment that gave human rights its beginning, the Universal Declaration’s appearance was, Moyn insists, “less the annunciation of a new age than a funereal wreath laid on the grave of wartime hopes.”

This is a decidedly irreverent perspective on a movement whose brief and explosive history has (especially in recent years) been lionized as proof of civilization’s continuing evolution. But Moyn is certain that these celebrants of human rights’ march to glory have it all wrong. In fact, he argues, the UDHR was, if anything, more detrimental than it was helpful in facilitating the cause of human rights as it is known today. The UDHR’s adoption “had come at the price of legal enforceability:” by its inability to transcend ancient notions of state sovereignty, the declaration in effect bequeathed to nation-states the power of adjudication over their own adherence to human rights standards. Moyn’s contention revolves around the fact that world leaders in the 1940s were understandably reluctant to cede any jurisdiction to the whims of a supranational institution, notwithstanding (or perhaps directly due to) its supposed impartiality.

I found the author’s thesis compelling at first, as he explicitly delineated the prevailing global consensus of political leaders in the post-World War II era: a strong desire for peace was complemented by a profound wariness of others’ intentions. In such an environment, the idea of subordinating a national legal framework to an international structure — especially one in which the state itself could be held blameworthy — was not an attractive proposition to any elites. And thus was born the Universal Declaration of Human Rights, a document whose noble goals disguised an impotent enforcement mechanism.

But Samuel Moyn’s continued pounding on the heads of his readers quickly grows old. I cannot count the number of times (or the plethora of ways) he tries to convince his readers that today’s edition of human rights bears little resemblance to, or is only a distant relative of, that of the 1940s. “As of 1945,” Moyn writes in one instance, “human rights were already on the way out for the few international lawyers who had made them central in wartime.” Elsewhere: “Instead of turning to history to monumentalize human rights by rooting them deep in the past, it is much better to acknowledge how recent and contingent they really are.” And, “what mattered most of all about the human rights moment of the 1940s, in truth, is not that it happened, but that — like the even deeper past — it had to be reinvented, not merely retrieved, after the fact.”

Virtually nothing is as consistently unsurprising as professorial loquacity. But even among academics, Moyn tests the limits of repetition. His mantra seems to have been: if something is worth writing, it’s worth writing one hundred times. In this regard, then, he has succeeded. Unfortunately, much like human rights themselves for a time, Moyn proves far more adept at defining their history negatively than positively. It is obvious that he considers the UDHR only nominally relevant in jump-starting the human rights movement; what is less transparent is his perspective on its true origins.

Human rights constitute the eponymous last utopia of his book’s title, but Samuel Moyn does little with this concept other than to restate it over and over (just as he does with his repudiations of the movement’s alleged foundation myth). “When the history of human rights acknowledges how recently they came to the world,” Moyn writes, “it focuses not simply on the crisis of the nation-state, but on the collapse of alternative internationalisms — global visions that were powerful for so long in spite of not featuring individual rights.” It was, in a sense, the worldwide disillusionment with grandiose visions of the past that gradually led to the introduction of human rights as a viable alternative. It offered a (facially) moral ideal where before had existed only political ones.

In short, “human rights were born as the last utopia — but one day another may appear.” Other than brief mentions (and like so much else in The Last Utopia), Samuel Moyn leaves this final speculation largely unaddressed. As to the idea that modern human rights came about due to the Universal Declaration of Rights, however: well, that horse has already been beaten quite to death.

#48: Salvation City

Earlier this year I expressed the need to stop reading manifestos. This time it’s dystopias that have drawn my ire: I think I’ll take a break on these too. Salvation City, a novel by Sigrid Nunez, is no duller than some of the other post-apocalyptic books I’ve read in the past few years. It’s also not particularly memorable.

Cole Vining is a thirteen-year-old orphan whose atheist parents died in a flu epidemic. The atheist bit matters, in this case, since so much of the narrative is focused on the conflicting identities of the young protagonist, as the storytelling jumps back and forth in time to pull all the strings together. Following his parents’ death, and after spending time in an orphanage known as Here Be Hope, young Cole was then delivered to the rural Indiana home of Pastor Wyatt and his wife Tracy, in a place called Salvation City.

The kindly clergyman — who, Cole notes ambivalently, “always looks right into the face of the person he is talking to” — and his spouse are devout, fundamentalist Christians, and their peculiar lifestyle is frequently juxtaposed against Cole’s earlier years under the emotionally fraught relationship of his irreligious parents. In Salvation City, and I refer here both to the book and to the town, the question is raised as to what exactly constitutes a rescue from tragedy, if not throwing into doubt the very nature of tragedy itself.

For Cole’s mother, Serena, even those neighbors who had opened their doors for assistance, as the flu swept through cities and towns, were deserving of the utmost suspicion: “But they were Jesus freaks, his mother said, and she didn’t want to get involved with them. ‘I mean, these people are actually happy about this catastrophe. They think any day now they’re going to be sucked up to heaven.'” Her twin sister, Addy, in an attempt to reclaim Cole from his new home following Serena’s death, expresses much the same sentiments: “‘These fanatics will use religion to justify anything — especially the ones who believe in the imminent rapture. You do understand, don’t you? That’s what these monsters were counting on? The Messiah was supposed to show up before I did.'”

Cole sees things somewhat differently. As he contemplates looming adulthood (from the wide-eyed vantage point experienced uniquely by young teens) and his adoptive father claims divine guidance in trying to persuade him to stay, Cole wonders: why “didn’t Jesus send a message to him and Addy, too? Wouldn’t that have helped them all?”

Sigrid Nunez leaves many questions such as this one open-ended, a seeming mockery of faith that becomes less flippant upon closer observation. Salvation City dwells on choices and asks, implicitly, the important question of what makes a home. But, as often befalls works of fiction whose circumstances require a great leap of imagination, the elusive answers never seem as important as the author intended them to be, and an apathetic reader is the disappointing consequence.

#47: The Mendacity of Hope

Roger D. Hodge is angry. The Mendacity of Hope: Barack Obama and the Betrayal of American Liberalism, a colorful expression of the author’s outrage at failed objectives and broken promises, begins with a lament that bespeaks profound disappointment in our current president. “Barack Obama came to us with such great promise,” Hodge writes. “He pledged to end the war in Iraq, end torture, close Guantánamo, restore the Constitution, heal our wounds, wash our feet. None of these things has come to pass.”

The Mendacity of Hope has been largely skewered by critics. In a Washington Post review, Alan Wolfe deemed Hodge’s polemic “a sloppily organized, badly argued and deeply reactionary book unlikely to have any influence at all on the way Americans think about their president.” In The New York Times, Jonathan Alter took issue with Hodge’s uncompromising position vis-à-vis the liberal purity of Obama’s policies: “Really?” Alter challenges. “Since when did the tenets of liberalism demand that politics no longer be viewed as the art of the ­possible?”

What we have seen to date, in the nearly two years since Obama’s inauguration, is a veritable influx of books, articles, essays, and magazine profiles critiquing his policies from the right. But while MSNBC, The Daily Show, and a smattering of other outlets have tweaked the president from the left, a substantive book-length rendering, by a liberal, of the inadequacies of the Obama administration’s policies has been largely nonexistent. This is owing at least as much to institutional inertia (Obama is already the president, and dissent is usually most effective when originating in the opposition) as it is to the fear that airing liberals’ disillusion could actually exacerbate the problem by causing miffed lefties to sit out the midterm elections.

Thus, after devoting much of his showtime, over the past year and a half, to unfavorable comparisons of the Barack Obama of today to the one who campaigned on such “high rhetoric” two years ago, The Daily Show‘s Jon Stewart was downright hospitable when the president appeared on his show on October 27, a mere six days before Election Day. Whether the abrupt change in the host’s demeanor was due to timidity or shrewd political strategy is unclear, but the consequence followed a general trend: outside of some niche circles, President Obama has not been held to accountability — in a protracted, thorough manner — by his liberal base.

But there is, I think, another reason that the left has kept largely silent. And that is the admission that, notwithstanding the collectively disaffected state of American liberals, Obama has indeed pushed through some truly formidable legislation. Health care reform, however trimmed-down and neutered its final edition, is still reform, as is financial regulation and other measures. Yes, Obama’s embrace of gay rights has been tepid at best, and his African-American constituency is less than pleased with his reluctance to embrace its plight. There are other grievances as well. But the progressive successes, largely lost amidst a torrent of obstructionism and party-line politics, remain, even as their legacy is overshadowed by perpetual congressional impasse and decreasing approval ratings.

It is this understanding — captured by the axiom “do not allow the perfect to be the enemy of the good” — that has eluded Rodger D. Hodge. In railing against “the mundane corruption of our capitalist democracy,” Hodge hammers away at “the obscene intimacy of big corporations and big government.” But his disillusionment is encased within a quixotic fantasy of liberal American governance. To Hodge, the conservative position is, for all intents and purposes, a politically impotent entity in the face of progressive ideology that is properly divorced from moneyed interests.

This is a somewhat absurd conclusion, given the populist (or demagogic, depending on perspective) stirrings that gave birth to the Tea Party and are expected to sweep the Republicans back into power in the House on Tuesday. Fortunately, Hodge’s animus is far more persuasive in his wholesale denunciation of corporate interests’ influence on American politics. Although at times a bit wonky, Hodge nevertheless portrays, with astounding clarity, fund-raising contributions whose origins and scale were strikingly at odds with the Obama brand’s stated philosophy. “The results were impressive,” the author writes. “Against a token candidate who raised a mere $2.8 million, Obama in his Senate race raised $14.9 million — in his first attempt at national office, in a relatively short time, with significant contributions from out-of-state donors such as Goldman Sachs, JPMorgan Chase, and George Soros. Indeed, 32 percent of his contributions came from out of state.”

Contrast this with a 2006 speech Obama made, in which he expressed empathy with Americans for their disgust with “a political process where the vote you cast isn’t as important as the favors you can do” and proclaimed that Americans were “tired of trusting us with their tax dollars when they see them spent on frivolous pet projects and corporate giveaways.” Indeed, Hodge would argue that the president stole from the playbook of former New York governor Mario Cuomo, who famously noted that political candidates “campaign in poetry but have to govern in prose.”

Interestingly, it is Roger D. Hodge’s prose that remains the highlight of The Mendacity of Hope. At times his phraseology perfectly straddles the line between comedy and outrage, as when he deems the doctrine of the “unitary executive” to be “a partial-birth abortion of the Constitution.” Later, decrying the lack of retributive justice for Ronald Reagan’s perceived crimes in relation to the Nicaraguan Sandinista government, Hodge sulkily concludes, “Impeachment would have to await Oval Office fellatio.” Yet however sincere his repulsion for Obama’s gradual backslide from his campaign’s lofty poetry, Roger D. Hodge is doomed to eternal disappointment if his vision for American leadership, as espoused in his book, remains so far removed from the reality of the possible.

#46: Blink

What does Malcolm Gladwell have in common with Glenn Beck, Adam Lambert, Ronald Reagan, Paul Krugman, John Grisham, Nicolas Sarkozy, and Jesus Christ? An uncanny ability to polarize, that’s what. (As for his tendency to invent categories of strange bedfellows, well, he’ll just have to share that dubious distinction with yours truly.) Gladwell and his book, Blink, have evoked praise from writers at The New York Times, The Boston Globe, The Wall Street Journal, Time, and the Associated Press. He has also attracted criticism, sometimes from unlikely corners. Highly regarded Seventh Circuit Court judge Richard Posner dismissed Blink as “a series of loosely connected anecdotes, rich in ‘human interest’ particulars but poor in analysis.” More bitingly, he notes that “one of Gladwell’s themes is that clear thinking can be overwhelmed by irrelevant information, but he revels in the irrelevant.”

Harsh words are these, but one must consider the source. Who appointed Posner the judge of right and wrong? (OK, so Ronald Reagan.) And when’s the last time a casual reader willfully plunged into the dark recesses of a judicial opinion? For all of Posner’s eminent reasonableness, his jurisprudence has the popular appeal of an electrocardiograph. Interestingly enough (or not), just such a transmission is one of the subjects of Malcolm Gladwell’s Blink. “The ECG is far from perfect,” Gladwell informs us, and so are his analogies. But at least in the latter’s case, a quick skimming is still a decently pleasant endeavor and one whose proximate cause is curiosity, not heartburn. Mr. Posner, know thy audience.

This isn’t to say mild discomfort won’t accompany the book-reading. Blink deals in just the sort of Ripley’s Believe It or Not-esque anecdotes that shoo us scurrying over to Wikipedia for furious fact-checking even as we wallow in vague notions of gullibility. Like the counterfeit kouros sculpture to which Gladwell’s gaze continually returns, Blink “had a problem. It didn’t look right.” Whether this instinctive skepticism regarding the book’s simplistic reasoning can be attributed to thin-slicing or careful analysis, I know not. I am armed only with an incredulity that the long-term success of a marriage can be diagnosed within fifteen minutes, or that commission-seeking car salesmen discriminate not intentionally but due to the unconscious “kind of biases that many of us carry around in the nether regions of our brains.” And while I can believe that information overload actually reduces our ability to formulate practical solutions, I’m not so certain the answer is to “put screens in the courtroom” to protect defendants — who would remain “in another room entirely, answering questions by e-mail or through the use of an intermediary” — from race-, sex-, and age-based discrimination.

This Gladwellian resort to logical deus ex machinas has rattled many a critical reviewer. It is one thing to remind readers that “a black man [in Illinois] is 57 times more likely to be sent to prison on drug charges than a white man.” It is quite another to mount a defense of this same criminal justice system in the very next paragraph, in which Gladwell elaborates, “I don’t think the car salesmen in the study meant to discriminate against black men…Put a black man inside the criminal justice system and the same thing happens. Justice is supposed to be blind. It isn’t.”

A more generous take on law enforcement may not exist. In fact, while we’re at it, we might as well remind aspiring historians that the Holocaust’s targeted killing of Jews was nothing more than a slight statistical anomaly, and that the Ku Klux Klan’s public disgrace was due entirely to a silly cultural misreading of the burning of crosses on minorities’ front lawns. One would think that, on the occasion of the black-over-white incarceration multiplier reaching double digits, there may be sufficient evidence to suspect systemic abuse. But then, Malcolm Gladwell is nothing if not unsuspecting. In Blink, he argues that what we process in the first two seconds of any given event is often more valuable than the subsequent (and more detailed) analysis. His editors and proofreaders, God bless’ em, appear to have taken his advice quite literally.

#45: Before You Suffocate Your Own Fool Self

Danielle Evans is the kind of author that gives one pause. And this is before one even reads a word she’s written. At the age of twenty-three, Evans’ work had already seen the glorious light of publication in The Paris Review. Now, three years and a critically acclaimed short-story collection later, Evans teaches literature at American University in Washington, D.C. And, presumably, ends world hunger.

The above-mentioned short-story collection is Before You Suffocate Your Own Fool Self, a title borrowed from “The Bridge Poem,” by Donna Kate Rushin. Shortly after the phrase that gives Evans’ book its title, Rushin’s poem ends with a declaration: “I must be the bridge to nowhere / But my true self / And then / I will be useful.”

Having just finished reading Before You Suffocate Your Own Fool Self, I think “The Bridge Poem” is key to understanding the undercurrent of displacement among African-Americans that permeates Evans’ stories. In “Virgins,” the collection’s first story and the one that landed its young author in the vaunted pages of The Paris Review, a teenage girl vacillates between instinct and adolescent curiosity as she timorously embraces her budding sexuality. It should be noted that, refreshingly, this and the other short stories are remarkably unpretentious, no small feat in this genre. The main character in “Virgins” displays the fledgling snark that marks a phase suffered through by all urban youth, with which readers’ near-universal familiarity makes it hard not to grin when she consoles her friend, “The only difference between that girl and the subway…is that everybody in the world hasn’t ridden the subway.”

Underneath such faux-witticisms lies a deep-seated unease with concurrently, and contrarily, demanding social pressures. For Erica, the first-person narrator of “Virgins,” this conflict pits the assertion befitting her ascendancy into adulthood against familially-bred perceptions of danger. Crystal struggles to reconcile her fraying ties to her high school best friend with a desire to escape the quiet desperation of a ghetto, in the ironically-titled “Robert E. Lee Is Dead.” And in the poignant voice of a military veteran in “Someone Ought to Tell Her There’s Nowhere to Go,” a small lie takes on new shape when the soldier’s daughter becomes a pawn in his grasping plea for recognition and acceptance.

These, and all the stories, are framed delicately on the fringes of white America, as the characters are forced by circumstance into engagement with the Other and yet remain substantively disenfranchised from the majority’s perceived benefits. At one point, betraying a worldly cynicism that belies her youth, a high school student reminds her pal that “white kids do senior pranks. When we try it, they’re called felonies.”

This comment, joined by Evans’ other, far subtler nods to the plight of African-Americans, painfully casts even the banal aspects of Stateside dhimmitude into sharp relief. When, in “Harvest,” an inadvertent pregnancy spawns a tragic debt that cuts across racial lines, the burden of social exclusion is harshly exposed; elsewhere, implication is preferred. Regardless of methodology, however, the subtext of alienation — from country as from family — is a troubling constant. And I expect that its vivid rendering by Danielle Evans will take the author one step closer to something resembling inclusion.