Tag Archives: book review

A brief digression

I’ll be honest: I love to rip on the media. My frustration is neither strictly ideological — although, as an avid New York Times reader, my jabs tend to come from somewhere around center-right — nor completely random, but this current explosion is admittedly a bit out of left field.

I love Nomar Garciaparra. An All-Star shortstop for the Red Sox and a baseball icon for the youth of Boston from the moment he first stepped onto Fenway’s glistening diamond in 1996 until his contentious last days in the summer of 2004, Number 5 was the king. His obsessive-compulsive batting rituals, mysterious middle name (you mean you didn’t know his first name was Anthony?), and searing line drives were tailor-made for baseball-mad New England. Comparisons with Ted Williams became ever more frequent; in 2000, Nomar flirted with a .400 batting average. He graced the cover of Sports Illustrated, and soon suffered from its notorious curse; the ghost of Al Reyes (who joins Sox fans’ eternal blacklist, along with Grady Little) haunted him and eventually derailed his 2001 season. He was never the same afterward, but still we loved him.

Then came 2004. Or that’s what certain members of Boston’s sports-writing elite would have us believe. In reality, the rapid downward spiral of Nomar’s time in Boston began in the winter of 2003, when rumors were swirling as to the possible acquisition of Alex Rodriguez, then the shortstop for the Texas Rangers. The persistence of the public speculation was a slap in the face to Garciaparra, who’d played for his entire career with an intensity and vigor that stood in stark contrast to the lackadaisical approach of fellow Sox superstars Pedro Martinez and Manny Ramirez. Nomar ran hard on every play, whether at bat or in the field; his numerous throwing errors were usually a result of attempting spectacular plays that most shortstops could never have attempted.

So it was understandable, then, that his attitude heading into the 2004 baseball season, immediately following his seventh full year with the Sox (he was on the All-Star team in five of those years), was less than amiable. If Nomar had a fault, it was not comprehending the nature of the beast that is the Boston sports media. And no one embodied this vindictive spirit more than Dan Shaughnessy of the Boston Globe. This was the same guy who once criticized Sox outfielder Carl Everett so vociferously that the player famously dubbed him the “curly-haired boyfriend” of Gordon Edes, a fellow (far more talented) baseball writer for the Globe. To this day, members of online Red Sox forums still refer to Shaughnessy derisively as “CHB.”

During the summer of 2004, Shaughnessy and several of his colleagues from Boston media outfits — notably including the then-novel bostondirtdogs.com, which at this moment has a sub-headline that reads “The Nomar Phonyfest Is Now Over, Everyone Go Take a Steaming Hot Shower” — went to work ruining the stellar reputation Garciaparra had nurtured over his long and illustrious career. The coverage launched a vicious cycle, as Nomar became more disillusioned with perceptions of him as a lazy and uncommitted player — allegations that, up until that season, were unthinkable — and the media caught on to his frustrations, perpetuating his misery. When he was finally traded just before the deadline in July 2004, his departure was heralded as the relieving end to a burdening era. Boston’s World Series triumph just three months later — its first in eighty-six years — appeared to lend credence to the view that Nomar had been expendable at best, a serious detriment at worst.

Fast forward six years. Nomar has just announced his retirement, and in a move that prompted a wave of hardball nostalgia for me and thousands of other like-minded fans, signed a one-day minor league contract with the Red Sox. “I’ve always had a recurring dream,” Nomar said, “…to be able to retire in a Red Sox uniform, and thanks to Mr. Henry, Mr. Werner, Mr. Lucchino, and Theo [Epstein] and the Red Sox organization, today I do get to retire, I get to fulfill that dream and retire as a Red Sox.”

Nomar, then, has achieved his dream of retiring with the team, and the city, that has always adored him. In response, the Boston media — and Dan Shaughnessy especially — have taken to excoriating him once again. His crime? Although they’d never admit this, it is only Nomar’s disinclination towards engaging the media that eventually led to the demise of his public image in Boston. Unlike Pedro, who embraced his larger-than-life role in Boston sports, or Manny, who was seemingly oblivious to it all, Nomar was actively uninterested in burnishing his reputation through exclusive interviews and media hobnobbing. This would cost him dearly.

On March 11, Dan Shaughnessy wrote a column which began, “Great player. Total fraud. Welcome home, Nomie.” His unfounded vitriol underscored his own prejudice and, even worse, highlighted his ignorance of that intangible factor that makes baseball so transcendent: the heartfelt connection between a player and his fans. Unlike members of rock bands, or politicians, or any number of other public figures, a hard-nosed and talented baseball player like Nomar Garciaparra has the potential to capture the hearts and minds of millions and remain in their memories for a lifetime. Dan Shaughnessy and his vindictive cohorts will be long gone before the echoes of Nomar Garciaparra’s legendary years in Boston ever fade from the city’s collective consciousness.

Welcome home indeed, Nomar.

#12: You Are Not a Gadget

“The words in this book are written for people, not computers.” So declares Jaron Lanier, in the preface to his self-described “manifesto” on the impending doom of Web 2.0 and its digital companions. In You Are Not a Gadget: A Manifesto, Lanier confronts the brooding technological nightmare with revolutionary fervor, decrying with gusto the horrifying destructive potential of…of…of Wikipedia. In what amounts to an elegy for the creative spirit, Lanier warns against the dangers inherent to the “hive mind” by lashing out against humanity’s self-imposed subjugation to technology.

Let’s be fair here. Lanier seems like a smart enough guy, even if his choice of hairstyle — he appears on the book’s flap in a thinker’s pose, with his dreadlocks running past chest level and on to the great beyond — is more suited to an aspiring grunge artist than an Internet visionary. Fittingly, then, he actually enjoys playing the oud and even frequents an online forum that serves as a virtual community for the instrument’s fan base. Of the forum, he says, “There’s a bit of a feeling of paradise about it. You can feel each participant’s passion for the instrument, and we help one another become more intense.”

Indeed, Lanier’s intensity — his passion for rescuing the individual voices from the clutches of impersonal cyberspace — is to be admired, even if the object of his rigor is perplexing. His thesis, that the digital era’s explosion has created ways of thinking about and interacting with technology that portend disaster down the road, is not particularly convincing. And while he could never be accused of boring his readers, one could easily charge him with alarmism.

The author ably explains the dangers of “lock-in,” the process in which an arbitrary digital convention — organizing computer data into virtual files and folders, using MIDI as the industry standard for digital music representation, etc. — becomes so ingrained in culture and thought that it is nearly impossible to reverse. What Lanier never quite masters, however, is just why certain accepted standards, most notably the open-source movement and crowd-sourcing, are so malignant. Technology’s purpose, he lectures, is to adapt to and serve human beings; he worries that the sudden and widespread advent of the Internet has given rise to the opposite being the case, as we have now become willingly subservient to machines, adapting to their whimsies instead of demanding tools that do not require a degradation of human intelligence.

It is in this vein that he alludes to Wikipedia, a site he admits to using himself but whose implicit founding principle — the more contributors, the more closely we approach truth — he derides with vivacity. “The ‘wisdom of crowds’ effect should be thought of as a tool,” Lanier writes. “The value of a tool is in accomplishing a task. The point should never be the glorification of the tool…There’s an odd lack of curiosity about the limits of crowd wisdom.” He has a point, but not much of one. It is true, for example, that, as Lanier notes, most breakthroughs in modern technology have been delivered under the auspices of for-profit corporations (i.e. Microsoft Windows, the iPod, digital camera, etc.). And that such innovations are sorely lacking in the domain of open-sourcers is cause for reflection, although not necessarily concern.

However, what the author consistently misses (or perhaps chooses to ignore) is the innate ingenuity of human beings, regardless of their provided tools. In a section discussing the impact of the file-sharing era on musicians, Lanier writes, “If we choose to pry culture away from capitalism while the rest of life is still capitalistic, culture will become a slum.” Above all, he is concerned with our collective loss of free spirit, but he fails to notice, for example, the consistent ability of the young to bypass and defeat ever more stringent regulations by those in the business of enforcing digital rights management. First, there was Napster; after being brought low, it emerged as a legal, paid music service. File-sharing clients sprouted up one after the other, with new entrants following quickly on the heels of those brought to an end via litigation. Even Radiohead’s novel idea of giving away music for free, which Lanier claims does not “fill me with hope for the future,” is actually proof that people are continuing to exhibit an entrepreneurial spirit by forming new and inventive solutions to existing problems. These are not the products of unqualified and inexpert crowds, but the brainchildren of creative, ambitious individuals. Jaron Lanier may not be a Luddite, but his dire warnings of future doom are a bit anachronistic. I can only wonder what he’d think of the iPad.

Just for fun…

…I’m throwing in an essay I recently submitted to some newspapers and the National Enquirer. (Just kidding, but it is actually in contention for a Pulitzer this year, for which it must eternally thank John Edwards.) Disclaimer: I have no particularly headstrong feelings for or against Google Buzz or any other social networking sites. (For more information, please message me, poke me, and invite me to a ninja battle on Facebook.)

Anyway.

* * * *

I’m an amateur Google Buzz user. On occasion, in the few days since its inception, I’ve been known to post a pithy little message from work, or share a link to a humorous news story. In short, I’m fairly representative of the hundreds of millions of people who regularly use social networking tools to stay in touch. Thus, when it became clear that Google’s Buzz team had not given privacy concerns enough consideration, I was among the legions of the disaffected, and I joined countless others in denouncing the search giant for the flaws in its ill-conceived product. Most concerning to many of us was the fact that the service, by default, allowed others to view our contacts, which could only be prevented by manually opting out.

However, unlike much of the online community, I am reluctant to extend my criticism of Google beyond this specific faux pas into the gray area of online privacy as a principle. In fact, a healthy chunk of the vitriol directed toward Google seems disingenuous, if not hypocritical.

As a loosely knit collective, we Web users – and especially those of us active in the social networking sphere – have, in effect, subscribed to an alternate society. Ours is a community in which shared friends, values, and interests precipitate shared information. When a group of college buddies goes skiing over Christmas break, the vacation is immediately virtually enshrined as a Facebook photo album. A food critic alerts the world about a restaurant’s new dish via Twitter. An up-and-coming band builds a fan base by posting music on MySpace, long before it signs a record deal.

And that’s just counting self-submissions. Ever since increasing bandwidth and cheaper storage spawned the explosion of Picasa, YouTube, and a variety of similar services, even those of us who’ve eschewed online schmoozing have been subject to frequent invasions of privacy. These violations range from the trivial – an embarrassing picture of someone asleep on a train – to the truly significant, as when job-seekers are denied a position due to pictures others have posted of them after one (or two, or three) too many drinks.

That such practices persist is a testament to the ubiquity of resources designed for the purpose of self-expression. Much has been made of the so-called digital era, a period that has seen, among other things, the blink-or-you’ll-miss-it rise and fall of Internet celebrities, individuals whose entire narrative arcs have been circumscribed by the medium that created them. As an enormous online neighborhood, we have smiled along with adoring parents who film their crying babies for posterity, perused blogs for gossip on public figures, scorned those caught in disgraceful acts, and even learned how to surreptitiously keep up with the Joneses without the hassle of actual communication.

And in so doing, we have embraced an implicit standard in which opting out, rather than in, must be chosen manually if we wish to safeguard our real-life identities. Yes, Google overstepped a boundary when it revealed its users’ online relationships via Buzz, but by joining the service we had voluntarily sacrificed a shroud of protective anonymity anyway. We had stepped into the circle of shared experience and then recoiled in horror when the net effect was – shudder – to bring us closer together.

After all the negative press Buzz received, Google promised immediate improvements, none more welcome than the switch from auto-follow to auto-suggest, a change requiring users to specifically select (or opt in) those whom they wish to follow, rather than Google picking for them. I welcome this change. There are many legitimate reasons for restricting the compromise of personal identity on the Internet, ranging from safety to preference and much else besides. But while we rightly concern ourselves with the proliferation of our lives’ digital fragments, it may be that we are fighting a losing battle to standardize an opt-in default in a community from which we refuse to opt out.

#11: The Curious Incident of the Dog in the Night-Time

I have just learned something: there does not seem to be a character limit on blog post titles. If for no other reason, British author Mark Haddon’s novel, The Curious Incident of the Dog in the Night-Time, has been helpful in this regard. (In the spirit of generosity, I will also attest to Haddon’s wisdom in neglecting to include a subtitle.)

Fortunately, the virtues of this work extend far beyond its function as a test of software capabilities, as Haddon lends a powerful voice to a mentally disabled boy living in England. At once a mystery novel and a compelling journey into the mind of a very unique child, The Curious Incident is savvy enough to allow the reader wide room for interpretation while still relating an extraordinarily accessible tale (and in the first person, no less). Christopher John Francis Boone is an autistic child living with his father in Swindon. (In an interview, the author preferred to define the protagonist’s condition somewhat more ambiguously, stating that “there is a very true sense in which there is something more wrong with the people around Christopher than with him.” A bit trite, yes, but not without validity.)

Throughout the course of the story, it becomes obvious that — surprise, surprise — not all is as it seems in Christopher-land. And yes, it is as if he inhabits an alternate world, likely one of his own making or, at the very least, bounded by the despotic tics that frequently shut down both his mental and physical faculties. When composed, however, Christopher is something of a math prodigy, and his embrace of sheer logic (however illogical his idea of rationality may be) is as heartwarming as it is frightening. In a rather characteristic example, Christopher postulates on the difference between his memory and the imagination of his peers: “Other people have pictures in their heads, too,” he writes. “But they are different because the pictures in my head are all pictures of things which really happened. But other people have pictures in their heads of things which aren’t real and didn’t happen.”

Like many children, Christopher is often funny without intending to be, but his limited mental agility renders his many evaluations of his experiences extremely affecting. It is not what he takes note of, but instead what he does not, that exudes the beauty and sadness of a life lived within the constricting walls of mental illness. And in a strange way, Mark Haddon’s seemingly cliched remark about Christopher is spot-on: the childlike gravitation towards (il)logical deduction provides a far more elegant form of commentary than do the voices of those surrounding him. Christopher concludes his journey secure in the knowledge that “[he] can do anything.” Readers will have little trouble believing that, but Mark Haddon’s challenge with his next novel will be attempting to live up to Christopher’s high standards.

#10: Notes from the Cracked Ceiling

Anne E. Kornblut, a White House reporter for the Washington Post, is impatient to see a woman in the White House — and not another First Lady, either. Her book, Notes from the Cracked Ceiling: Hillary Clinton, Sarah Palin, and What It Will Take for a Woman to Win, is easy (yet purposeful) reading. But lest her novelistic tone deceive you, let it be clear that her views on the necessity of recruiting more female political candidates are never in question. Having personally followed the two aforementioned presidential hopefuls during their campaigns, Kornblut has seen firsthand the unique abuse lavished upon female candidates. In her introduction, she argues that Clinton and Palin “may not have lost because they were women…but their sex played an outsize role in the year’s events.” She then closes that section with the observation that “the glass ceiling may be cracked…but it is far from broken.”

What, then, is keeping women from breaking through that glass? History is an obvious culprit, but Kornblut is disinclined to let the present off the hook so easily. More specifically, she faults the candidates and their large teams of handlers, who often waged behind-the-scenes battles over their candidates’ public self-portrayal. Should Hillary exude toughness, or feminine restraint? How about a combination of the two? Would it help if her daughter, Chelsea, campaigned along with her? In one potent example of poor decision-making, Kornblut details the various Christmas commercials the presidential candidates aired in December 2007. While Obama focused on his home and family, Clinton devoted her airtime to wrapping Christmas presents with labels such as “universal health care” and “bring troops home.” “It was hard,” Kornblut wryly notes, “to quit being tough.”

Of course, Hillary Clinton eventually lost the Democratic nomination, but not without some help from the national media. Was their constant bombardment indicative of sexism, or simply a reaction to the Clinton camp’s preexisting ambivalence towards the press corps? Kornblut seems to think there is some of both, but the mass public’s embrace of some of the more vicious ad hominem attacks on Clinton lend credence to allegations that it was more the former than the latter.

Clinton’s demise was soon overshadowed by the meteoric rise of Sarah Palin, the governor of Alaska. Kornblut does an admirable job retracing Palin’s time on the campaign trail, especially in noting how quickly the high praise was overtaken by vitriolic condemnation. And while it is true that public commentary on Palin soon reflected sexist undertones, Kornblut at times seems unable to completely separate these attacks from the legitimate criticisms, most prominent of which was Palin’s lack of a grasp on even basic domestic and foreign policy issues and her disastrous performances in network interviews. That Palin became a favorite target of the Democratic base was undeniable, but that this was largely due to her gender is much less apparent.

Furthermore, Kornblut missed a golden opportunity to delve deeper into one of the more fascinating subplots of Palin’s candidacy — namely, that of her role within the historical feminist movement. Traditionally, feminists were assumed to adhere to more liberal ideology, which in its most common incarnation usually included a pro-choice stance and a general alignment with the Democratic Party. So when Palin, a mother of five with strong pro-life views, became the vice presidential nominee, it seemed almost as if the modern feminist movement had reached a fork in the road. Kornblut had noted earlier how many women in their twenties had voted for Obama over Clinton in the Democratic primaries, confident in their belief that voting based on competence and ideology over gender politics epitomized a more authentic form of gender equality. With Palin, older models of feminism were once again being strained: was Palin’s candidacy, given her conservative views (especially on abortion), a betrayal of feminist ideals, or was it reflective of a new wave of female ascendancy representing all points on the political spectrum?

Kornblut gives this tension a brief nod when she notes that “if Clinton had epitomized the feminist movement’s dream, Palin was in many ways its worst nightmare.” Entire volumes could be written on this subject, and in that Kornblut’s book was ostensibly intended to ask these and similar questions, the fact that she devoted just several pages to Palin’s role within feminism was disappointing. Similarly glaring in its absence was any discussion of female minority voters who faced the difficult and historic choice between Barack Obama and Hillary Clinton in the Democratic presidential primaries. The question of which identity holds strongest — race or gender — was ignored in Kornblut’s analysis, a surprising omission in an election for which identity took center stage.

Towards the end of the book, Kornblut contrasts the American political experience for women with that of other countries. The comparison is not flattering to the United States. For Kornblut, however, the upside to the disappointment of two women narrowly losing out in the 2008 elections is that countless lessons can be taken from their failures — shortcomings that were as much the fault of their advisers, the media, and an unpredictable electorate as they were of the candidates themselves. With shrewd recruitment and well-planned campaigns, women will continue to challenge the gender status quo in politics. It remains to be seen when this will happen, but the shattering of the glass ceiling is long overdue.

#9: The Unnamed

About halfway through reading The Unnamed by Joshua Ferris, I found myself perusing its review online at the New York Times. Jay McInerney was less than glowing in his evaluation, deeming Ferris’ first novel, Then We Came to the End, a “masterly debut,” before lamenting that “it’s difficult to believe that ‘The Unnamed’ and ‘Then We Came to the End’ come from the same laptop.” The review concludes on a wistful note, with McInerney willing the author to “return to the kind of thing at which he excels.”

So then, perhaps he’d like a sequel? It is true that The Unnamed marks a sharp departure from Then We Came to the End, which was a highly comical yet ultimately shallow plunge into office hijinks and melodrama. (In fact, Ferris’ first book was probably a closer — and slightly older — cousin to Jonathan Tropper’s This Is Where I Leave You, which arrived on bookshelves late last summer, than it is to The Unnamed. Both books awkwardly mingle frivolity with heavier matters of the soul, with many passages leaving readers simultaneously laughing and yet unsure of whether that was an appropriate response to, say, the protagonist sleeping with his brother’s wife. I’ve seen Adam Sandler movies with more emotional verve.) But these differences are hardly a knock on Ferris’ progression as an author. In fact, while I was contemplating buying The Unnamed on Amazon.com, I noticed that the book’s page featured a video conversation between Ferris and David Sedaris. At the time, this meeting of the minds seemed apt, but the congruency disappeared upon completing The Unnamed.

Unlike Jay McInerney, I do not find it unthinkable that Joshua Ferris’ two novels share the same author. In both books he displays his keenness for irony and wit, and in both books his characters seem ever so slightly unbelievable, even while their antics compel you — inevitably and without hesitation — to keep turning the pages. In the case of The Unnamed, the main character is Tim Farnsworth, a partner at a prestigious Manhattan law firm. Farnsworth has a mysterious condition: at times and without warning, he starts walking. And doesn’t stop. Or at least not for several hours, until his body gives way and the enigmatic force propelling him forward suddenly yields its mastery over his limbs. By the time he finally regains control over his forward motion, he is overtaken by an otherworldly slumber and often finds himself in unlikely places, such as crumpled in a heap by the East River, or even somewhere in New Jersey (which, I’ve learned, is so much farther away for a self-respecting Manhattanite than the actual geographical distance traversed).

Tim’s wife, Jane, has been his stalwart ally throughout his ordeals, which, as the story opens, have surfaced for the third time. While desperate for a cure, in his darker moments Tim knows he would almost be content just to find someone else with the same affliction, as vindication, proof that his is a purely physical aberration and not reflective of mental vulnerability. In despair, Tim tells his wife, “I’m the only one, Jane. No one else on record. That’s crazy.” However, the couple’s daughter, Becka, a maladjusted teenager with delicate weight issues, is skeptical of her father’s illness. In one exchange with her mother, she asks, “Have you ever Googled it? Google it and see what comes up.” “Google what?” Jane asks. “Exactly,” Becka replies, and it is immediately clear that Ferris has his finger on the pulse of filial dynamics.

Read simply, The Unnamed is a compelling love story — not in the traditional sense, but in an arguably purer form. There is nothing remotely sexy or alluring about Jane’s tireless efforts to rescue her husband (more from himself than from his illness), nor are Tim’s attempts to break free from his family to prevent their self-destruction at all representative of popular romantic themes. As a family, the Farnsworths are failures in many respects — Tim’s illness persists, Jane succumbs to alcoholism, and even Becka resigns herself to living with the body she has. Disappointment permeates every part of their lives, yet there is always the potential for a miracle, a reversal; and it is this paradox that characterizes their predicament. Joshua Ferris has combined his talent for lively dialogue and quirky characters and infused his narrative with a profound emotional depth and complexity that was simply not present in Then We Came to the End. That earlier novel claimed the hearts of legions of new fans, and The Unnamed has since broken them. Given the ease with which Ferris has already transported us through these two distinct worlds, it seems safe to expect more pleasant surprises down the road.

#8: Freefall

“As the United States entered the first Gulf War in 1990, General Colin Powell articulated what came to be called the Powell Doctrine, one element of which included attacking with decisive force. There should be something analogous in economics, perhaps the Krugman-Stiglitz doctrine.”

Yes, Joseph Stiglitz, the author of Freefall: America, Free Markets, and the Sinking of the World Economy, has a fan. This ardent devotee is not, as one might suspect, a fellow academic scrawling her mark of approval onto the book’s cover, nor a book reviewer writing for a newspaper or magazine. It’s not even Paul Krugman, although presumably he too has fallen victim to the spell of his fellow Nobel laureate.

No, the fan is Joseph Stiglitz himself, the author of both the book Freefall and the above quote, found in its second chapter. And as self-aggrandizing as he can tend to be — he joins the litany of economists, politicians, and pundits who vociferously trumpet their early predictions of the current financial crisis — his words are bolstered by an undeniably credible resumé. As the former chairman of President Clinton’s Council of Economic Advisers, the senior vice president and chief economist at the World Bank, and the 2001 Nobel Prize winner in economics, Stiglitz has combined his enviable pedigree as a top-notch economist with the political savvy gained through spending many years in the halls of power.

In the course of reading Freefall, it soon becomes abundantly clear that Stiglitz is not especially fond of deregulation. However, in a departure from the current American zeitgeist, he does not embrace populist rhetoric or condemn bankers unduly for their greed. (In writing this last sentence, I vacillated between enclosing greed in quotes or not; either choice seems equally prejudiced, so I arbitrarily chose not to.) “Bankers acted greedily because they had incentives and opportunities to do so, and that is what has to be changed,” Stiglitz writes. “Besides, the basis of capitalism is the pursuit of profit: should we blame the bankers for doing (perhaps a little better) what everyone in the market economy is supposed to be doing?”

This is an interesting question, and one that is not normally asked in today’s politically charged environment. And yet Stiglitz is just about the furthest thing from an apologist for the banking industry. Responding to central bankers’ claims that allowing inflation hurts those with low incomes, Stiglitz deadpans, “One should be suspicious when one hears bankers take up the cause of the poor.” Elsewhere, he states that “there is an obvious solution to the too-big-to-fail banks: break them up. If they are too big to fail, they are too big to exist.”

Obviously, large-scale problems in the financial sector led to the collapse of the markets and the economy at large, but Freefall is not content to stop at causes. The responses by both the Bush and Obama administrations come under heavy fire too: the former for not recognizing the severity of the crisis or forming a coherent rescue, and the latter for choosing the politically safest responses (tellingly, the author dubs this the “muddling through” approach). A key problem, if Stiglitz is to be believed, is the misalignment of private and social benefits. When banking executives’ compensation is based upon short-term stock price gains instead of long-term profitability, when regulators and top government officials at the Federal Reserve and the Treasury turn a blind eye to the mounting risks in the housing bubble to avoid slowing perceived economic growth, when financial innovations that produce high fees and low efficiency are encouraged instead of fined or prohibited, eventually there will be hell to pay, and we as taxpayers will be the ones paying it.

Indeed, this is exactly what we’re doing right now. Regardless of one’s feelings on Stiglitz’s policy prescriptions — some of which, not unlike those of his earlier book, Making Globalization Work, appear more grounded in political idealism than in reality — the fact remains that it has fallen to the taxpaying public to bear the risk created by the masterminds of Big Finance’s increasingly complex securities and other derivatives. To Stiglitz, this is ample reason to hit the reset button on the American financial industry — or perhaps more accurately, the reformat button. His vision is of a world of free markets, yes, but not completely unfettered and left to their own whimsies.

Instead, President Stiglitz would beef up the regulatory framework: ensuring that banks’ leverage ratios do not stray too high, that conflicts of interest (such as banks running their own real estate appraisal subdivisions) cannot occur, that predatory lending is prohibited (or at least heavily restricted), etc. Furthermore, Keynesian economics would experience a renaissance. (Stiglitz has little patience with the Chicago school, which he finds too theoretical and based on fallacious assumptions anyway. In one of the author’s weakest moments, he shamelessly deconstructs a straw man only vaguely resembling actual conservative ideology.) A global reserve currency would be created, similar (but not identical) to the International Monetary Fund’s Special Drawing Rights (SDR), to prevent the contagion of a worldwide crisis started by one currency’s downward spiral.

By the time one has finished this book, it seems that there is not much to look forward to in Joseph Stiglitz’s version of world events. He sees a financial market in disarray, being slowly rebuilt by the same hands that led to its destruction and leading inevitably to another instance of the same shortsightedness followed by more devastation. This is a hard pill to swallow, but it sheds light on why Joseph Stiglitz chose to write this book so soon after the financial earthquake. An undesirable future can be prevented, and we’re in the ideal scenario to start again from the rubble. By the time the economy begins showing serious signs of recovery, all resolve to change course will have evaporated. And so the gods of irony may be leaving us a silver lining after all in this prolonged economic massacre: the longer we suffer from the effects of past miscalculations and neglect, the more time we have to formulate a new, healthy, and safe framework to avoid a recurrence.

#7: Me Talk Pretty One Day

Without much in the way of proof, I submit that Me Talk Pretty One Day is best enjoyed under the influence of serious narcotics. This is an admittedly uncertain proposal and one I have failed to test firsthand, but really not so harebrained upon deeper reflection. David Sedaris, the “author” of this “book,” appeared to be in just such a state for the entirety of its writing. (I enclose “author” and “book” in quotes because I’m not convinced either moniker really describes its respective object.)

Where do I get this idea? Perhaps from his track record. “After a few months in my parents’ basement, I took an apartment near the state university, where I discovered both crystal methamphetamine and conceptual art,” Sedaris muses. “Either one of these things is dangerous, but in combination they have the potential to destroy entire civilizations.” Later, a chapter begins with the simple declaration, “I’m thinking of making a little jacket for my clock radio.” In the chapter entitled “I Almost Saw This Girl Get Killed” (situated toward the end of Part Deux, directly succeeding Part One), a bemused Sedaris living in France grapples with the idiocy of an event organizer coordinating a show in which young men taunt an enraged cow. “I’m willing to bet that he had some outstanding drug connections,” the author deadpans. “How else could a person come up with this stuff?” Twenty bucks says readers will speak similarly of David Sedaris.

In fact, it is hard to say with any certainty which parts of this book are true and which are figments of Sedaris’ hyperactive imagination. To this end, clues may be found in the chapter “The Late Show,” which consists of various autobiographical fantasies involving saving the world from cancer and bestowing youthful features upon everyone but the ruthless editors of fashion magazines. (“Here are people who have spent their lives promoting youthful beauty, making everyone over the age of thirty feel like an open sore. Now, too late, they’ll attempt to promote liver spots as the season’s most sophisticated accessory. ‘Old is the new young,’ they’ll say, but nobody will listen to them.”) But Me Talk Pretty One Day is as concerned with its own veracity as Animal Farm is with mutinous livestock. To debate its accuracy is meaningless; the point lies decidedly elsewhere.

This memoir, if the genre can stomach this latest addition to its ranks, embraces black humor with a strange ease, as Sedaris channels Robert Downey, Jr.’s Harry Lockhart in Kiss Kiss Bang Bang. In short, Me Talk Pretty One Day is clearly more style than substance. Or is it? The author’s sardonic send-ups of everything from Americans traveling abroad to the laughable pretension at art exhibitions are riddled with jolting allusions to a less comic reality. After concluding his lengthy digression into juvenile daydreams of worldly super-stardom while living in Paris, Sedaris quietly notes that all of his fantasies revolve around impressing only fellow Americans. “…It doesn’t interest me to manipulate the French. I’m not keyed into their value system. Because they are not my people, their imagined praise or condemnation means nothing to me. Paris, it seems, is where I’ve come to dream about America.” Such words arrive unexpectedly, sandwiched as they are between a longing for an affair with President Clinton and a story of the author’s father ingesting a hat.

It is in these similarly contrasting tones of irony and sobriety that Sedaris tackles his first spells with drugs and the displacement he felt as he coped with his sexual identity in a traditional childhood. Self-pity is never considered, and self-deprecation never remitted. His writing prompts sudden, inappropriate laughter as well as eyebrows scrunched together in perplexity. Both reactions feel natural, given the text. In the strange and beautiful world of David Sedaris, Me Talk Pretty One Day probably makes some sense. Fortunately for the rest of us to whom it does not, he doesn’t seem to mind much either way.

#6: Family Album

There are so many things one could say about Penelope Lively’s Family Album. (For one, it has nothing to do with the book of the same title by Danielle Steel.) Here, I will quote a few: “a haunting new novel” (Dominique Brown, New York Times); “another winning demonstration of [Lively’s] wit” (Ron Charles, Washington Post); “one of her most impressive works” (Joanna Briscoe, The Guardian).

To this could be added “thoroughly underwhelming,” or — perhaps less generously — “a meandering tale lacking a protagonist, an antagonist, a plot, a progression, character development, and, while we’re at it, a point.” To varying degrees, completing the journey that is reading a book generally elicits the self-satisfaction of literary accomplishment; at the conclusion of Family Album, that feeling was something closer to relief.

To be fair, the story isn’t awful, just repetitive and needlessly preoccupied with trifles. (Yes, trifles. If you’re neither familiar with nor amused by English idioms, you’ve one more reason to cross this novel off the reading list. On the other hand, Lively appears to have appropriated a decent portion of vocabulary words from GRE prep courses. This would seem rather jejune if not for her literary fecundity.)

In a genuine attempt to cut the author some slack, I frequently reminded myself that there is much — everything? — about the intricacies of English middle-class existence about which I know nothing. (The term “Edwardian” is bandied about with alarming frequency, for example.) If that is the extent of it, then I apologize to Lively’s loyal readers across the pond and respectfully retreat to lighter American fare. Perhaps Danielle Steel? The characters populating her Family Album are said to “face the greatest challenges and harshest test a family can endure, to emerge stronger, bound forever by loyalty and love.” But then, those words were written by her publisher; and besides, as guilty pleasures go, I remain unwaveringly yours, John Grisham.

But I find it unlikely that cultural ignorance alone can explain the yawning gap between Family Album‘s aspirations and its reality. Maybe familial experience, then? I have as many siblings as Alison Harper has children (six), and perhaps that’s just the problem: none of these dark, festering secrets and tensions strike me as extraordinary, or imbued with any larger meaning. Loud, rambunctious dinner conversations cut short by an ill-timed outburst? Self-imposed emotional detachment from the less pleasurable aspects of childhood? Par for the course, methinks. (Doesn’t everyone do that?)

And now I’m starting to sound like Gina, the second child who, in an email to her siblings, agrees with her older brother that “all families screwed up, more or less.” I just wish Penelope Lively’s editor had kindly informed her of the same. Even the looming family secret, revealed midway through the book, is a letdown, almost a cliché as these things go, and both central and irrelevant to the story at the same time. Making matters worse is the grating redundancy; each sibling marvels, in a never-ending revolving door of memories, at how the formative years stubbornly retain their familiarity while growing increasingly foreign. The children themselves, from infancy through adulthood, are too numerous to animate with believable personalities, and so become terribly one-dimensional. Sandra can do nothing other than shop for clothes and look elegant. Paul must always drink heavily and display utter disregard for social etiquette. Clare just dances, and that is all. Even the interweaving style with which Lively travels through time and space to indulge her characters’ collective nostalgia is arbitrary, with just enough proximity to Kazuo Ishiguro’s similar tendencies to bring him to mind while silently reprimanding her for trying on his shoes.

There are, disappointments notwithstanding, some highlights amidst the unimpressive remainder. Strewn among the unremarkable hiccups of nostalgia are poignant touches that strike a chord with anyone who has grown up, left home, and returned, astonished at the changes. “Goodness,” Katie exclaims in an email to her brother, Roger. “A married Gina, who’d have thought it.” Similarly, towards the end, as Alison recounts the glory days of her motherhood at Allersmead, it would require an inhuman imperviousness to pain for the tragedy of her existence not to weigh heavily on the spirit of the reader. (And once again, specters of Ishiguro’s Remains of the Day haunt Alison’s pitifully denialist closing reflections.) It’s just that the characters themselves seem to cope more serenely — and authentically so — with their upbringing than their creator does, and that, generally speaking, should not be the case. Chalk it up to big-family cynicism, but this is one family album I won’t be flipping through again any time soon.

#5: The Disappeared

Read it and weep. Literally. The Disappeared is a quick, meaningful punch to the gut. In 228 short pages, author Kim Echlin wastes not a word or phrase in this despairing depiction of love and loss in war-torn Cambodia. Spanning decades and continents, from the dingy blues clubs of Montreal to the killing fields outside Phnom Penh, Anne Greves weaves a mournful path of despondency and courage as she follows her lover into the darkest recesses of human depravity.

Almost immediately upon opening this book, I knew I was going to enjoy it. Of course, “enjoy” is perhaps an inappropriate term given the subject. But a book’s value is not measured in tidy narratives so much as in an ability to immerse its readers wholly into the world of its characters’ lives. This holds true even when dialogue between characters is written intentionally dreamily, as if the protagonist’s memory has decayed and dissolved over time, leaving only mystical moments where reality once breathed.

Strangely, I couldn’t escape a familiar feeling for the first several chapters: the author’s literary style reminded me of something else I’d read previously. Then it suddenly occurred to me: The English Patient. “The light in Mau’s eyes was a pinprick through black paper,” Echlin writes of Anne’s first meeting with a new friend. “…I chose him because when he stepped forward, the others fell back…The light of his eyes twisted into mine.” One entire chapter reads: “I can still see a particle of dust hanging in a sunbeam near your cheek as you slept.” In very short order, it becomes all too clear that The Disappeared resembles Michael Ondaatje’s masterpiece in little other than descriptive syntax, however. This is not dream-sequence-turned-real; it’s a living nightmare, stretched and tortured into over thirty years of searching and loving and waiting and finding and searching all over again.

It is impossible not to empathize with Anne. Her naivete, her persistent belief in a justice, or karma, that will transform wrong into right, is as admirable as it is devastating. When she asks of her captor, “How can people move on without knowing what happens to their families? How can they move on without truth?” we want to laugh at her simplicity even as we cry for her faith in humanity. It is her ever-burning fire that ignites this story and affords us all the unique opportunity, if only for a moment, of believing again with her.