All posts by Jay Pinho

About Jay Pinho

Jay is a data journalist and political junkie. He currently writes about domestic politics, foreign affairs, and journalism and continues to make painstakingly slow progress in amateur photography. He would very much like you to check out SCOTUSMap.com and SCOTUSSearch.com if you have the chance.

#23: The Living Constitution

I was first made aware of David A. Strauss’ The Living Constitution via a Stanley Fish column on the New York Times web site. Titled “Why Bother With the Constitution?,” Fish’s blog post for May 10 dovetailed Supreme Court nominee Elena Kagan’s impending confirmation process with the fundamental constitutional questions raised by Strauss in his 139-page book.

Professor Fish’s reaction to The Living Constitution is best described as righteous indignation. To some of Strauss’ statements he retorts, “This is simply wrong.” To others, with considerable consternation, “This is an amazing statement.” Towards the end he proclaims that “the incoherence of what Strauss is urging is spectacularly displayed in a single sentence. Given the importance of common ground, ‘it makes sense,’ he says, ‘to adhere to the text even while disregarding the framers’ intentions.'”

So what exactly is going on here? Clearly something that Strauss is advocating, or even simply implying, is rather disagreeable to Mr. Fish. The former’s thesis is described (on the book’s front flap) as a defense of “the living Constitution…a common law approach to the Constitution, rooted in the written document but also based on precedent. Each generation has contributed precedents that guide and confine judicial rulings, yet allow us to meet the demands of today, not force us to follow the commands of the long-dead Founders.” Or, as Stanley Fish would have it, “Why is Strauss trying to take the Constitution out of the constitutional interpretation loop? Because he wants to liberate us from it as a constraint.”

Not exactly. I don’t get the impression Strauss intended to relegate the written Constitution to window dressing. Nevertheless, Fish is correct in noting that the The Living Constitution makes some bold claims as to the document’s role in contemporary jurisprudence. In large part, the book is a crusade against “originalism,” the judicial philosophy espoused most visibly by Supreme Court justices Antonin Scalia and Clarence Thomas. According to the originalist line of thinking as delineated by Strauss, “when we give meanings to the words of the Constitution, we should use the meanings that the people who adopted those constitutional provisions would have assigned…It is impermissible — it’s a kind of cheating, really — to take the words of the Constitution and give those words a meaning that differs from the understandings of the people who were responsible for including those words in the Constitution in the first place.”

The obvious counterpoint is, of course, the question of what to do in the majority of scenarios in which the Founding Fathers set forth no explicit guidelines (what exactly constitutes “cruel and unusual punishment?”), could not possibly have foreseen the issues (privacy on the Internet), or espoused views that are no longer acceptable in modern society (slavery). Acknowledging these obstacles, Strauss contends that they render originalism useless as a judicial philosophy. (In a section he headlined “The Originalists’ America,” Strauss remarks that “racial segregation of public schools would be constitutional,” “the government would be free to discriminate against women,” “the Bill of Rights would not apply to the states,” and so on.)

In titling his book The Living Constitution, he follows a long (albeit controversial, like nearly everything else related to jurisprudence) tradition of adhering to a more flexible view of the Constitution. But then he takes matters a step further. “It is the unusual case,” Strauss notes, “in which the original understandings get much attention.” In Strauss’ estimation, not only is the Constitution necessarily mutable to fit the needs of a dynamic society, in a way it is actually irrelevant to modern “constitutional” law itself. This is because of what he terms the “common law” approach: historically, “the law was a particular set of customs, and it emerged in the way that customs often emerge in a society…It can develop over time, not at a single moment; it can be the evolutionary product of many people, in many generations.”

In contemporary American law, this series of ever-shifting customs takes the form of precedent. Past judges’ rulings are considered the foundation upon which future verdicts are rendered; thus, Strauss claims, this methodology avoids both the impracticalities of originalism and the dangers of judicial overreach inherent to the dominant view of “living Constitutionalism,” in which activist judges are free to bend the law to their liking at will.

In reality, however, what the author deems an alternative approach to mainstream modes of thought is not entirely groundbreaking. At its simplest, Strauss’ thesis is simply a reassurance that living Constitutionalism works, that it does restrain judges from arbitrary decision-making. Where it differs, however, is in his attitude toward the actual text of the Constitution. To Strauss, the lip service that justices pay to the sacred text in their judicial opinions is just that: lip service. In actuality, he argues, current legal interpretation has so completely evolved and transformed over the years that the written Constitution itself has lost its germaneness to today’s legal wrangling. Quoting Thomas Jefferson, who wrote that “the earth belongs…to the living,” Strauss maintains that the Constitution, restrained as it is by the chains of centuries-old thinking, is an insufficient substitute for judicial precedent. And yet by forcing judges to formulate constitutional justifications for their every ruling — however tenuous the connection may be — living Constitutionalism, in most cases, prevents the worst variants of judicial activism. (Of course, depending on one’s particular ideological affiliations and the results of any given trial, this may or may not always be readily apparent.)

Stanley Fish, meanwhile, is having none of this. “You don’t interpret a text by looking for meanings people would find agreeable,” he writes. “You interpret a text by determining, or at least trying to determine, what meanings the creator(s) had in mind; and the possibility that the meanings you settle on are not ones most people would want to hear is beside the interpretive point.” He then angrily concludes: “If this is what the ‘living Constitution’ is — a Constitution produced and reproduced by serial acts of infidelity — I hereby cast a vote for the real one.” That Fish and Strauss cannot even agree on what the “real” Constitution is provides a worthy bellwether of the political whirlwind that is sure to accompany Elena Kagan into her much-anticipated confirmation hearings.

Bonus #2: The Bro Code

I’m quite sure Barney Stinson didn’t actually write The Bro Code. For one, any time a book’s authorship is attributed to someone with someone else — in this case, “Barney Stinson with Matt Kuhn” — generally the name following the “with” did the actual writing for the ostensibly overly busy and self-absorbed person whose name comes first.

Also, Barney Stinson isn’t a real person. Or I suppose that depends on how one defines “real.” As a regular character on the CBS sitcom How I Met Your Mother, Barney (played to ironic perfection by Neil Patrick Harris) has one thing on his mind. OK, two. The second is laser tag. The first, though, is sex. With women. (I make this distinction only because Harris is openly gay. Sometimes I think this fact alone makes the process of watching him portray a womanizing bachelor whose sexual exploits number in the three digits at least twice as funny.)

Anyway, because the business of female-catching is a complex one, a set of rules was deemed necessary. By whom, you ask? Well, by the geniuses in the marketing department of CBS, obviously. Fortunately, The Bro Code — which the author modestly describes as a “compendium of knowledge” — lives up to its billing. The code itself resembles the Ten Commandments in form, but in fact contains one hundred and fifty articles and ten amendments. These amendments, despite recalling Founding Fathers and their ilk, bear no resemblance whatsoever to that Bill of Rights; the only quartering of soldiers allowed in The Bro Code would likely involve a booty call and an entirely ill-fitting Army uniform.

Fortunately, Barney allows his readers some breathing room before delving straight into the code itself. In the introduction, he lays the foundation for his profound ruminations: “It is my hope that, with a better understanding of the Bro Code, Bros the world over can put aside their differences and strengthen the bonds of brotherhood. It is then, and only then, that we might work together as one to accomplish perhaps the most important challenge society faces — getting laid.” Next is “What is a Bro?” This section consists of a brief Q&A (Q: “What is a Bro? A: “A Bro is a person who would give you the shirt off his back when he doesn’t want to wear it anymore…”); apparently three questions was the limit. The other two were “Who is your Bro?” (who is my neighbor, anyone?) and “Can only dudes be Bros?” (The answers: many people, and no.) Finally, the “Brocabulary” and “Origin” (“lacking an agreed-upon set of social principles, Cain killed Abel and committed history’s first Broicide”) round out the opening salvo in a tour de force of masculinity.

The articles themselves are quite straightforward. From Article 1, “Bros before ho’s,” to Article 150, “No sex with your Bro’s ex,” Barney Stinson (with Matt Kuhn) means business. Article 47: “A Bro never wears pink. Not even in Europe.” Article 96: “Bros shall go camping once a year, or at least attempt to start a fire.” Article 118: “When a Bro is with his Bros, he is not a vegetarian.” These and many others make it clear that being a man is clearly not sufficient (nor even necessary) to being a Bro. Membership in the club of Bros requires a special kind of commitment (no, not that kind of commitment), a healthy dosage of machismo, and a general aversion to expressions of feeling that don’t involve a new HD TV or high-end sports car. But don’t let these traits fool you. Altruism is still at the core of The Bro Code. Case in point: Article 80, which states, “A Bro shall make every effort to aid another Bro in riding the tricycle, short of completing the tricycle himself.” Then again, I don’t think he’s referring to the childhood pedaling device.

#22: The Dream Life of Sukhanov

It had been awhile since I’d read a Russian novel. In fact, I believe the last such book I’d read until now was Invitation to a Beheading by Vladimir Nabokov. Even after having read only a scant few of the major works of Dostoevsky and Tolstoy, I suppose I still should have realized that not all Russian books — or Russian authors, for that matter — are alike. And yet somehow I was persuaded by the name Olga Grushin and the intriguing title of her book, The Dream Life of Sukhanov, into presuming literary greatness.

As it turns out, not all stereotypes are inaccurate. Beautiful novels are as Russian as vodka consumption and chess. But where the last two are respectively vulgar or elite, the Russian novel is a format accessible to all, at least to those for whom 700-page sagas are not too forbidding. Grushin’s is no different (except considerably shorter). As several reviewers have noted, her writing does contain a slight foreign twang, as when she uses overly lengthy strings of adjectives to describe mundane settings. But her English is considerably better than my Russian, so judge I shall not.

The Dream Life of Sukhanov opens with the the protagonist and sometime antihero, Anatoly Pavlovich Sukhanov, arriving with his wife, Nina, at a birthday celebration for a renowned Soviet painter named Pyotr Alekseevich Malenin. (Do not fear an endless litany of names for each person; either Grushin has graciously spared her anglophone readers the consternation of rote name memorization, or I have subconsciously grown accustomed to the practice. And I’m quite confident it’s not the latter.) Malinin is a product of the Soviet machine, an “artist” whose works traffic in ideological and political cliche, stripped of their creative meaning even as they enjoy the public notoriety afforded by an official stamp of approval.

Malinin is also Nina’s father. Sukhanov, while privately musing that “the main quality uniting all [of Malinin’s] works…was the inherent ease with which they slid into oblivion the moment one’s back was turned,” was nevertheless duty-bound to pay the man his patriotic dues. Anyway, as editor in chief of Art of the World, the nation’s (and thus the state-approved) premier art magazine, Sukhanov was in no position to evaluate the integrity of others’ choices.

What he cannot stop himself from doing, however, is reassessing his own decisions, ad nauseum. As Sukhanov constantly travels in thought from the present to his past, the narrative voice switches from third to first. He is once again a small child, then a young man in love with both art and his future wife. Surrealism is his passion, but the Kruschev Thaw all too soon evaporated and, with it, the sacred luxury of maintaining artistic creativity without forfeiting all professional (and certainly political) ambition. Sukhanov confronted a life-altering decision: to rebel, or choose the safety of the ideological mainstream.

Choosing the latter, Sukhanov eventually soared to career success. When the time came, however, he was unable or unwilling to comprehend the realities of glasnost and perestroika, even as they rendered his suppressive voice cartoonish and his fears of a crackdown anachronistic. When a student journalist accosts him at Malinin’s birthday event, demanding that he acknowledge the innate dishonesty in the great man’s paintings, Sukhanov condescendingly responds, “A piece of friendly advice…Those artistic ideas of yours, I wouldn’t advertise them so openly if I were you — you never know who might hear you.” To which she replies, presciently, “I don’t care who hears me…The times are changing.”

The Dream Life of Sukhanov, in chronicling a unique world event — the twilight of the Soviet era — evokes a surrealist universe of its own, neatly meshing with the artistic chaos of the genre that first captured Sukhanov’s heart as a child. Olga Grushin, Russian by birth and now American via naturalization, has experienced first-hand the decline of Russian communism, both from within and without the country; and this personal touch lends her already sterling writing an entirely believable hue. Sukhanov as a character is difficult to be admired, and yet a decent helping of contextual pity is always present nonetheless. Upon hearing (but not heeding) the student’s retort about changing times, Sukhanov concludes the terse conversation: “The times are always changing, my dear Lida…But it would serve you well to remember that certain things always stay the same.” In the Union of Soviet Socialist Republics of 1985, those “certain things” were nonexistent. For Sukhanov, then, as for the rest of the country and the world at large, the only question was whether to accept the inevitable.

Music to my ears

This is a book blog. I know that. But, as with the Red Sox, Google Buzz, and, yes, the Red Sox again, from time to time my entries have reflected a certain distractible sensibility. (This is a nice way of saying it’s hard for me to stay on point – one of many reasons I am relegated to Internet Siberia and not your local bookstore’s display windows.)

It really cannot be helped this time. And, showing signs of improvement, this one’s not about the Sox. No, I am devoting this post to fawning adulation. But first, the back story. For years now, I have been a fan of Icelandic post-rock band Sigur Rós. From the moment the first lilting twangs of “Njosnavelin” reached my ears in the movie Vanilla Sky, I knew I’d been hooked. Lead singer Jonsí’s (Jon Thor Birgisson) voice had an ethereal quality that is unmatched in music today. Soon I couldn’t get enough. As I began to discover more of their tracks, an entirely new universe unveiled itself before me. From the thumping rock anthem of “Glósóli” to the lighthearted giddiness of “Hoppipolla,” it was obvious the band had its pulse on a sound the rest of the world had yet to capture.

In a way, Sigur Rós changed my expectations of what good music should sound like. It most certainly raised the bar, but it wasn’t just that: Sigur Rós’ pieces – at times haunting, at times dreamy, but always unique – unfolded like a canvas, evoking an almost physical reaction, something previously unknown to my uncultured ears. (Keep in mind, my favorite song at the time was Nickelback’s “How You Remind Me.”) Although it took time, my love of Sigur Rós gradually led to appreciation of other bands who refused to be bound by the vagaries and expectations of pop culture. Most notable among these was Radiohead, whose leader Thom Yorke exhibited, albeit with more swaggering panache, the same spirit of musical rebellion embodied by Jonsí. (In what is perhaps the most compelling evidence against apocalyptic prophesies, the two bands once toured together without precipitating the universe’s explosion – somehow avoiding death by musical nirvana, so to speak.)

Fast forward to 2010. Jonsí had announced a world tour to promote his new album, Go. I purchased tickets for a New York show at Terminal 5 as soon as they went on sale. Last year, Jonsí had released an instrumental record, named Riceboy Sleeps, with his boyfriend; and embarrassingly, I didn’t realize until the concert this month that he’d compiled a new album since then. So imagine my surprise when Jonsí took to the stage and began performing “Hengilas,” “Animal Arithmetic,” “Tornado,” and the like.

I went home and immediately purchased the entire album on iTunes and have since listened to it incessantly. (My valiant college housemates can attest to my inability to diversify my playlist: when I discovered a new song, I would play it ad nauseum until, inevitably, even I grew sick of it.) But before I even get to that part, allow me to describe the concert itself. The set was designed to look like an old museum. Large panes of windows towered behind the band, on which projections of animals in the wild (the concert heavily utilized a nature motif) lent an epic quality to the accompanying music. I agree with the online commenter who stated that it felt like a film to which Jonsí was performing the accompanying soundtrack. The light show (and the entire production, for that matter) made it obvious that Jonsí intended to continue the creativity displayed at Sigur Rós’ live performances. (I had attended one of their concerts in Chicago in September 2008 and was equally impressed by the scale of the production.) In the most grandiose moment of the concert, house lightning bombarded the stage, then faded to a deer being chased by a wolf-like predator through dense forest, as Jonsí’s otherworldly falsetto rang out in “Kolnidur.”

Now, for the music itself: Go is a beautiful album. Because it is Jonsí’s voice on all the tracks, shades of Sigur Rós may initially sneak in, but the similarities are less real than imagined. Jonsí strikes out on his own path on this one, even if his musical decisions are clearly influenced by his prior works with the band. In arguably the best song on the album, “Hengilas,” Jonsí, singing in Icelandic and accompanied by an ominous ambient chord progression, evokes a deep melancholy. A repetitive piano theme in “Tornado” yields to booming percussion as Jonsí, soaring high above the melody, sings, “You flow through the inside/you kill everything through/you kill from the inside/you’ll, you’ll learn to know.”

The most well-known track on Go is “Boy Lilikoi.” Jonsí, in the chorus, urges his listeners to “use your life, the world goes and flutters by.” I’m trying, but his musical ingenuity has kept headphones glued to my ears. Flutter on, world.

#21: The Orchid Thief

There is an inherent danger in adapting any book into a feature-length film. This is doubly true when the book’s subject is flowers. So when Charlie Kaufman transformed Susan Orlean’s The Orchid Thief: A True Story of Beauty and Obsession into a screenplay (and, eventually, into a film directed by Spike Jonze), naysayers had plenty of reason to be skeptical. That movies rarely live up to the books on which they are based is one of the most pervasive truisms in both literature and film; but where tradition dictated one trajectory, Kaufman took the road less traveled.

2002’s Adaptation was, alternately, a narcissistic endeavor of self-absorption by a screenwriter who failed both to appreciate the book’s subject matter and translate it into a compelling narrative, or a revelatory piece of meta-fiction, in which the film itself appears to be an unfolding work in progress even as the viewer watches it. Middle ground is scarce when it comes to Adaptation (and, more generally, Charlie Kaufman as a creative artist); it may be revered or loathed, but rarely dismissed.

Notwithstanding the ubiquitous axioms, I never believed the book would live up to this particular movie. And well before I had actually finished reading The Orchid Thief, it was quite clear that comparing the two is a near impossibility. Adaptation is as much concerned with orchids as The Orchid Thief is with, well, its adaptation, even though John Laroche, the thief of the book’s title, plays a major role in the movie as well. Nevertheless, it must be stated that, bucking historical trends, the movie beats the book.

It is unclear why someone thought it would be a good idea to turn a meandering reflection on Florida, Seminole Native Americans, eccentric white men with loose teeth, and a shared passion for orchids — hunting orchids, buying orchids, selling orchids, naming orchids, growing orchids, cloning orchids — into a 114-minute movie. Contemporary cinema tends to adhere to narrative arcs, comprehensible characters, and plausible events (at least within the context of the film’s universe, whether that be modern-day New York or a galaxy far, far away). What it generally shies away from are stories with no real ending and whose meatiest content is reserved for fastidious descriptions of orchid flowers and lengthy digressions into the history of their commercialization.

In fact, what Kaufman nobly managed to do, regardless of one’s feelings on his method of arriving at his destination, was extract the one essential aspect of Orlean’s book and turn it into the overriding theme of his screenplay. He prefers to think of it as evolution or adaptation; but more simply, The Orchid Thief is about passion. Orlean writes in the first person, noting early on that Laroche “is quite an unusual person. He is also the most moral amoral person I’ve ever known.” The author peppers her accounts with seemingly random tidbits, as when she notes that “there are more golf courses per person in Naples than anywhere else in the world,” a piece of trivia ostensibly thrown in precisely because she had mentioned the city. Little comments like this are scattered in bunches throughout the book, and many seem superfluous or, at the very least, unnecessarily detailed. Without them, The Orchid Thief would be a third of its actual length, but somehow one gets the feeling it would fail to retain its substance in their absence.

What, then, is its substance? Facially, The Orchid Thief is about a man accused of stealing orchids from federally protected land. More broadly, however, his is a parable of the shape that passions can take and the way in which virtually anything can serve as a muse for a perfectly suited person. Where it gets bogged down is in the essay-length forays into the history of orchids as commodities, needlessly expansive depictions of strands of conversation at orchid expositions, and other similarly elongated tales that seem gratuitous, although not necessarily incongruous in the context of the book as a whole. This is not to say that many parts of The Orchid Thief were not fascinating, because they were. However, while Charlie Kaufman quickly recognized that his screenplay would have to forfeit faithfulness to the content in favor of thematic fidelity, Susan Orlean appears to have missed a similar message in adapting the stories she lived and heard to the written page. Both works are flawed tributes to their predecessors: Kaufman plays God with facts, and Orlean refuses to discriminate her numerous segments for relevance. But if perfection is unattainable, one may as well be entertained, and the prize in that category goes to Adaptation, not The Orchid Thief.

#20: Rise to Globalism

Last month, The New Yorker ran a full-page piece on Stephen E. Ambrose, American historian extraordinaire and author, most famously, of Band of Brothers. (Yes, that one.) Ambrose, whom the reporter describes (not inaccurately) as “America’s most famous and popular historian,” appears to have joined the long list of respected writers and academics whose zeal for sculpting a superior narrative was undermined by the dubious methods they used along the way. As Dwight D. Eisenhower’s biographer, Ambrose, who died in 2002, took pride in the “hundreds and hundreds of hours” he spent with the president over the course of five years.

As it turns out, these hours turned out to be just as phantasmal as, say, weapons of mass destruction in Iraq, or an honest politician. According to The New Yorker article, the deputy director of Eisenhower’s presidential library and museum recently discovered the president’s schedule, which revealed that “Eisenhower saw Ambrose only three times, for a total of less than five hours. The two men were never alone together.” In an understatement, the deputy secretary mused, “[Eisenhower] simply didn’t see that much of Stephen Ambrose.”

Notwithstanding the historian’s posthumous humiliation, it is quite clear just how he ascended to the zenith of American history-telling. In the 1997 eighth revised edition of Rise to Globalism: American Foreign Policy Since 1938, Ambrose joined with Douglas G. Brinkley (ironically, the Stephen E. Ambrose Professor of History at the University of New Orleans at the time) in recounting the events, decisions, and people that shaped the course of the twentieth century and set the standard for the one following.

Reading a pre-9/11 history book feels a bit fantastical at times, as when the introduction to the book presciently concludes, “America in the 1990s was richer and more powerful — and more vulnerable — than at any other time in her history.” In general, however, Rise to Globalism is every bit the thorough (although not detached) retrospective one would expect from a titan in the field such as Stephen Ambrose. Somewhat surprisingly, the most interesting nuggets are not the minutiae of political decision-making or military strategies but Brinkley and Ambrose’s perspectives on them.

As one clear example, it is quite obvious from the outset of chapter fifteen, Reagan and the Evil Empire, that the authors are not particularly fond of President #40. This is made abundantly clear in recounting the American involvement in the Israeli-Lebanese war in the early 1980s. Whether describing Reagan as “[blundering] in Lebanon as badly as Carter had blundered in Iran” or claiming that “no one, most of all Reagan himself, ever seemed to be clear on the purpose of that involvement,” Rise to Globalism is unapologetically ambivalent at best about the Reagan administration.

Of course, this is a perfectly valid evaluation, and one that helpfully reminds the reader that a history textbook this is not. (Of course, recent developments in the Texas public school curriculum raise doubts as to whether some history textbooks are even history textbooks. But I digress.) The authors’ perspectives are not partisan, it should be noted; of President Jimmy Carter, they praise his emphasis on human rights, which “struck a responsive chord among the oppressed everywhere,” but ultimately concede that “all the goals were wildly impractical and none were achieved.”

Especially fascinating is the nuanced tale of the Vietnam War, in which the reader is taken beyond soldiers, fighting, politicians, and election campaigns and into a deeper look at the underlying shifts in national societies themselves. Of course, these changes cannot be understood in isolation, and Ambrose and Brinkley deftly portray the interacting elements of politics and public sentiment. (The American fascination with the atomic bomb’s potential to establish permanent world hegemony is but one of several intriguing developments explored within this context.)

At times, Rise to Globalism is unforgiving in its assessments of American arrogance and impulsiveness, a tendency that lends credibility to their praise at other moments. (Commenting on the American commitment to Vietnam, despite elected officials’ many reassurances that the United States was not “fighting a white man’s war against Asians,” the authors ponder, “Why had the Americans not heeded their own warnings? Because they were cocky, overconfident, sure of themselves, certain that they could win at a bearable cost, and that in the process they would turn back the Communist tide in Asia.”) A similar attention to detail is displayed in coverage of the Cold War; perhaps most remarkable is the illuminating fact that, strenuous attempts at differentiation notwithstanding, most American presidents of the era closely mimicked their immediate predecessors’ foreign policies.

Despite having been written in the late 1990s, at a time when American influence could hardly have been more pervasive, Rise to Globalism is remarkably circumspect in its prognostications for the future. The book ends as President Bill Clinton’s second term begins, a period of unprecedented peace and prosperity for the West; and yet, in the last sentence, the authors caution that “no one would claim that he had wrought a global utopia of free-market democracies.” If Stephen Ambrose were alive today, a sequel, or at least an updated edition, would be in order. The times, they are a-changin’, but Rise to Globalism‘s relative old age has yet to relegate it to the dusty side of the bookshelf.

#19: The Reluctant Fundamentalist

Mohsin Hamid’s The Reluctant Fundamentalist takes an interesting approach to storytelling. The narrator is a Pakistani man sitting in a Lahore cafe, and from the outset it becomes clear that the narrative will move and flow only at his bidding. Ostensibly, his audience is the “uneasy American stranger” seated across from him, who manages to jump or look alarmed at virtually everything, but his book-long monologue makes it abundantly clear that the reader is the host’s true target.

“Do not be frightened by my beard,” the Pakistani, whose name is Changez, reassures his guest. “I am a lover of America.” And with that, Changez launches into an account of his life and times in the United States, beginning with his undergraduate years at Princeton University and soon followed by his interview and eventual employment at a valuation firm, Underwood Samson. Inevitably, a romantic entanglement materializes, as Changez’s new employer competes for his attention against Erica, the type of girl who wore “a short T-shirt bearing the image of Chairman Mao” as if it had personal meaning.

Changez finds himself gaining headway into American corporate culture; meanwhile, his relationship with Erica runs a parallel course. The only hitch, it seems, is her inability to release herself from the memory of her old boyfriend, Chris, who had died of cancer the year before she met Changez. On some level the latter man feels vulnerable, threatened even, at his seemingly disadvantageous position, but with a sense of irony at feeling jealous of a dead man.

Then, as often happens in recent literature, came September 11. Changez, on assignment in Manila for Underwood Samson, was in his hotel room preparing to return to the States when he saw the news on television. “I stared as one — and then the other — of the twin towers of New York’s World Trade Center collapsed. And then I smiled.” Changez’s instinctual pleasure was not, he assures his listener, in response to the violent act itself but as an acknowledgment of what it symbolized, “the fact that someone had so visibly brought America to her knees.” Fearing what such a reaction might provoke in his colleagues, he wisely keeps this sentiment to himself.

As the story progresses, however, the disparity between Changez’s corporate breeding and his national and cultural identity grows ever larger. Erica, afflicted with anxiety and depression following the attacks in New York, withdraws for long periods; each time he sees her, she has shrunk to a fraction of the person he had seen the time before. His conflicted relationship with her mirrors his increasingly frayed connection to the United States, as perceived American self-righteousness and machismo soon supersede concerns for maintaining some semblance of the existing (and precarious) global order, a development that holds dubious implications for Changez’s native Pakistan. “It will perhaps be odd for you,” he tells his guest, “…to imagine residing within commuting distance of a million or so hostile troops who could, at any moment, attempt a full-scale invasion.”

Towards the end of his story, Changez describes an emotional farewell with Erica, which immediately follows her verbal longing for her deceased lover. As Changez prepares to leave her, “she gave me a hug and afterwards she stood there, looking at me. But he is dead, I wanted to shout!” To Changez, the statement is an equally appropriate analysis of American innocence. Nostalgia for a mystic past is hardly empathetic when one exerts such overwhelming control over the present. “I had always thought of America as a nation that looked forward,” Changez muses. Following September 11, “for the first time I was struck by its determination to look back.” His attempts to ingratiate himself into Erica’s life met resistance from the same impulse that denied his easy assimilation into American culture.

These sentiments, while hardly original, are easily comprehensible and are expressed in an enjoyable format to boot. Thus it is all the more disappointing when the narrator descends into the occasional, yet dispiriting, cliche. Upon returning to the States from an aborted assignment in Chile, for example, Changez suddenly sees his destination in a new light. “…I was struck by how traditional your empire appeared. Armed sentries manned the check post at which I sought entry…once admitted I hired a charioteer who belonged to a serf class lacking the requisite permissions to abide legally and forced therefore to accept work at lower pay; I myself was a form of indentured servant whose right to remain was dependent upon the continued benevolence of my employer.” Melodrama aside, it is difficult to ascertain exactly how these disjointed fragments coalesce with the rest of the narrative.

This is the most egregious of Hamid’s errors, an admittedly minor one in the spectrum of literary flaws, but one that, nonetheless, could just as easily have been uttered by Michael Moore as penned by a serious author. Ultimately, Mohsin Hamid is exactly that; now, if only the filmmaker could take a few notes from him.

#18: The Big Short

In describing the behavior of Wall Street bankers prior to the financial crisis, many adjectives have been bandied about. Greedy, say some; arrogant, claim others. What is only now beginning to gain ground on these populist declarations of discontent is a third, and far more horrifying, descriptor: stupid. This trait may at first seem less offensive to those of us who flaunt our self-prescribed moral superiority over these perceived miscreants. The reality, however, is anything but comforting. In The Big Short: Inside the Doomsday Machine, Michael Lewis, author of Moneyball and Liar’s Poker, dabbles in the thriller genre, often to hilarious effect, as he details the inner workings of a financial world that was truly ill-prepared for its inevitable Waterloo.

I’ll admit it: The Big Short is a very, very entertaining book. Mine is an admission whose sheepishness can only be understood once one has finished reading the book. It reads like a John Grisham novel, yet John Maynard Keynes is a far likelier neighbor on a library shelf. Lewis is profligate in his use of such terms as “big Wall Street firms” (32 occurrences, according to Google Books) and he is wont to transcribe entire conversations whose accuracy is often questionable but whose content leaves the reader in stitches.

Ultimately, it is funny, isn’t it? Here were our best and brightest, as David Halberstam might say, assuring us that our money was safe, that real estate prices would continue to rise, that subprime loans were the healthy product of a heightened ability to reduce risk, not a house of cards upon which much of the global economy now rested precariously. And they were wrong, not because they intentionally lied (though some did), but because they failed to recognize the bright red flags everywhere on (and sometimes off) their own balance sheets.

The Securities and Exchange Commission’s civil lawsuit against Goldman Sachs this week has resulted in even more vitriolic rhetoric against investment bankers and their ilk, a demographic Lewis takes no pains to please in The Big Short. He opens his book with this: “The willingness of a Wall Street investment bank to pay me hundreds of thousands of dollars to dispense investment advice to grown-ups remains a mystery to me to this day.” And he ends it on an account of his lunch with an investment banker, his old boss at Salomon Brothers, recounted with equal parts nostalgia and regret. In between, he rips apart much of the industry, railing against “the madness of the machine” and buttressing his anecdotes with footnotes that occasionally take up half the page.

It’s hard to say whom Lewis ridicules more, the bankers or the ratings agencies: while The Big Short is premised on the fact that high-powered bankers failed to research or even understand their own investments, Lewis makes it painfully clear that the foundation upon which all risk analysis rested was the highly coveted — and, it turns out, highly manipulable — ratings from industry leaders such as Moody’s and Standard and Poor’s. According to Lewis, employees of these firms, instead of conducting far-reaching investigations into the nature of subprime collateralized debt obligations (CDOs), simply took at face value much of what the banks told them. And since there were large fees to be had for each rating bestowed on these shadowy financial instruments, Moody’s and S&P had significant incentive to perpetuate the subprime industry.

In one particularly enlightening passage, Steve Eisman, one of the book’s central characters whose disgust for Wall Street types figured prominently into his investing strategy, explained the lack of incentives for analysts at ratings agencies, a misalignment that helped to create and foster the crisis. “‘They’re underpaid,’ said Eisman. ‘The smartest ones leave for Wall Street firms so they can help manipulate the companies they used to work for. There should be no greater thing you can do as an analyst than to be the Moody’s analyst…So why does the guy at Moody’s want to work at Goldman Sachs? The guy who is the bank analyst at Goldman Sachs should want to go to Moody’s. It should be that elite.'”

The Big Short is filled with quotes such as this. And although not all of them are as penetrating or as keenly observant of the recession’s underlying fault lines, each is helpful in piecing together a panorama of the landscape that existed in and around these “big Wall Street firms.” Michael Lewis has not compiled a tell-all here; if he has revealed any industry secret, it is simply the astonishing truth that, in the subprime lending business, there were none. When the dust had settled around our financial ground zero, it soon became apparent that even Wall Street had failed to understand Wall Street. In this, if nothing else, it shares the same fate as Main Street.

#17: Crowdsourcing

“No matter who you are, most of the smartest people work for someone else,” quips Bill Joy, a Sun Microsystems co-founder. This declaration was articulated as a paean to the wisdom of crowds, the subject of Jeff Howe’s 2008 book, Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business. Why limit yourself to a small, expensive subset of the available talent, the argument goes, when a global network of freelancers will gladly do the job better for little or free?

Howe’s enthusiasm is very nearly unequivocal. He predicts that today’s tech-savvy youth will “help accelerate the obsolescence of such standard corporate fixtures as the management hierarchy and nine-to-five workday,” concepts he deems to be “artifacts of an earlier age when information was scarce and all decisions…trickled down from on high.” And Howe’s praise of the community as exemplified in crowdsourcing is so complete that it borders on subservience: “Yes, communities need a decider,” he concedes in his concluding chapter, but while “…you can try to guide the community…ultimately you’ll wind up following them.”

The author’s unabashedly optimistic chronicle of the ascendancy of crowdsourcing (a label he created) brings to mind a phrase once made famous by former Federal Reserve chairman Alan Greenspan: “irrational exuberance.” Jeff Howe’s full-fledged advocacy for the crowd’s potential is equally as overreaching as Jaron Lanier’s dire warnings on the same topic. In You Are Not a Gadget, Lanier writes ominously, “We [have]…entered a persistent somnolence, and I have come to believe that we will only escape it when we kill the hive.”

Both authors fail to account for some basic rules of human nature. Lanier laments that “when [digital developers] design an internet service that is edited by a vast anonymous crowd, they are suggesting that a random crowd of humans is an organism with a legitimate point of view.” To which Howe would undoubtedly respond, Damn right. In fact, he explicitly states that “a central principle animating crowdsourcing is that the groups contain more knowledge than individuals.”

Howe and Lanier are each right in their own ways. Crowdsourcing does indeed represent an entirely new model of work, one that transcends business and could upend a sizable chunk of existing corporate practices. Many of Lanier’s fears, while understandable, are not feasible now or in virtually any other conceivable time horizon. And yet he is right that crowdsourcing will never replace the value of specialization. While Howe correctly lauds the democratization of decision-making — for example, aspiring filmmakers are no longer beholden to studio executives’ every whim — his populist celebration of online egalitarianism is not bounded by realistically described limitations. “The crowd possesses a wide array of talents,” Howe writes, “and some have the kind of scientific talent and expertise that used to exist only in rarefied academic environments.”

The key word here is “some.” Howe notes Sturgeon’s Law (“90 percent of everything is crap”) and briefly admits that this may present an inaccurate portrayal of reality: “a number of the people I talked to for this book thought that was a lowball estimate.” Even for the ten or fewer percent that actually do provide reasonably intelligent contributions to the marketplace of ideas, much will be repetitive or non-cumulative. A thousand people with a hobbyist’s interest in chemistry may all eagerly contribute to a forum on noble gases, but it hardly follows that they will achieve any real breakthrough that eludes far more studied experts in the field.

Ultimately, it is not so much the anecdotes that undercut Howe’s thesis, nor is it his own repetition (which, in one particularly egregious case, consisted of several sentences copied wholesale from an earlier section of the book). Instead, it is his idealism that brings to mind countless earlier predictions of technology’s ability to transform human nature, prophesies that have more often than not been proved demonstrably untrue. It remains to be seen what will become of crowdsourcing; will it go the way of the flying cars that American prognosticators naively envisioned over half a century ago? This seems unlikely, and yet so does the author’s vision of a crowdsourcing revolution in business. The truth will likely lie somewhere in the middle, lodged comfortably between Jeff Howe’s crowd-fueled utopia and Jaron Lanier’s “hive mind” hell.