Tag Archives: jaron lanier

Trouble in cyber-paradise

Yesterday Full Stop published my dual book review of Susan Crawford’s Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age and Jaron Lanier’s Who Owns the Future? An excerpt from the essay here:

The consumer as loser is nearly a meme at this point, so thoroughly has the Great Recession imprinted its insignia on the wearied American mindset. Nevertheless, Crawford warns, the consolidation of the American Internet access bottleneck is particularly worthy of hand-wringing. As other advanced economies like South Korea and Japan rocket ahead in embracing fiber-optic connectivity – complete with 1Gbps symmetric data speeds that remain incomprehensible to most Americans – the United States finds itself in the humiliating position of aiming for a minimum national broadband speed of 4Mbps (download-only; 1 Mbps upload) by 2020.

Underlying this technological angst is something deeper, more primal. It is the sense that some right, however virtualized, is being denied by the cartelization of the American telecom space. It is the realization, further still, that our international peers are enjoying the fruits of their justly obtained lightning-speed access while those of us holding American passports are condemned to the endless purgatory that is YouTube’s “loading” spin-wheel.

Enhanced by Zemanta

Rescuing the Facebook generation

For the November 25th issue of The New York Review of Books, author Zadie Smith contributed an essay titled “Generation Why?” Ostensibly, the column was a review of Aaron Sorkin’s much-ballyhooed film, The Social Network, but Smith clearly had bigger fish to fry than nerdy billionaires (especially since Sorkin and director David Fincher had already undertaken this task so elegantly themselves).

No, the issue at stake was not Facebook but the “generation” for which it was created and for whom, perhaps, its existence circumscribes theirs. Smith, in attempting to extricate Facebook from its inevitable foundation myths, nevertheless concludes that she will someday “misremember my closeness to Zuckerberg [she, too, was on Harvard’s campus for Facebook’s birth in 2003], in the same spirit that everyone in ‘60s Liverpool met John Lennon.” And yet an acute sense of separation haunts her, as much for its seeming incongruity (Smith is only nine years Mark Zuckerberg’s senior) as for its depth.

“You want to be optimistic about your own generation,” Smith muses, with a touch of nostalgia. “You want to keep pace with them and not to fear what you don’t understand.” She would be wise to heed her own advice. For what she contends in “Generation Why?” – that for the unwashed masses who fancy Facebook, Twitter, et al among life’s requisites, their online reincarnations have themselves become unhinged from, or even superseded, reality – is as emblematic of the anachronisms of the old-guard cohort (whom she affectionately dubs “1.0 people”) as it is a functional indictment of their successors.

The New Yorker’s Susan Orlean stumbles into the same trap, albeit somewhat more amiably. On her blog, “Free Range,” she posits a new hierarchy of friendship: the Social Index, a ranking of relationships by the relative frequencies of online vs. offline contact. “Human relationships used to be easy,” she explains. But “now, thanks to social media, it’s all gone sideways.” Orlean then proceeds to delineate these subtle distinctions: between “the friend you know well” and “the friend you sort of know” and “the friend, or friend-like entity, whom you met initially via Facebook or Twitter or Goodreads or, heaven help us, MySpace,” and so on. Wisely, she keeps the column short and employs a jocular tone, one whose comic value is reaffirmed by her promotion of the Social Index on – where else? – Twitter, using the hashtag #socialindex.

But one can detect a beguiling undercurrent of cynicism beneath Orlean’s evident joviality. What Zadie Smith and Susan Orlean share – in addition to their niche of the “celebrity lifestyle” whose attainment, Smith assures us, is the raison d’être of the Facebook generation – is the creeping suspicion, despite reaching a career zenith, of their continuing exclusion from the proverbial “Porcellian Club” of Zuckerberg’s collegiate fantasies. This, then, is a fate to which both they and those they pity are likewise consigned. The irony, of course, is their refusal, or inability, to identify these “People 2.0” as their kindred spirits.

Smith opts instead for the appeal to authority. In this case, that role falls to Jaron Lanier, a “master programmer and virtual reality pioneer.” (Smith, who is 35, quickly reminds us that Lanier, 50, is “not of my generation,” an assertion whose brashness once more belies her commonalities with that perpetually group-conscious underclass of Facebookers.) Quoting extensively from Lanier’s book, You Are Not a Gadget, Smith appropriates the tech-philosopher’s arm’s-length aspect toward technology as her own, spraying the reader with snippets of his wisdom. (In the book’s preface, Lanier cautioned against this Girl Talk-esque brand of mishmash, lamenting that his words would be “scanned, rehashed, and misrepresented by crowds of quick and sloppy readers into wikis and automatically aggregated wireless text message streams.”)

But Smith and Lanier have separately, and preemptively, doomed themselves to contemporary irrelevance by adhering to a retrograde narrative of the modern condition. Together, their worst nightmare is the narrowing of human existence into unintentionally confined spaces. This process takes place via “lock-in,” a series of inadvertently interacting steps which, taken together, preclude the possibility of reversal or alteration. Such was the case, Lanier argues (and Smith dutifully recounts), in the invention of the MIDI file type, a once-cutting edge format for storing and playing digital music, whose binary limitations preternaturally forced the beautiful infinity of analog melodies into a prepackaged sepulcher of bits and bytes. Once the standard had been formalized, the jig was up: there was no turning back. Music had forever changed, and not necessarily for the better. Lanier unwittingly reformulates – on behalf of the self-described “software idiot” Zadie Smith – these same fears in regard to social media.

These visions of doom are misplaced. One can feel almost viscerally the bored sighs emanating from countless millennials’ diaphragms as Zadie Smith ages before their very eyes: “When a human being becomes a set of data on a website like Facebook, he or she is reduced. Everything shrinks. Individual character. Friendships. Language. Sensibility. In a way it’s a transcendent experience: we lose our bodies, our messy feelings, our desires, our fears.” Such generous hyperbolizing obscures whatever consideration Smith’s fretting may warrant on the margins. If rescuing this Lost Generation is her utmost objective, then her plea for sanity, easily mistaken for groveling, will scatter Zuckerberg’s millions of disciples like so many cards in a two-bit parlor trick.

Notably, Zadie Smith gently ridicules the Facebook era’s emphasis on connectivity, remarking snidely that Zuckerberg “used the word ‘connect’ as believers use the word ‘Jesus,’ as if it were sacred in and of itself.” The quality of those interactions, she worries, is not worth the minimal effort exerted to vivify them. And yet she comes agonizingly close, on multiple occasions, to grasping the essence of this generation that remains simultaneously adjacent to, but seemingly unreachable from, her own. “Watching this movie, even though you know Sorkin wants your disapproval, you can’t help feel a little swell of pride in this 2.0 generation,” Smith concedes. “They’ve spent a decade being berated for not making the right sorts of paintings or novels or music or politics. Turns out the brightest 2.0 kids have been doing something else extraordinary. They’ve been making a world.”

Sound familiar? It should. The specter of John Lennon, the one “that everyone in ’60s Liverpool met,” haunts every word of “Generation Why?”  Even Zadie Smith, for whom Lennon (unlike Lanier) is clearly not a peer, cannot ignore the contemporary relevance of the former’s transformative impact on society. Culture may move more rapidly in the digital era than it did in the 1960s, but its disruptive rhythm has survived largely intact. Rebellion, experimentation, innovation: these are all hallmarks of the creative subculture, as each subsequent breakthrough quickly buries its predecessors. Mark Zuckerberg, then, is the spiritual descendant of John Lennon’s “Imagine.” We are, indeed, all connected (much to Smith’s everlasting surprise).

This is the epiphanic truth that the Facebook generation has uncovered, even if in so doing they remain blissfully unaware of the historical import of their actions. To be sure, their self-absorbed ignorance of a chronology of innovation is itself a product of the ever-shifting nature of modern culture. A generation once encompassed two or three decades; now, an absence of even five years from civilization would reduce the most precocious techie to the countenance of a Luddite. But, somewhat paradoxically (considering her alarm at Facebook’s social impact), Smith digests technology’s ephemeral nature with ease, as she states at the end of her essay: “I can’t imagine life without files but I can just about imagine a time when Facebook will seem as comically obsolete as LiveJournal.”

If this is the case, then what, precisely, is the cause for concern? Conceivably, Zadie Smith, who teaches literature, senses an intellectual fence over which the social media-savvy yet literarily deficient minds of her young charges are unable to vault. Perhaps, for a ponderous writer such as Susan Orlean, who once penned a 282-page paean to orchids, it is a fear of losing her audience to ever-decreasing attention spans. For Jaron Lanier, it may be the horror at a remix culture in which the devolution of works of art into haphazardly scissored segments (à la David Shields’ Reality Hunger) threatens the very nature of public expression. Perhaps Zadie Smith and Susan Orlean and Jaron Lanier and so many others of their age and temperament, finding themselves unable to “keep pace with [the younger generation],” succumb to the all-too-human instinct to “fear what [they] don’t understand.” In short, they face the same challenge that confronted the parents and teachers and writers of the ‘60s generation, fifty years later. They, like Mark Zuckerberg and the hordes of Facebook users who followed him in the quest for digital immortality, face the fear of oblivion.

#25: Reality Hunger

I really need to stop reading manifestos. First it was The Communist Manifesto. Then, earlier this year, it was You Are Not a Gadget: A Manifesto, by Jaron Lanier. “Workers of the world, unite!” yielded to much ado about the “hive mind.” Like any manifesto, both were distinctly aware of their characterization as such; hence, the grandiose language and sweeping world vision. (I suppose Karl Marx’s received a bit more attention than Jaron Lanier’s, however.)

Reality Hunger: A Manifesto, by virtue of its self-descriptive subtitle, belongs to this same (decreasingly exclusive) club. However, David Shields, whose other books have titles like The Thing About Life Is That One Day You’ll Be Dead, hardly looks like a kindred spirit of the communist revolutionary or the cautious Internet pioneer (based on his photograph on the book’s back flap, at least). The bald and bespectacled author appears better suited for a dignified study of poetry, or perhaps as a caption writer for a nature-themed daily calendar.

Neither of these subjects is what Shields is interested in writing about, however. In what the author describes as “twenty-six chapters; 618 mini-sections,” a case is made for an emerging writing form. “An artistic movement, albeit an organic and as-yet-unstated one, is forming,” Shields writes. “What are its key components? A deliberate unartiness: ‘raw’ material, seemingly unprocessed, unfiltered, uncensored, and unprofessional.”

Or did Shields actually write this? Aha, he would reply, but that’s just the point: who cares? Each of these hundreds of fragmentary “mini-sections” operates as a unique thought, yet also as an integral part of the whole. What if section three were written by David Shields? Or what if it were David Mamet, or David Carr, or David Markson (all of whom are also quoted in Reality Hunger)? Does knowing the identity of the author — or, for that matter, evaluating the authenticity of the text itself — alter the experience of reading the work? And if so, is this for the better or the worse? In short, why the big fuss over intellectual property?

Well, for one, most writers struggle to scrape together the requisite means to make a living out of their dreams. When someone else comes along and irreverently plucks a quote here and a passage there wholesale, a drop in royalties is the result. This would appear to be an understandable reason for approximately 618 authors, speakers, and public figures to be very angry with David Shields. Thanks to the lawyers at Random House, however, the author was generously spared this fate. Lamenting the loss of a “freedom that writers from Montaigne to Burroughs took for granted,” Shields explains, in a brief note following the main text, that his publisher’s dutiful attorneys “determined that it was necessary for me to provide a complete list of citations,” but that readers may easily “restore this book to the form in which I intended it to be read” by cutting out these very citations with a pair of scissors.

“Reality cannot be copyrighted,” Shields declares. He is right, it cannot; but its various expressions can, and do, warrant legal protection. The author, in an online defense of his book, claims that “numerous bloggers appear to think I’m the anti-Christ because I don’t genuflect at the twin altars of the novel and intellectual property.” But I suspect theirs is merely a case of disenchantment, not stupefaction. The starving masses may be hungering for reality, but it is doubtful that a hardcover compendium of reprocessed ideas will provide the necessary protein. The last sentence of Reality Hunger reads, “Stop; don’t read any farther.” I’m assuming it was David Shields who wrote this line; regardless of the author, this advice would have been far more useful in the book’s earlier pages. Even reality hunger disappears when confronted with enough junk food.

#17: Crowdsourcing

“No matter who you are, most of the smartest people work for someone else,” quips Bill Joy, a Sun Microsystems co-founder. This declaration was articulated as a paean to the wisdom of crowds, the subject of Jeff Howe’s 2008 book, Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business. Why limit yourself to a small, expensive subset of the available talent, the argument goes, when a global network of freelancers will gladly do the job better for little or free?

Howe’s enthusiasm is very nearly unequivocal. He predicts that today’s tech-savvy youth will “help accelerate the obsolescence of such standard corporate fixtures as the management hierarchy and nine-to-five workday,” concepts he deems to be “artifacts of an earlier age when information was scarce and all decisions…trickled down from on high.” And Howe’s praise of the community as exemplified in crowdsourcing is so complete that it borders on subservience: “Yes, communities need a decider,” he concedes in his concluding chapter, but while “…you can try to guide the community…ultimately you’ll wind up following them.”

The author’s unabashedly optimistic chronicle of the ascendancy of crowdsourcing (a label he created) brings to mind a phrase once made famous by former Federal Reserve chairman Alan Greenspan: “irrational exuberance.” Jeff Howe’s full-fledged advocacy for the crowd’s potential is equally as overreaching as Jaron Lanier’s dire warnings on the same topic. In You Are Not a Gadget, Lanier writes ominously, “We [have]…entered a persistent somnolence, and I have come to believe that we will only escape it when we kill the hive.”

Both authors fail to account for some basic rules of human nature. Lanier laments that “when [digital developers] design an internet service that is edited by a vast anonymous crowd, they are suggesting that a random crowd of humans is an organism with a legitimate point of view.” To which Howe would undoubtedly respond, Damn right. In fact, he explicitly states that “a central principle animating crowdsourcing is that the groups contain more knowledge than individuals.”

Howe and Lanier are each right in their own ways. Crowdsourcing does indeed represent an entirely new model of work, one that transcends business and could upend a sizable chunk of existing corporate practices. Many of Lanier’s fears, while understandable, are not feasible now or in virtually any other conceivable time horizon. And yet he is right that crowdsourcing will never replace the value of specialization. While Howe correctly lauds the democratization of decision-making — for example, aspiring filmmakers are no longer beholden to studio executives’ every whim — his populist celebration of online egalitarianism is not bounded by realistically described limitations. “The crowd possesses a wide array of talents,” Howe writes, “and some have the kind of scientific talent and expertise that used to exist only in rarefied academic environments.”

The key word here is “some.” Howe notes Sturgeon’s Law (“90 percent of everything is crap”) and briefly admits that this may present an inaccurate portrayal of reality: “a number of the people I talked to for this book thought that was a lowball estimate.” Even for the ten or fewer percent that actually do provide reasonably intelligent contributions to the marketplace of ideas, much will be repetitive or non-cumulative. A thousand people with a hobbyist’s interest in chemistry may all eagerly contribute to a forum on noble gases, but it hardly follows that they will achieve any real breakthrough that eludes far more studied experts in the field.

Ultimately, it is not so much the anecdotes that undercut Howe’s thesis, nor is it his own repetition (which, in one particularly egregious case, consisted of several sentences copied wholesale from an earlier section of the book). Instead, it is his idealism that brings to mind countless earlier predictions of technology’s ability to transform human nature, prophesies that have more often than not been proved demonstrably untrue. It remains to be seen what will become of crowdsourcing; will it go the way of the flying cars that American prognosticators naively envisioned over half a century ago? This seems unlikely, and yet so does the author’s vision of a crowdsourcing revolution in business. The truth will likely lie somewhere in the middle, lodged comfortably between Jeff Howe’s crowd-fueled utopia and Jaron Lanier’s “hive mind” hell.

#12: You Are Not a Gadget

“The words in this book are written for people, not computers.” So declares Jaron Lanier, in the preface to his self-described “manifesto” on the impending doom of Web 2.0 and its digital companions. In You Are Not a Gadget: A Manifesto, Lanier confronts the brooding technological nightmare with revolutionary fervor, decrying with gusto the horrifying destructive potential of…of…of Wikipedia. In what amounts to an elegy for the creative spirit, Lanier warns against the dangers inherent to the “hive mind” by lashing out against humanity’s self-imposed subjugation to technology.

Let’s be fair here. Lanier seems like a smart enough guy, even if his choice of hairstyle — he appears on the book’s flap in a thinker’s pose, with his dreadlocks running past chest level and on to the great beyond — is more suited to an aspiring grunge artist than an Internet visionary. Fittingly, then, he actually enjoys playing the oud and even frequents an online forum that serves as a virtual community for the instrument’s fan base. Of the forum, he says, “There’s a bit of a feeling of paradise about it. You can feel each participant’s passion for the instrument, and we help one another become more intense.”

Indeed, Lanier’s intensity — his passion for rescuing the individual voices from the clutches of impersonal cyberspace — is to be admired, even if the object of his rigor is perplexing. His thesis, that the digital era’s explosion has created ways of thinking about and interacting with technology that portend disaster down the road, is not particularly convincing. And while he could never be accused of boring his readers, one could easily charge him with alarmism.

The author ably explains the dangers of “lock-in,” the process in which an arbitrary digital convention — organizing computer data into virtual files and folders, using MIDI as the industry standard for digital music representation, etc. — becomes so ingrained in culture and thought that it is nearly impossible to reverse. What Lanier never quite masters, however, is just why certain accepted standards, most notably the open-source movement and crowd-sourcing, are so malignant. Technology’s purpose, he lectures, is to adapt to and serve human beings; he worries that the sudden and widespread advent of the Internet has given rise to the opposite being the case, as we have now become willingly subservient to machines, adapting to their whimsies instead of demanding tools that do not require a degradation of human intelligence.

It is in this vein that he alludes to Wikipedia, a site he admits to using himself but whose implicit founding principle — the more contributors, the more closely we approach truth — he derides with vivacity. “The ‘wisdom of crowds’ effect should be thought of as a tool,” Lanier writes. “The value of a tool is in accomplishing a task. The point should never be the glorification of the tool…There’s an odd lack of curiosity about the limits of crowd wisdom.” He has a point, but not much of one. It is true, for example, that, as Lanier notes, most breakthroughs in modern technology have been delivered under the auspices of for-profit corporations (i.e. Microsoft Windows, the iPod, digital camera, etc.). And that such innovations are sorely lacking in the domain of open-sourcers is cause for reflection, although not necessarily concern.

However, what the author consistently misses (or perhaps chooses to ignore) is the innate ingenuity of human beings, regardless of their provided tools. In a section discussing the impact of the file-sharing era on musicians, Lanier writes, “If we choose to pry culture away from capitalism while the rest of life is still capitalistic, culture will become a slum.” Above all, he is concerned with our collective loss of free spirit, but he fails to notice, for example, the consistent ability of the young to bypass and defeat ever more stringent regulations by those in the business of enforcing digital rights management. First, there was Napster; after being brought low, it emerged as a legal, paid music service. File-sharing clients sprouted up one after the other, with new entrants following quickly on the heels of those brought to an end via litigation. Even Radiohead’s novel idea of giving away music for free, which Lanier claims does not “fill me with hope for the future,” is actually proof that people are continuing to exhibit an entrepreneurial spirit by forming new and inventive solutions to existing problems. These are not the products of unqualified and inexpert crowds, but the brainchildren of creative, ambitious individuals. Jaron Lanier may not be a Luddite, but his dire warnings of future doom are a bit anachronistic. I can only wonder what he’d think of the iPad.