Category Archives: Technology

For a moment, we were kids again

Grantland‘s Brian Phillips just stole my thoughts on Felix Baumgartner’s truly incredible space jump:

A hundred years ago, before we took it for granted that we could all live on the moon if Congress would only raise taxes, a large public cared intensely about speed records, air races, parachutists, and feats of aerial daring. The morning newspaper brought the results of the latest sensational exploits. At the start of the 1911 Paris-to-Madrid air race, during which Louis Train crashed his monoplane into the prime minister of France, hundreds of thousands of people turned out just to watch the fliers take off. When Charles Lindbergh landed at Le Bourget Airport after flying from New York to Paris, navigating by the stars, the crowd pulled him out of the cockpit and carried him over their heads for half an hour. It was the era of zeppelins and astonishment. Flight, which had been a crazy dream for nearly all of human history, was suddenly something we could do. The fascination with stunts and records was partly scientific: In the same way that gaps on the map were filled in by intrepid individual explorers, our sense of what was possible in navigating the sky would be defined by solo daredevils, inventors, and balloonists. But it was partly born of wonder: Each new accomplishment was a fresh reminder that people could fly.

128,000 feet is: the stratosphere. I mean literally.

Maybe the most incredible thing about Baumgartner’s jump was not that he did it successfully but that, for a short while, he brought the world back to that old daredevil wonder. Yes, he was sponsored by Red Bull and broadcast live on YouTube, but that’s actually kind of the point: He pushed the limits of human flight so far that he made the whole Internet remember that flight is like magic. He took social media’s constant search for the next big distraction and funneled it into old-fashioned amazement. In that way, his jump resembled the landing of the Curiosity rover on Mars in August. But where the NASA mission recalled the popular-scientific inquisitiveness of an earlier era of flight, the response to Baumgartner echoed the other part of the equation: the sense of purposeless glee people felt at the sight of a brave deed splendidly done. Baumgartner’s jump might lead to some scientific progress, in the form of space suit advances and so forth. But that’s not why we were all memorizing the numbers and freaking out when his visor fogged up and tweeting about every second of the fall.

128,000 feet is: a record that exists for no reason, and therefore one of the best reasons of all.

Show me the money

Wired is on it:

Ask politicians whether campaign contributions influence their decisions, and they’ll tell you certainly not.

Ask any citizen, and they’ll likely give the opposite answer.

With that in mind, we’re re-introducing a web-based embeddable widget — for anybody to use — that lists the top 10 donors and their contributions to any member of the House and Senate, their opponents, and the presidential candidates. Wired updated the widget in conjunction with Maplight, the Berkeley, California-based nonprofit dedicated to following money and politics.

“Corporate influence in politics has gone off the charts, and it’s more important than ever for voters to understand who is financing candidates,” said Evan Hansen, editor in chief of Wired.com. “Maplight has done the hard work of compiling the data. At Wired, we’re happy to help get that information out to the wider public, and share it as broadly as possible with this web-based embeddable widget.”

The widget is free to steal and comes with a Creative Commons license. The widget displays a shadow outline of the politician adorned with NASCAR-style logos of some of the top donors giving that candidate money.

Maplight pulls down up-to-date campaign-financing figures from the Federal Election Commission, which are fed into a database so the widget stays current.

“In just a few weeks, voters will confront a ballot filled with candidates whose campaigns have been paid for by wealthy donors. People deserve to know the truth about whose interests their candidates are really representing,” said Daniel Newman, president and co-founder of MapLight. “We’re proud to work with Wired to give voters a tool they can use to draw back the curtain on the moneyed influence plaguing our political system.”

Felix Baumgartner attempts to make history

[youtube http://www.youtube.com/watch?v=MrIxH6DToXQ]

The skydiver is mere minutes away from skydiving from a world-record 23 miles up:

Skydiver Felix Baumgartner is making his ascent to the edge of space, where he plans to jump into the biggest free fall of all time.

In a capsule hanging from a helium balloon, Baumgartner is working his way to 120,000 feet (about 23 miles) — more than three times the cruising altitude of the average airliner.

With nothing but a space suit, helmet and parachute, Baumgartner hopes to be the first person to break the sound barrier without the protection of a vehicle.

The thin air at that height provides so little resistance that after just 40 seconds, he is expected to be free falling faster than 690 miles per hour.

Debates and video games

Robert Schoon watched last night’s vice presidential debate on his Xbox:

Xbox’s simple presentation was a surprisingly liberating, compared to watching on T.V.. I had been watching other channels that night, and all of them, even the broadcast networks, had some gimmick on the screen, whether it was split-screen “reaction” shots, twitter feeds, ubiquitous crawl at the bottom of the screen, or even just visually displaying the moderator’s question. Though it didn’t look revolutionary in the “digital age” sense, it was a quiet rebellion against the distracting visual packaging that all the news channels have seemingly decided upon.

Then the polls started popping up on screen. At about half an hour before the end of the debate, a little blue band appeared with a question and three choices. After choosing one, a bar graph would appear, giving instant results, in percentages, for each option. I counted at least fifteen questions before I lost track, and though the questions tracked well with the debate – foreign policy questions while Vice President Biden and Paul Ryan talked Afghanistan, questions about candidates’ religion during the abortion portion – the polling became so rapid-fire as to become distracting. Plus, they were beside the point, unless that point was to relentlessly confirm that roughly two-thirds of Xbox Live watchers are liberal and about ten percent had no opinion about anything.

Still, it’s a start…

Personally, I found ABC News’ coverage last night nauseating. I turned on the live Internet feed about 15 minutes before the debate started, and all the hosts were talking about was the apparently horrifying news that Twitter usage would be banned in the debate hall. Then, during the debate itself, an absurdly large blue bar kept popping up showing which keywords were trending on Twitter. Enough, already.

Greed is good. Just don’t remind us too often.

[youtube http://www.youtube.com/watch?v=fsP6XTHBwRw]

Jesse Pinho (no relation; we just have the same parents) mutes his TV every time he sees the above Levi’s commercial, and wonders why:

Levi’s #GoForth commercial is an excellent example of the intersection of individual and corporation, in which the corporation (Levi’s) more or less blatantly proclaims its jeans to be the standard-bearer of creativity, leadership, and individuality. Wear Levi’s, and you’ve staked your claim to “it.” (Exactly what “it” is, I’m not sure–but it’s certainly something good.) Add Levi’s to your curation of relationships, and it will perfectly complement your other inevitably cool qualities–your artistry, your go-getter attitude, your so-perfect-it-could-only-be-in-a-commercial haircut…

So why, then, do I find myself reaching for the remote the instant I hear “This is a pair of Levi’s”? What triggers this reaction? Certainly, annoying or poorly-done ads (I’m looking at you, 5-hour Energy!) prompt the same response, but the Levi’s commercial was neither of those. So what is it? Is the call to action too transparent? That is, does Levi’s offend me by toeing the delicate line between subtlety and overtness vis-à-vis manipulation of its audience? Certainly, someone who responds positively to being told that he is “the next living leader of the world” won’t respond so well to realizing that the compliment was proffered simply to coax him into buying some company’s product.  But then, all advertising aims to do just that: it offers the viewer something she wants (a compliment, entertainment, humor, etc.) in exchange for 30 to 60 seconds of her attention. Or, at the very least, it beats the viewer over the head with some piece of information (“5-hour Energy… every day! Every day! Every day!”) so that its message–“Buy this!”–is inescapable.

Perhaps, then, it’s that Levi’s violates the contract between viewer and advertiser, in which the viewer suspends her cynicism every time a commercial is played. We viewers know that we are being manipulated into action when watching a commercial; and we’ve come to accept that, under one condition: that the advertiser does not insult our intelligence. Levi’s, however, fails to acknowledge this basic requirement, blatantly exploiting our perception of cool and forcing us to confront it in such literal terms that we’re made to feel uncomfortable. Advertisers should take note of this interaction, and learn from it one important lesson: that we’ll gladly consent to exploitation as long as you don’t remind us that that’s what we’re doing.

Look out for more stuff from Jesse coming up on this blog (as well as on his), primarily on the tech scene and related topics.

Amusing Ourselves to Death: Aaron Sorkin’s “The Newsroom” and the View from Nowhere

There was a moment in the second episode of The Newsroom where I really felt this series might pack a punch. Will McAvoy, the anchor of the evening news, is attending a brainstorming session led by his executive producer, MacKenzie, who rhetorically asks her assembled minions, “Are there really two sides to this story?” This wrinkles the fair brow of MacKenzie’s subordinate, Maggie, who asks, usefully, “What does that mean?” “The media’s biased towards fairness,” MacKenzie replies. To which Maggie rejoins, “How can you be biased toward fairness?”

You get the point: this is Aaron Sorkin’s world, after all. Clueless women exist so five-minute expositional monologues don’t have to. (Even if recitations of entire Wikipedia articles, delivered hostage-style directly into the camera, would arguably be more realistic and less condescending.) Unsurprisingly, Will – imagine a leaner, meaner Jed Bartlet with a penchant for swearing because he has a show on goddamn, motherfucking HBO – has something to say:

“Bias toward fairness means that if the entire Congressional Republican Caucus were to walk into the House and propose a resolution stating that the Earth was flat, the Times would lead with, ‘Democrats and Republicans Can’t Agree on Shape of Earth.’”

With that decisive and sardonic blurt, The Newsroom caught my full attention. Unfortunately, it lost me a couple seconds later, when Sorkin’s cutely clever dialogue once again devolved into petty pitter-patter and destroyed any chance at genuine social commentary. Nevertheless, Sorkin’s thinly disguised nod to what NYU professor and media critic Jay Rosen has dubbed “The View from Nowhere” is worth further analysis.

In that fleeting moment, Will McAvoy’s brief diversion away from his Keith Olbermann-like self-absorption and into something a little more like media criticism got me fired up. I felt similarly while watching the premiere episode when, during a characteristically grating shouting match, MacKenzie demands of Will, “Where does it say that a good news show can’t be popular?” and he replies, “Nielsen ratings.” (As banal as these ideas may sound to anyone not living under a rock for the past few years, hearing them said aloud on a mainstream TV series was a little akin to reading Anderson Cooper’s coming-out email the other day: everyone knew it already, but it just hadn’t been said yet.) Perhaps this really was the series I’d been hoping The Newsroom would turn out to be when I’d first heard about it a couple months ago: a full-throated evisceration of fluff and reportorial false modesty disguised as “objective” news.

I really should’ve known better. To anyone who’s watched at least an episode or two of The West Wing, it is immediately clear that Sorkin desperately wants to believe in something. Problematically, he often explores this desire vicariously via nattily-attired male characters who passionately exchange juvenile tropes and platitudes, usually while striding briskly down a hallway, dodging Xerox machines and the occasional stray secretary. You can tell Sorkin feels a little sheepish about this boyish optimism, because – at least in The Newsroom, where a fleeting moment of cynicism occasionally breaks through his otherwise cloudy self-assurance – the character on the receiving end of the inspirational mini-speech often responds with just the sort of sarcastic aside Sorkin guesses a cynic might use.

But even this hedging of bets can’t dull the sharp edge off his innate bullishness on life: inevitably, the cynic is won over in the end – of the scene or the episode, never mind the season. I distinctly remember the final minutes of one episode of The West Wing (early in season two, I believe) in which most of the major characters are drinking beers on a brownstone stoop late into the evening. Josh Lyman is telling a story whose moral ultimately boils down to “America, Fuck Yeah,” and each of his enraptured listeners, speaking in solemn, hushed tones, responds in turn, “God bless America.” (“God bless America.” “God bless America.”) Ladies and gentlemen, Aaron Sorkin. So yes, while The Newsroom’s two main characters verbally bludgeon each other in the age-old fight between integrity and popularity, Sorkin long ago waved the white flag. Nielsen ratings, you see.

I bring this up because, providentially or otherwise, around the same time I first watched the pilot episode of The Newsroom, I’d also begun reading, at a friend’s recommendation, Neil Postman’s classic, Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Caveat: I’m still only a third of the way through the book, but that’s far enough along to help me start mentally tying a common thread that weaves together a mélange of seemingly disparate entities from Sorkinist idealism to Jay Rosen’s “View from Nowhere” to Ricky Gervais’ TV show Extras to New York Times public editor Arthur Brisbane’s confusion to, yes, Anderson Cooper’s sexuality.

Let’s start with Postman. In Amusing Ourselves to Death, he distinguishes between what he dubs “television’s junk” on the one hand versus what self-serving journalists might call “serious television” on the other. “I raise no objection to television’s junk. The best things on television are its junk, and no one and nothing is seriously threatened by it,” he reassures us, but then warns, “Besides, we do not measure a culture by its output of undisguised trivialities but by what it claims as significant.”

The question, then, is which category can most accurately lay claim to The Newsroom. I think I could venture an uneducated guess as to Postman’s take: whichever category doesn’t include the Lincoln-Douglas debates, for starters; whichever one does include TV shows about TV shows about the news, as a follow-up. Clearly, Sorkin and Postman wouldn’t see eye to eye on this (nor on anything else, most likely). On the one hand, Sorkin can easily be dismissed. No creator prefers to think of his invention as “television’s junk.”

On the other hand, a TV series that launches an honest attempt to take on the absurdities of its own medium warrants respect if executed correctly. I don’t watch a lot of television, but in terms of creating a legitimate space for introspection and self-reflection, it’s hard for me to come up with a better example than British comedian Ricky Gervais’ hit show Extras.

The first season, while hilarious, isn’t particularly notable on a deeper level, but it’s the second (and final) season that really turns the corner into a full-frontal assault on television entertainment. There must be no sweeter irony than pillorying BBC TV executives as slavish devotees of the almighty bottom line on a show financed and aired by that very same company. This was form making sweet, sweet love to content.

If, as Postman (himself paraphrasing Marshall McLuhan) postulates, “the medium is the metaphor,” then Gervais seemed to grasp this lesson perfectly. Season two is a six-episode marathon portraying the slow, tortuous disintegration of an aspiring artiste into a self-loathing puppet spouting catchphrases in a desperate, cloying attempt to placate his overlords and stave off the fast-approaching death of his TV celebrity. It’s a remarkably pathetic descent, rendered all the more so by the oddly moving spectacle of Gervais’ character clumsily pirouetting through increasingly incoherent rationalizations so as to shield himself from the reality of his self-annihilation.

And then, just like that, after twelve episodes and one Christmas special, Ricky Gervais and his brainchild, Extras, bowed out, almost assuredly leaving money on the table. Nothing more needed to be said. To do otherwise would have been to jeopardize the credibility of his critique and, paradoxically, would have turned his real-life series into a self-parody, life imitating art. No, then. Leave the sequels to pirates and superheroes.

It is against this mental backdrop of mine that Aaron Sorkin was unlucky enough to submit his latest entry. Reciting trite clichés in steady vocal crescendos makes for entertaining television. It may even make for great television. But great television – even the best thing on TV, Postman reminds us – is the junk. TV Sorkin-land occupies the world just a few ladder rungs above the tundra of laugh tracks and catchphrases, ambitious enough to fancy itself serious but oblivious beyond measure to its startling irrelevance. I can envision, sometime in 2020, a season nine where a thoroughly sincere Will McAvoy rails against the frivolous pursuit of Nielsen ratings and advertising dollars, and I can envision myself, years before, having thrown my remote control through the wall.

Even a show like Extras is probably not what Postman had in mind when he discussed the things “[a culture] claims as significant.” Indeed, his keen eye was trained on the news desk, the anchor’s chair, the endlessly scrolling ticker. This was then, and still is now, the domain of “Very Serious People” (to borrow Paul Krugman’s phrase). And yet television news today is dominated by uber-partisan hatchet men on the one side and self-described “neutral” journalists on the other. The former star in shows like CNN’s ill-fated Crossfire, while the latter’s considerable terror of accidentally importing facts into fully contrived controversies leads them to abandon the task altogether and question, instead, whether the presidential candidates prefer iPhones to BlackBerries.

This is exactly what Postman had feared in his worst dystopian nightmare. Invoking the dichotomously grim futures envisioned in 1984 and Brave New World, Postman wrote: “Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture.”

By 1985, when Amusing Ourselves to Death was first published, Postman was convinced Huxley’s vision had carried the day. What he might not have anticipated at the time was the retrogressive effect TV news would exert even on its older counterparts. (Or maybe he did: again, I’m only one-third finished.) It’s no longer just CNN throwing out more election-night holograms while FOX and MSNBC exchange clumps of angry spittle. The disease has spread backwards, infecting the previously immune printed press.

Among its victims is none other than the Grey Lady herself, the New York Times. Its public editor, Arthur Brisbane, recently ignited an Internet firestorm with his sincerely-titled column, “Should The Times Be a Truth Vigilante?” The online response was rapid, voluminous, and overwhelmingly of one mind: thankfully, virtually everyone was incredulous that the question even had to be asked. Brisbane’s query was a classic embodiment of Jay Rosen’s “View from Nowhere:” assuming, sans verification, that every story has two equally valid sides. As The Newsroom’s MacKenzie rightly noted, some stories have five sides. Some have one. But simply serving as the court stenographer, which was bad enough in the pre-Internet era, isn’t being fair anymore. It’s being lazy. Mostly, it’s being scared.

To the Times’ credit, Brisbane is its public editor, meaning he operates independently of all other staff. But a brief skimming of an average day’s news coverage makes it immediately obvious that the problem is widespread. To use one infamous example from relatively recent history, the paper’s longtime refusal to use the word “torture” to describe waterboarding spawned so much criticism that a satirical web app calling itself the “New York Times Torture Euphemism Generator” sprung up: one could refresh the page to yield various phrases like “enhanced physical audits” and “elevated nipple scrutiny.” Ironically, the Times’ then-public editor’s official explanation for its linguistic aversion to “torture” inadvertently reinforced its critics’ justified perception as to the paper’s insistence on perpetuating false equivalencies: “The Times is displeasing some who think ‘brutal’ is just a timid euphemism for torture and their opponents who think ‘brutal’ is too loaded.” (Because waterboarding isn’t brutal if it’s done fewer than 183 times per person. Look it up.)

It is perhaps more interesting to imagine Neil Postman’s take on the Internet as it exists today. As early as 1995, in an interview with Charlene Hunter Gault on PBS’ NewsHour, Postman expressed his alarm at the then-novel idea of an “information superhighway:” “I often wonder if this doesn’t signify the end of any meaningful community life.” (In a twist he could easily appreciate, this very interview stands today as a testament to a bygone era, one in which in-depth discussions of theoretical import could be shown on national TV and people would actually watch.) He conceded the interactive nature of the Internet, which contrasted it from the passivity of watching television, but feared – accurately – that it would nevertheless lead to a surge in tribalism (foreshadowing Cass Sunstein’s “information cocoons”) and actually divide the global community while claiming to unite it. De-contextualization – the commodification of information as a standalone product, utterly divorced from personal or even local significance – was a primary concern of Postman’s. And that’s where we jump to Anderson Cooper’s sexuality.

Anderson Cooper visited Wolfson Children's Hos...

As a preemptive disclaimer, I happen to like Cooper more than just about anyone else doing news on TV today. (Not counting Jon Stewart and Stephen Colbert, who together represent a nearly perfect antidote to Postman’s disgust for trivialities masquerading as something culturally significant: Stewart and Colbert are cultural signifiers masquerading as triviality.) But this doesn’t alter the fact that the recent “news” of his homosexuality embodied the worst of everything about the entertainment conglomerate approach to TV news.

Cooper is, quite literally, a TV celebrity. He’s famous solely by virtue of his position as someone who appears regularly on TV. It’s notable to what extent that trajectory alone – from TV presence to fame, and not the other way around – so profoundly contrasts itself with the print press. How many people would recognize Bill Keller walking down the street? How about Jill Abramson? The medium of television made Anderson Cooper who he is, and so it is only fitting that his self-outing should light up the television and computer screens of people all over the world in return. That Anderson Cooper’s sexuality bears no personal significance for any of these people is completely missed in the rush to retell and re-tweet the “breaking news.”

This may look like a tempest in a teapot, except that human attention spans are finite containers. Spending time talking about Anderson Cooper’s sexuality necessarily detracts from the available time and mental effort required to understand something else that might have infinitely more personal relevance. Worse yet, it conditions us to start categorizing stories like these as “news.” Not only is information out of context now an acceptable subject of extended discussion, but the type of wonky dissection of media critiques that Postman had launched into with Gault in 1995 now seems strangely quaint, a relic of a simpler, more boring time. The financial troubles of many of our historical newspapers signal the emergence of a culture that’s moved on from the world of facts and figures and swept straight into a sea of colors and noise and lights. And tickers. Endlessly scrolling tickers.

Will McAvoy wasn’t wrong to locate the media’s failure in its inability to favor facts over a dubious balancing act that ignores the central issues. But Sorkin was wrong, for implicitly positioning The Newsroom as intellectually significant when, so far at least, it’s really nothing more than a very conventional sitcom. Nothing more than junk television. Which just might make it the best thing on TV.

Then there were Path, Pinterest, and Highlight: do we have too many social networks?

First it was Facebook and Twitter. (Or not even “first,” since those products were actually preceded by Friendster, MySpace, et al.) Now it’s Foursquare, Tumblr, Flickr, Pinterest, Path, and Highlight. Social networks are proliferating, but the addition of each new app/network is diluting the quality of the whole social experience.

In some (but not all) ways, social networking is a natural monopoly. The best customer experience is only possible when a large number of people are using the same service. The more fractured the space becomes, the less likely any of your friends are to be using the specific network you prefer. And this makes the entire social experience less valuable — both for prospective social media users looking to get involved for the first time, as well as for existing users looking to expand their digital influence/footprint.

I just downloaded Path on my Android smartphone the other day and, while I must admit that I haven’t spent much time with it, it’s a little unclear to me why anyone should bother using this over, say, Facebook, which already does the same thing (and more) and has the additional value of being used by nearly everyone I know. There is some merit in using multiple online tools — Facebook, for example, has yet to establish a blogging service capable of wooing customers away from Tumblr, WordPress, Blogspot, and the like — but much of today’s social sphere is simply redundant. Foursquare, or Facebook check-ins, or Google Latitude? They all do basically the same thing to varying degrees of success, but the existence of all three of them means that, at any given moment, relatively few of my friends are using any specific one of them.

Of course, the counterargument is that the presence of this competition is the very driver of innovation in the field. This is undoubtably true, and perhaps more importantly, there is no good way (nor would it be a remotely good idea) to force everyone to use a specific service anyway. But it is starting to feel as if the hyperactivity in social media these days is reducing the quality for everyone. I suppose the best we can hope for is that the presence of all these startups will force the big names like Facebook and Google to incorporate more of the best ideas into their own products. That is hardly an ideal free-market scenario, but it may be the best option we have at the moment.

A brief thought about inverted scrolling on Mac OS X Lion

I downloaded and installed Max OS X Lion yesterday, and probably the most immediately noticeable update is the way in which you use the track pad to scroll. Previously, if you wanted to see text below the bottom of the visible page, you swiped with two fingers in a downward direction, and the page would scroll down in response. Now, the default has switched so that, to see more text below, you must swipe two fingers up, not down (and vice versa).

There is a certain logic to this. First of all, in direct opposition to the title of this post, it’s not really inverted scrolling: the way we’ve always scrolled is actually the inverted version, and this update “corrects” that. We are moving definitively in the direction of Apple Singularity: the convergence of user experience across all Apple products. Due to the massive rise of the iPhone and iPad, both of which are entirely touch-based, users have grown used to swiping down to scroll up and vice versa, because it feels natural to do so when you’re interacting directly with a screen.

The problem I see with trying to integrate the iPhone/iPad experience with that of a MacBook Pro, for example, is that there is a long history of user interaction with computer visuals, and that history is completely unaccustomed to the new “inverted scrolling” method. Most obviously, to move a cursor around a screen, you don’t touch the screen: you use the track pad. But when using this track pad, since you’re not physically (and I use the word “physically” here in a metaphorical sense, not literally) swiping a page up and down like you do on the iPhone or iPad, the natural expectation is for the cursor to move in the same direction as your finger movements.

This is still exactly what happens when you’re simply moving the cursor. But now, the moment you want to scroll, you’re forced to override your instincts and scroll in the opposite direction of what you want to see, despite the fact that the act of moving the cursor is handled exactly oppositely. So you basically have two sets of track pad rules for the same device, a MacBook Pro. On the iPhone and iPad, you only have one set of rules, which is to scroll in the opposite direction of what you want to see, and it feels natural because a) those devices have always only worked that way, and b) due to the touchscreen, you actually feel as if you’re swiping a physical piece of paper up or down, which would correspond perfectly to the movements you’re making on the screen. On the computer, you’re using a trackpad; you’re not swiping on the screen, so there’s already a disconnect between your finger movements and what’s happening on the screen.

By throwing “inverted scrolling” (which can be changed, by the way; it’s only a default) into the mix, Apple is either betting that this will catch on long-term on laptops as well, or they’re not particularly concerned with placating their laptop users — a possibility which is increasingly viable, given the incredible revenue growth of their touchscreen devices. I actually haven’t even switched back to the old settings, because in a way I agree: we are inexorably marching towards a touchscreen future (even though, I hope, laptops won’t disappear completely), and on some level it does make sense for all the scrolling rules to work similarly, no matter on which device. It just feels a little strange and unnatural on a laptop, where we’ve had years to get used to another system and where we continue to use that old system when it comes to moving the cursor but have switched to the new one for scrolling.