The internet promised to feed our minds with knowledge. What have we learned? That our minds need more than that ... My point is not that we should return to some romanticised preindustrial past: I mean only to draw attention to contradictions that still shape our post-industrial present. The physical violence of the 19th-century factory might be gone, at least in the countries where industrialisation began, but the alienation inherent in these ways of organising work remains. ... When the internet arrived, it seemed to promise a liberation from the boredom of industrial society, a psychedelic jet-spray of information into every otherwise tedious corner of our lives. In fact, at its best, it is something else: a remarkable helper in the search for meaningful connections. But if the deep roots of boredom are in a lack of meaning, rather than a shortage of stimuli, and if there is a subtle, multilayered process by which information can give rise to meaning, then the constant flow of information to which we are becoming habituated cannot deliver on such a promise. At best, it allows us to distract ourselves with the potentially endless deferral of clicking from one link to another. Yet sooner or later we wash up downstream in some far corner of the web, wondering where the time went. The experience of being carried on these currents is quite different to the patient, unpredictable process that leads towards meaning.
The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fuelled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day. ... The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern. ... the culture of the Enlightenment celebrated attention as the most important mental faculty for the exercise of reason. ... Countering the habit of inattention among children and young people became the central concern of pedagogy in the 18th century. ... Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterised as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way. ... Throughout its history, inattention has served as a sublimated focus for apprehensions about moral authority.
For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long. ... I was, in other words, a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. ... Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating. ... the insanity was now banality ... e almost forget that ten years ago, there were no smartphones, and as recently as 2011, only a third of Americans owned one. Now nearly two-thirds do. That figure reaches 85 percent when you’re only counting young adults. And 46 percent of Americans told Pew surveyors last year a simple but remarkable thing: They could not live without one. ... By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact.
Harris is the closest thing Silicon Valley has to a conscience. As the co‑founder of Time Well Spent, an advocacy group, he is trying to bring moral integrity to software design: essentially, to persuade the tech world to help us disengage more easily from its devices. ... While some blame our collective tech addiction on personal failings, like weak willpower, Harris points a finger at the software itself. That itch to glance at our phone is a natural reaction to apps and websites engineered to get us scrolling as frequently as possible. The attention economy, which showers profits on companies that seize our focus, has kicked off what Harris calls a “race to the bottom of the brain stem.” ... we’ve lost control of our relationship with technology because technology has become better at controlling us. ... He studied computer science at Stanford while interning at Apple, then embarked on a master’s degree at Stanford, where he joined the Persuasive Technology Lab. Run by the experimental psychologist B. J. Fogg, the lab has earned a cultlike following among entrepreneurs hoping to master Fogg’s principles of “behavior design”—a euphemism for what sometimes amounts to building software that nudges us toward the habits a company seeks to instill. ... Sites foster a sort of distracted lingering partly by lumping multiple services together.