Ernest Hemingway writes in the bedroom of his house in the Havana suburb of San Francisco de Paula. He has a special workroom prepared for him in a square tower at the southwest corner of the house, but prefers to work in his bedroom, climbing to the tower room only when “characters” drive him up there. ... A working habit he has had from the beginning, Hemingway stands when he writes. He stands in a pair of his oversized loafers on the worn skin of a lesser kudu—the typewriter and the reading board chest-high opposite him. ... He keeps track of his daily progress—“so as not to kid myself”—on a large chart made out of the side of a cardboard packing case and set up against the wall under the nose of a mounted gazelle head. The numbers on the chart showing the daily output of words differ from 450, 575, 462, 1250, back to 512, the higher figures on days Hemingway puts in extra work so he won’t feel guilty spending the following day fishing on the Gulf Stream. ... This dedication to his art may suggest a personality at odds with the rambunctious, carefree, world-wheeling Hemingway-at-play of popular conception. The fact is that Hemingway, while obviously enjoying life, brings an equivalent dedication to everything he does—an outlook that is essentially serious, with a horror of the inaccurate, the fraudulent, the deceptive, the half-baked.
The brain’s craving for novelty, constant stimulation and immediate gratification creates something called a “compulsion loop.” Like lab rats and drug addicts, we need more and more to get the same effect. ... Endless access to new information also easily overloads our working memory. When we reach cognitive overload, our ability to transfer learning to long-term memory significantly deteriorates. ... we humans have a very limited reservoir of will and discipline. We’re far more likely to succeed by trying to change one behavior at a time, ideally at the same time each day, so that it becomes a habit, requiring less and less energy to sustain.
As self-help workshops go, Applied Rationality’s is not especially accessible. The center’s three founders — Julia Galef, Anna Salamon and Smith — all have backgrounds in science or math or both, and their curriculum draws heavily from behavioral economics. Over the course of the weekend, I heard instructors invoke both hyperbolic discounting (a mathematical model of how people undervalue long-term rewards) and prospect theory (developed by the behavioral economists Daniel Kahneman and Amos Tversky to capture how people inaccurately weigh risky probabilities). But the premise of the workshop is simple: Our minds, cobbled together over millenniums by that lazy craftsman, evolution, are riddled with bad mental habits. ... Some of these problems are byproducts of our brain’s reward system. ... logical errors may be easy to spot in others, the group says, they’re often harder to see in ourselves. The workshop promised to give participants the tools to address these flaws, which, it hinted, are almost certainly worse than we realize. ... Most self-help appeals to us because it promises real change without much real effort, a sort of fad diet for the psyche. ... CFAR’s focus on science and on tiresome levels of practice can seem almost radical. It has also generated a rare level of interest among data-driven tech people and entrepreneurs who see personal development as just another optimization problem, if a uniquely central one. Yet, while CFAR’s methods are unusual, its aspirational promise — that a better version of ourselves is within reach — is distinctly familiar. The center may emphasize the benefits that will come to those who master the techniques of rational thought, like improved motivation and a more organized inbox, but it also suggests that the real reward will be far greater, enabling users to be more intellectually dynamic and nimble. ... CFAR’s original mandate was to give researchers the mental tools to overcome their unconscious assumptions. ... What makes CFAR novel is its effort to use those same principles to fix personal problems: to break frustrating habits, recognize self-defeating cycles and relentlessly interrogate our own wishful inclinations and avoidant instincts.
The rise of the internet and the widespread availability of digital technology has surrounded us with endless sources of distraction: texts, emails and Instagrams from friends, streaming music and videos, ever-changing stock quotes, news and more news. To get our work done, we could try to turn off the digital stream, but that’s difficult to do when we’re plagued by FOMO, the modern fear of missing out. Some people think that our willpower is so weak because our brains have been damaged by digital noise. But blaming technology for the rise in inattention is misplaced. History shows that the disquiet is fuelled not by the next new thing but by the threat this thing – whatever it might be – poses to the moral authority of the day. ... The first time inattention emerged as a social threat was in 18th-century Europe, during the Enlightenment, just as logic and science were pushing against religion and myth. The Oxford English Dictionary cites a 1710 entry from Tatler as its first reference to this word, coupling inattention with indolence; both are represented as moral vices of serious public concern. ... the culture of the Enlightenment celebrated attention as the most important mental faculty for the exercise of reason. ... Countering the habit of inattention among children and young people became the central concern of pedagogy in the 18th century. ... Unlike in the 18th century when it was perceived as abnormal, today inattention is often presented as the normal state. The current era is frequently characterised as the Age of Distraction, and inattention is no longer depicted as a condition that afflicts a few. Nowadays, the erosion of humanity’s capacity for attention is portrayed as an existential problem, linked with the allegedly corrosive effects of digitally driven streams of information relentlessly flowing our way. ... Throughout its history, inattention has served as a sublimated focus for apprehensions about moral authority.
For a decade and a half, I’d been a web obsessive, publishing blog posts multiple times a day, seven days a week, and ultimately corralling a team that curated the web every 20 minutes during peak hours. Each morning began with a full immersion in the stream of internet consciousness and news, jumping from site to site, tweet to tweet, breaking news story to hottest take, scanning countless images and videos, catching up with multiple memes. Throughout the day, I’d cough up an insight or an argument or a joke about what had just occurred or what was happening right now. And at times, as events took over, I’d spend weeks manically grabbing every tiny scrap of a developing story in order to fuse them into a narrative in real time. I was in an unending dialogue with readers who were caviling, praising, booing, correcting. My brain had never been so occupied so insistently by so many different subjects and in so public a way for so long. ... I was, in other words, a very early adopter of what we might now call living-in-the-web. And as the years went by, I realized I was no longer alone. Facebook soon gave everyone the equivalent of their own blog and their own audience. More and more people got a smartphone — connecting them instantly to a deluge of febrile content, forcing them to cull and absorb and assimilate the online torrent as relentlessly as I had once. ... Then the apps descended, like the rain, to inundate what was left of our free time. It was ubiquitous now, this virtual living, this never-stopping, this always-updating. ... the insanity was now banality ... e almost forget that ten years ago, there were no smartphones, and as recently as 2011, only a third of Americans owned one. Now nearly two-thirds do. That figure reaches 85 percent when you’re only counting young adults. And 46 percent of Americans told Pew surveyors last year a simple but remarkable thing: They could not live without one. ... By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact.
When I returned to addiction, it was as a scientist studying the addicted brain. The data were indisputable: brains change with addiction. I wanted to understand how – and why. I wanted to understand addiction with fastidious objectivity, but I didn’t want to lose touch with its subjectivity – how it feels, how hard it is – in the process. ... One explanation is that addiction is a brain disease. The United States National Institute on Drug Abuse, the American Society of Addiction Medicine, and the American Medical Association ubiquitously define addiction as a ‘chronic disease of brain reward, motivation, memory and related circuitry’ ... If only the disease model worked. Yet, more and more, we find that it doesn’t. First of all, brain change alone isn’t evidence for brain disease. Brains are designed to change. ... we now know that drugs don’t cause addiction. ... One idea is that addicts voluntarily choose to remain addicted: if they don’t quit, it’s because they don’t want to. ... The view that addiction arises through learning, in the context of environmental forces, appears to be gathering momentum.
Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way? ... new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. ... point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context. ... Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups. ... Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.
Smoking rates were in decline among well-educated consumers in developed economies; to make up for slipping sales, the companies were raising prices, which they could do for only so long. Meanwhile, a growing number of customers were switching to e-cigarettes in the hope of escaping their addiction or preserving their health. The devices, which use battery-powered coils to vaporize nicotine-infused solutions, had leapt on the scene seemingly out of nowhere. One of the first commercially available e-cigarettes had been created circa 2003 as a smoking cessation device by a Chinese pharmacist whose father had died of lung cancer. By 2013 the e-cigarette market had $3.7 billion in annual sales, according to Euromonitor International, and was expanding rapidly. ... Philip Morris International scrambled to fashion newfangled nicotine-delivering devices that would catch the wandering eye of the restless tobacco consumer. ... Everywhere you look in the industry, companies are pouring money into product development while borrowing liberally from the style of Silicon Valley. ... Tobacco executives often sound like media owners talking about content. That is, they’re open to delivering their drug via whatever pipe the consumer chooses—be it e-cigarettes, heat-not-burn devices, gum, lozenges, dip, or some medium that hasn’t been invented yet.