How the massive diesel fraud incinerated VW’s reputation—and will hobble the company for years to come. ... “Hoax,” of course, is a layman’s word. But plenty of legal terms also arguably apply, including “consumer fraud” and “false advertising.” They are fueling an explosion of litigation. That and the horrific reputational damage are subjecting Volkswagen to one of the severest challenges in its nearly 80-year history. ... The U.S. Department of Justice and the EPA have filed a civil suit that could theoretically subject VW to up to $45 billion in fines (though, in fairness, no one expects penalties quite that draconian). The DOJ and the EPA are also pursuing a criminal inquiry, as are prosecutors in Germany, France, Italy, Sweden, and South Korea. All 50 state attorneys general in the U.S. are also on the warpath, armed with state laws that, nominally at least, are every bit as crushing as the federal law. ... All of that comes on top of more than 500 class actions filed on behalf of owners and lessors of Volkswagen diesel cars ... VW’s misbehavior did not come out of nowhere. The company has a history of scandals and episodes in which it skirted the law. Each time—till now—it has escaped without dire consequences. ... VW is driven by a ruthless, overweening culture. Under Ferdinand Piëch and his successors, the company was run like an empire, with overwhelming control vested in a few hands, marked by a high-octane mix of ambition and arrogance—and micromanagement—all set against a volatile backdrop of epic family power plays, liaisons, and blood feuds. It’s a culture that mandated success at all costs.
An accelerating field of research suggests that most of the artificial intelligence we’ve created so far has learned enough to give a correct answer, but without truly understanding the information. And that means it’s easy to deceive. ... Machine learning algorithms have quickly become the all-seeing shepherds of the human flock. This software connects us on the internet, monitors our email for spam or malicious content, and will soon drive our cars. To deceive them would be to shift tectonic underpinnings of the internet, and could pose even greater threats for our safety and security in the future. ... Small groups of researchers—from Pennsylvania State University to Google to the U.S. military— are devising and defending against potential attacks that could be carried out on artificially intelligent systems. In theories posed in the research, an attacker could change what a driverless car sees. Or, it could activate voice recognition on any phone and make it visit a website with malware, only sounding like white noise to humans. Or let a virus travel through a firewall into a network. ... Instead of taking the controls of a driverless car, this method shows it a kind of a hallucination—images that aren’t really there. ... “We show you a photo that’s clearly a photo of a school bus, and we make you think it’s an ostrich,” says Ian Goodfellow, a researcher at Google who has driven much of the work on adversarial examples.
One Thursday in January 2001, Maksym Igor Popov, a 20-year-old Ukrainian man, walked nervously through the doors of the United States embassy in London. While Popov could have been mistaken for an exchange student applying for a visa, in truth he was a hacker, part of an Eastern European gang that had been raiding US companies and carrying out extortion and fraud. A wave of such attacks was portending a new kind of cold war, between the US and organized criminals in the former Soviet bloc, and Popov, baby-faced and pudgy, with glasses and a crew cut, was about to become the conflict’s first defector. ... The once-friendly FBI agents threw Popov in an isolation room, then returned an hour later with a federal prosecutor, a defense attorney, and a take-it-or-leave-it offer: Popov was going to be their informant, working all day, every day, to lure his crime partners into an FBI trap. If he refused, he’d go to prison. ... Popov was shocked. He’d been played for a durak—a fool. He was placed under 24-hour guard at an FBI safe house in Fair Lakes, Virginia, and instructed to talk to his friends in Russian chat rooms while the bureau recorded everything. But Popov had some tricks of his own. He pretended to cooperate while using Russian colloquialisms to warn his associates that he’d been conscripted into a US government sting. ... There seemed no escape from a future of endless jail cells and anonymous American courtrooms. ... Except that in a backwater FBI office in Santa Ana, California, an up-and-coming agent named Ernest “E. J.” Hilbert saw that the government needed Popov more than anyone knew. ... They called the operation Ant City. Now that he was back online, Popov adopted a new identity and began hanging out in underground chat rooms and posting on CarderPlanet, portraying himself as a big-time Ukrainian scammer with an insatiable hunger for stolen credit cards. ... One thing Popov had always known about Eastern European hackers: All they really wanted was a job.
Lying, it turns out, is something that most of us are very adept at. We lie with ease, in ways big and small, to strangers, co-workers, friends, and loved ones. Our capacity for dishonesty is as fundamental to us as our need to trust others, which ironically makes us terrible at detecting lies. Being deceitful is woven into our very fabric, so much so that it would be truthful to say that to lie is human. ... The ubiquity of lying was first documented systematically by Bella DePaulo, a social psychologist at the University of California, Santa Barbara. Two decades ago DePaulo and her colleagues asked 147 adults to jot down for a week every instance they tried to mislead someone. The researchers found that the subjects lied on average one or two times a day. Most of these untruths were innocuous, intended to hide one’s inadequacies or to protect the feelings of others. Some lies were excuses—one subject blamed the failure to take out the garbage on not knowing where it needed to go. Yet other lies—such as a claim of being a diplomat’s son—were aimed at presenting a false image. ... That human beings should universally possess a talent for deceiving one another shouldn’t surprise us. Researchers speculate that lying as a behavior arose not long after the emergence of language. The ability to manipulate others without using physical force likely conferred an advantage in the competition for resources and mates, akin to the evolution of deceptive strategies in the animal kingdom, such as camouflage.