The Verge - Translating Seinfeld 5-15min

More so than the average American sitcom, Seinfeld has had difficulty reaching global audiences. While it’s popular in Latin America, it hasn’t been widely accepted in Germany, France, Italy, and the Netherlands. Two decades after it went off the air, Seinfeld remains relevant to American audiences — thanks in part to omnipresent syndicated reruns — but in much of Europe it is considered a cult hit, and commonly relegated to deep-late-night time slots. Its humor, it seems, is just too complicated, too cultural and word-based, to make for easy translation. ... Jokes are the hardest things to translate into another language, another culture, another world. A good script for dubbing an American sitcom for foreign consumption does more than literally translate. It manages to convey the same meaning, the same feeling, the same story — the same direct hit to the lower frontal lobes of the brain that produces a laugh, even though those frontal lobes are steeped in a completely different cultural brew. ... Lip-synch dubbing, despite its ultimate benefits, can get very complicated. It’s not just that the lines may not translate directly — they also have to take just as long to say in both languages and approximate, to the best of their abilities, the lip movements of the original actors. That can pose an added challenge when translating from laconic languages like English into verbose languages like German.

Fortune - Why Deep Learning is Suddenly Changing Your Life 13min

The most remarkable thing about neural nets is that no human being has programmed a computer to perform any of the stunts described above. In fact, no human could. Programmers have, rather, fed the computer a learning algorithm, exposed it to terabytes of data—hundreds of thousands of images or years’ worth of speech samples—to train it, and have then allowed the computer to figure out for itself how to recognize the desired objects, words, or sentences. ... Neural nets aren’t new. The concept dates back to the 1950s, and many of the key algorithmic breakthroughs occurred in the 1980s and 1990s. What’s changed is that today computer scientists have finally harnessed both the vast computational power and the enormous storehouses of data—images, video, audio, and text files strewn across the Internet—that, it turns out, are essential to making neural nets work well. ... That dramatic progress has sparked a burst of activity. Equity funding of AI-focused startups reached an all-time high last quarter of more than $1 billion, according to the CB Insights research firm. There were 121 funding rounds for such startups in the second quarter of 2016, compared with 21 in the equivalent quarter of 2011, that group says. More than $7.5 billion in total investments have been made during that stretch—with more than $6 billion of that coming since 2014. ... The hardware world is feeling the tremors. The increased computational power that is making all this possible derives not only from Moore’s law but also from the realization in the late 2000s that graphics processing units (GPUs) made by Nvidia—the powerful chips that were first designed to give gamers rich, 3D visual experiences—were 20 to 50 times more efficient than traditional central processing units (CPUs) for deep-learning computations. ... Think of deep learning as a subset of a subset. “Artificial intelligence” encompasses a vast range of technologies—like traditional logic and rules-based systems—that enable computers and robots to solve problems in ways that at least superficially resemble thinking. Within that realm is a smaller category called machine learning, which is the name for a whole toolbox of arcane but important mathematical techniques that enable computers to improve at performing tasks with experience. Finally, within machine learning is the smaller subcategory called deep learning.

Ai glossary