Science and Tech

Book review: Everybody Lies 5/27/17

Article (book review): Everybody Lies  

 The book Everybody Lies by Seth Stephens-Davidowitz looks at the use of search data as a way to find previously invisible correlations and connections. Example: the prevalence of the term “n*gger” in search results was the best variable in predicting whether or not the voters in that region would vote for Trump in the 2016 GOP primaries. Search data is a game-changer because it gets at what people actually believe, not what they are willing to admit to a stranger with a clipboard.

From the review: ‘Modern microeconomics, sociology, political science and quantitative psychology all depend to a large extent on surveys of at most a few thousand respondents. In contrast, he says, there are “four unique powers of Big Data”: it provides new sources of information, such as pornographic searches; it captures what people actually do or think, rather than what they choose to tell pollsters; it enables researchers to home in on and compare demographic or geographic subsets; and it allows for speedy randomized controlled trials that demonstrate not just correlation but causality. As a result, he predicts, “the days of academics devoting months to recruiting a small number of undergraduates to perform a single test will come to an end.” In their place, “the social and behavioural sciences are most definitely going to scale,” and the conclusions researchers will be able to reach are “the stuff of science, not pseudoscience”.’

Will Robots Replace Humans 4/1/17

Article: ""Remember the mane"

 > Looking at how the rise of robots and their impact on low-skilled workers can be analogous to the rise of the automobile and its impact on horses. A new working paper concludes that, between 1990 and 2007, each industrial robot added per thousand workers reduced employment in America by nearly six workers. The International Federation of Robotics defines industrial robots as machines that are automatically controlled and re-programmable; single-purpose equipment does not count. The worldwide population of such creatures is below 2m; America has slightly fewer than two robots per 1,000 workers. The paper’s authors, Daron Acemoglu of the Massachusetts Institute of Technology (MIT) and Pascual Restrepo of Boston University, are careful to exclude confounding causes as best they can. Since relatively few industrial robots are in use in the American economy, the total job loss from robotisation has been modest: between 360,000 and 670,000.

> Automation should yield savings to firms or consumers which can be spent on other goods or services. Labour liberated by technology should gravitate toward tasks and jobs in which humans retain an advantage. Yet that should also have been true of horses. The difficulty facing horses was in reallocating the huge numbers displaced by technology to places where they could still be of use.

 

Special Report: Automation 6/25/16

Article: "Automation - the return of the machinery question"

Automation has been a concern since the Industrial Revolution. David Ricardo from 1821: “The substitution of machinery for human labor renders the population redundant” - this is the Machinery Question. Now, after decades of overhype, AI is actually starting to dramatically improve. Study by Frey and Osborne (2013) says that 47% of all jobs in America are capable of being substituted by “computer capital”. Further developments like self-driving cars are estimated to add $2 trillion per year in productivity savings to the world economy.

AI advances come through a process called “deep learning” in which neural networks are created and then the machine is given a chance to learn on its own as opposed to being explicitly programmed. In supervised deep learning, the machine is given labelled data and learns how to make the identification correctly. Example is the ImageNet challenge - millions of labeled images. Humans can match the labels correctly 95% of the time; old AI about 75%. New deep learning AI at 96% - better than humans. Unsupervised deep learning, of which Google Brain is an example, let the machine look at millions of unsorted, unlabeled examples and let the machine find patterns and assign categories. Supervised deep learning was used to create “Enlitic” - a program for medical diagnoses based on images (x-rays, radiology etc.).. Enlitic has zero percent false negatives (missing a problem) compared to 7% for humans.  

Which jobs are vulnerable to automation? Old automation distinguished between manual and cognitive work. New AI based automation distinguishes between routine and non-routine work. In the Enlitic example, AI might be able to replace the radiologist but probably not the secretary. Historically, automation has created more jobs than it has destroyed. Ex: ATM’s mean banks need fewer tellers. But fewer tellers means lower costs per branch which allows for more branches to be built and more tellers overall to be employed. Work is fluid, not finite and fixed. Basic income may help get us through the rough patch of AI job destruction.