Book picks similar to
Deep Learning by John D. Kelleher
ai
science
computer-science
non-fiction
Naked Statistics: Stripping the Dread from the Data
Charles Wheelan - 2012
How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more.For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions.And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.
Computational Thinking
Peter J. Denning - 2019
More recently, "computational thinking" has become part of the K-12 curriculum. But what is computational thinking? This volume in the MIT Press Essential Knowledge series offers an accessible overview, tracing a genealogy that begins centuries before digital computers and portraying computational thinking as pioneers of computing have described it.The authors explain that computational thinking (CT) is not a set of concepts for programming; it is a way of thinking that is honed through practice: the mental skills for designing computations to do jobs for us, and for explaining and interpreting the world as a complex of information processes. Mathematically trained experts (known as "computers") who performed complex calculations as teams engaged in CT long before electronic computers. The authors identify six dimensions of today's highly developed CT--methods, machines, computing education, software engineering, computational science, and design--and cover each in a chapter. Along the way, they debunk inflated claims for CT and computation while making clear the power of CT in all its complexity and multiplicity.
Computing: A Concise History
Paul E. Ceruzzi - 2012
In this concise and accessible account of the invention and development of digital technology, computer historian Paul Ceruzzi offers a broader and more useful perspective. He identifies four major threads that run throughout all of computing's technological development: digitization--the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices, and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by "Moore's Law"; and the human-machine interface. Ceruzzi guides us through computing history, telling how a Bell Labs mathematician coined the word "digital" in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internet's precursor. Ceruzzi's account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a "minicomputer" to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the present with the Internet, the World Wide Web, and social networking.
The Model Thinker: What You Need to Know to Make Data Work for You
Scott E. Page - 2018
But as anyone who has ever opened up a spreadsheet packed with seemingly infinite lines of data knows, numbers aren't enough: we need to know how to make those numbers talk. In The Model Thinker, social scientist Scott E. Page shows us the mathematical, statistical, and computational models—from linear regression to random walks and far beyond—that can turn anyone into a genius. At the core of the book is Page's "many-model paradigm," which shows the reader how to apply multiple models to organize the data, leading to wiser choices, more accurate predictions, and more robust designs. The Model Thinker provides a toolkit for business people, students, scientists, pollsters, and bloggers to make them better, clearer thinkers, able to leverage data and information to their advantage.
Artificial Intelligence: A Modern Approach
Stuart Russell - 1994
The long-anticipated revision of this best-selling text offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. *NEW-Nontechnical learning material-Accompanies each part of the book. *NEW-The Internet as a sample application for intelligent systems-Added in several places including logical agents, planning, and natural language. *NEW-Increased coverage of material - Includes expanded coverage of: default reasoning and truth maintenance systems, including multi-agent/distributed AI and game theory; probabilistic approaches to learning including EM; more detailed descriptions of probabilistic inference algorithms. *NEW-Updated and expanded exercises-75% of the exercises are revised, with 100 new exercises. *NEW-On-line Java software. *Makes it easy for students to do projects on the web using intelligent agents. *A unified, agent-based approach to AI-Organizes the material around the task of building intelligent agents. *Comprehensive, up-to-date coverage-Includes a unified view of the field organized around the rational decision making pa
The Book of Why: The New Science of Cause and Effect
Judea Pearl - 2018
Today, that taboo is dead. The causal revolution, instigated by Judea Pearl and his colleagues, has cut through a century of confusion and established causality -- the study of cause and effect -- on a firm scientific basis. His work explains how we can know easy things, like whether it was rain or a sprinkler that made a sidewalk wet; and how to answer hard questions, like whether a drug cured an illness. Pearl's work enables us to know not just whether one thing causes another: it lets us explore the world that is and the worlds that could have been. It shows us the essence of human thought and key to artificial intelligence. Anyone who wants to understand either needs The Book of Why.
Practical Statistics for Data Scientists: 50 Essential Concepts
Peter Bruce - 2017
Courses and books on basic statistics rarely cover the topic from a data science perspective. This practical guide explains how to apply various statistical methods to data science, tells you how to avoid their misuse, and gives you advice on what's important and what's not.Many data science resources incorporate statistical methods but lack a deeper statistical perspective. If you're familiar with the R programming language, and have some exposure to statistics, this quick reference bridges the gap in an accessible, readable format.With this book, you'll learn:Why exploratory data analysis is a key preliminary step in data scienceHow random sampling can reduce bias and yield a higher quality dataset, even with big dataHow the principles of experimental design yield definitive answers to questionsHow to use regression to estimate outcomes and detect anomaliesKey classification techniques for predicting which categories a record belongs toStatistical machine learning methods that "learn" from dataUnsupervised learning methods for extracting meaning from unlabeled data
Our Final Invention: Artificial Intelligence and the End of the Human Era
James Barrat - 2013
Corporations & government agencies around the world are pouring billions into achieving AI’s Holy Grail—human-level intelligence. Once AI has attained it, scientists argue, it will have survival drives much like our own. We may be forced to compete with a rival more cunning, more powerful & more alien than we can imagine. Thru profiles of tech visionaries, industry watchdogs & groundbreaking AI systems, James Barrat's Our Final Invention explores the perils of the heedless pursuit of advanced AI. Until now, human intelligence has had no rival. Can we coexist with beings whose intelligence dwarfs our own? Will they allow us to?
Metadata
Jeffrey Pomerantz - 2015
When "metadata" became breaking news, appearing in stories about surveillance by the National Security Agency, many members of the public encountered this once-obscure term from information science for the first time. Should people be reassured that the NSA was "only" collecting metadata about phone calls--information about the caller, the recipient, the time, the duration, the location--and not recordings of the conversations themselves? Or does phone call metadata reveal more than it seems? In this book, Jeffrey Pomerantz offers an accessible and concise introduction to metadata.In the era of ubiquitous computing, metadata has become infrastructural, like the electrical grid or the highway system. We interact with it or generate it every day. It is not, Pomerantz tell us, just "data about data." It is a means by which the complexity of an object is represented in a simpler form. For example, the title, the author, and the cover art are metadata about a book. When metadata does its job well, it fades into the background; everyone (except perhaps the NSA) takes it for granted.Pomerantz explains what metadata is, and why it exists. He distinguishes among different types of metadata--descriptive, administrative, structural, preservation, and use--and examines different users and uses of each type. He discusses the technologies that make modern metadata possible, and he speculates about metadata's future. By the end of the book, readers will see metadata everywhere. Because, Pomerantz warns us, it's metadata's world, and we are just living in it.
Hello World: Being Human in the Age of Algorithms
Hannah Fry - 2018
It’s time we stand face-to-digital-face with the true powers and limitations of the algorithms that already automate important decisions in healthcare, transportation, crime, and commerce. Hello World is indispensable preparation for the moral quandaries of a world run by code, and with the unfailingly entertaining Hannah Fry as our guide, we’ll be discussing these issues long after the last page is turned.
Deep Learning with Python
François Chollet - 2017
It is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more.In particular, Deep learning excels at solving machine perception problems: understanding the content of image data, video data, or sound data. Here's a simple example: say you have a large collection of images, and that you want tags associated with each image, for example, "dog," "cat," etc. Deep learning can allow you to create a system that understands how to map such tags to images, learning only from examples. This system can then be applied to new images, automating the task of photo tagging. A deep learning model only has to be fed examples of a task to start generating useful results on new data.
Foundations of Statistical Natural Language Processing
Christopher D. Manning - 1999
This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.
Possible Minds: 25 Ways of Looking at AI
John Brockman - 2019
It is the Second Coming and the Apocalypse at the same time: Good AI versus evil AI." --John BrockmanMore than sixty years ago, mathematician-philosopher Norbert Wiener published a book on the place of machines in society that ended with a warning: "we shall never receive the right answers to our questions unless we ask the right questions.... The hour is very late, and the choice of good and evil knocks at our door."In the wake of advances in unsupervised, self-improving machine learning, a small but influential community of thinkers is considering Wiener's words again. In Possible Minds, John Brockman gathers their disparate visions of where AI might be taking us.The fruit of the long history of Brockman's profound engagement with the most important scientific minds who have been thinking about AI--from Alison Gopnik and David Deutsch to Frank Wilczek and Stephen Wolfram--Possible Minds is an ideal introduction to the landscape of crucial issues AI presents. The collision between opposing perspectives is salutary and exhilarating; some of these figures, such as computer scientist Stuart Russell, Skype co-founder Jaan Tallinn, and physicist Max Tegmark, are deeply concerned with the threat of AI, including the existential one, while others, notably robotics entrepreneur Rodney Brooks, philosopher Daniel Dennett, and bestselling author Steven Pinker, have a very different view. Serious, searching and authoritative, Possible Minds lays out the intellectual landscape of one of the most important topics of our time.
Machine Learning
Tom M. Mitchell - 1986
Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.
Architects of Intelligence: The truth about AI from the people building it
Martin Ford - 2018
of Toronto and Google), Rodney Brooks (Rethink Robotics), Yann LeCun (Facebook) , Fei-Fei Li (Stanford and Google), Yoshua Bengio (Univ. of Montreal), Andrew Ng (AI Fund), Daphne Koller (Stanford), Stuart Russell (UC Berkeley), Nick Bostrom (Univ. of Oxford), Barbara Grosz (Harvard), David Ferrucci (Elemental Cognition), James Manyika (McKinsey), Judea Pearl (UCLA), Josh Tenenbaum (MIT), Rana el Kaliouby (Affectiva), Daniela Rus (MIT), Jeff Dean (Google), Cynthia Breazeal (MIT), Oren Etzioni (Allen Institute for AI), Gary Marcus (NYU), and Bryan Johnson (Kernel).Martin Ford is a prominent futurist, and author of Financial Times Business Book of the Year, Rise of the Robots. He speaks at conferences and companies around the world on what AI and automation might mean for the future. Editorial reviews: "In his newest book, Architects of Intelligence, Martin Ford provides us with an invaluable opportunity to learn from some of the most prominent thought leaders about the emerging fields of science that are shaping our future."
-Al Gore, Former Vice President of the US
"AI is going to shape our future, and Architects of Intelligence offers a unique and fascinating collection of perspectives from the top researchers and entrepreneurs who are driving progress in the field."
- Eric Schmidt, former Chairman and CEO, Google
"The best way to understand the challenges and consequences of AGI is to see inside the minds of industry experts shaping the field. Architects of Intelligence gives you that power."
-Sam Altman, President of Y Combinator and co-chairman of OpenAI
"Architects of Intelligence gets you inside the minds of the people building the technology that is going to transform our world. This is a book that everyone should read."
-Reid Hoffman, Co-founder of LinkedIn