Book picks similar to
Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World by Leslie Valiant
science
non-fiction
nonfiction
mathematics
The Sciences of the Artificial
Herbert A. Simon - 1969
There are updates throughout the book as well. These take into account important advances in cognitive psychology and the science of design while confirming and extending the book's basic thesis: that a physical symbol system has the necessary and sufficient means for intelligent action. The chapter "Economic Reality" has also been revised to reflect a change in emphasis in Simon's thinking about the respective roles of organizations and markets in economic systems."People sometimes ask me what they should read to find out about artificial intelligence. Herbert Simon's book The Sciences of the Artificial is always on the list I give them. Every page issues a challenge to conventional thinking, and the layman who digests it well will certainly understand what the field of artificial intelligence hopes to accomplish. I recommend it in the same spirit that I recommend Freud to people who ask about psychoanalysis, or Piaget to those who ask about child psychology: If you want to learn about a subject, start by reading its founding fathers." -- George A. Miller
Data Analysis with Open Source Tools: A Hands-On Guide for Programmers and Data Scientists
Philipp K. Janert - 2010
With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications.Along the way, you'll experiment with concepts through hands-on workshops at the end of each chapter. Above all, you'll learn how to think about the results you want to achieve -- rather than rely on tools to think for you.Use graphics to describe data with one, two, or dozens of variablesDevelop conceptual models using back-of-the-envelope calculations, as well asscaling and probability argumentsMine data with computationally intensive methods such as simulation and clusteringMake your conclusions understandable through reports, dashboards, and other metrics programsUnderstand financial calculations, including the time-value of moneyUse dimensionality reduction techniques or predictive analytics to conquer challenging data analysis situationsBecome familiar with different open source programming environments for data analysisFinally, a concise reference for understanding how to conquer piles of data.--Austin King, Senior Web Developer, MozillaAn indispensable text for aspiring data scientists.--Michael E. Driscoll, CEO/Founder, Dataspora
Code: The Hidden Language of Computer Hardware and Software
Charles Petzold - 1999
And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines. It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.
The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life Plus the Secrets of Enigma
Alan Turing - 2004
In 1935, aged 22, he developed the mathematical theory upon which all subsequent stored-program digital computers are modeled.At the outbreak of hostilities with Germany in September 1939, he joined the Government Codebreaking team at Bletchley Park, Buckinghamshire and played a crucial role in deciphering Engima, the code used by the German armed forces to protect their radio communications. Turing's work on the versionof Enigma used by the German navy was vital to the battle for supremacy in the North Atlantic. He also contributed to the attack on the cyphers known as Fish, which were used by the German High Command for the encryption of signals during the latter part of the war. His contribution helped toshorten the war in Europe by an estimated two years.After the war, his theoretical work led to the development of Britain's first computers at the National Physical Laboratory and the Royal Society Computing Machine Laboratory at Manchester University.Turing was also a founding father of modern cognitive science, theorizing that the cortex at birth is an unorganized machine which through training becomes organized into a universal machine or something like it. He went on to develop the use of computers to model biological growth, launchingthe discipline now referred to as Artificial Life.The papers in this book are the key works for understanding Turing's phenomenal contribution across all these fields. The collection includes Turing's declassified wartime Treatise on the Enigma; letters from Turing to Churchill and to codebreakers; lectures, papers, and broadcasts which opened upthe concept of AI and its implications; and the paper which formed the genesis of the investigation of Artifical Life.
R for Data Science: Import, Tidy, Transform, Visualize, and Model Data
Hadley Wickham - 2016
This book introduces you to R, RStudio, and the tidyverse, a collection of R packages designed to work together to make data science fast, fluent, and fun. Suitable for readers with no previous programming experience, R for Data Science is designed to get you doing data science as quickly as possible.
Authors Hadley Wickham and Garrett Grolemund guide you through the steps of importing, wrangling, exploring, and modeling your data and communicating the results. You’ll get a complete, big-picture understanding of the data science cycle, along with basic tools you need to manage the details. Each section of the book is paired with exercises to help you practice what you’ve learned along the way.
You’ll learn how to:
Wrangle—transform your datasets into a form convenient for analysis
Program—learn powerful R tools for solving data problems with greater clarity and ease
Explore—examine your data, generate hypotheses, and quickly test them
Model—provide a low-dimensional summary that captures true "signals" in your dataset
Communicate—learn R Markdown for integrating prose, code, and results
The Art of Doing Science and Engineering: Learning to Learn
Richard Hamming - 1996
By presenting actual experiences and analyzing them as they are described, the author conveys the developmental thought processes employed and shows a style of thinking that leads to successful results is something that can be learned. Along with spectacular successes, the author also conveys how failures contributed to shaping the thought processes. Provides the reader with a style of thinking that will enhance a person's ability to function as a problem-solver of complex technical issues. Consists of a collection of stories about the author's participation in significant discoveries, relating how those discoveries came about and, most importantly, provides analysis about the thought processes and reasoning that took place as the author and his associates progressed through engineering problems.
The Information: A History, a Theory, a Flood
James Gleick - 2011
The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself. And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.
The Visual Display of Quantitative Information
Edward R. Tufte - 1983
Theory and practice in the design of data graphics, 250 illustrations of the best (and a few of the worst) statistical graphics, with detailed analysis of how to display data for precise, effective, quick analysis. Design of the high-resolution displays, small multiples. Editing and improving graphics. The data-ink ratio. Time-series, relational graphics, data maps, multivariate designs. Detection of graphical deception: design variation vs. data variation. Sources of deception. Aesthetics and data graphical displays. This is the second edition of The Visual Display of Quantitative Information. Recently published, this new edition provides excellent color reproductions of the many graphics of William Playfair, adds color to other images, and includes all the changes and corrections accumulated during 17 printings of the first edition.
The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies
Erik Brynjolfsson - 2014
Digital technologies—with hardware, software, and networks at their core—will in the near future diagnose diseases more accurately than doctors can, apply enormous data sets to transform retailing, and accomplish many tasks once considered uniquely human.In The Second Machine Age MIT’s Erik Brynjolfsson and Andrew McAfee—two thinkers at the forefront of their field—reveal the forces driving the reinvention of our lives and our economy. As the full impact of digital technologies is felt, we will realize immense bounty in the form of dazzling personal technology, advanced infrastructure, and near-boundless access to the cultural items that enrich our lives.Amid this bounty will also be wrenching change. Professions of all kinds—from lawyers to truck drivers—will be forever upended. Companies will be forced to transform or die. Recent economic indicators reflect this shift: fewer people are working, and wages are falling even as productivity and profits soar.Drawing on years of research and up-to-the-minute trends, Brynjolfsson and McAfee identify the best strategies for survival and offer a new path to prosperity. These include revamping education so that it prepares people for the next economy instead of the last one, designing new collaborations that pair brute processing power with human ingenuity, and embracing policies that make sense in a radically transformed landscape.A fundamentally optimistic book, The Second Machine Age alters how we think about issues of technological, societal, and economic progress.
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Cathy O'Neil - 2016
Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.
The Sentient Machine: The Coming Age of Artificial Intelligence
Amir Husain - 2017
Acclaimed technologist and inventor Amir Husain explains how we can live amidst the coming age of sentient machines and artificial intelligence—and not only survive, but thrive.Artificial “machine” intelligence is playing an ever-greater role in our society. We are already using cruise control in our cars, automatic checkout at the drugstore, and are unable to live without our smartphones. The discussion around AI is polarized; people think either machines will solve all problems for everyone, or they will lead us down a dark, dystopian path into total human irrelevance. Regardless of what you believe, the idea that we might bring forth intelligent creation can be intrinsically frightening. But what if our greatest role as humans so far is that of creators? Amir Husain, a brilliant inventor and computer scientist, argues that we are on the cusp of writing our next, and greatest, creation myth. It is the dawn of a new form of intellectual diversity, one that we need to embrace in order to advance the state of the art in many critical fields, including security, resource management, finance, and energy. “In The Sentient Machine, Husain prepares us for a brighter future; not with hyperbole about right and wrong, but with serious arguments about risk and potential” (Dr. Greg Hyslop, Chief Technology Officer, The Boeing Company). He addresses broad existential questions surrounding the coming of AI: Why are we valuable? What can we create in this world? How are we intelligent? What constitutes progress for us? And how might we fail to progress? Husain boils down complex computer science and AI concepts into clear, plainspoken language and draws from a wide variety of cultural and historical references to illustrate his points. Ultimately, Husain challenges many of our societal norms and upends assumptions we hold about “the good life.”
Machine Learning for Hackers
Drew Conway - 2012
Authors Drew Conway and John Myles White help you understand machine learning and statistics tools through a series of hands-on case studies, instead of a traditional math-heavy presentation.Each chapter focuses on a specific problem in machine learning, such as classification, prediction, optimization, and recommendation. Using the R programming language, you'll learn how to analyze sample datasets and write simple machine learning algorithms. "Machine Learning for Hackers" is ideal for programmers from any background, including business, government, and academic research.Develop a naive Bayesian classifier to determine if an email is spam, based only on its textUse linear regression to predict the number of page views for the top 1,000 websitesLearn optimization techniques by attempting to break a simple letter cipherCompare and contrast U.S. Senators statistically, based on their voting recordsBuild a "whom to follow" recommendation system from Twitter data
Bayesian Statistics the Fun Way: Understanding Statistics and Probability with Star Wars, Lego, and Rubber Ducks
Will Kurt - 2019
But many people use data in ways they don't even understand, meaning they aren't getting the most from it. Bayesian Statistics the Fun Way will change that.This book will give you a complete understanding of Bayesian statistics through simple explanations and un-boring examples. Find out the probability of UFOs landing in your garden, how likely Han Solo is to survive a flight through an asteroid shower, how to win an argument about conspiracy theories, and whether a burglary really was a burglary, to name a few examples.By using these off-the-beaten-track examples, the author actually makes learning statistics fun. And you'll learn real skills, like how to:- How to measure your own level of uncertainty in a conclusion or belief- Calculate Bayes theorem and understand what it's useful for- Find the posterior, likelihood, and prior to check the accuracy of your conclusions- Calculate distributions to see the range of your data- Compare hypotheses and draw reliable conclusions from themNext time you find yourself with a sheaf of survey results and no idea what to do with them, turn to Bayesian Statistics the Fun Way to get the most value from your data.
Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies
Geoffrey B. West - 2017
The term “complexity” can be misleading, however, because what makes West’s discoveries so beautiful is that he has found an underlying simplicity that unites the seemingly complex and diverse phenomena of living systems, including our bodies, our cities and our businesses. Fascinated by issues of aging and mortality, West applied the rigor of a physicist to the biological question of why we live as long as we do and no longer. The result was astonishing, and changed science, creating a new understanding of energy use and metabolism: West found that despite the riotous diversity in the sizes of mammals, they are all, to a large degree, scaled versions of each other. If you know the size of a mammal, you can use scaling laws to learn everything from how much food it eats per day, what its heart-rate is, how long it will take to mature, its lifespan, and so on. Furthermore, the efficiency of the mammal’s circulatory systems scales up precisely based on weight: if you compare a mouse, a human and an elephant on a logarithmic graph, you find with every doubling of average weight, a species gets 25% more efficient—and lives 25% longer. This speaks to everything from how long we can expect to live to how many hours of sleep we need. Fundamentally, he has proven, the issue has to do with the fractal geometry of the networks that supply energy and remove waste from the organism's body. West's work has been game-changing for biologists, but then he made the even bolder move of exploring his work's applicability to cities. Cities, too, are constellations of networks and laws of scalability relate with eerie precision to them. For every doubling in a city's size, the city needs 15% less road, electrical wire, and gas stations to support the same population. More amazingly, for every doubling in size, cities produce 15% more patents and more wealth, as well as 15% more crime and disease. This broad pattern lays the groundwork for a new science of cities. Recently, West has applied his revolutionary work on cities and biological life to the business world. This investigation has led to powerful insights into why some companies thrive while others fail. The implications of these discoveries are far-reaching, and are just beginning to be explored. Scale is a thrilling scientific adventure story about the elemental natural laws that bind us together in simple but profound ways. Through the brilliant mind of Geoffrey West, we can envision how cities, companies and biological life alike are dancing to the same simple, powerful tune, however diverse and unrelated they are to each other.From the Hardcover edition.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie - 2001
With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.