The Sciences of the Artificial


Herbert A. Simon - 1969
    There are updates throughout the book as well. These take into account important advances in cognitive psychology and the science of design while confirming and extending the book's basic thesis: that a physical symbol system has the necessary and sufficient means for intelligent action. The chapter "Economic Reality" has also been revised to reflect a change in emphasis in Simon's thinking about the respective roles of organizations and markets in economic systems."People sometimes ask me what they should read to find out about artificial intelligence. Herbert Simon's book The Sciences of the Artificial is always on the list I give them. Every page issues a challenge to conventional thinking, and the layman who digests it well will certainly understand what the field of artificial intelligence hopes to accomplish. I recommend it in the same spirit that I recommend Freud to people who ask about psychoanalysis, or Piaget to those who ask about child psychology: If you want to learn about a subject, start by reading its founding fathers." -- George A. Miller

Applied Cryptography: Protocols, Algorithms, and Source Code in C


Bruce Schneier - 1993
    … The book the National Security Agency wanted never to be published." –Wired Magazine "…monumental… fascinating… comprehensive… the definitive work on cryptography for computer programmers…" –Dr. Dobb's Journal"…easily ranks as one of the most authoritative in its field." —PC Magazine"…the bible of code hackers." –The Millennium Whole Earth CatalogThis new edition of the cryptography classic provides you with a comprehensive survey of modern cryptography. The book details how programmers and electronic communications professionals can use cryptography—the technique of enciphering and deciphering messages-to maintain the privacy of computer data. It describes dozens of cryptography algorithms, gives practical advice on how to implement them into cryptographic software, and shows how they can be used to solve security problems. Covering the latest developments in practical cryptographic techniques, this new edition shows programmers who design computer applications, networks, and storage systems how they can build security into their software and systems. What's new in the Second Edition? * New information on the Clipper Chip, including ways to defeat the key escrow mechanism * New encryption algorithms, including algorithms from the former Soviet Union and South Africa, and the RC4 stream cipher * The latest protocols for digital signatures, authentication, secure elections, digital cash, and more * More detailed information on key management and cryptographic implementations

The Man Who Knew Too Much: Alan Turing and the Invention of the Computer


David Leavitt - 2006
    Then, attempting to break a Nazi code during World War II, he successfully designed and built one, thus ensuring the Allied victory. Turing became a champion of artificial intelligence, but his work was cut short. As an openly gay man at a time when homosexuality was illegal in England, he was convicted and forced to undergo a humiliating "treatment" that may have led to his suicide.With a novelist's sensitivity, David Leavitt portrays Turing in all his humanity—his eccentricities, his brilliance, his fatal candor—and elegantly explains his work and its implications.

Circles: Fifty Round Trips Through History Technology Science Culture


James Burke - 2000
    Whether exploring electromagnetic fields, the origin of hot chocolate, or DNA fingerprinting, these essays all illustrate the surprisingly circular nature of change. In "Room with (Half) a View," for instance, Burke muses about the partly obscured railway bridge outside his home on the Thames, a musing which sets off a chain of thought that leads from the bridge's engineer to Samuel Morse, to firearms inventor Sam Colt, and finally to a trombonist named Gustav Holst, who once lived in the very house that blocks Burke's view. So it goes with Burke's entertaining and informative essays as each one highlights the interconnectedness of seemingly unrelated events and innovations. Romantic poetry leads to brandy distillation; tonic water connects through Leibniz to the first explorers to reach the North Pole. This unique collection is sure to stimulate and delight history buffs, technophiles, and anyone else with a healthy intellectual curiosity.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

The Heart of Mathematics: An Invitation to Effective Thinking


Edward B. Burger - 1999
    In this new, innovative overview textbook, the authors put special emphasis on the deep ideas of mathematics, and present the subject through lively and entertaining examples, anecdotes, challenges and illustrations, all of which are designed to excite the student's interest. The underlying ideas include topics from number theory, infinity, geometry, topology, probability and chaos theory. Throughout the text, the authors stress that mathematics is an analytical way of thinking, one that can be brought to bear on problem solving and effective thinking in any field of study.

Bayesian Methods for Hackers: Probabilistic Programming and Bayesian Inference


Cameron Davidson-Pilon - 2014
    However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice-freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples and intuitive explanations that have been refined after extensive user feedback. You'll learn how to use the Markov Chain Monte Carlo algorithm, choose appropriate sample sizes and priors, work with loss functions, and apply Bayesian inference in domains ranging from finance to marketing. Once you've mastered these techniques, you'll constantly turn to this guide for the working PyMC code you need to jumpstart future projects. Coverage includes - Learning the Bayesian "state of mind" and its practical implications - Understanding how computers perform Bayesian inference - Using the PyMC Python library to program Bayesian analyses - Building and debugging models with PyMC - Testing your model's "goodness of fit" - Opening the "black box" of the Markov Chain Monte Carlo algorithm to see how and why it works - Leveraging the power of the "Law of Large Numbers" - Mastering key concepts, such as clustering, convergence, autocorrelation, and thinning - Using loss functions to measure an estimate's weaknesses based on your goals and desired outcomes - Selecting appropriate priors and understanding how their influence changes with dataset size - Overcoming the "exploration versus exploitation" dilemma: deciding when "pretty good" is good enough - Using Bayesian inference to improve A/B testing - Solving data science problems when only small amounts of data are available Cameron Davidson-Pilon has worked in many areas of applied mathematics, from the evolutionary dynamics of genes and diseases to stochastic modeling of financial prices. His contributions to the open source community include lifelines, an implementation of survival analysis in Python. Educated at the University of Waterloo and at the Independent University of Moscow, he currently works with the online commerce leader Shopify.

Ecological Intelligence: How Knowing the Hidden Impacts of What We Buy Can Change Everything


Daniel Goleman - 2009
    We dive down to see coral reefs, not realizing that an ingredient in our sunscreen feeds a virus that kills the reef. We wear organic cotton t-shirts, but don’t know that its dyes may put factory workers at risk for leukemia. In Ecological Intelligence, Daniel Goleman reveals why so many of the products that are labeled green are a “mirage,” and illuminates our wild inconsistencies in response to the ecological crisis.Drawing on cutting-edge research, Goleman explains why we as shoppers are in the dark over the hidden impacts of the goods and services we make and consume, victims of a blackout of information about the detrimental effects of producing, shipping, packaging, distributing, and discarding the goods we buy.But the balance of power is about to shift from seller to buyer, as a new generation of technologies informs us of the ecological facts about products at the point of purchase. This “radical transparency” will enable consumers to make smarter purchasing decisions, and will drive companies to rethink and reform their businesses, ushering in, Goleman claims, a new age of competitive advantage.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

They All Laughed...: From Light Bulbs to Lasers: The Fascinating Stories Behind the Great Inventions


Ira Flatow - 1992
    An enlightening and fun look at scientific discoveries and the often wacky and accidental ways in which they have led to some of the most important inventions--by award-winning journalist Ira Flatow.

Behind Every Good Decision: How Anyone Can Use Business Analytics to Turn Data into Profitable Insight


Piyanka Jain - 2014
    Nothing could be further from the truth. In Behind Every Good Decision, authors and analytics experts Piyanka Jain and Puneet Sharma demonstrate how professionals at any level can take the information at their disposal and leverage it to make better decisions. The authors’ streamlined frame work demystifies the process of business analytics and helps anyone move from data to decisions in just five steps…using only Excel as a tool. Readers will learn how to: Clarify the business question • Lay out a hypothesis-driven plan • Pull relevant data • Convert it to insights • Make decisions that make an impact Packed with examples and exercises, this refreshingly accessible book explains the four fundamental analytic techniques that can help solve a surprising 80% of all business problems. Business analytics isn’t rocket science—it’s a simple problem-solving tool that can help companies increase revenue, decrease costs, improve products, and delight customers. And who doesn’t want to do that?

Programming Collective Intelligence: Building Smart Web 2.0 Applications


Toby Segaran - 2002
    With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.Programming Collective Intelligence takes you into the world of machine learning and statistics, and explains how to draw conclusions about user experience, marketing, personal tastes, and human behavior in general -- all from information that you and others collect every day. Each algorithm is described clearly and concisely with code that can immediately be used on your web site, blog, Wiki, or specialized application. This book explains:Collaborative filtering techniques that enable online retailers to recommend products or media Methods of clustering to detect groups of similar items in a large dataset Search engine features -- crawlers, indexers, query engines, and the PageRank algorithm Optimization algorithms that search millions of possible solutions to a problem and choose the best one Bayesian filtering, used in spam filters for classifying documents based on word types and other features Using decision trees not only to make predictions, but to model the way decisions are made Predicting numerical values rather than classifications to build price models Support vector machines to match people in online dating sites Non-negative matrix factorization to find the independent features in a dataset Evolving intelligence for problem solving -- how a computer develops its skill by improving its own code the more it plays a game Each chapter includes exercises for extending the algorithms to make them more powerful. Go beyond simple database-backed applications and put the wealth of Internet data to work for you. "Bravo! I cannot think of a better way for a developer to first learn these algorithms and methods, nor can I think of a better way for me (an old AI dog) to reinvigorate my knowledge of the details."-- Dan Russell, Google "Toby's book does a great job of breaking down the complex subject matter of machine-learning algorithms into practical, easy-to-understand examples that can be directly applied to analysis of social interaction across the Web today. If I had this book two years ago, it would have saved precious time going down some fruitless paths."-- Tim Wolters, CTO, Collective Intellect

Six Degrees: The Science of a Connected Age


Duncan J. Watts - 2003
    Whether they bind computers, economies, or terrorist organizations, networks are everywhere in the real world, yet only recently have scientists attempted to explain their mysterious workings.From epidemics of disease to outbreaks of market madness, from people searching for information to firms surviving crisis and change, from the structure of personal relationships to the technological and social choices of entire societies, Watts weaves together a network of discoveries across an array of disciplines to tell the story of an explosive new field of knowledge, the people who are building it, and his own peculiar path in forging this new science.

Introduction to Algorithms


Thomas H. Cormen - 1989
    Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.

How Risky Is It, Really?: Why Our Fears Don't Always Match the Facts


David Ropeik - 2010
    HOW RISKY IS IT, REALLY?International risk expert David Ropeik takes an in-depth look at our perceptions of risk and explains the hidden factors that make us unnecessarily afraid of relatively small threats and not afraid enough of some really big ones. This read is a comprehensive, accessible, and entertaining mixture of what's been discovered about how and why we fear — too much or too little. It brings into focus the danger of The Perception Gap: when our fears don't match the facts, and we make choices that create additional risks.This book will not decide for you what is really risky and what isn't. That's up to you. HOW RISKY IS IT, REALLY? will tell you how you make those decisions. Understanding how we perceive risk is the first step toward making wiser and healthier choices for ourselves as individuals and for society as a whole.TEST YOUR OWN "RISK RESPONSE" IN DOZENS OF SELF-QUIZZES!