A Mind at Play: How Claude Shannon Invented the Information Age


Jimmy Soni - 2017
    He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass.In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener.And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.

Learning From Data: A Short Course


Yaser S. Abu-Mostafa - 2012
    Its techniques are widely applied in engineering, science, finance, and commerce. This book is designed for a short course on machine learning. It is a short course, not a hurried course. From over a decade of teaching this material, we have distilled what we believe to be the core topics that every student of the subject should know. We chose the title `learning from data' that faithfully describes what the subject is about, and made it a point to cover the topics in a story-like fashion. Our hope is that the reader can learn all the fundamentals of the subject by reading the book cover to cover. ---- Learning from data has distinct theoretical and practical tracks. In this book, we balance the theoretical and the practical, the mathematical and the heuristic. Our criterion for inclusion is relevance. Theory that establishes the conceptual framework for learning is included, and so are heuristics that impact the performance of real learning systems. ---- Learning from data is a very dynamic field. Some of the hot techniques and theories at times become just fads, and others gain traction and become part of the field. What we have emphasized in this book are the necessary fundamentals that give any student of learning from data a solid foundation, and enable him or her to venture out and explore further techniques and theories, or perhaps to contribute their own. ---- The authors are professors at California Institute of Technology (Caltech), Rensselaer Polytechnic Institute (RPI), and National Taiwan University (NTU), where this book is the main text for their popular courses on machine learning. The authors also consult extensively with financial and commercial companies on machine learning applications, and have led winning teams in machine learning competitions.

The C Programming Language


Brian W. Kernighan - 1978
    It is the definitive reference guide, now in a second edition. Although the first edition was written in 1978, it continues to be a worldwide best-seller. This second edition brings the classic original up to date to include the ANSI standard. From the Preface: We have tried to retain the brevity of the first edition. C is not a big language, and it is not well served by a big book. We have improved the exposition of critical features, such as pointers, that are central to C programming. We have refined the original examples, and have added new examples in several chapters. For instance, the treatment of complicated declarations is augmented by programs that convert declarations into words and vice versa. As before, all examples have been tested directly from the text, which is in machine-readable form. As we said in the first preface to the first edition, C "wears well as one's experience with it grows." With a decade more experience, we still feel that way. We hope that this book will help you to learn C and use it well.

Discrete Mathematics and Its Applications


Kenneth H. Rosen - 2000
    These themes include mathematical reasoning, combinatorial analysis, discrete structures, algorithmic thinking, and enhanced problem-solving skills through modeling. Its intent is to demonstrate the relevance and practicality of discrete mathematics to all students. The Fifth Edition includes a more thorough and linear presentation of logic, proof types and proof writing, and mathematical reasoning. This enhanced coverage will provide students with a solid understanding of the material as it relates to their immediate field of study and other relevant subjects. The inclusion of applications and examples to key topics has been significantly addressed to add clarity to every subject. True to the Fourth Edition, the text-specific web site supplements the subject matter in meaningful ways, offering additional material for students and instructors. Discrete math is an active subject with new discoveries made every year. The continual growth and updates to the web site reflect the active nature of the topics being discussed. The book is appropriate for a one- or two-term introductory discrete mathematics course to be taken by students in a wide variety of majors, including computer science, mathematics, and engineering. College Algebra is the only explicit prerequisite.

Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life


Albert-László Barabási - 2002
    Albert-László Barabási, the nation’s foremost expert in the new science of networks and author of Bursts, takes us on an intellectual adventure to prove that social networks, corporations, and living organisms are more similar than previously thought. Grasping a full understanding of network science will someday allow us to design blue-chip businesses, stop the outbreak of deadly diseases, and influence the exchange of ideas and information. Just as James Gleick and the Erdos–Rényi model brought the discovery of chaos theory to the general public, Linked tells the story of the true science of the future and of experiments in statistical mechanics on the internet, all vital parts of what would eventually be called the Barabási–Albert model.

Fuzzy Logic: The Revolutionary Computer Technology That Is Changing Our World


Daniel McNeill - 1993
    Professor Lofti Zadeh masterminded "fuzzy logic"--a way of programming computers to "make decisions" bases on imprecise data and complex situations. In "Fuzzy Logic," Daniel McNeill and Paul Freiberger relate the compelling tale of this remarkable new technology, the genius who brought it to life, and how it will soon affect the lives of every one of us.

Automate This: How Algorithms Came to Rule Our World


Christopher Steiner - 2012
    It used to be that to diagnose an illness, interpret legal documents, analyze foreign policy, or write a newspaper article you needed a human being with specific skills—and maybe an advanced degree or two. These days, high-level tasks are increasingly being handled by algorithms that can do precise work not only with speed but also with nuance. These “bots” started with human programming and logic, but now their reach extends beyond what their creators ever expected. In this fascinating, frightening book, Christopher Steiner tells the story of how algorithms took over—and shows why the “bot revolution” is about to spill into every aspect of our lives, often silently, without our knowledge. The May 2010 “Flash Crash” exposed Wall Street’s reliance on trading bots to the tune of a 998-point market drop and $1 trillion in vanished market value. But that was just the beginning. In Automate This, we meet bots that are driving cars, penning haiku, and writing music mistaken for Bach’s. They listen in on our customer service calls and figure out what Iran would do in the event of a nuclear standoff. There are algorithms that can pick out the most cohesive crew of astronauts for a space mission or identify the next Jeremy Lin. Some can even ingest statistics from baseball games and spit out pitch-perfect sports journalism indistinguishable from that produced by humans. The interaction of man and machine can make our lives easier. But what will the world look like when algorithms control our hospitals, our roads, our culture, and our national security? What hap­pens to businesses when we automate judgment and eliminate human instinct? And what role will be left for doctors, lawyers, writers, truck drivers, and many others?  Who knows—maybe there’s a bot learning to do your job this minute.

Gödel's Proof


Ernest Nagel - 1958
    Gödel received public recognition of his work in 1951 when he was awarded the first Albert Einstein Award for achievement in the natural sciences--perhaps the highest award of its kind in the United States. The award committee described his work in mathematical logic as "one of the greatest contributions to the sciences in recent times."However, few mathematicians of the time were equipped to understand the young scholar's complex proof. Ernest Nagel and James Newman provide a readable and accessible explanation to both scholars and non-specialists of the main ideas and broad implications of Gödel's discovery. It offers every educated person with a taste for logic and philosophy the chance to understand a previously difficult and inaccessible subject.New York University Press is proud to publish this special edition of one of its bestselling books. With a new introduction by Douglas R. Hofstadter, this book will appeal students, scholars, and professionals in the fields of mathematics, computer science, logic and philosophy, and science.

Prisoner's Dilemma: John von Neumann, Game Theory, and the Puzzle of the Bomb


William Poundstone - 1992
    Though the answers may seem simple, their profound implications make the prisoner's dilemma one of the great unifying concepts of science. Watching players bluff in a poker game inspired John von Neumann--father of the modern computer and one of the sharpest minds of the century--to construct game theory, a mathematical study of conflict and deception. Game theory was readily embraced at the RAND Corporation, the archetypical think tank charged with formulating military strategy for the atomic age, and in 1950 two RAND scientists made a momentous discovery.Called the prisoner's dilemma, it is a disturbing and mind-bending game where two or more people may betray the common good for individual gain. Introduced shortly after the Soviet Union acquired the atomic bomb, the prisoner's dilemma quickly became a popular allegory of the nuclear arms race. Intellectuals such as von Neumann and Bertrand Russell joined military and political leaders in rallying to the preventive war movement, which advocated a nuclear first strike against the Soviet Union. Though the Truman administration rejected preventive war the United States entered into an arms race with the Soviets and game theory developed into a controversial tool of public policy--alternately accused of justifying arms races and touted as the only hope of preventing them.A masterful work of science writing, Prisoner's Dilemma weaves together a biography of the brilliant and tragic von Neumann, a history of pivotal phases of the cold war, and an investigation of game theory's far-reaching influence on public policy today. Most important, Prisoner's Dilemma is the incisive story of a revolutionary idea that has been hailed as a landmark of twentieth-century thought.

On Intelligence


Jeff Hawkins - 2004
    Now he stands ready to revolutionize both neuroscience and computing in one stroke, with a new understanding of intelligence itself.Hawkins develops a powerful theory of how the human brain works, explaining why computers are not intelligent and how, based on this new theory, we can finally build intelligent machines.The brain is not a computer, but a memory system that stores experiences in a way that reflects the true structure of the world, remembering sequences of events and their nested relationships and making predictions based on those memories. It is this memory-prediction system that forms the basis of intelligence, perception, creativity, and even consciousness.In an engaging style that will captivate audiences from the merely curious to the professional scientist, Hawkins shows how a clear understanding of how the brain works will make it possible for us to build intelligent machines, in silicon, that will exceed our human ability in surprising ways.Written with acclaimed science writer Sandra Blakeslee, On Intelligence promises to completely transfigure the possibilities of the technology age. It is a landmark book in its scope and clarity.

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy


Sharon Bertsch McGrayne - 2011
    To its adherents, it is an elegant statement about learning from experience. To its opponents, it is subjectivity run amok.In the first-ever account of Bayes' rule for general readers, Sharon Bertsch McGrayne explores this controversial theorem and the human obsessions surrounding it. She traces its discovery by an amateur mathematician in the 1740s through its development into roughly its modern form by French scientist Pierre Simon Laplace. She reveals why respected statisticians rendered it professionally taboo for 150 years—at the same time that practitioners relied on it to solve crises involving great uncertainty and scanty information (Alan Turing's role in breaking Germany's Enigma code during World War II), and explains how the advent of off-the-shelf computer technology in the 1980s proved to be a game-changer. Today, Bayes' rule is used everywhere from DNA de-coding to Homeland Security.Drawing on primary source material and interviews with statisticians and other scientists, The Theory That Would Not Die is the riveting account of how a seemingly simple theorem ignited one of the greatest controversies of all time.

Big Data: A Revolution That Will Transform How We Live, Work, and Think


Viktor Mayer-Schönberger - 2013
    “Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. This emerging science can translate myriad phenomena—from the price of airline tickets to the text of millions of books—into searchable form, and uses our increasing computing power to unearth epiphanies that we never could have seen before. A revolution on par with the Internet or perhaps even the printing press, big data will change the way we think about business, health, politics, education, and innovation in the years to come. It also poses fresh threats, from the inevitable end of privacy as we know it to the prospect of being penalized for things we haven’t even done yet, based on big data’s ability to predict our future behavior.In this brilliantly clear, often surprising work, two leading experts explain what big data is, how it will change our lives, and what we can do to protect ourselves from its hazards. Big Data is the first big book about the next big thing.www.big-data-book.com

Crypto: How the Code Rebels Beat the Government—Saving Privacy in the Digital Age


Steven Levy - 2001
    From Stephen Levy—the author who made "hackers" a household word—comes this account of a revolution that is already affecting every citizen in the twenty-first century. Crypto tells the inside story of how a group of "crypto rebels"—nerds and visionaries turned freedom fighters—teamed up with corporate interests to beat Big Brother and ensure our privacy on the Internet. Levy's history of one of the most controversial and important topics of the digital age reads like the best futuristic fiction.

The Beginning of Infinity: Explanations That Transform the World


David Deutsch - 2011
    Taking us on a journey through every fundamental field of science, as well as the history of civilization, art, moral values, and the theory of political institutions, Deutsch tracks how we form new explanations and drop bad ones, explaining the conditions under which progress—which he argues is potentially boundless—can and cannot happen. Hugely ambitious and highly original, The Beginning of Infinity explores and establishes deep connections between the laws of nature, the human condition, knowledge, and the possibility for progress.

God Created the Integers: The Mathematical Breakthroughs That Changed History


Stephen Hawking - 2005
    In this collection of landmark mathematical works, editor Stephen Hawking has assembled the greatest feats humans have ever accomplished using just numbers and their brains.