What Is Real?: The Unfinished Quest for the Meaning of Quantum Physics


Adam Becker - 2018
    But ask what it means, and the result will be a brawl. For a century, most physicists have followed Niels Bohr's Copenhagen interpretation and dismissed questions about the reality underlying quantum physics as meaningless. A mishmash of solipsism and poor reasoning, Copenhagen endured, as Bohr's students vigorously protected his legacy, and the physics community favored practical experiments over philosophical arguments. As a result, questioning the status quo long meant professional ruin. And yet, from the 1920s to today, physicists like John Bell, David Bohm, and Hugh Everett persisted in seeking the true meaning of quantum mechanics. What Is Real? is the gripping story of this battle of ideas and of the courageous scientists who dared to stand up for truth.

On Intelligence


Jeff Hawkins - 2004
    Now he stands ready to revolutionize both neuroscience and computing in one stroke, with a new understanding of intelligence itself.Hawkins develops a powerful theory of how the human brain works, explaining why computers are not intelligent and how, based on this new theory, we can finally build intelligent machines.The brain is not a computer, but a memory system that stores experiences in a way that reflects the true structure of the world, remembering sequences of events and their nested relationships and making predictions based on those memories. It is this memory-prediction system that forms the basis of intelligence, perception, creativity, and even consciousness.In an engaging style that will captivate audiences from the merely curious to the professional scientist, Hawkins shows how a clear understanding of how the brain works will make it possible for us to build intelligent machines, in silicon, that will exceed our human ability in surprising ways.Written with acclaimed science writer Sandra Blakeslee, On Intelligence promises to completely transfigure the possibilities of the technology age. It is a landmark book in its scope and clarity.

Artificial Intelligence: A Modern Approach


Stuart Russell - 1994
    The long-anticipated revision of this best-selling text offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. *NEW-Nontechnical learning material-Accompanies each part of the book. *NEW-The Internet as a sample application for intelligent systems-Added in several places including logical agents, planning, and natural language. *NEW-Increased coverage of material - Includes expanded coverage of: default reasoning and truth maintenance systems, including multi-agent/distributed AI and game theory; probabilistic approaches to learning including EM; more detailed descriptions of probabilistic inference algorithms. *NEW-Updated and expanded exercises-75% of the exercises are revised, with 100 new exercises. *NEW-On-line Java software. *Makes it easy for students to do projects on the web using intelligent agents. *A unified, agent-based approach to AI-Organizes the material around the task of building intelligent agents. *Comprehensive, up-to-date coverage-Includes a unified view of the field organized around the rational decision making pa

Journey through Genius: The Great Theorems of Mathematics


William Dunham - 1990
    Now William Dunham gives them the attention they deserve.Dunham places each theorem within its historical context and explores the very human and often turbulent life of the creator — from Archimedes, the absentminded theoretician whose absorption in his work often precluded eating or bathing, to Gerolamo Cardano, the sixteenth-century mathematician whose accomplishments flourished despite a bizarre array of misadventures, to the paranoid genius of modern times, Georg Cantor. He also provides step-by-step proofs for the theorems, each easily accessible to readers with no more than a knowledge of high school mathematics.A rare combination of the historical, biographical, and mathematical, Journey Through Genius is a fascinating introduction to a neglected field of human creativity.

The Mismeasure of Man


Stephen Jay Gould - 1982
    Gould's brilliant, funny, engaging prose dissects the motivations behind those who would judge intelligence, and hence worth, by cranial size, convolutions, or score on extremely narrow tests. How did scientists decide that intelligence was unipolar and quantifiable? Why did the standard keep changing over time? Gould's answer is clear and simple: power maintains itself. European men of the 19th century, even before Darwin, saw themselves as the pinnacle of creation and sought to prove this assertion through hard measurement. When one measure was found to place members of some "inferior" group such as women or Southeast Asians over the supposedly rightful champions, it would be discarded and replaced with a new, more comfortable measure. The 20th-century obsession with numbers led to the institutionalization of IQ testing and subsequent assignment to work (and rewards) commensurate with the score, shown by Gould to be not simply misguided--for surely intelligence is multifactorial--but also regressive, creating a feedback loop rewarding the rich and powerful. The revised edition includes a scathing critique of Herrnstein and Murray's The Bell Curve, taking them to task for rehashing old arguments to exploit a new political wave of uncaring belt tightening. It might not make you any smarter, but The Mismeasure of Man will certainly make you think.--Rob LightnerThis edition is revised and expanded, with a new introduction

Time Reborn: From the Crisis in Physics to the Future of the Universe


Lee Smolin - 2013
    You experience it passing every day when you watch clocks tick, bread toast, and children grow. But most physicists see things differently, from Newton to Einstein to today’s quantum theorists. For them, time isn’t real. You may think you experience time passing, but they say it’s just an illusion.Lee Smolin, author of the controversial bestseller The Trouble with Physics, argues this limited notion of time is holding physics back. It’s time for a major revolution in scientific thought. The reality of time could be the key to the next big breakthrough in theoretical physics.What if the laws of physics themselves were not timeless? What if they could evolve? Time Reborn offers a radical new approach to cosmology that embraces the reality of time and opens up a whole new universe of possibilties. There are few ideas that, like our notion of time, shape our thinking about literally everything, with major implications for physics and beyond—from climate change to the economic crisis. Smolin explains in lively and lucid prose how the true nature of time impacts our world.

Sciencia: Mathematics, Physics, Chemistry, Biology, and Astronomy for All


Burkard Polster - 2011
    Lavishly illustrated with engravings, woodcuts, and original drawings and diagrams, Sciencia will inspire readers of all ages to take an interest in the interconnected knowledge of the modern sciences.Beautifully produced in thirteen different colors of ink, Sciencia is an essential reference and an elegant gift.Wooden Books was founded in 1999 by designer John Martineau near Hay-on-Wye. The aim was to produce a beautiful series of recycled books based on the classical philosophies, arts and sciences. Using the Beatrix Potter formula of text facing picture pages, and old-styles fonts, along with hand-drawn illustrations and 19th century engravings, the books are designed not to date. Small but stuffed with information. Eco friendly and educational. Big ideas in a tiny space. There are over 1,000,000 Wooden Books now in print worldwide and growing.

The Principia: Mathematical Principles of Natural Philosophy


Isaac Newton - 1687
    Even after more than three centuries and the revolutions of Einsteinian relativity and quantum mechanics, Newtonian physics continues to account for many of the phenomena of the observed world, and Newtonian celestial dynamics is used to determine the orbits of our space vehicles.This completely new translation, the first in 270 years, is based on the third (1726) edition, the final revised version approved by Newton; it includes extracts from the earlier editions, corrects errors found in earlier versions, and replaces archaic English with contemporary prose and up-to-date mathematical forms. Newton's principles describe acceleration, deceleration, and inertial movement; fluid dynamics; and the motions of the earth, moon, planets, and comets. A great work in itself, the Principia also revolutionized the methods of scientific investigation. It set forth the fundamental three laws of motion and the law of universal gravity, the physical principles that account for the Copernican system of the world as emended by Kepler, thus effectively ending controversy concerning the Copernican planetary system.The illuminating Guide to the Principia by I. Bernard Cohen, along with his and Anne Whitman's translation, will make this preeminent work truly accessible for today's scientists, scholars, and students.

The Math Book: From Pythagoras to the 57th Dimension, 250 Milestones in the History of Mathematics


Clifford A. Pickover - 2009
    Beginning millions of years ago with ancient “ant odometers” and moving through time to our modern-day quest for new dimensions, it covers 250 milestones in mathematical history. Among the numerous delights readers will learn about as they dip into this inviting anthology: cicada-generated prime numbers, magic squares from centuries ago, the discovery of pi and calculus, and the butterfly effect. Each topic gets a lavishly illustrated spread with stunning color art, along with formulas and concepts, fascinating facts about scientists’ lives, and real-world applications of the theorems.

A Short History of Nearly Everything


Bill Bryson - 2003
    Taking as territory everything from the Big Bang to the rise of civilization, Bryson seeks to understand how we got from there being nothing at all to there being us. To that end, he has attached himself to a host of the world’s most advanced (and often obsessed) archaeologists, anthropologists, and mathematicians, travelling to their offices, laboratories, and field camps. He has read (or tried to read) their books, pestered them with questions, apprenticed himself to their powerful minds. A Short History of Nearly Everything is the record of this quest, and it is a sometimes profound, sometimes funny, and always supremely clear and entertaining adventure in the realms of human knowledge, as only Bill Bryson can render it. Science has never been more involving or entertaining.

Lost in Math: How Beauty Leads Physics Astray


Sabine Hossenfelder - 2018
    Whether pondering black holes or predicting discoveries at CERN, physicists believe the best theories are beautiful, natural, and elegant, and this standard separates popular theories from disposable ones. This is why, Sabine Hossenfelder argues, we have not seen a major breakthrough in the foundations of physics for more than four decades. The belief in beauty has become so dogmatic that it now conflicts with scientific objectivity: observation has been unable to confirm mindboggling theories, like supersymmetry or grand unification, invented by physicists based on aesthetic criteria. Worse, these "too good to not be true" theories are actually untestable and they have left the field in a cul-de-sac. To escape, physicists must rethink their methods. Only by embracing reality as it is can science discover the truth.

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy


Sharon Bertsch McGrayne - 2011
    To its adherents, it is an elegant statement about learning from experience. To its opponents, it is subjectivity run amok.In the first-ever account of Bayes' rule for general readers, Sharon Bertsch McGrayne explores this controversial theorem and the human obsessions surrounding it. She traces its discovery by an amateur mathematician in the 1740s through its development into roughly its modern form by French scientist Pierre Simon Laplace. She reveals why respected statisticians rendered it professionally taboo for 150 years—at the same time that practitioners relied on it to solve crises involving great uncertainty and scanty information (Alan Turing's role in breaking Germany's Enigma code during World War II), and explains how the advent of off-the-shelf computer technology in the 1980s proved to be a game-changer. Today, Bayes' rule is used everywhere from DNA de-coding to Homeland Security.Drawing on primary source material and interviews with statisticians and other scientists, The Theory That Would Not Die is the riveting account of how a seemingly simple theorem ignited one of the greatest controversies of all time.

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Everything and More: A Compact History of Infinity


David Foster Wallace - 2003
    Now he brings his considerable talents to the history of one of math's most enduring puzzles: the seemingly paradoxical nature of infinity.Is infinity a valid mathematical property or a meaningless abstraction? The nineteenth-century mathematical genius Georg Cantor's answer to this question not only surprised him but also shook the very foundations upon which math had been built. Cantor's counterintuitive discovery of a progression of larger and larger infinities created controversy in his time and may have hastened his mental breakdown, but it also helped lead to the development of set theory, analytic philosophy, and even computer technology.Smart, challenging, and thoroughly rewarding, Wallace's tour de force brings immediate and high-profile recognition to the bizarre and fascinating world of higher mathematics.

Consilience: The Unity of Knowledge


Edward O. Wilson - 1998
    In Consilience  (a word that originally meant "jumping together"), Edward O. Wilson renews the Enlightenment's search for a unified theory of knowledge in disciplines that range from physics to biology, the social sciences and the humanities.Using the natural sciences as his model, Wilson forges dramatic links between fields. He explores the chemistry of the mind and the genetic bases of culture. He postulates the biological principles underlying works of art from cave-drawings to Lolita. Presenting the latest findings in prose of wonderful clarity and oratorical eloquence, and synthesizing it into a dazzling whole, Consilience is science in the path-clearing traditions of Newton, Einstein, and Richard Feynman.