Innumeracy: Mathematical Illiteracy and Its Consequences


John Allen Paulos - 1988
    Dozens of examples in innumeracy show us how it affects not only personal economics and travel plans, but explains mis-chosen mates, inappropriate drug-testing, and the allure of pseudo-science.

We Have No Idea: A Guide to the Unknown Universe


Jorge Cham - 2017
    While they're at it, they helpfully demystify many complicated things we do know about, from quarks and neutrinos to gravitational waves and exploding black holes. With equal doses of humor and delight, they invite us to see the universe as a vast expanse of mostly uncharted territory that's still ours to explore.This entertaining illustrated science primer is the perfect book for anyone who's curious about all the big questions physicists are still trying to answer.

Math with Bad Drawings


Ben Orlin - 2018
     In MATH WITH BAD DRAWINGS, Ben Orlin answers math's three big questions: Why do I need to learn this? When am I ever going to use it? Why is it so hard? The answers come in various forms-cartoons, drawings, jokes, and the stories and insights of an empathetic teacher who believes that math should belong to everyone.Eschewing the tired old curriculum that begins in the wading pool of addition and subtraction and progresses to the shark infested waters of calculus (AKA the Great Weed Out Course), Orlin instead shows us how to think like a mathematician by teaching us a new game of Tic-Tac-Toe, how to understand an economic crisis by rolling a pair of dice, and the mathematical reason why you should never buy a second lottery ticket. Every example in the book is illustrated with his trademark "bad drawings," which convey both his humor and his message with perfect pitch and clarity. Organized by unconventional but compelling topics such as "Statistics: The Fine Art of Honest Lying," "Design: The Geometry of Stuff That Works," and "Probability: The Mathematics of Maybe," MATH WITH BAD DRAWINGS is a perfect read for fans of illustrated popular science.

The Book of Nothing: Vacuums, Voids, and the Latest Ideas about the Origins of the Universe


John D. Barrow - 2000
    Augustine equate nothingness with the Devil? What tortuous means did 17th-century scientists employ in their attempts to create a vacuum? And why do contemporary quantum physicists believe that the void is actually seething with subatomic activity? You’ll find the answers in this dizzyingly erudite and elegantly explained book by the English cosmologist John D. Barrow.Ranging through mathematics, theology, philosophy, literature, particle physics, and cosmology, The Book of Nothing explores the enduring hold that vacuity has exercised on the human imagination. Combining high-wire speculation with a wealth of reference that takes in Freddy Mercury and Shakespeare alongside Isaac Newton, Albert Einstein, and Stephen Hawking, the result is a fascinating excursion to the vanishing point of our knowledge.

The Ascent of Man


Jacob Bronowski - 1973
    Bronowski's exciting, illustrated investigation offers a perspective not just on science, but on civilization itself. Lower than the angelsForewordThe harvest of the seasons The grain in the stoneThe hidden structure The music of the spheresThe starry messanger The majestic clockworkThe drive for power The ladder of creation World within world Knowledge or certainty Generation upon generationThe long childhoodBibliographyIndex

Physics and Philosophy: The Revolution in Modern Science


Werner Heisenberg - 1958
    The theme of Heisenberg's exposition is that words and concepts familiar in daily life can lose their meaning in the world of relativity and quantum physics. This in turn has profound philosophical implications for the nature of reality and for our total world view.

The Laws of Thermodynamics: A Very Short Introduction


Peter Atkins - 1990
    From the sudden expansion of a cloud of gas to the cooling of hot metal--everything is moved or restrained by four simple laws. Written by Peter Atkins, one of the world's leading authorities on thermodynamics, this powerful and compact introduction explains what these four laws are and how they work, using accessible language and virtually no mathematics. Guiding the reader a step at a time, Atkins begins with Zeroth (so named because the first two laws were well established before scientists realized that a third law, relating to temperature, should precede them--hence the jocular name zeroth), and proceeds through the First, Second, and Third Laws, offering a clear account of concepts such as the availability of work and the conservation of energy. Atkins ranges from the fascinating theory of entropy (revealing how its unstoppable rise constitutes the engine of the universe), through the concept of free energy, and to the brink, and then beyond the brink, of absolute zero. About the Series: Combining authority with wit, accessibility, and style, Very Short Introductions offer an introduction to some of life's most interesting topics. Written by experts for the newcomer, they demonstrate the finest contemporary thinking about the central problems and issues in hundreds of key topics, from philosophy to Freud, quantum theory to Islam.

The Science Book: Big Ideas Simply Explained


Rob Scott Colson - 2014
     The Science Book covers every area of science--astronomy, biology, chemistry, geology, math, and physics, and brings the greatest scientific ideas to life with fascinating text, quirky graphics, and pithy quotes.

The Outer Limits of Reason: What Science, Mathematics, and Logic Cannot Tell Us


Noson S. Yanofsky - 2013
    This book investigates what cannot be known. Rather than exploring the amazing facts that science, mathematics, and reason have revealed to us, this work studies what science, mathematics, and reason tell us cannot be revealed. In The Outer Limits of Reason, Noson Yanofsky considers what cannot be predicted, described, or known, and what will never be understood. He discusses the limitations of computers, physics, logic, and our own thought processes.Yanofsky describes simple tasks that would take computers trillions of centuries to complete and other problems that computers can never solve; perfectly formed English sentences that make no sense; different levels of infinity; the bizarre world of the quantum; the relevance of relativity theory; the causes of chaos theory; math problems that cannot be solved by normal means; and statements that are true but cannot be proven. He explains the limitations of our intuitions about the world -- our ideas about space, time, and motion, and the complex relationship between the knower and the known.Moving from the concrete to the abstract, from problems of everyday language to straightforward philosophical questions to the formalities of physics and mathematics, Yanofsky demonstrates a myriad of unsolvable problems and paradoxes. Exploring the various limitations of our knowledge, he shows that many of these limitations have a similar pattern and that by investigating these patterns, we can better understand the structure and limitations of reason itself. Yanofsky even attempts to look beyond the borders of reason to see what, if anything, is out there.

How to Solve It: A New Aspect of Mathematical Method


George Pólya - 1944
    Polya, How to Solve It will show anyone in any field how to think straight. In lucid and appealing prose, Polya reveals how the mathematical method of demonstrating a proof or finding an unknown can be of help in attacking any problem that can be reasoned out--from building a bridge to winning a game of anagrams. Generations of readers have relished Polya's deft--indeed, brilliant--instructions on stripping away irrelevancies and going straight to the heart of the problem.

Everything and More: A Compact History of Infinity


David Foster Wallace - 2003
    Now he brings his considerable talents to the history of one of math's most enduring puzzles: the seemingly paradoxical nature of infinity.Is infinity a valid mathematical property or a meaningless abstraction? The nineteenth-century mathematical genius Georg Cantor's answer to this question not only surprised him but also shook the very foundations upon which math had been built. Cantor's counterintuitive discovery of a progression of larger and larger infinities created controversy in his time and may have hastened his mental breakdown, but it also helped lead to the development of set theory, analytic philosophy, and even computer technology.Smart, challenging, and thoroughly rewarding, Wallace's tour de force brings immediate and high-profile recognition to the bizarre and fascinating world of higher mathematics.

Mathematics: A Very Short Introduction


Timothy Gowers - 2002
    The most fundamental differences are philosophical, and readers of this book will emerge with a clearer understandingof paradoxical-sounding concepts such as infinity, curved space, and imaginary numbers. The first few chapters are about general aspects of mathematical thought. These are followed by discussions of more specific topics, and the book closes with a chapter answering common sociological questionsabout the mathematical community (such as Is it true that mathematicians burn out at the age of 25?) It is the ideal introduction for anyone who wishes to deepen their understanding of mathematics.About the Series: Combining authority with wit, accessibility, and style, Very Short Introductions offer an introduction to some of life's most interesting topics. Written by experts for the newcomer, they demonstrate the finest contemporary thinking about the central problems and issues in hundredsof key topics, from philosophy to Freud, quantum theory to Islam.

The Tao of Physics: An Exploration of the Parallels between Modern Physics and Eastern Mysticism


Fritjof Capra - 1975
    

The Mismeasure of Man


Stephen Jay Gould - 1982
    Gould's brilliant, funny, engaging prose dissects the motivations behind those who would judge intelligence, and hence worth, by cranial size, convolutions, or score on extremely narrow tests. How did scientists decide that intelligence was unipolar and quantifiable? Why did the standard keep changing over time? Gould's answer is clear and simple: power maintains itself. European men of the 19th century, even before Darwin, saw themselves as the pinnacle of creation and sought to prove this assertion through hard measurement. When one measure was found to place members of some "inferior" group such as women or Southeast Asians over the supposedly rightful champions, it would be discarded and replaced with a new, more comfortable measure. The 20th-century obsession with numbers led to the institutionalization of IQ testing and subsequent assignment to work (and rewards) commensurate with the score, shown by Gould to be not simply misguided--for surely intelligence is multifactorial--but also regressive, creating a feedback loop rewarding the rich and powerful. The revised edition includes a scathing critique of Herrnstein and Murray's The Bell Curve, taking them to task for rehashing old arguments to exploit a new political wave of uncaring belt tightening. It might not make you any smarter, but The Mismeasure of Man will certainly make you think.--Rob LightnerThis edition is revised and expanded, with a new introduction

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy


Sharon Bertsch McGrayne - 2011
    To its adherents, it is an elegant statement about learning from experience. To its opponents, it is subjectivity run amok.In the first-ever account of Bayes' rule for general readers, Sharon Bertsch McGrayne explores this controversial theorem and the human obsessions surrounding it. She traces its discovery by an amateur mathematician in the 1740s through its development into roughly its modern form by French scientist Pierre Simon Laplace. She reveals why respected statisticians rendered it professionally taboo for 150 years—at the same time that practitioners relied on it to solve crises involving great uncertainty and scanty information (Alan Turing's role in breaking Germany's Enigma code during World War II), and explains how the advent of off-the-shelf computer technology in the 1980s proved to be a game-changer. Today, Bayes' rule is used everywhere from DNA de-coding to Homeland Security.Drawing on primary source material and interviews with statisticians and other scientists, The Theory That Would Not Die is the riveting account of how a seemingly simple theorem ignited one of the greatest controversies of all time.