The Calculus Wars: Newton, Leibniz, and the Greatest Mathematical Clash of All Time


Jason Socrates Bardi - 2006
    But a dispute over its discovery sowed the seeds of discontent between two of the greatest scientific giants of all time - Sir Isaac Newton and Gottfried Wilhelm Leibniz." "Today Newton and Leibniz are generally considered the twin independent inventors of calculus. They are both credited with giving mathematics its greatest push forward since the time of the Greeks. Had they known each other under different circumstances, they might have been friends. But in their own lifetimes, the joint glory of calculus was not enough for either and each declared war against the other, openly and in secret." This long and bitter dispute has been swept under the carpet by historians - perhaps because it reveals Newton and Leibniz in their worst light - but The Calculus Wars tells the full story in narrative form for the first time. This history ultimately exposes how these twin mathematical giants were brilliant, proud, at times mad, and in the end completely human.

Number: The Language of Science


Tobias Dantzig - 1930
    Tobias Dantzig shows that the development of math—from the invention of counting to the discovery of infinity—is a profoundly human story that progressed by “trying and erring, by groping and stumbling.” He shows how commerce, war, and religion led to advances in math, and he recounts the stories of individuals whose breakthroughs expanded the concept of number and created the mathematics that we know today.

The Information: A History, a Theory, a Flood


James Gleick - 2011
    The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself. And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.

The Book of Numbers: The Secret of Numbers and How They Changed the World


Peter J. Bentley - 2008
    Indeed, numbers are part of every discipline in the sciences and the arts.With 350 illustrations, including diagrams, photographs and computer imagery, the book chronicles the centuries-long search for the meaning of numbers by famous and lesser-known mathematicians, and explains the puzzling aspects of the mathematical world. Topics include:The earliest ideas of numbers and counting Patterns, logic, calculating Natural, perfect, amicable and prime numbers Numerology, the power of numbers, superstition The computer, the Enigma Code Infinity, the speed of light, relativity Complex numbers The Big Bang and Chaos theories The Philosopher's Stone. The Book of Numbers shows enthusiastically that numbers are neither boring nor dull but rather involve intriguing connections, rivalries, secret documents and even mysterious deaths.

An Introduction to Probability and Inductive Logic


Ian Hacking - 2001
    The book has been designed to offer maximal accessibility to the widest range of students (not only those majoring in philosophy) and assumes no formal training in elementary symbolic logic. It offers a comprehensive course covering all basic definitions of induction and probability, and considers such topics as decision theory, Bayesianism, frequency ideas, and the philosophical problem of induction. The key features of the book are: * A lively and vigorous prose style* Lucid and systematic organization and presentation of the ideas* Many practical applications* A rich supply of exercises drawing on examples from such fields as psychology, ecology, economics, bioethics, engineering, and political science* Numerous brief historical accounts of how fundamental ideas of probability and induction developed.* A full bibliography of further reading Although designed primarily for courses in philosophy, the book could certainly be read and enjoyed by those in the social sciences (particularly psychology, economics, political science and sociology) or medical sciences such as epidemiology seeking a reader-friendly account of the basic ideas of probability and induction. Ian Hacking is University Professor, University of Toronto. He is Fellow of the Royal Society of Canada, Fellow of the British Academy, and Fellow of the American Academy of Arts and Sciences. he is author of many books including five previous books with Cambridge (The Logic of Statistical Inference, Why Does Language Matter to Philosophy?, The Emergence of Probability, Representing and Intervening, and The Taming of Chance).

The Nothing That Is: A Natural History of Zero


Robert M. Kaplan - 1999
    As we enter the year 2000, zero is once again making its presence felt. Nothing itself, it makes possible a myriad of calculations. Indeed, without zero mathematicsas we know it would not exist. And without mathematics our understanding of the universe would be vastly impoverished. But where did this nothing, this hollow circle, come from? Who created it? And what, exactly, does it mean? Robert Kaplan's The Nothing That Is: A Natural History of Zero begins as a mystery story, taking us back to Sumerian times, and then to Greece and India, piecing together the way the idea of a symbol for nothing evolved. Kaplan shows us just how handicapped our ancestors were in trying to figurelarge sums without the aid of the zero. (Try multiplying CLXIV by XXIV). Remarkably, even the Greeks, mathematically brilliant as they were, didn't have a zero--or did they? We follow the trail to the East where, a millennium or two ago, Indian mathematicians took another crucial step. By treatingzero for the first time like any other number, instead of a unique symbol, they allowed huge new leaps forward in computation, and also in our understanding of how mathematics itself works. In the Middle Ages, this mathematical knowledge swept across western Europe via Arab traders. At first it was called dangerous Saracen magic and considered the Devil's work, but it wasn't long before merchants and bankers saw how handy this magic was, and used it to develop tools likedouble-entry bookkeeping. Zero quickly became an essential part of increasingly sophisticated equations, and with the invention of calculus, one could say it was a linchpin of the scientific revolution. And now even deeper layers of this thing that is nothing are coming to light: our computers speakonly in zeros and ones, and modern mathematics shows that zero alone can be made to generate everything.Robert Kaplan serves up all this history with immense zest and humor; his writing is full of anecdotes and asides, and quotations from Shakespeare to Wallace Stevens extend the book's context far beyond the scope of scientific specialists. For Kaplan, the history of zero is a lens for looking notonly into the evolution of mathematics but into very nature of human thought. He points out how the history of mathematics is a process of recursive abstraction: how once a symbol is created to represent an idea, that symbol itself gives rise to new operations that in turn lead to new ideas. Thebeauty of mathematics is that even though we invent it, we seem to be discovering something that already exists.The joy of that discovery shines from Kaplan's pages, as he ranges from Archimedes to Einstein, making fascinating connections between mathematical insights from every age and culture. A tour de force of science history, The Nothing That Is takes us through the hollow circle that leads to infinity.

Introduction to Linear Algebra


Gilbert Strang - 1993
    Topics covered include matrix multiplication, row reduction, matrix inverse, orthogonality and computation. The self-teaching book is loaded with examples and graphics and provides a wide array of probing problems, accompanying solutions, and a glossary. Chapter 1: Introduction to Vectors; Chapter 2: Solving Linear Equations; Chapter 3: Vector Spaces and Subspaces; Chapter 4: Orthogonality; Chapter 5: Determinants; Chapter 6: Eigenvalues and Eigenvectors; Chapter 7: Linear Transformations; Chapter 8: Applications; Chapter 9: Numerical Linear Algebra; Chapter 10: Complex Vectors and Matrices; Solutions to Selected Exercises; Final Exam. Matrix Factorizations. Conceptual Questions for Review. Glossary: A Dictionary for Linear Algebra Index Teaching Codes Linear Algebra in a Nutshell.

Euclid's Elements


Euclid
    Heath's translation of the thirteen books of Euclid's Elements. In keeping with Green Lion's design commitment, diagrams have been placed on every spread for convenient reference while working through the proofs; running heads on every page indicate both Euclid's book number and proposition numbers for that page; and adequate space for notes is allowed between propositions and around diagrams. The all-new index has built into it a glossary of Euclid's Greek terms.Heath's translation has stood the test of time, and, as one done by a renowned scholar of ancient mathematics, it can be relied upon not to have inadvertantly introduced modern concepts or nomenclature. We have excised the voluminous historical and scholarly commentary that swells the Dover edition to three volumes and impedes classroom use of the original text. The single volume is not only more convenient, but less expensive as well.

Everything and More: A Compact History of Infinity


David Foster Wallace - 2003
    Now he brings his considerable talents to the history of one of math's most enduring puzzles: the seemingly paradoxical nature of infinity.Is infinity a valid mathematical property or a meaningless abstraction? The nineteenth-century mathematical genius Georg Cantor's answer to this question not only surprised him but also shook the very foundations upon which math had been built. Cantor's counterintuitive discovery of a progression of larger and larger infinities created controversy in his time and may have hastened his mental breakdown, but it also helped lead to the development of set theory, analytic philosophy, and even computer technology.Smart, challenging, and thoroughly rewarding, Wallace's tour de force brings immediate and high-profile recognition to the bizarre and fascinating world of higher mathematics.

A Mathematician's Lament


Paul Lockhart
    He proposes his solution.

How to read and do proofs


Daniel Solow - 1982
    Shows how any proof can be understood as a sequence of techniques. Covers the full range of techniques used in proofs, such as the contrapositive, induction, and proof by contradiction. Explains how to identify which techniques are used and how they are applied in the specific problem. Illustrates how to read written proofs with many step-by-step examples. Includes new, expanded appendices related to discrete mathematics, linear algebra, modern algebra and real analysis.

The Monty Hall Problem: The Remarkable Story of Math's Most Contentious Brain Teaser


Jason Rosenhouse - 2009
    Imagine that you face three doors, behind one of which is a prize. You choose one but do not open it. The host--call him Monty Hall--opens a different door, alwayschoosing one he knows to be empty. Left with two doors, will you do better by sticking with your first choice, or by switching to the other remaining door? In this light-hearted yet ultimately serious book, Jason Rosenhouse explores the history of this fascinating puzzle. Using a minimum ofmathematics (and none at all for much of the book), he shows how the problem has fascinated philosophers, psychologists, and many others, and examines the many variations that have appeared over the years. As Rosenhouse demonstrates, the Monty Hall Problem illuminates fundamental mathematical issuesand has abiding philosophical implications. Perhaps most important, he writes, the problem opens a window on our cognitive difficulties in reasoning about uncertainty.

Elementary Statistics: A Step by Step Approach


Allan G. Bluman - 1992
    The book is non-theoretical, explaining concepts intuitively and teaching problem solving through worked examples and step-by-step instructions. This edition places more emphasis on conceptual understanding and understanding results. This edition also features increased emphasis on Excel, MINITAB, and the TI-83 Plus and TI 84-Plus graphing calculators, computing technologies commonly used in such courses.

To Infinity and Beyond: A Cultural History of the Infinite


Eli Maor - 1986
    He evokes the profound intellectual impact the infinite has exercised on the human mind--from the horror infiniti of the Greeks to the works of M. C. Escher; from the ornamental designs of the Moslems, to the sage Giordano Bruno, whose belief in an infinite universe led to his death at the hands of the Inquisition. But above all, the book describes the mathematician's fascination with infinity--a fascination mingled with puzzlement. Maor explores the idea of infinity in mathematics and in art and argues that this is the point of contact between the two, best exemplified by the work of the Dutch artist M. C. Escher, six of whose works are shown here in beautiful color plates.--Los Angeles Times [Eli Maor's] enthusiasm for the topic carries the reader through a rich panorama.--Choice Fascinating and enjoyable.... places the ideas of infinity in a cultural context and shows how they have been espoused and molded by mathematics.--Science

Calculus for Dummies


Mark Ryan - 2003
    Others who have no intention of ever studying the subject have this notion that calculus is impossibly difficult unless you happen to be a direct descendant of Einstein. Well, the good news is that you can master calculus. It's not nearly as tough as its mystique would lead you to think. Much of calculus is really just very advanced algebra, geometry, and trig. It builds upon and is a logical extension of those subjects. If you can do algebra, geometry, and trig, you can do calculus.Calculus For Dummies is intended for three groups of readers:Students taking their first calculus course - If you're enrolled in a calculus course and you find your textbook less than crystal clear, this is the book for you. It covers the most important topics in the first year of calculus: differentiation, integration, and infinite series.Students who need to brush up on their calculus to prepare for other studies - If you've had elementary calculus, but it's been a couple of years and you want to review the concepts to prepare for, say, some graduate program, Calculus For Dummies will give you a thorough, no-nonsense refresher course.Adults of all ages who'd like a good introduction to the subject - Non-student readers will find the book's exposition clear and accessible. Calculus For Dummies takes calculus out of the ivory tower and brings it down to earth. This is a user-friendly math book. Whenever possible, the author explains the calculus concepts by showing you connections between the calculus ideas and easier ideas from algebra and geometry. Then, you'll see how the calculus concepts work in concrete examples. All explanations are in plain English, not math-speak. Calculus For Dummies covers the following topics and more:Real-world examples of calculus The two big ideas of calculus: differentiation and integration Why calculus works Pre-algebra and algebra review Common functions and their graphs Limits and continuity Integration and approximating area Sequences and series Don't buy the misconception. Sure calculus is difficult - but it's manageable, doable. You made it through algebra, geometry, and trigonometry. Well, calculus just picks up where they leave off - it's simply the next step in a logical progression.