Secrets of Mental Math: The Mathemagician's Guide to Lightning Calculation and Amazing Math Tricks


Arthur T. Benjamin - 1993
    Get ready to amaze your friends—and yourself—with incredible calculations you never thought you could master, as renowned “mathemagician” Arthur Benjamin shares his techniques for lightning-quick calculations and amazing number tricks. This book will teach you to do math in your head faster than you ever thought possible, dramatically improve your memory for numbers, and—maybe for the first time—make mathematics fun.Yes, even you can learn to do seemingly complex equations in your head; all you need to learn are a few tricks. You’ll be able to quickly multiply and divide triple digits, compute with fractions, and determine squares, cubes, and roots without blinking an eye. No matter what your age or current math ability, Secrets of Mental Math will allow you to perform fantastic feats of the mind effortlessly. This is the math they never taught you in school.Also available as an eBook

Fuzzy Logic: The Revolutionary Computer Technology That Is Changing Our World


Daniel McNeill - 1993
    Professor Lofti Zadeh masterminded "fuzzy logic"--a way of programming computers to "make decisions" bases on imprecise data and complex situations. In "Fuzzy Logic," Daniel McNeill and Paul Freiberger relate the compelling tale of this remarkable new technology, the genius who brought it to life, and how it will soon affect the lives of every one of us.

Thinking Statistically


Uri Bram - 2011
    Along the way we’ll learn how selection bias can explain why your boss doesn’t know he sucks (even when everyone else does); how to use Bayes’ Theorem to decide if your partner is cheating on you; and why Mark Zuckerberg should never be used as an example for anything. See the world in a whole new light, and make better decisions and judgements without ever going near a t-test. Think. Think Statistically.

Number Theory


George E. Andrews - 1994
    In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simplicity of the proofs for many theorems.Among the topics covered in this accessible, carefully designed introduction are multiplicativity-divisibility, including the fundamental theorem of arithmetic, combinatorial and computational number theory, congruences, arithmetic functions, primitive roots and prime numbers. Later chapters offer lucid treatments of quadratic congruences, additivity (including partition theory) and geometric number theory.Of particular importance in this text is the author's emphasis on the value of numerical examples in number theory and the role of computers in obtaining such examples. Exercises provide opportunities for constructing numerical tables with or without a computer. Students can then derive conjectures from such numerical tables, after which relevant theorems will seem natural and well-motivated..

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

The Art and Craft of Problem Solving


Paul Zeitz - 1999
    Readers are encouraged to do math rather than just study it. The author draws upon his experience as a coach for the International Mathematics Olympiad to give students an enhanced sense of mathematics and the ability to investigate and solve problems.

Emergence: The Connected Lives of Ants, Brains, Cities, and Software


Steven Johnson - 2001
    Explaining why the whole is sometimes smarter than the sum of its parts, Johnson presents surprising examples of feedback, self-organization, and adaptive learning. How does a lively neighborhood evolve out of a disconnected group of shopkeepers, bartenders, and real estate developers? How does a media event take on a life of its own? How will new software programs create an intelligent World Wide Web? In the coming years, the power of self-organization -- coupled with the connective technology of the Internet -- will usher in a revolution every bit as significant as the introduction of electricity. Provocative and engaging, Emergence puts you on the front lines of this exciting upheaval in science and thought.

The Science of Information: From Language to Black Holes


Benjamin Schumacher - 2015
    Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.Table of ContentsLECTURE 1The Transformability of Information 4LECTURE 2Computation and Logic Gates 17LECTURE 3Measuring Information 26LECTURE 4Entropy and the Average Surprise 34LECTURE 5Data Compression and Prefix-Free Codes 44LECTURE 6Encoding Images and Sounds 57LECTURE 7Noise and Channel Capacity 69LECTURE 8Error-Correcting Codes 82LECTURE 9Signals and Bandwidth 94LECTURE 10Cryptography and Key Entropy 110LECTURE 11Cryptanalysis and Unraveling the Enigma 119LECTURE 12Unbreakable Codes and Public Keys 130LECTURE 13What Genetic Information Can Do 140LECTURE 14Life’s Origins and DNA Computing 152LECTURE 15Neural Codes in the Brain 169LECTURE 16Entropy and Microstate Information 185LECTURE 17Erasure Cost and Reversible Computing 198LECTURE 18Horse Races and Stock Markets 213LECTURE 19Turing Machines and Algorithmic Information 226LECTURE 20Uncomputable Functions and Incompleteness 239LECTURE 21Qubits and Quantum Information 253LECTURE 22Quantum Cryptography via Entanglement 266LECTURE 23It from Bit: Physics from Information 281LECTURE 24The Meaning of Information 293

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

The Evolution of Cooperation


Robert Axelrod - 1984
    Widely praised and much-discussed, this classic book explores how cooperation can emerge in a world of self-seeking egoists—whether superpowers, businesses, or individuals—when there is no central authority to police their actions. The problem of cooperation is central to many different fields. Robert Axelrod recounts the famous computer tournaments in which the “cooperative” program Tit for Tat recorded its stunning victories, explains its application to a broad spectrum of subjects, and suggests how readers can both apply cooperative principles to their own lives and teach cooperative principles to others.

Paradox: The Nine Greatest Enigmas in Physics


Jim Al-Khalili - 2012
    A fun and fascinating look at great scientific paradoxes.   Throughout history, scientists have come up with theories and ideas that just don't seem to make sense.  These we call paradoxes.  The paradoxes Al-Khalili offers are drawn chiefly from physics and astronomy and represent those that have stumped some of the finest minds.  For example, how can a cat be both dead and alive at the same time?  Why will Achilles never beat a tortoise in a race, no matter how fast he runs?  And how can a person be ten years older than his twin?   With elegant explanations that bring the reader inside the mind of those who've developed them, Al-Khalili helps us to see that, in fact, paradoxes can be solved if seen from the right angle.  Just as surely as Al-Khalili narrates the enduring fascination of these classic paradoxes, he reveals their underlying logic.  In doing so, he brings to life a select group of the most exciting concepts in human knowledge.  Paradox is mind-expanding fun.