A Beautiful Math: John Nash, Game Theory, and the Modern Quest for a Code of Nature


Tom Siegfried - 2006
    Today Nash's beautiful math has become a universal language for research in the social sciences and has infiltrated the realms of evolutionary biology, neuroscience, and even quantum physics. John Nash won the 1994 Nobel Prize in economics for pioneering research published in the 1950s on a new branch of mathematics known as game theory. At the time of Nash's early work, game theory was briefly popular among some mathematicians and Cold War analysts. But it remained obscure until the 1970s when evolutionary biologists began applying it to their work. In the 1980s economists began to embrace game theory. Since then it has found an ever expanding repertoire of applications among a wide range of scientific disciplines. Today neuroscientists peer into game players' brains, anthropologists play games with people from primitive cultures, biologists use games to explain the evolution of human language, and mathematicians exploit games to better understand social networks. A common thread connecting much of this research is its relevance to the ancient quest for a science of human social behavior, or a Code of Nature, in the spirit of the fictional science of psychohistory described in the famous Foundation novels by the late Isaac Asimov. In A Beautiful Math, acclaimed science writer Tom Siegfried describes how game theory links the life sciences, social sciences, and physical sciences in a way that may bring Asimov's dream closer to reality.

The Deep Learning Revolution


Terrence J. Sejnowski - 2018
    Deep learning networks can play poker better than professional poker players and defeat a world champion at Go. In this book, Terry Sejnowski explains how deep learning went from being an arcane academic field to a disruptive technology in the information economy.Sejnowski played an important role in the founding of deep learning, as one of a small group of researchers in the 1980s who challenged the prevailing logic-and-symbol based version of AI. The new version of AI Sejnowski and others developed, which became deep learning, is fueled instead by data. Deep networks learn from data in the same way that babies experience the world, starting with fresh eyes and gradually acquiring the skills needed to navigate novel environments. Learning algorithms extract information from raw data; information can be used to create knowledge; knowledge underlies understanding; understanding leads to wisdom. Someday a driverless car will know the road better than you do and drive with more skill; a deep learning network will diagnose your illness; a personal cognitive assistant will augment your puny human brain. It took nature many millions of years to evolve human intelligence; AI is on a trajectory measured in decades. Sejnowski prepares us for a deep learning future.

The Great Philosophers (From Socrates to Foucault)


Jeremy Stangroom - 2005
    Each essay gives a biographical background for its subject and a description of the main strands of their thought, together with summaries of their major works.The thirty-four chronologically-organized essays are a comprehensive introduction to Western philosophy's major figures.Dr Jeremy Stangroom is a founding editor of The Philosophers' Magazine, one of the world's most popular philosophy publications. He has written and/or edited numerous books, including: New British Philosophy, What Philosophers Think and Great Thinkers A-Z (all with Julian Baggini); The Dictionary of Fashionable Nonsense and Why Truth Matters (with Ophelia Benson); and What Scientists Think. He is a frequent contributor to New Humanist magazine, and he is also the editor of the Royal Institute of Philosophy web site.James Garvey teaches philosophy at the University of Nottingham and is Secretary of the Royal Institute of Philosophy.

Bayes' Rule: A Tutorial Introduction to Bayesian Analysis


James V. Stone - 2013
    Discovered by an 18th century mathematician and preacher, Bayes' rule is a cornerstone of modern probability theory. In this richly illustrated book, intuitive visual representations of real-world examples are used to show how Bayes' rule is actually a form of commonsense reasoning. The tutorial style of writing, combined with a comprehensive glossary, makes this an ideal primer for novices who wish to gain an intuitive understanding of Bayesian analysis. As an aid to understanding, online computer code (in MatLab, Python and R) reproduces key numerical results and diagrams.Stone's book is renowned for its visually engaging style of presentation, which stems from teaching Bayes' rule to psychology students for over 10 years as a university lecturer.

Quantum Computing for Everyone


Chris Bernhardt - 2019
    In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means.Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement--which, he says, is easier to describe mathematically than verbally--and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as "spooky action at a distance"); and introduces quantum cryptography. He recaps standard topics in classical computing--bits, gates, and logic--and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.

Elliptic Tales: Curves, Counting, and Number Theory


Avner Ash - 2012
    The Clay Mathematics Institute is offering a prize of $1 million to anyone who can discover a general solution to the problem. In this book, Avner Ash and Robert Gross guide readers through the mathematics they need to understand this captivating problem.The key to the conjecture lies in elliptic curves, which are cubic equations in two variables. These equations may appear simple, yet they arise from some very deep--and often very mystifying--mathematical ideas. Using only basic algebra and calculus while presenting numerous eye-opening examples, Ash and Gross make these ideas accessible to general readers, and in the process venture to the very frontiers of modern mathematics. Along the way, they give an informative and entertaining introduction to some of the most profound discoveries of the last three centuries in algebraic geometry, abstract algebra, and number theory. They demonstrate how mathematics grows more abstract to tackle ever more challenging problems, and how each new generation of mathematicians builds on the accomplishments of those who preceded them. Ash and Gross fully explain how the Birch and Swinnerton-Dyer Conjecture sheds light on the number theory of elliptic curves, and how it provides a beautiful and startling connection between two very different objects arising from an elliptic curve, one based on calculus, the other on algebra.

Fearful Symmetry: The Search for Beauty in Modern Physics


A. Zee - 1986
    A. Zee, a distinguished physicist and skillful expositor, tells the exciting story of how today's theoretical physicists are following Einstein in their search for the beauty and simplicity of Nature. Animated by a sense of reverence and whimsy, the book describes the majestic sweep and accomplishments of twentieth-century physics. In the end, we stand in awe before the grand vision of modern physics--one of the greatest chapters in the intellectual history of humankind.

The Principle of Relativity (Books on Physics)


Albert Einstein - 1952
    Lorentz.

Change is the Only Constant: The Wisdom of Calculus in a Madcap World


Ben Orlin - 2019
    By spinning 28 mathematical tales, Orlin shows us that calculus is simply another language to express the very things we humans grapple with every day -- love, risk, time, and most importantly, change. Divided into two parts, "Moments" and "Eternities," and drawing on everyone from Sherlock Holmes to Mark Twain to David Foster Wallace, Change is the Only Constant unearths connections between calculus, art, literature, and a beloved dog named Elvis. This is not just math for math's sake; it's math for the sake of becoming a wiser and more thoughtful human.

God's Mechanics: How Scientists and Engineers Make Sense of Religion


Guy Consolmagno - 2007
    A full fledged techie himself, he relates some classic philosophical reflections, his interviews with dozens of fellow techies, and his own personal take on his Catholic beliefs to provide, like a set of "worked out sample problems," the hard data on the challenges and joys of embracing a life of faith as a techie. And he also gives a roadmap of the traps that can befall an unwary techie believer. With lively prose and wry humor, Brother Guy shows how he not only believes in God but gives religion an honored place alongside science in his life. This book offers an engaging look at how--and why--scientists and those with technological leanings can hold profound, "unprovable" religious beliefs while working in highly empirical fields. Through his own experience and interviews with other scientists and engineers who profess faith, Brother Guy explores how religious beliefs and practices make sense to those who are deeply rooted in the world of technology.

Hidden In Plain Sight 2: The Equation of the Universe


Andrew H. Thomas - 2013
    Enjoy a thrilling intergalactic tour as Andrew Thomas redefines the force of gravity and introduces a brave new view of the universe!

The Cambridge Quintet: A Work Of Scientific Speculation


John L. Casti - 1997
    Casti contemplates an imaginary evening of intellectual inquiry—a sort of “My Dinner with” not Andre, but five of the most brilliant thinkers of the twentieth century.Imagine, if you will, one stormy summer evening in 1949, as novelist and scientist C. P. Snow, Britain’s distinguished wartime science advisor and author of The Two Cultures, invites four singular guests to a sumptuous seven-course dinner at his alma mater, Christ’s College, Cambridge, to discuss one of the emerging scientific issues of the day: Can we build a machine that could duplicate human cognitive processes? The distinguished guest list for Snow’s dinner consists of physicist Erwin Schrodinger, inventor of wave mechanics; Ludwig Wittgenstein, the famous twentieth-century philosopher of language, who posited two completely contradictory theories of human thought in his lifetime; population geneticist/science popularizer J.B.S. Haldane; and Alan Turing, the mathematician/codebreaker who formulated the computing scheme that foreshadowed the logical structure of all modern computers. Capturing not only their unique personalities but also their particular stands on this fascinating issue, Casti dramatically shows what each of these great men might have argued about artificial intelligence, had they actually gathered for dinner that midsummer evening.With Snow acting as referee, a lively intellectual debate unfolds. Philosopher Wittgenstein argues that in order to become conscious, a machine would have to have life experiences similar to those of human beings—such as pain, joy, grief, or pleasure. Biologist Haldane offers the idea that mind is a separate entity from matter, so that regardless of how sophisticated the machine, only flesh can bond with that mysterious force called intelligence. Both physicist Schrodinger and, of course, computer pioneer Turing maintain that it is not the substance, but rather the organization of that substance, that makes a mind conscious.With great verve and skill, Casti recreates a unique and thrilling moment of time in the grand history of scientific ideas. Even readers who have already formed an opinion on artificial intelligence will be forced to reopen their minds on the subject upon reading this absorbing narrative. After almost four decades, the solutions to the epic scientific and philosophical problems posed over this meal in C. P. Snow’s old rooms at Christ’s College remains tantalizingly just out of reach, making this adventure into scientific speculation as valid today as it was in 1949.

Numerical Recipes: The Art of Scientific Computing


William H. Press - 2007
    Widely recognized as the most comprehensive, accessible and practical basis for scientific computing, this new edition incorporates more than 400 Numerical Recipes routines, many of them new or upgraded. The executable C++ code, now printed in color for easy reading, adopts an object-oriented style particularly suited to scientific applications. The whole book is presented in the informal, easy-to-read style that made earlier editions so popular. Please visit www.nr.com or www.cambridge.org/us/numericalrecipes for more details. More information concerning licenses is available at: www.nr.com/licenses New key features: 2 new chapters, 25 new sections, 25% longer than Second Edition Thorough upgrades throughout the text Over 100 completely new routines and upgrades of many more. New Classification and Inference chapter, including Gaussian mixture models, HMMs, hierarchical clustering, Support Vector MachinesNew Computational Geometry chapter covers KD trees, quad- and octrees, Delaunay triangulation, and algorithms for lines, polygons, triangles, and spheres New sections include interior point methods for linear programming, Monte Carlo Markov Chains, spectral and pseudospectral methods for PDEs, and many new statistical distributions An expanded treatment of ODEs with completely new routines Plus comprehensive coverage of linear algebra, interpolation, special functions, random numbers, nonlinear sets of equations, optimization, eigensystems, Fourier methods and wavelets, statistical tests, ODEs and PDEs, integral equations, and inverse theory

Head First Data Analysis: A Learner's Guide to Big Numbers, Statistics, and Good Decisions


Michael G. Milton - 2009
    If your job requires you to manage and analyze all kinds of data, turn to Head First Data Analysis, where you'll quickly learn how to collect and organize data, sort the distractions from the truth, find meaningful patterns, draw conclusions, predict the future, and present your findings to others. Whether you're a product developer researching the market viability of a new product or service, a marketing manager gauging or predicting the effectiveness of a campaign, a salesperson who needs data to support product presentations, or a lone entrepreneur responsible for all of these data-intensive functions and more, the unique approach in Head First Data Analysis is by far the most efficient way to learn what you need to know to convert raw data into a vital business tool. You'll learn how to:Determine which data sources to use for collecting information Assess data quality and distinguish signal from noise Build basic data models to illuminate patterns, and assimilate new information into the models Cope with ambiguous information Design experiments to test hypotheses and draw conclusions Use segmentation to organize your data within discrete market groups Visualize data distributions to reveal new relationships and persuade others Predict the future with sampling and probability models Clean your data to make it useful Communicate the results of your analysis to your audience Using the latest research in cognitive science and learning theory to craft a multi-sensory learning experience, Head First Data Analysis uses a visually rich format designed for the way your brain works, not a text-heavy approach that puts you to sleep.

Computer: A History of the Information Machine


Martin Campbell-Kelly - 1996
    Old-fashioned entrepreneurship combined with scientific know-how inspired now famous computer engineers to create the technology that became IBM. Wartime needs drove the giant ENIAC, the first fully electronic computer. Later, the PC enabled modes of computing that liberated people from room-sized, mainframe computers. This second edition now extends beyond the development of Microsoft Windows and the Internet, to include open source operating systems like Linux, and the rise again and fall and potential rise of the dot.com industries.