Concrete Mathematics: A Foundation for Computer Science


Ronald L. Graham - 1988
    "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.

Think Complexity: Complexity Science and Computational Modeling


Allen B. Downey - 2009
    Whether you’re an intermediate-level Python programmer or a student of computational modeling, you’ll delve into examples of complex systems through a series of exercises, case studies, and easy-to-understand explanations.You’ll work with graphs, algorithm analysis, scale-free networks, and cellular automata, using advanced features that make Python such a powerful language. Ideal as a text for courses on Python programming and algorithms, Think Complexity will also help self-learners gain valuable experience with topics and ideas they might not encounter otherwise.Work with NumPy arrays and SciPy methods, basic signal processing and Fast Fourier Transform, and hash tablesStudy abstract models of complex physical systems, including power laws, fractals and pink noise, and Turing machinesGet starter code and solutions to help you re-implement and extend original experiments in complexityExplore the philosophy of science, including the nature of scientific laws, theory choice, realism and instrumentalism, and other topicsExamine case studies of complex systems submitted by students and readers

Social Statistics for a Diverse Society


Chava Frankfort-Nachmias - 1996
    The authors help students learn key sociological concepts through real research examples related to the dynamic interplay of race, class, gender, and other social variables.

Hello World: Being Human in the Age of Algorithms


Hannah Fry - 2018
    It’s time we stand face-to-digital-face with the true powers and limitations of the algorithms that already automate important decisions in healthcare, transportation, crime, and commerce. Hello World is indispensable preparation for the moral quandaries of a world run by code, and with the unfailingly entertaining Hannah Fry as our guide, we’ll be discussing these issues long after the last page is turned.

Bayesian Data Analysis


Andrew Gelman - 1995
    Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy


Cathy O'Neil - 2016
    Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.

Introduction to Probability


Joseph K. Blitzstein - 2014
    The book explores a wide variety of applications and examples, ranging from coincidences and paradoxes to Google PageRank and Markov chain Monte Carlo MCMC. Additional application areas explored include genetics, medicine, computer science, and information theory. The print book version includes a code that provides free access to an eBook version. The authors present the material in an accessible style and motivate concepts using real-world examples. Throughout, they use stories to uncover connections between the fundamental distributions in statistics and conditioning to reduce complicated problems to manageable pieces. The book includes many intuitive explanations, diagrams, and practice problems. Each chapter ends with a section showing how to perform relevant simulations and calculations in R, a free statistical software environment.

A New Kind of Science


Stephen Wolfram - 1997
    Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton

Cybernetics: or the Control and Communication in the Animal and the Machine


Norbert Wiener - 1948
    It is a ‘ must’ book for those in every branch of science . . . in addition, economists, politicians, statesmen, and businessmen cannot afford to overlook cybernetics and its tremendous, even terrifying implications. "It is a beautifully written book, lucid, direct, and despite its complexity, as readable by the layman as the trained scientist." -- John B. Thurston, "The Saturday Review of Literature" Acclaimed one of the "seminal books . . . comparable in ultimate importance to . . . Galileo or Malthus or Rousseau or Mill," "Cybernetics" was judged by twenty-seven historians, economists, educators, and philosophers to be one of those books published during the "past four decades", which may have a substantial impact on public thought and action in the years ahead." -- Saturday Review

Turing's Cathedral: The Origins of the Digital Universe


George Dyson - 2012
    In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same. Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars. Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.  How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

The Data Detective: Ten Easy Rules to Make Sense of Statistics


Tim Harford - 2020
    That’s a mistake, Tim Harford says in The Data Detective. We shouldn’t be suspicious of statistics—we need to understand what they mean and how they can improve our lives: they are, at heart, human behavior seen through the prism of numbers and are often “the only way of grasping much of what is going on around us.” If we can toss aside our fears and learn to approach them clearly—understanding how our own preconceptions lead us astray—statistics can point to ways we can live better and work smarter.As “perhaps the best popular economics writer in the world” (New Statesman), Tim Harford is an expert at taking complicated ideas and untangling them for millions of readers. In The Data Detective, he uses new research in science and psychology to set out ten strategies for using statistics to erase our biases and replace them with new ideas that use virtues like patience, curiosity, and good sense to better understand ourselves and the world. As a result, The Data Detective is a big-idea book about statistics and human behavior that is fresh, unexpected, and insightful.

Essential Calculus


James Stewart - 2006
    In writing the book James Stewart asked himself: What is essential for a three-semester calculus course for scientists and engineers? Stewart's ESSENTIAL CALCULUS offers a concise approach to teaching calculus that focuses on major concepts and supports those concepts with precise definitions, patient explanations, and carefully graded problems. Essential Calculus is only 850 pages-two-thirds the size of Stewart's other calculus texts (CALCULUS, Fifth Edition and CALCULUS, EARLY TRANSCENDENTALS, Fifth Edition)-and yet it contains almost all of the same topics. The author achieved this relative brevity mainly by condensing the exposition and by putting some of the features on the website, www.StewartCalculus.com. Despite the reduced size of the book, there is still a modern flavor: Conceptual understanding and technology are not neglected, though they are not as prominent as in Stewart's other books. ESSENTIAL CALCULUS has been written with the same attention to detail, eye for innovation, and meticulous accuracy that have made Stewart's textbooks the best-selling calculus texts in the world.

The Improbability Principle: Why Coincidences, Miracles, and Rare Events Happen Every Day


David J. Hand - 2014
    Hand argues that extraordinarily rare events are anything but. In fact, they’re commonplace. Not only that, we should all expect to experience a miracle roughly once every month.     But Hand is no believer in superstitions, prophecies, or the paranormal. His definition of “miracle” is thoroughly rational. No mystical or supernatural explanation is necessary to understand why someone is lucky enough to win the lottery twice, or is destined to be hit by lightning three times and still survive. All we need, Hand argues, is a firm grounding in a powerful set of laws: the laws of inevitability, of truly large numbers, of selection, of the probability lever, and of near enough.     Together, these constitute Hand’s groundbreaking Improbability Principle. And together, they explain why we should not be so surprised to bump into a friend in a foreign country, or to come across the same unfamiliar word four times in one day. Hand wrestles with seemingly less explicable questions as well: what the Bible and Shakespeare have in common, why financial crashes are par for the course, and why lightning does strike the same place (and the same person) twice. Along the way, he teaches us how to use the Improbability Principle in our own lives—including how to cash in at a casino and how to recognize when a medicine is truly effective.     An irresistible adventure into the laws behind “chance” moments and a trusty guide for understanding the world and universe we live in, The Improbability Principle will transform how you think about serendipity and luck, whether it’s in the world of business and finance or you’re merely sitting in your backyard, tossing a ball into the air and wondering where it will land.

Applied Predictive Modeling


Max Kuhn - 2013
    Non- mathematical readers will appreciate the intuitive explanations of the techniques while an emphasis on problem-solving with real data across a wide variety of applications will aid practitioners who wish to extend their expertise. Readers should have knowledge of basic statistical ideas, such as correlation and linear regression analysis. While the text is biased against complex equations, a mathematical background is needed for advanced topics. Dr. Kuhn is a Director of Non-Clinical Statistics at Pfizer Global R&D in Groton Connecticut. He has been applying predictive models in the pharmaceutical and diagnostic industries for over 15 years and is the author of a number of R packages. Dr. Johnson has more than a decade of statistical consulting and predictive modeling experience in pharmaceutical research and development. He is a co-founder of Arbor Analytics, a firm specializing in predictive modeling and is a former Director of Statistics at Pfizer Global R&D. His scholarly work centers on the application and development of statistical methodology and learning algorithms. Applied Predictive Modeling covers the overall predictive modeling process, beginning with the crucial steps of data preprocessing, data splitting and foundations of model tuning. The text then provides intuitive explanations of numerous common and modern regression and classification techniques, always with an emphasis on illustrating and solving real data problems. Addressing practical concerns extends beyond model fitting to topics such as handling class imbalance, selecting predictors, and pinpointing causes of poor model performance-all of which are problems that occur frequently in practice. The text illustrates all parts of the modeling process through many hands-on, real-life examples. And every chapter contains extensive R code f