Book picks similar to
Spurious Correlations by Tyler Vigen
non-fiction
science
nonfiction
humor
Standard Deviations: Flawed Assumptions, Tortured Data, and Other Ways to Lie with Statistics
Gary Smith - 2014
In Standard Deviations, economics professor Gary Smith walks us through the various tricks and traps that people use to back up their own crackpot theories. Sometimes, the unscrupulous deliberately try to mislead us. Other times, the well-intentioned are blissfully unaware of the mischief they are committing. Today, data is so plentiful that researchers spend precious little time distinguishing between good, meaningful indicators and total rubbish. Not only do others use data to fool us, we fool ourselves.With the breakout success of Nate Silver’s The Signal and the Noise, the once humdrum subject of statistics has never been hotter. Drawing on breakthrough research in behavioral economics by luminaries like Daniel Kahneman and Dan Ariely and taking to task some of the conclusions of Freakonomics author Steven D. Levitt, Standard Deviations demystifies the science behind statistics and makes it easy to spot the fraud all around.
Bayesian Data Analysis
Andrew Gelman - 1995
Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.
Holy Sh*t: A Brief History of Swearing
Melissa Mohr - 2013
With humor and insight, Melissa Mohr takes readers on a journey to discover how "swearing" has come to include both testifying with your hand on the Bible and calling someone a *#$&!* when they cut you off on the highway. She explores obscenities in ancient Rome and unearths the history of religious oaths in the Middle Ages, when swearing (or not swearing) an oath was often a matter of life and death. Holy Sh*t also explains the advancement of civility and corresponding censorship of language in the 18th century, considers the rise of racial slurs after World War II, examines the physiological effects of swearing and answers a question that preoccupies the FCC, the US Senate, and anyone who has recently overheard little kids at a playground: are we swearing more now than people did in the past?A gem of lexicography and cultural history, Holy Sh*t is a serious exploration of obscenity.
R Graphics Cookbook: Practical Recipes for Visualizing Data
Winston Chang - 2012
Each recipe tackles a specific problem with a solution you can apply to your own project, and includes a discussion of how and why the recipe works.Most of the recipes use the ggplot2 package, a powerful and flexible way to make graphs in R. If you have a basic understanding of the R language, you're ready to get started.Use R's default graphics for quick exploration of dataCreate a variety of bar graphs, line graphs, and scatter plotsSummarize data distributions with histograms, density curves, box plots, and other examplesProvide annotations to help viewers interpret dataControl the overall appearance of graphicsRender data groups alongside each other for easy comparisonUse colors in plotsCreate network graphs, heat maps, and 3D scatter plotsStructure data for graphing
The Complete Manual of Things That Might Kill You: A Guide to Self-Diagnosis for Hypochondriacs
Megan E. Bluhm Foldenauer - 2007
The world's worst maladies, conveniently organized by symptom (real or imagined), will ignite even the mildest hypochondriac's fantasy life. We're all going to die of something--why not choose an ailment that's rare and hard to pronounce?
Linear Algebra and Its Applications [with CD-ROM]
David C. Lay - 1993
The Outer Limits of Reason: What Science, Mathematics, and Logic Cannot Tell Us
Noson S. Yanofsky - 2013
This book investigates what cannot be known. Rather than exploring the amazing facts that science, mathematics, and reason have revealed to us, this work studies what science, mathematics, and reason tell us cannot be revealed. In The Outer Limits of Reason, Noson Yanofsky considers what cannot be predicted, described, or known, and what will never be understood. He discusses the limitations of computers, physics, logic, and our own thought processes.Yanofsky describes simple tasks that would take computers trillions of centuries to complete and other problems that computers can never solve; perfectly formed English sentences that make no sense; different levels of infinity; the bizarre world of the quantum; the relevance of relativity theory; the causes of chaos theory; math problems that cannot be solved by normal means; and statements that are true but cannot be proven. He explains the limitations of our intuitions about the world -- our ideas about space, time, and motion, and the complex relationship between the knower and the known.Moving from the concrete to the abstract, from problems of everyday language to straightforward philosophical questions to the formalities of physics and mathematics, Yanofsky demonstrates a myriad of unsolvable problems and paradoxes. Exploring the various limitations of our knowledge, he shows that many of these limitations have a similar pattern and that by investigating these patterns, we can better understand the structure and limitations of reason itself. Yanofsky even attempts to look beyond the borders of reason to see what, if anything, is out there.
Information: The New Language of Science
Hans Christian Von Baeyer - 2003
In this indispensable volume, a primer for the information age, Hans Christian von Baeyer presents a clear description of what information is, how concepts of its measurement, meaning, and transmission evolved, and what its ever-expanding presence portends for the future. Information is poised to replace matter as the primary stuff of the universe, von Baeyer suggests; it will provide a new basic framework for describing and predicting reality in the twenty-first century. Despite its revolutionary premise, von Baeyer's book is written simply in a straightforward fashion, offering a wonderfully accessible introduction to classical and quantum information. Enlivened with anecdotes from the lives of philosophers, mathematicians, and scientists who have contributed significantly to the field, Information conducts readers from questions of subjectivity inherent in classical information to the blurring of distinctions between computers and what they measure or store in our quantum age. A great advance in our efforts to define and describe the nature of information, the book also marks an important step forward in our ability to exploit information--and, ultimately, to transform the nature of our relationship with the physical universe. (20040301)
Probabilistic Graphical Models: Principles and Techniques
Daphne Koller - 2009
The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality.Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.
Pattern Recognition and Machine Learning
Christopher M. Bishop - 2006
However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.
Data Jujitsu: The Art of Turning Data into Product
D.J. Patil - 2012
Acclaimed data scientist DJ Patil details a new approach to solving problems in Data Jujitsu.Learn how to use a problem's "weight" against itself to:Break down seemingly complex data problems into simplified partsUse alternative data analysis techniques to examine themUse human input, such as Mechanical Turk, and design tricks that enlist the help of your users to take short cuts around tough problemsLearn more about the problems before starting on the solutions—and use the findings to solve them, or determine whether the problems are worth solving at all.
Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms, Cities, Economies, and Companies
Geoffrey B. West - 2017
The term “complexity” can be misleading, however, because what makes West’s discoveries so beautiful is that he has found an underlying simplicity that unites the seemingly complex and diverse phenomena of living systems, including our bodies, our cities and our businesses. Fascinated by issues of aging and mortality, West applied the rigor of a physicist to the biological question of why we live as long as we do and no longer. The result was astonishing, and changed science, creating a new understanding of energy use and metabolism: West found that despite the riotous diversity in the sizes of mammals, they are all, to a large degree, scaled versions of each other. If you know the size of a mammal, you can use scaling laws to learn everything from how much food it eats per day, what its heart-rate is, how long it will take to mature, its lifespan, and so on. Furthermore, the efficiency of the mammal’s circulatory systems scales up precisely based on weight: if you compare a mouse, a human and an elephant on a logarithmic graph, you find with every doubling of average weight, a species gets 25% more efficient—and lives 25% longer. This speaks to everything from how long we can expect to live to how many hours of sleep we need. Fundamentally, he has proven, the issue has to do with the fractal geometry of the networks that supply energy and remove waste from the organism's body. West's work has been game-changing for biologists, but then he made the even bolder move of exploring his work's applicability to cities. Cities, too, are constellations of networks and laws of scalability relate with eerie precision to them. For every doubling in a city's size, the city needs 15% less road, electrical wire, and gas stations to support the same population. More amazingly, for every doubling in size, cities produce 15% more patents and more wealth, as well as 15% more crime and disease. This broad pattern lays the groundwork for a new science of cities. Recently, West has applied his revolutionary work on cities and biological life to the business world. This investigation has led to powerful insights into why some companies thrive while others fail. The implications of these discoveries are far-reaching, and are just beginning to be explored. Scale is a thrilling scientific adventure story about the elemental natural laws that bind us together in simple but profound ways. Through the brilliant mind of Geoffrey West, we can envision how cities, companies and biological life alike are dancing to the same simple, powerful tune, however diverse and unrelated they are to each other.From the Hardcover edition.
The Two Cultures
C.P. Snow - 1959
But it was C. P. Snow's Rede lecture of 1959 that brought it to prominence and began a public debate that is still raging in the media today. This 50th anniversary printing of The Two Cultures and its successor piece, A Second Look (in which Snow responded to the controversy four years later) features an introduction by Stefan Collini, charting the history and context of the debate, its implications and its afterlife. The importance of science and technology in policy run largely by non-scientists, the future for education and research, and the problem of fragmentation threatening hopes for a common culture are just some of the subjects discussed.
The Principia: Mathematical Principles of Natural Philosophy
Isaac Newton - 1687
Even after more than three centuries and the revolutions of Einsteinian relativity and quantum mechanics, Newtonian physics continues to account for many of the phenomena of the observed world, and Newtonian celestial dynamics is used to determine the orbits of our space vehicles.This completely new translation, the first in 270 years, is based on the third (1726) edition, the final revised version approved by Newton; it includes extracts from the earlier editions, corrects errors found in earlier versions, and replaces archaic English with contemporary prose and up-to-date mathematical forms. Newton's principles describe acceleration, deceleration, and inertial movement; fluid dynamics; and the motions of the earth, moon, planets, and comets. A great work in itself, the Principia also revolutionized the methods of scientific investigation. It set forth the fundamental three laws of motion and the law of universal gravity, the physical principles that account for the Copernican system of the world as emended by Kepler, thus effectively ending controversy concerning the Copernican planetary system.The illuminating Guide to the Principia by I. Bernard Cohen, along with his and Anne Whitman's translation, will make this preeminent work truly accessible for today's scientists, scholars, and students.
The Logic of Scientific Discovery
Karl Popper - 1934
It remains the one of the most widely read books about science to come out of the twentieth century.(Note: the book was first published in 1934, in German, with the title Logik der Forschung. It was "reformulated" into English in 1959. See Wikipedia for details.)