Book picks similar to
Probability Theory by S.R.S. Varadhan
math
mathematics
science
taleb
The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives
Stephen Thomas Ziliak - 2008
If it takes a book to get it across, I hope this book will do it. It ought to.”—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics “With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots. Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).
Social Network Analysis: Methods and Applications
Stanley Wasserman - 1994
Social Network Analysis: Methods and Applications reviews and discusses methods for the analysis of social networks with a focus on applications of these methods to many substantive examples. As the first book to provide a comprehensive coverage of the methodology and applications of the field, this study is both a reference book and a textbook.
Algebra - The Very Basics
Metin Bektas - 2014
This book picks you up at the very beginning and guides you through the foundations of algebra using lots of examples and no-nonsense explanations. Each chapter contains well-chosen exercises as well as all the solutions. No prior knowledge is required. Topics include: Exponents, Brackets, Linear Equations and Quadratic Equations. For a more detailed table of contents, use the "Look Inside" feature. From the author of "Great Formulas Explained" and "Physics! In Quantities and Examples".
Digital Image Processing
Rafael C. Gonzalez - 1977
Completely self-contained, heavily illustrated, and mathematically accessible, it has a scope of application that is not limited to the solution of specialized problems. Digital Image Fundamentals. Image Enhancement in the Spatial Domain. Image Enhancement in the Frequency Domain. Image Restoration. Color Image Processing. Wavelets and Multiresolution Processing. Image Compression. Morphological Image Processing. Image Segmentation. Representation and Description. Object Recognition.
Quantum Computation and Quantum Information
Michael A. Nielsen - 2000
A wealth of accompanying figures and exercises illustrate and develop the material in more depth. They describe what a quantum computer is, how it can be used to solve problems faster than familiar "classical" computers, and the real-world implementation of quantum computers. Their book concludes with an explanation of how quantum states can be used to perform remarkable feats of communication, and of how it is possible to protect quantum states against the effects of noise.
Survey Methodology
Robert M. Groves - 2004
Survey Methodology describes the basic principles of survey design discovered in methodological research over recent years and offers guidance for making successful decisions in the design and execution of high quality surveys. Written by six nationally recognized experts in the field, this book covers the major considerations in designing and conducting a sample survey. Topical, accessible, and succinct, this book represents the state of the science in survey methodology. Employing the "total survey error" paradigm as an organizing framework, it merges the science of surveys with state-of-the-art practices. End-of-chapter terms, references, and exercises enhance its value as a reference for practitioners and as a text for advanced students.
Data Science
John D. Kelleher - 2018
Today data science determines the ads we see online, the books and movies that are recommended to us online, which emails are filtered into our spam folders, and even how much we pay for health insurance. This volume in the MIT Press Essential Knowledge series offers a concise introduction to the emerging field of data science, explaining its evolution, current uses, data infrastructure issues, and ethical challenges.It has never been easier for organizations to gather, store, and process data. Use of data science is driven by the rise of big data and social media, the development of high-performance computing, and the emergence of such powerful methods for data analysis and modeling as deep learning. Data science encompasses a set of principles, problem definitions, algorithms, and processes for extracting non-obvious and useful patterns from large datasets. It is closely related to the fields of data mining and machine learning, but broader in scope. This book offers a brief history of the field, introduces fundamental data concepts, and describes the stages in a data science project. It considers data infrastructure and the challenges posed by integrating data from multiple sources, introduces the basics of machine learning, and discusses how to link machine learning expertise with real-world problems. The book also reviews ethical and legal issues, developments in data regulation, and computational approaches to preserving privacy. Finally, it considers the future impact of data science and offers principles for success in data science projects.
Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts
Annie Duke - 2018
The pass was intercepted and the Seahawks lost. Critics called it the dumbest play in history. But was the call really that bad? Or did Carroll actually make a great move that was ruined by bad luck?Even the best decision doesn't yield the best outcome every time. There's always an element of luck that you can't control, and there is always information that is hidden from view. So the key to long-term success (and avoiding worrying yourself to death) is to think in bets: How sure am I? What are the possible ways things could turn out? What decision has the highest odds of success? Did I land in the unlucky 10% on the strategy that works 90% of the time? Or is my success attributable to dumb luck rather than great decision making?Annie Duke, a former World Series of Poker champion turned business consultant, draws on examples from business, sports, politics, and (of course) poker to share tools anyone can use to embrace uncertainty and make better decisions. For most people, it's difficult to say "I'm not sure" in a world that values and, even, rewards the appearance of certainty. But professional poker players are comfortable with the fact that great decisions don't always lead to great outcomes and bad decisions don't always lead to bad outcomes.By shifting your thinking from a need for certainty to a goal of accurately assessing what you know and what you don't, you'll be less vulnerable to reactive emotions, knee-jerk biases, and destructive habits in your decision making. You'll become more confident, calm, compassionate and successful in the long run.
Design and Analysis of Experiments
Douglas C. Montgomery - 1976
Douglas Montgomery arms readers with the most effective approach for learning how to design, conduct, and analyze experiments that optimize performance in products and processes. He shows how to use statistically designed experiments to obtain information for characterization and optimization of systems, improve manufacturing processes, and design and develop new processes and products. You will also learn how to evaluate material alternatives in product design, improve the field performance, reliability, and manufacturing aspects of products, and conduct experiments effectively and efficiently. Discover how to improve the quality and efficiency of working systems with this highly-acclaimed book. This 6th Edition: Places a strong focus on the use of the computer, providing output from two software products: Minitab and DesignExpert. Presents timely, new examples as well as expanded coverage on adding runs to a fractional factorial to de-alias effects. Includes detailed discussions on how computers are currently used in the analysis and design of experiments. Offers new material on a number of important topics, including follow-up experimentation and split-plot design. Focuses even more sharply on factorial and fractional factorial design.
The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography
Simon Singh - 1999
From Mary, Queen of Scots, trapped by her own code, to the Navajo Code Talkers who helped the Allies win World War II, to the incredible (and incredibly simple) logisitical breakthrough that made Internet commerce secure, The Code Book tells the story of the most powerful intellectual weapon ever known: secrecy.Throughout the text are clear technical and mathematical explanations, and portraits of the remarkable personalities who wrote and broke the world’s most difficult codes. Accessible, compelling, and remarkably far-reaching, this book will forever alter your view of history and what drives it. It will also make you wonder how private that e-mail you just sent really is.
Proofiness: The Dark Arts of Mathematical Deception
Charles Seife - 2010
According to MSNBC, having a child makes you stupid. You actually lose IQ points. Good Morning America has announced that natural blondes will be extinct within two hundred years. Pundits estimated that there were more than a million demonstrators at a tea party rally in Washington, D.C., even though roughly sixty thousand were there. Numbers have peculiar powers-they can disarm skeptics, befuddle journalists, and hoodwink the public into believing almost anything. "Proofiness," as Charles Seife explains in this eye-opening book, is the art of using pure mathematics for impure ends, and he reminds readers that bad mathematics has a dark side. It is used to bring down beloved government officials and to appoint undeserving ones (both Democratic and Republican), to convict the innocent and acquit the guilty, to ruin our economy, and to fix the outcomes of future elections. This penetrating look at the intersection of math and society will appeal to readers of Freakonomics and the books of Malcolm Gladwell.
The Half-life of Facts: Why Everything We Know Has an Expiration Date
Samuel Arbesman - 2012
Smoking has gone from doctor recommended to deadly. We used to think the Earth was the center of the universe and that Pluto was a planet. For decades, we were convinced that the brontosaurus was a real dinosaur. In short, what we know about the world is constantly changing. But it turns out there’s an order to the state of knowledge, an explanation for how we know what we know. Samuel Arbesman is an expert in the field of scientometrics—literally the science of science. Knowledge in most fields evolves systematically and predictably, and this evolution unfolds in a fascinating way that can have a powerful impact on our lives. Doctors with a rough idea of when their knowledge is likely to expire can be better equipped to keep up with the latest research. Companies and governments that understand how long new discoveries take to develop can improve decisions about allocating resources. And by tracing how and when language changes, each of us can better bridge generational gaps in slang and dialect. Just as we know that a chunk of uranium can break down in a measurable amount of time—a radioactive half-life—so too any given field’s change in knowledge can be measured concretely. We can know when facts in aggregate are obsolete, the rate at which new facts are created, and even how facts spread. Arbesman takes us through a wide variety of fields, including those that change quickly, over the course of a few years, or over the span of centuries. He shows that much of what we know consists of “mesofacts”—facts that change at a middle timescale, often over a single human lifetime. Throughout, he offers intriguing examples about the face of knowledge: what English majors can learn from a statistical analysis of The Canterbury Tales, why it’s so hard to measure a mountain, and why so many parents still tell kids to eat their spinach because it’s rich in iron. The Half-life of Facts is a riveting journey into the counterintuitive fabric of knowledge. It can help us find new ways to measure the world while accepting the limits of how much we can know with certainty.
Multiple View Geometry in Computer Vision
Richard Hartley - 2000
This book covers relevant geometric principles and how to represent objects algebraically so they can be computed and applied. Recent major developments in the theory and practice of scene reconstruction are described in detail in a unified framework. Richard Hartley and Andrew Zisserman provide comprehensive background material and explain how to apply the methods and implement the algorithms. First Edition HB (2000): 0-521-62304-9
Introduction to Machine Learning with Python: A Guide for Data Scientists
Andreas C. Müller - 2015
If you use Python, even as a beginner, this book will teach you practical ways to build your own machine learning solutions. With all the data available today, machine learning applications are limited only by your imagination.You'll learn the steps necessary to create a successful machine-learning application with Python and the scikit-learn library. Authors Andreas Muller and Sarah Guido focus on the practical aspects of using machine learning algorithms, rather than the math behind them. Familiarity with the NumPy and matplotlib libraries will help you get even more from this book.With this book, you'll learn:Fundamental concepts and applications of machine learningAdvantages and shortcomings of widely used machine learning algorithmsHow to represent data processed by machine learning, including which data aspects to focus onAdvanced methods for model evaluation and parameter tuningThe concept of pipelines for chaining models and encapsulating your workflowMethods for working with text data, including text-specific processing techniquesSuggestions for improving your machine learning and data science skills
Book of Proof
Richard Hammack - 2009
It is a bridge from the computational courses (such as calculus or differential equations) that students typically encounter in their first year of college to a more abstract outlook. It lays a foundation for more theoretical courses such as topology, analysis and abstract algebra. Although it may be more meaningful to the student who has had some calculus, there is really no prerequisite other than a measure of mathematical maturity. Topics include sets, logic, counting, methods of conditional and non-conditional proof, disproof, induction, relations, functions and infinite cardinality.