Introduction to Probability


Dimitri P. Bertsekas - 2002
    This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.

Mathematics In The Modern World: Readings From Scientific American


Morris Kline - 1968
    

Linear Algebra and Its Applications


Gilbert Strang - 1976
    While the mathematics is there, the effort is not all concentrated on proofs. Strang's emphasis is on understanding. He explains concepts, rather than deduces. This book is written in an informal and personal style and teaches real mathematics. The gears change in Chapter 2 as students reach the introduction of vector spaces. Throughout the book, the theory is motivated and reinforced by genuine applications, allowing pure mathematicians to teach applied mathematics.

Superforecasting: The Art and Science of Prediction


Philip E. Tetlock - 2015
    Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts’ predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught?   In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters."   In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden’s compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn’t require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.

The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty


Sam L. Savage - 2009
    As the recent collapse on Wall Street shows, we are often ill-equipped to deal with uncertainty and risk. Yet every day we base our personal and business plans on uncertainties, whether they be next month's sales, next year's costs, or tomorrow's stock price. In The Flaw of Averages, Sam Savage-known for his creative exposition of difficult subjects- describes common avoidable mistakes in assessing risk in the face of uncertainty. Along the way, he shows why plans based on average assumptions are wrong, on average, in areas as diverse as healthcare, accounting, the War on Terror, and climate change. In his chapter on Sex and the Central Limit Theorem, he bravely grasps the literary third rail of gender differences.Instead of statistical jargon, Savage presents complex concepts in plain English. In addition, a tightly integrated web site contains numerous animations and simulations to further connect the seat of the reader's intellect to the seat of their pants.The Flaw of Averages typically results when someone plugs a single number into a spreadsheet to represent an uncertain future quantity. Savage finishes the book with a discussion of the emerging field of Probability Management, which cures this problem though a new technology that can pack thousands of numbers into a single spreadsheet cell.Praise for The Flaw of Averages"Statistical uncertainties are pervasive in decisions we make every day in business, government, and our personal lives. Sam Savage's lively and engaging book gives any interested reader the insight and the tools to deal effectively with those uncertainties. I highly recommend The Flaw of Averages." --William J. Perry, Former U.S. Secretary of Defense"Enterprise analysis under uncertainty has long been an academic ideal. . . . In this profound and entertaining book, Professor Savage shows how to make all this practical, practicable, and comprehensible." ---Harry Markowitz, Nobel Laureate in Economics

Professional Blackjack


Stanford Wong - 1980
    Stanford Wong has contributed to Professional Blackjack .Wong is one of the country's leading gambling authorities.

Numerical Methods for Scientists and Engineers


Richard Hamming - 1973
    Book is unique in its emphasis on the frequency approach and its use in the solution of problems. Contents include: Fundamentals and Algorithms; Polynomial Approximation — Classical Theory; Fourier Approximation — Modern Theory; and Exponential Approximation.

Networks: A Very Short Introduction


Guido Caldarelli - 2012
    It is impossible to understand the spread of an epidemic, a computer virus, large-scale blackouts, or massive extinctions without taking into account the network structure that underlies all these phenomena. In this Very Short Introduction, Guido Caldarelli and Michele Catanzaro discuss the nature and variety of networks, using everyday examples from society, technology, nature, and history to explain and understand the science of network theory. They show the ubiquitous role of networks; how networks self-organize; why the rich get richer; and how networks can spontaneously collapse. They conclude by highlighting how the findings of complex network theory have very wide and important applications in genetics, ecology, communications, economics, and sociology.

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Chances Are . . .: Adventures in Probability


Michael Kaplan - 2003
    All things are possible, only one thing actually happens; everything else is in the realm of probability. The twin disciplines of probability and statistics underpin every modern science and sketch the shape of all purposeful group activity- politics, economics, medicine, law, sports-giving humans a handle on the essential uncertainty of their existence. Yet while we are all aware of the hard facts, most of us still refuse to take account of probability-preferring to drive, not fly; buying into market blips; smoking cigarettes; denying we will ever age. There are some people, though-gamblers, risk buyers, forensic experts, doctors, strategists- who find probability's mass of incomplete uncertainties delightful and revelatory. "Chances Are" is their story. Combining philosophical and historical background with portraits of the men and women who command the forces of probability, this engaging, wide-ranging, and clearly written volume will be welcomed not only by the proven audiences for popular books like "E=MC2" and "The Golden Ratio" but by anyone interested in the workings of fate.

The Mathematics of Poker


Bill Chen - 2006
    By the mid-1990s the old school grizzled traders had been replaced by a new breed of quantitative analysts, applying mathematics to the "art" of trading and making of it a science. A similar phenomenon is happening in poker. The grizzled "road gamblers" are being replaced by a new generation of players who have challenged many of the assumptions that underlie traditional approaches to the game. One of the most important features of this new approach is a reliance on quantitative analysis and the application of mathematics to the game. This book provides an introduction to quantitative techniques as applied to poker and to a branch of mathematics that is particularly applicable to poker, game theory, in a manner that makes seemingly difficult topics accessible to players without a strong mathematical background.

Operations Research: An Introduction


Hamdy A. Taha - 1976
    The applications and computations in operations research are emphasized. Significantly revised, this text streamlines the coverage of the theory, applications, and computations of operations research. Numerical examples are effectively used to explain complex mathematical concepts. A separate chapter of fully analyzed applications aptly demonstrates the diverse use of OR. The popular commercial and tutorial software AMPL, Excel, Excel Solver, and Tora are used throughout the book to solve practical problems and to test theoretical concepts. New materials include Markov chains, TSP heuristics, new LP models, and a totally new simplex-based approach to LP sensitivity analysis.

Introductory Functional Analysis with Applications


Erwin Kreyszig - 1978
    With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists.Currently available in the Series: Emil ArtinGeometnc Algebra R. W. CarterSimple Groups Of Lie Type Richard CourantDifferential and Integrai Calculus. Volume I Richard CourantDifferential and Integral Calculus. Volume II Richard Courant & D. HilbertMethods of Mathematical Physics, Volume I Richard Courant & D. HilbertMethods of Mathematical Physics. Volume II Harold M. S. CoxeterIntroduction to Modern Geometry. Second Edition Charles W. Curtis, Irving ReinerRepresentation Theory of Finite Groups and Associative Algebras Nelson Dunford, Jacob T. Schwartzunear Operators. Part One. General Theory Nelson Dunford. Jacob T. SchwartzLinear Operators, Part Two. Spectral Theory--Self Adjant Operators in Hilbert Space Nelson Dunford, Jacob T. SchwartzLinear Operators. Part Three. Spectral Operators Peter HenriciApplied and Computational Complex Analysis. Volume I--Power Senes-lntegrauon-Contormal Mapping-Locatvon of Zeros Peter Hilton, Yet-Chiang WuA Course in Modern Algebra Harry HochstadtIntegral Equations Erwin KreyszigIntroductory Functional Analysis with Applications P. M. PrenterSplines and Variational Methods C. L. SiegelTopics in Complex Function Theory. Volume I --Elliptic Functions and Uniformizatton Theory C. L. SiegelTopics in Complex Function Theory. Volume II --Automorphic and Abelian Integrals C. L. SiegelTopics In Complex Function Theory. Volume III --Abelian Functions & Modular Functions of Several Variables J. J. StokerDifferential Geometry

Introduction to Error Analysis: The Study of Uncertainties in Physical Measurements


John R. Taylor - 1982
    It is designed as a reference for students in the physical sciences and engineering.

Mostly Harmless Econometrics: An Empiricist's Companion


Joshua D. Angrist - 2008
    In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? Mostly Harmless Econometrics shows how the basic tools of applied econometrics allow the data to speak.In addition to econometric essentials, Mostly Harmless Econometrics covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications