Book picks similar to
Stochastic Calculus for Finance I: The Binomial Asset Pricing Model by Steven E. Shreve
finance
mathematics
quant
math
The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
Nate Silver - 2012
He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of FiveThirtyEight.com. Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good-or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary-and dangerous-science.Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise.
Mathematical Methods for Physicists
George B. Arfken - 1970
This work includes differential forms and the elegant forms of Maxwell's equations, and a chapter on probability and statistics. It also illustrates and proves mathematical relations.
The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives
Stephen Thomas Ziliak - 2008
If it takes a book to get it across, I hope this book will do it. It ought to.”—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics “With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots. Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).
Inefficient Markets: An Introduction to Behavioral Finance
Andrei Shleifer - 2000
It states that securities prices in financial markets must equal fundamental values, either because all investors are rational or because arbitrage eliminates pricing anomalies. This bookdescribes an alternative approach to the study of financial markets: behavioral finance. This approach starts with an observation that the assumptions of investor rationality and perfect arbitrage are overwhelmingly contradicted by both psychological and institutional evidence. In actualfinancial markets, less than fully rational investors trade against arbitrageurs whose resources are limited by risk aversion, short horizons, and agency problems. The book presents models of such markets. These models explain the available financial data more accurately than the efficient marketshypothesis, and generate new predictions about security prices. By summarizing and expanding the research in behavioral finance, the book builds a new theoretical and empirical foundation for the economic analysis of real-world markets.
Schaum's Outline of Calculus
Frank Ayres Jr. - 1990
They'll also find the related analytic geometry much easier. The clear review of algebra and geometry in this edition will make calculus easier for students who wish to strengthen their knowledge in these areas. Updated to meet the emphasis in current courses, this new edition of a popular guide--more than 104,000 copies were bought of the prior edition--includes problems and examples using graphing calculators..
The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty
Sam L. Savage - 2009
As the recent collapse on Wall Street shows, we are often ill-equipped to deal with uncertainty and risk. Yet every day we base our personal and business plans on uncertainties, whether they be next month's sales, next year's costs, or tomorrow's stock price. In The Flaw of Averages, Sam Savage-known for his creative exposition of difficult subjects- describes common avoidable mistakes in assessing risk in the face of uncertainty. Along the way, he shows why plans based on average assumptions are wrong, on average, in areas as diverse as healthcare, accounting, the War on Terror, and climate change. In his chapter on Sex and the Central Limit Theorem, he bravely grasps the literary third rail of gender differences.Instead of statistical jargon, Savage presents complex concepts in plain English. In addition, a tightly integrated web site contains numerous animations and simulations to further connect the seat of the reader's intellect to the seat of their pants.The Flaw of Averages typically results when someone plugs a single number into a spreadsheet to represent an uncertain future quantity. Savage finishes the book with a discussion of the emerging field of Probability Management, which cures this problem though a new technology that can pack thousands of numbers into a single spreadsheet cell.Praise for The Flaw of Averages"Statistical uncertainties are pervasive in decisions we make every day in business, government, and our personal lives. Sam Savage's lively and engaging book gives any interested reader the insight and the tools to deal effectively with those uncertainties. I highly recommend The Flaw of Averages." --William J. Perry, Former U.S. Secretary of Defense"Enterprise analysis under uncertainty has long been an academic ideal. . . . In this profound and entertaining book, Professor Savage shows how to make all this practical, practicable, and comprehensible." ---Harry Markowitz, Nobel Laureate in Economics
Econometrics
Fumio Hayashi - 2000
It introduces first year Ph.D. students to standard graduate econometrics material from a modern perspective. It covers all the standard material necessary for understanding the principal techniques of econometrics from ordinary least squares through cointegration. The book is also distinctive in developing both time-series and cross-section analysis fully, giving the reader a unified framework for understanding and integrating results.Econometrics has many useful features and covers all the important topics in econometrics in a succinct manner. All the estimation techniques that could possibly be taught in a first-year graduate course, except maximum likelihood, are treated as special cases of GMM (generalized methods of moments). Maximum likelihood estimators for a variety of models (such as probit and tobit) are collected in a separate chapter. This arrangement enables students to learn various estimation techniques in an efficient manner. Eight of the ten chapters include a serious empirical application drawn from labor economics, industrial organization, domestic and international finance, and macroeconomics. These empirical exercises at the end of each chapter provide students a hands-on experience applying the techniques covered in the chapter. The exposition is rigorous yet accessible to students who have a working knowledge of very basic linear algebra and probability theory. All the results are stated as propositions, so that students can see the points of the discussion and also the conditions under which those results hold. Most propositions are proved in the text.For those who intend to write a thesis on applied topics, the empirical applications of the book are a good way to learn how to conduct empirical research. For the theoretically inclined, the no-compromise treatment of the basic techniques is a good preparation for more advanced theory courses.
Probability And Statistics For Engineers And Scientists
Ronald E. Walpole - 1978
Offers extensively updated coverage, new problem sets, and chapter-ending material to enhance the book’s relevance to today’s engineers and scientists. Includes new problem sets demonstrating updated applications to engineering as well as biological, physical, and computer science. Emphasizes key ideas as well as the risks and hazards associated with practical application of the material. Includes new material on topics including: difference between discrete and continuous measurements; binary data; quartiles; importance of experimental design; “dummy” variables; rules for expectations and variances of linear functions; Poisson distribution; Weibull and lognormal distributions; central limit theorem, and data plotting. Introduces Bayesian statistics, including its applications to many fields. For those interested in learning more about probability and statistics.
Quantum Computation and Quantum Information
Michael A. Nielsen - 2000
A wealth of accompanying figures and exercises illustrate and develop the material in more depth. They describe what a quantum computer is, how it can be used to solve problems faster than familiar "classical" computers, and the real-world implementation of quantum computers. Their book concludes with an explanation of how quantum states can be used to perform remarkable feats of communication, and of how it is possible to protect quantum states against the effects of noise.
The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth Century
David Salsburg - 2001
At a summer tea party in Cambridge, England, a guest states that tea poured into milk tastes different from milk poured into tea. Her notion is shouted down by the scientific minds of the group. But one man, Ronald Fisher, proposes to scientifically test the hypothesis. There is no better person to conduct such an experiment, for Fisher is a pioneer in the field of statistics.The Lady Tasting Tea spotlights not only Fisher's theories but also the revolutionary ideas of dozens of men and women which affect our modern everyday lives. Writing with verve and wit, David Salsburg traces breakthroughs ranging from the rise and fall of Karl Pearson's theories to the methods of quality control that rebuilt postwar Japan's economy, including a pivotal early study on the capacity of a small beer cask at the Guinness brewing factory. Brimming with intriguing tidbits and colorful characters, The Lady Tasting Tea salutes the spirit of those who dared to look at the world in a new way.
Financial Markets and Institutions (Prentice Hall Series in Finance) (Addison-Wesley Series in Finance)
Frederic S. Mishkin - 1994
A unifying framework uses a few core principles to organize readers' thinking then examines the models as real-world scenarios from a practitioner's perspective. By analyzing these applications, readers develop the critical-thinking and problem-solving skills necessary to respond to challenging situations in their future careers. Introduction: Why Study Financial Markets and Institutions?; Overview of the Financial System. Fundamentals of Financial Markets: What Do Interest Rates Mean and What Is Their Role in Valuation?; Why Do Interest Rates Change?; How Do Risk and Term Structure Affect Interest Rates?; Are Financial Markets Efficient? Central Banking and the Conduct of Monetary Policy: Structure of Central Banks and the Federal Reserve System; Conduct of Monetary Policy: Tools, Goals, Strategy, and Tactics. Financial Markets: The Money Markets; The Bond Market; The Stock Market; The Mortgage Markets; The Foreign Exchange Market; The International Financial System. Fundamentals of Financial Institutions: Why Do Financial Institutions Exist?; What Should Be Done About Conflicts of Interest? A Central Issue in Business Ethics. The Financial Institutions Industry: Banking and the Management of Financial Institutions; Commercial Banking Industry: Structure and Competition; Savings Associations and Credit Unions; Banking Regulation; The Mutual Fund Industry; Insurance Companies and Pension Funds; Investment Banks, Security Brokers and Dealers, and Venture Capital Firms. The Management of Financial Institutions: Risk Management in Financial Institutions; Hedging with Financial Derivatives. On the Web: Finance Companies. For all readers interested in financial markets and institutions.
The Little Book of Mathematical Principles, Theories, & Things
Robert Solomon - 2008
Rare Book
The Book of Why: The New Science of Cause and Effect
Judea Pearl - 2018
Today, that taboo is dead. The causal revolution, instigated by Judea Pearl and his colleagues, has cut through a century of confusion and established causality -- the study of cause and effect -- on a firm scientific basis. His work explains how we can know easy things, like whether it was rain or a sprinkler that made a sidewalk wet; and how to answer hard questions, like whether a drug cured an illness. Pearl's work enables us to know not just whether one thing causes another: it lets us explore the world that is and the worlds that could have been. It shows us the essence of human thought and key to artificial intelligence. Anyone who wants to understand either needs The Book of Why.
The Art of Statistics: How to Learn from Data
David Spiegelhalter - 2019
Statistics are everywhere, as integral to science as they are to business, and in the popular media hundreds of times a day. In this age of big data, a basic grasp of statistical literacy is more important than ever if we want to separate the fact from the fiction, the ostentatious embellishments from the raw evidence -- and even more so if we hope to participate in the future, rather than being simple bystanders. In The Art of Statistics, world-renowned statistician David Spiegelhalter shows readers how to derive knowledge from raw data by focusing on the concepts and connections behind the math. Drawing on real world examples to introduce complex issues, he shows us how statistics can help us determine the luckiest passenger on the Titanic, whether a notorious serial killer could have been caught earlier, and if screening for ovarian cancer is beneficial. The Art of Statistics not only shows us how mathematicians have used statistical science to solve these problems -- it teaches us how we too can think like statisticians. We learn how to clarify our questions, assumptions, and expectations when approaching a problem, and -- perhaps even more importantly -- we learn how to responsibly interpret the answers we receive. Combining the incomparable insight of an expert with the playful enthusiasm of an aficionado, The Art of Statistics is the definitive guide to stats that every modern person needs.
Statistical Rethinking: A Bayesian Course with Examples in R and Stan
Richard McElreath - 2015
Reflecting the need for even minor programming in today's model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation.By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling.Web ResourceThe book is accompanied by an R package (rethinking) that is available on the author's website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.