Probability, Random Variables and Stochastic Processes with Errata Sheet


Athanasios Papoulis - 2001
    Unnikrishna Pillai of Polytechnic University. The book is intended for a senior/graduate level course in probability and is aimed at students in electrical engineering, math, and physics departments. The authors' approach is to develop the subject of probability theory and stochastic processes as a deductive discipline and to illustrate the theory with basic applications of engineering interest. Approximately 1/3 of the text is new material--this material maintains the style and spirit of previous editions. In order to bridge the gap between concepts and applications, a number of additional examples have been added for further clarity, as well as several new topics.

Models.Behaving.Badly.: Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life


Emanuel Derman - 2011
    The reliance traders put on such quantitative analysis was catastrophic for the economy, setting off the series of financial crises that began to erupt in 2007 with the mortgage crisis and from which we're still recovering. Here Derman looks at why people--bankers in particular--still put so much faith in these models, and why it's a terrible mistake to do so.Though financial models imitate the style of physics by using the language of mathematics, ultimately they deal with human beings. Their similarity confuses the fundamental difference between the aims and possible achievements of the phsyics world and that of the financial world. When we make a model involving human beings, we are trying to force the ugly stepsister's foot into Cinderella's pretty glass slipper.  It doesn't fit without cutting off some of the essential parts. Physicists and economists have been too enthusiastic to recognize the limits of their equations in the sphere of human behavior--which of course is what economics is all about.  Models.Behaving.Badly. includes a personal account Derman's childhood encounter with failed models--the utopia of the kibbutz, his experience as a physicist on Wall Street, and a look at the models quants generated: the benefits they brought and the problems they caused. Derman takes a close look at what a model is, and then he highlights the differences between the success of modeling in physics and its relative failure in economics.  Describing the collapse of the subprime mortgage CDO market in 2007, Derman urges us to stop relying on these models where possible, and offers suggestions for mending these models where they might still do some good.  This is a fascinating, lyrical, and very human look behind the curtain at the intersection between mathematics and human nature.

Why Stock Markets Crash: Critical Events in Complex Financial Systems


Didier Sornette - 2002
    In this book, Didier Sornette boldly applies his varied experience in these areas to propose a simple, powerful, and general theory of how, why, and when stock markets crash.Most attempts to explain market failures seek to pinpoint triggering mechanisms that occur hours, days, or weeks before the collapse. Sornette proposes a radically different view: the underlying cause can be sought months and even years before the abrupt, catastrophic event in the build-up of cooperative speculation, which often translates into an accelerating rise of the market price, otherwise known as a "bubble." Anchoring his sophisticated, step-by-step analysis in leading-edge physical and statistical modeling techniques, he unearths remarkable insights and some predictions--among them, that the "end of the growth era" will occur around 2050.Sornette probes major historical precedents, from the decades-long "tulip mania" in the Netherlands that wilted suddenly in 1637 to the South Sea Bubble that ended with the first huge market crash in England in 1720, to the Great Crash of October 1929 and Black Monday in 1987, to cite just a few. He concludes that most explanations other than cooperative self-organization fail to account for the subtle bubbles by which the markets lay the groundwork for catastrophe.Any investor or investment professional who seeks a genuine understanding of looming financial disasters should read this book. Physicists, geologists, biologists, economists, and others will welcome "Why Stock Markets Crash" as a highly original "scientific tale," as Sornette aptly puts it, of the exciting and sometimes fearsome--but no longer quite so unfathomable--world of stock markets.

Introductory Econometrics: A Modern Approach


Jeffrey M. Wooldridge - 1999
    It bridges the gap between the mechanics of econometrics and modern applications of econometrics by employing a systematic approach motivated by the major problems facing applied researchers today. Throughout the text, the emphasis on examples gives a concrete reality to economic relationships and allows treatment of interesting policy questions in a realistic and accessible framework.

The New Financial Order: Risk in the 21st Century


Robert J. Shiller - 2003
    Less noted was Shiller's admonition that our infatuation with the stock market distracts us from more durable economic prospects. These lie in the hidden potential of real assets, such as income from our livelihoods and homes. But these ''ordinary riches, '' so fundamental to our well-being, are increasingly exposed to the pervasive risks of a rapidly changing global economy. This compelling and important new book presents a fresh vision for hedging risk and securing our economic future.Shiller describes six fundamental ideas for using modern information technology and advanced financial theory to temper basic risks that have been ignored by risk management institutions--risks to the value of our jobs and our homes, to the vitality of our communities, and to the very stability of national economies. Informed by a comprehensive risk information database, this new financial order would include global markets for trading risks and exploiting myriad new financial opportunities, from inequality insurance to intergenerational social security. Just as developments in insuring risks to life, health, and catastrophe have given us a quality of life unimaginable a century ago, so Shiller's plan for securing crucial assets promises to substantially enrich our condition.Once again providing an enormous service, Shiller gives us a powerful means to convert our ordinary riches into a level of economic security, equity, and growth never before seen. And once again, what Robert Shiller says should be read and heeded by anyone with a stake in the economy.

Time Series Analysis


James Douglas Hamilton - 1994
    This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. Time Series Analysis fills an important need for a textbook that integrates economic theory, econometrics, and new results.The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.-- "Journal of Economics"

Econometric Analysis


William H. Greene - 1990
    This title is aimed at courses in applied econometrics, political methodology, and sociological methods or a one-year graduate course in econometrics for social scientists.

Mostly Harmless Econometrics: An Empiricist's Companion


Joshua D. Angrist - 2008
    In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? Mostly Harmless Econometrics shows how the basic tools of applied econometrics allow the data to speak.In addition to econometric essentials, Mostly Harmless Econometrics covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Bull by the Horns: Fighting to Save Main Street from Wall Street and Wall Street from Itself


Sheila Bair - 2012
    Bull By The Horns: Fighting To Save Main Street From Wall Street, by Bair, Sheila

Statistics for People Who (Think They) Hate Statistics


Neil J. Salkind - 2000
    The book begins with an introduction to the language of statistics and then covers descriptive statistics and inferential statistics. Throughout, the author offers readers:- Difficulty Rating Index for each chapter′s material- Tips for doing and thinking about a statistical technique- Top tens for everything from the best ways to create a graph to the most effective techniques for data collection- Steps that break techniques down into a clear sequence of procedures- SPSS tips for executing each major statistical technique- Practice exercises at the end of each chapter, followed by worked out solutions.The book concludes with a statistical software sampler and a description of the best Internet sites for statistical information and data resources. Readers also have access to a website for downloading data that they can use to practice additional exercises from the book. Students and researchers will appreciate the book′s unhurried pace and thorough, friendly presentation.

Naked Statistics: Stripping the Dread from the Data


Charles Wheelan - 2012
    How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more.For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions.And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.

Statistical Consequences of Fat Tails: Real World Preasymptotics, Epistemology, and Applications


Nassim Nicholas Taleb - 2020
    Switching from thin tailed to fat tailed distributions requires more than "changing the color of the dress." Traditional asymptotics deal mainly with either n=1 or n=∞, and the real world is in between, under the "laws of the medium numbers"-which vary widely across specific distributions. Both the law of large numbers and the generalized central limit mechanisms operate in highly idiosyncratic ways outside the standard Gaussian or Levy-Stable basins of convergence. A few examples: - The sample mean is rarely in line with the population mean, with effect on "na�ve empiricism," but can be sometimes be estimated via parametric methods. - The "empirical distribution" is rarely empirical. - Parameter uncertainty has compounding effects on statistical metrics. - Dimension reduction (principal components) fails. - Inequality estimators (Gini or quantile contributions) are not additive and produce wrong results. - Many "biases" found in psychology become entirely rational under more sophisticated probability distributions. - Most of the failures of financial economics, econometrics, and behavioral economics can be attributed to using the wrong distributions. This book, the first volume of the Technical Incerto, weaves a narrative around published journal articles.

Discovering Statistics Using SPSS (Introducing Statistical Methods)


Andy Field - 2000
    What's new in the Second Edition? 1. Fully compliant with the latest version of SPSS version 12 2. More coverage of advanced statistics including completely new coverage of non-parametric statistics. The book is 50 per cent longer than the First Edition. 3. Each section of each chapter now has a notation - 1,2 or 3 - referring to the intended level of study. This helps students navigate their way through the book and makes it user-friendly for students of ALL levels. 4. Has a 'how to use this book' section at the start of the text. 5. Characters in each chapter have defined roles - summarizing key points, to pose questions etc 6. Each chapter now has several examples for students to work through. Answers provided on the enclosed CD-ROM

I Think, Therefore I Laugh: The Flip Side of Philosophy


John Allen Paulos - 1985
    Paulos uses jokes, stories, parables, and anecdotes to elucidate difficult concepts, in this case, some of the fundamental problems in modern philosophy.