Probability, Random Variables and Stochastic Processes with Errata Sheet


Athanasios Papoulis - 2001
    Unnikrishna Pillai of Polytechnic University. The book is intended for a senior/graduate level course in probability and is aimed at students in electrical engineering, math, and physics departments. The authors' approach is to develop the subject of probability theory and stochastic processes as a deductive discipline and to illustrate the theory with basic applications of engineering interest. Approximately 1/3 of the text is new material--this material maintains the style and spirit of previous editions. In order to bridge the gap between concepts and applications, a number of additional examples have been added for further clarity, as well as several new topics.

A Guide To Econometrics


Peter E. Kennedy - 1979
    This overview has enabled students to make sense more easily of what instructors are doing when they produce proofs, theorems and formulas.

Information: The New Language of Science


Hans Christian Von Baeyer - 2003
    In this indispensable volume, a primer for the information age, Hans Christian von Baeyer presents a clear description of what information is, how concepts of its measurement, meaning, and transmission evolved, and what its ever-expanding presence portends for the future. Information is poised to replace matter as the primary stuff of the universe, von Baeyer suggests; it will provide a new basic framework for describing and predicting reality in the twenty-first century. Despite its revolutionary premise, von Baeyer's book is written simply in a straightforward fashion, offering a wonderfully accessible introduction to classical and quantum information. Enlivened with anecdotes from the lives of philosophers, mathematicians, and scientists who have contributed significantly to the field, Information conducts readers from questions of subjectivity inherent in classical information to the blurring of distinctions between computers and what they measure or store in our quantum age. A great advance in our efforts to define and describe the nature of information, the book also marks an important step forward in our ability to exploit information--and, ultimately, to transform the nature of our relationship with the physical universe. (20040301)

I Think, Therefore I Laugh: The Flip Side of Philosophy


John Allen Paulos - 1985
    Paulos uses jokes, stories, parables, and anecdotes to elucidate difficult concepts, in this case, some of the fundamental problems in modern philosophy.

The Mind Doesn't Work That Way: The Scope and Limits of Computational Psychology


Jerry A. Fodor - 2000
    Although Fodor has praised the computational theory of mind as the best theory of cognition that we have got, he considers it to be only a fragment of the truth. In fact, he claims, cognitive scientists do not really know much yet about how the mind works (the book's title refers to Steve Pinker's How the Mind Works).Fodor's primary aim is to explore the relationship among computational and modular theories of mind, nativism, and evolutionary psychology. Along the way, he explains how Chomsky's version of nativism differs from that of the widely received New Synthesis approach. He concludes that although we have no grounds to suppose that most of the mind is modular, we have no idea how nonmodular cognition could work. Thus, according to Fodor, cognitive science has hardly gotten started.

Models.Behaving.Badly.: Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life


Emanuel Derman - 2011
    The reliance traders put on such quantitative analysis was catastrophic for the economy, setting off the series of financial crises that began to erupt in 2007 with the mortgage crisis and from which we're still recovering. Here Derman looks at why people--bankers in particular--still put so much faith in these models, and why it's a terrible mistake to do so.Though financial models imitate the style of physics by using the language of mathematics, ultimately they deal with human beings. Their similarity confuses the fundamental difference between the aims and possible achievements of the phsyics world and that of the financial world. When we make a model involving human beings, we are trying to force the ugly stepsister's foot into Cinderella's pretty glass slipper.  It doesn't fit without cutting off some of the essential parts. Physicists and economists have been too enthusiastic to recognize the limits of their equations in the sphere of human behavior--which of course is what economics is all about.  Models.Behaving.Badly. includes a personal account Derman's childhood encounter with failed models--the utopia of the kibbutz, his experience as a physicist on Wall Street, and a look at the models quants generated: the benefits they brought and the problems they caused. Derman takes a close look at what a model is, and then he highlights the differences between the success of modeling in physics and its relative failure in economics.  Describing the collapse of the subprime mortgage CDO market in 2007, Derman urges us to stop relying on these models where possible, and offers suggestions for mending these models where they might still do some good.  This is a fascinating, lyrical, and very human look behind the curtain at the intersection between mathematics and human nature.

Machine Learning: A Probabilistic Perspective


Kevin P. Murphy - 2012
    Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

Probability Theory: The Logic of Science


E.T. Jaynes - 1999
    It discusses new results, along with applications of probability theory to a variety of problems. The book contains many exercises and is suitable for use as a textbook on graduate-level courses involving data analysis. Aimed at readers already familiar with applied mathematics at an advanced undergraduate level or higher, it is of interest to scientists concerned with inference from incomplete information.

A History of the Mind: Evolution and the Birth of Consciousness


Nicholas Humphrey - 1992
    From the "phantom pain" experienced by people who have lost their limbs to the uncanny faculty of "blindsight," Humphrey argues that raw sensations are central to all conscious states and that consciousness must have evolved, just like all other mental faculties, over time from our ancestors' bodily responses to pain and pleasure. '

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Bayesian Data Analysis


Andrew Gelman - 1995
    Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.

The Secret of Fatima


Peter J. Tanous - 2017
    where he leads a quiet and rewarding life. But he is also troubled by his past in combat with an elite military unit. Even after taking his vows, he is as ready to clasp his hands around the grip of a Glock as in prayer. He sometimes wonders who he really is. To the Vatican, however, Father Thrall is uniquely suited for a dangerous mission—one directly tied to the mystery of a 100-year old prophecy.One hundred years ago, the Blessed Virgin revealed a mysterious prophecy to three Portuguese shepherd children. The three Secrets of Fatima were closely held by the Vatican for decades, until the text of the third and last secret was finally released in 2000. But many believe that the Vatican withheld important parts of the Third Secret, perhaps because its contents were too dangerous to reveal . . .The Secret of Fatima’s knife-edge plot unfolds as both a modern-day spy thriller and a spiritual quest, as Father Thrall faces implacable enemies both within and outside of the Church who will test both his unique abilities and his faith. If he fails, the very foundations of the world will be shaken.

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

The New Financial Order: Risk in the 21st Century


Robert J. Shiller - 2003
    Less noted was Shiller's admonition that our infatuation with the stock market distracts us from more durable economic prospects. These lie in the hidden potential of real assets, such as income from our livelihoods and homes. But these ''ordinary riches, '' so fundamental to our well-being, are increasingly exposed to the pervasive risks of a rapidly changing global economy. This compelling and important new book presents a fresh vision for hedging risk and securing our economic future.Shiller describes six fundamental ideas for using modern information technology and advanced financial theory to temper basic risks that have been ignored by risk management institutions--risks to the value of our jobs and our homes, to the vitality of our communities, and to the very stability of national economies. Informed by a comprehensive risk information database, this new financial order would include global markets for trading risks and exploiting myriad new financial opportunities, from inequality insurance to intergenerational social security. Just as developments in insuring risks to life, health, and catastrophe have given us a quality of life unimaginable a century ago, so Shiller's plan for securing crucial assets promises to substantially enrich our condition.Once again providing an enormous service, Shiller gives us a powerful means to convert our ordinary riches into a level of economic security, equity, and growth never before seen. And once again, what Robert Shiller says should be read and heeded by anyone with a stake in the economy.

The Art of Statistics: How to Learn from Data


David Spiegelhalter - 2019
      Statistics are everywhere, as integral to science as they are to business, and in the popular media hundreds of times a day. In this age of big data, a basic grasp of statistical literacy is more important than ever if we want to separate the fact from the fiction, the ostentatious embellishments from the raw evidence -- and even more so if we hope to participate in the future, rather than being simple bystanders. In The Art of Statistics, world-renowned statistician David Spiegelhalter shows readers how to derive knowledge from raw data by focusing on the concepts and connections behind the math. Drawing on real world examples to introduce complex issues, he shows us how statistics can help us determine the luckiest passenger on the Titanic, whether a notorious serial killer could have been caught earlier, and if screening for ovarian cancer is beneficial. The Art of Statistics not only shows us how mathematicians have used statistical science to solve these problems -- it teaches us how we too can think like statisticians. We learn how to clarify our questions, assumptions, and expectations when approaching a problem, and -- perhaps even more importantly -- we learn how to responsibly interpret the answers we receive. Combining the incomparable insight of an expert with the playful enthusiasm of an aficionado, The Art of Statistics is the definitive guide to stats that every modern person needs.