Causality: Models, Reasoning, and Inference


Judea Pearl - 2000
    It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable. Professor of Computer Science at the UCLA, Judea Pearl is the winner of the 2008 Benjamin Franklin Award in Computers and Cognitive Science.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Introduction to Probability Models


Sheldon M. Ross - 1972
    This updated edition of Ross's classic bestseller provides an introduction to elementary probability theory and stochastic processes, and shows how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. With the addition of several new sections relating to actuaries, this text is highly recommended by the Society of Actuaries.This book now contains a new section on compound random variables that can be used to establish a recursive formula for computing probability mass functions for a variety of common compounding distributions; a new section on hiddden Markov chains, including the forward and backward approaches for computing the joint probability mass function of the signals, as well as the Viterbi algorithm for determining the most likely sequence of states; and a simplified approach for analyzing nonhomogeneous Poisson processes. There are also additional results on queues relating to the conditional distribution of the number found by an M/M/1 arrival who spends a time t in the system; inspection paradox for M/M/1 queues; and M/G/1 queue with server breakdown. Furthermore, the book includes new examples and exercises, along with compulsory material for new Exam 3 of the Society of Actuaries.This book is essential reading for professionals and students in actuarial science, engineering, operations research, and other fields in applied probability.

Information Theory: A Tutorial Introduction


James V. Stone - 2015
    In this richly illustrated book, accessible examples are used to show how information theory can be understood in terms of everyday games like '20 Questions', and the simple MatLab programs provided give hands-on experience of information theory in action. Written in a tutorial style, with a comprehensive glossary, this text represents an ideal primer for novices who wish to become familiar with the basic principles of information theory.Download chapter 1 from http://jim-stone.staff.shef.ac.uk/Boo...

Conceptual Mathematics: A First Introduction to Categories


F. William Lawvere - 1997
    Written by two of the best-known names in categorical logic, Conceptual Mathematics is the first book to apply categories to the most elementary mathematics. It thus serves two purposes: first, to provide a key to mathematics for the general reader or beginning student; and second, to furnish an easy introduction to categories for computer scientists, logicians, physicists, and linguists who want to gain some familiarity with the categorical method without initially committing themselves to extended study.

Who Is Fourier? a Mathematical Adventure


Transnational College of Lex - 1995
    This is done in a way that is not only easy to understand, but is actually fun! Professors and engineers, with high school and college students following closely, comprise the largest percentage of our readers. It is a must-have for anyone interested in music, mathematics, physics, engineering, or complex science. Dr. Yoichiro Nambu, 2008 Nobel Prize Winner in Physics, served as a senior adviser to the English version of Who is Fourier? A Mathematical Adventure.

The Math Book: From Pythagoras to the 57th Dimension, 250 Milestones in the History of Mathematics


Clifford A. Pickover - 2009
    Beginning millions of years ago with ancient “ant odometers” and moving through time to our modern-day quest for new dimensions, it covers 250 milestones in mathematical history. Among the numerous delights readers will learn about as they dip into this inviting anthology: cicada-generated prime numbers, magic squares from centuries ago, the discovery of pi and calculus, and the butterfly effect. Each topic gets a lavishly illustrated spread with stunning color art, along with formulas and concepts, fascinating facts about scientists’ lives, and real-world applications of the theorems.

A First Course in String Theory


Barton Zwiebach - 2004
    The first part deals with basic ideas, reviewing special relativity and electromagnetism while introducing the concept of extra dimensions. D-branes and the classical dynamics of relativistic strings are discussed next, and the quantization of open and closed bosonic strings in the light-cone gauge, along with a brief introduction to superstrings. The second part begins with a detailed study of D-branes followed by string thermodynamics. It discusses possible physical applications, and covers T-duality of open and closed strings, electromagnetic fields on D-branes, Born/Infeld electrodynamics, covariant string quantization and string interactions. Primarily aimed as a textbook for advanced undergraduate and beginning graduate courses, it will also be ideal for a wide range of scientists and mathematicians who are curious about string theory.

Mostly Harmless Econometrics: An Empiricist's Companion


Joshua D. Angrist - 2008
    In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? Mostly Harmless Econometrics shows how the basic tools of applied econometrics allow the data to speak.In addition to econometric essentials, Mostly Harmless Econometrics covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications

The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge


William Poundstone - 1984
    Topics include the limits of knowledge, paradox of complexity, Maxwell's demon, Big Bang theory, much more. 1985 edition.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

Network Science


Albert-László Barabási
    

Schaum's Outline of Calculus


Frank Ayres Jr. - 1990
    They'll also find the related analytic geometry much easier. The clear review of algebra and geometry in this edition will make calculus easier for students who wish to strengthen their knowledge in these areas. Updated to meet the emphasis in current courses, this new edition of a popular guide--more than 104,000 copies were bought of the prior edition--includes problems and examples using graphing calculators..

Algebra - The Very Basics


Metin Bektas - 2014
    This book picks you up at the very beginning and guides you through the foundations of algebra using lots of examples and no-nonsense explanations. Each chapter contains well-chosen exercises as well as all the solutions. No prior knowledge is required. Topics include: Exponents, Brackets, Linear Equations and Quadratic Equations. For a more detailed table of contents, use the "Look Inside" feature. From the author of "Great Formulas Explained" and "Physics! In Quantities and Examples".

The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine


Charles Petzold - 2008
    Turing Mathematician Alan Turing invented an imaginary computer known as the Turing Machine; in an age before computers, he explored the concept of what it meant to be "computable," creating the field of computability theory in the process, a foundation of present-day computer programming.The book expands Turing's original 36-page paper with additional background chapters and extensive annotations; the author elaborates on and clarifies many of Turing's statements, making the original difficult-to-read document accessible to present day programmers, computer science majors, math geeks, and others.Interwoven into the narrative are the highlights of Turing's own life: his years at Cambridge and Princeton, his secret work in cryptanalysis during World War II, his involvement in seminal computer projects, his speculations about artificial intelligence, his arrest and prosecution for the crime of "gross indecency," and his early death by apparent suicide at the age of 41.