How to Measure Anything: Finding the Value of "Intangibles" in Business


Douglas W. Hubbard - 1985
    Douglas Hubbard helps us create a path to know the answer to almost any question in business, in science, or in life . . . Hubbard helps us by showing us that when we seek metrics to solve problems, we are really trying to know something better than we know it now. How to Measure Anything provides just the tools most of us need to measure anything better, to gain that insight, to make progress, and to succeed." -Peter Tippett, PhD, M.D. Chief Technology Officer at CyberTrust and inventor of the first antivirus software "Doug Hubbard has provided an easy-to-read, demystifying explanation of how managers can inform themselves to make less risky, more profitable business decisions. We encourage our clients to try his powerful, practical techniques." -Peter Schay EVP and COO of The Advisory Council "As a reader you soon realize that actually everything can be measured while learning how to measure only what matters. This book cuts through conventional cliches and business rhetoric and offers practical steps to using measurements as a tool for better decision making. Hubbard bridges the gaps to make college statistics relevant and valuable for business decisions." -Ray Gilbert EVP Lucent "This book is remarkable in its range of measurement applications and its clarity of style. A must-read for every professional who has ever exclaimed, 'Sure, that concept is important, but can we measure it?'" -Dr. Jack Stenner Cofounder and CEO of MetraMetrics, Inc.

Algorithms to Live By: The Computer Science of Human Decisions


Brian Christian - 2016
    What should we do, or leave undone, in a day or a lifetime? How much messiness should we accept? What balance of new activities and familiar favorites is the most fulfilling? These may seem like uniquely human quandaries, but they are not: computers, too, face the same constraints, so computer scientists have been grappling with their version of such issues for decades. And the solutions they've found have much to teach us.In a dazzlingly interdisciplinary work, acclaimed author Brian Christian and cognitive scientist Tom Griffiths show how the algorithms used by computers can also untangle very human questions. They explain how to have better hunches and when to leave things to chance, how to deal with overwhelming choices and how best to connect with others. From finding a spouse to finding a parking spot, from organizing one's inbox to understanding the workings of memory, Algorithms to Live By transforms the wisdom of computer science into strategies for human living.

Time Series Analysis


James Douglas Hamilton - 1994
    This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. Time Series Analysis fills an important need for a textbook that integrates economic theory, econometrics, and new results.The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.-- "Journal of Economics"

The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution


Gregory Zuckerman - 2019
    No other investor--Warren Buffett, Peter Lynch, Ray Dalio, Steve Cohen, or George Soros--can touch his record. Since 1988, Renaissance's signature Medallion fund has generated average annual returns of 66 percent. The firm has earned profits of more than $100 billion; Simons is worth twenty-three billion dollars.Drawing on unprecedented access to Simons and dozens of current and former employees, Zuckerman, a veteran Wall Street Journal investigative reporter, tells the gripping story of how a world-class mathematician and former code breaker mastered the market. Simons pioneered a data-driven, algorithmic approach that's sweeping the world.As Renaissance became a market force, its executives began influencing the world beyond finance. Simons became a major figure in scientific research, education, and liberal politics. Senior executive Robert Mercer is more responsible than anyone else for the Trump presidency, placing Steve Bannon in the campaign and funding Trump's victorious 2016 effort. Mercer also impacted the campaign behind Brexit.The Man Who Solved the Market is a portrait of a modern-day Midas who remade markets in his own image, but failed to anticipate how his success would impact his firm and his country. It's also a story of what Simons's revolution means for the rest of us.

The Great Mental Models: General Thinking Concepts


Shane Parrish - 2018
     The more tools you have at your disposal, the more likely you'll use the right tool for the job — and get it done right. The same is true when it comes to your thinking. The quality of your outcomes depends on the mental models in your head. And most people are going through life with little more than a hammer. Until now. The Great Mental Models: General Thinking Concepts is the first book in The Great Mental Models series designed to upgrade your thinking with the best, most useful and powerful tools so you always have the right one on hand. This volume details nine of the most versatile, all-purpose mental models you can use right away to improve your decision making, productivity, and how clearly you see the world. You will discover what forces govern the universe and how to focus your efforts so you can harness them to your advantage, rather than fight with them or worse yet— ignore them. Upgrade your mental toolbox and get the first volume today!

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

The Visual Display of Quantitative Information


Edward R. Tufte - 1983
    Theory and practice in the design of data graphics, 250 illustrations of the best (and a few of the worst) statistical graphics, with detailed analysis of how to display data for precise, effective, quick analysis. Design of the high-resolution displays, small multiples. Editing and improving graphics. The data-ink ratio. Time-series, relational graphics, data maps, multivariate designs. Detection of graphical deception: design variation vs. data variation. Sources of deception. Aesthetics and data graphical displays. This is the second edition of The Visual Display of Quantitative Information. Recently published, this new edition provides excellent color reproductions of the many graphics of William Playfair, adds color to other images, and includes all the changes and corrections accumulated during 17 printings of the first edition.

Calculus


Michael Spivak - 1967
    His aim is to present calculus as the first real encounter with mathematics: it is the place to learn how logical reasoning combined with fundamental concepts can be developed into a rigorous mathematical theory rather than a bunch of tools and techniques learned by rote. Since analysis is a subject students traditionally find difficult to grasp, Spivak provides leisurely explanations, a profusion of examples, a wide range of exercises and plenty of illustrations in an easy-going approach that enlightens difficult concepts and rewards effort. Calculus will continue to be regarded as a modern classic, ideal for honours students and mathematics majors, who seek an alternative to doorstop textbooks on calculus, and the more formidable introductions to real analysis.

Computer Age Statistical Inference: Algorithms, Evidence, and Data Science


Bradley Efron - 2016
    'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

The Art of Doing Science and Engineering: Learning to Learn


Richard Hamming - 1996
    By presenting actual experiences and analyzing them as they are described, the author conveys the developmental thought processes employed and shows a style of thinking that leads to successful results is something that can be learned. Along with spectacular successes, the author also conveys how failures contributed to shaping the thought processes. Provides the reader with a style of thinking that will enhance a person's ability to function as a problem-solver of complex technical issues. Consists of a collection of stories about the author's participation in significant discoveries, relating how those discoveries came about and, most importantly, provides analysis about the thought processes and reasoning that took place as the author and his associates progressed through engineering problems.

Thinking in Systems: A Primer


Donella H. Meadows - 2008
    Edited by the Sustainability Institute’s Diana Wright, this essential primer brings systems thinking out of the realm of computers and equations and into the tangible world, showing readers how to develop the systems-thinking skills that thought leaders across the globe consider critical for 21st-century life.Some of the biggest problems facing the world—war, hunger, poverty, and environmental degradation—are essentially system failures. They cannot be solved by fixing one piece in isolation from the others, because even seemingly minor details have enormous power to undermine the best efforts of too-narrow thinking.While readers will learn the conceptual tools and methods of systems thinking, the heart of the book is grander than methodology. Donella Meadows was known as much for nurturing positive outcomes as she was for delving into the science behind global dilemmas. She reminds readers to pay attention to what is important, not just what is quantifiable, to stay humble, and to stay a learner.In a world growing ever more complicated, crowded, and interdependent, Thinking in Systems helps readers avoid confusion and helplessness, the first step toward finding proactive and effective solutions.

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy


Cathy O'Neil - 2016
    Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.

The Strategy of Conflict


Thomas C. Schelling - 1960
    It proposes enlightening similarities between, for instance, maneuvering in limited war and in a traffic jam; deterring the Russians and one's own children; the modern strategy of terror and the ancient institution of hostages.

Noise: A Flaw in Human Judgment


Daniel Kahneman - 2021
    Suppose that different food inspectors give different ratings to indistinguishable restaurants — or that when a company is handling customer complaints, the resolution depends on who happens to be handling the particular complaint. Now imagine that the same doctor, the same judge, the same inspector, or the same company official makes different decisions, depending on whether it is morning or afternoon, or Monday rather than Wednesday. These are examples of noise: variability in judgments that should be identical.   In Noise, Daniel Kahneman, Cass R. Sunstein, and Olivier Sibony show how noise contributes significantly to errors in all fields, including medicine, law, economic forecasting, police behavior, food safety, bail, security checks at airports, strategy, and personnel selection. And although noise can be found wherever people make judgments and decisions, individuals and organizations alike are commonly oblivious to the role of chance in their judgments and in their actions.   Drawing on the latest findings in psychology and behavioral economics, and the same kind of diligent, insightful research that made Thinking, Fast and Slow and Nudge groundbreaking New York Times bestsellers, Noise explains how and why humans are so susceptible to noise in judgment — and what we can do about it.

The General Theory of Employment, Interest, and Money


John Maynard Keynes - 1935
    In his most important work, The General Theory of Employment, Interest, and Money (1936), Keynes critiqued the laissez-faire policies of his day, particularly the proposition that a normally functioning market economy would bring full employment. Keynes's forward-looking work transformed economics from merely a descriptive and analytic discipline into one that is policy oriented. For Keynes, enlightened government intervention in a nation's economic life was essential to curbing what he saw as the inherent inequalities and instabilities of unregulated capitalism.