Book picks similar to
Time Series Analysis by James Douglas Hamilton
economics
econometrics
textbooks
finance
Stochastic Calculus Models for Finance II: Continuous Time Models (Springer Finance)
Steven E. Shreve - 2004
The content of this book has been used successfully with students whose mathematics background consists of calculus and calculus-based probability. The text gives both precise statements of results, plausibility arguments, and even some proofs, but more importantly intuitive explanations developed and refine through classroom experience with this material are provided. The book includes a self-contained treatment of the probability theory needed for shastic calculus, including Brownian motion and its properties. Advanced topics include foreign exchange models, forward measures, and jump-diffusion processes.This book is being published in two volumes. This second volume develops shastic calculus, martingales, risk-neutral pricing, exotic options and term structure models, all in continuous time.Masters level students and researchers in mathematical finance and financial engineering will find this book useful.Steven E. Shreve is Co-Founder of the Carnegie Mellon MS Program in Computational Finance and winner of the Carnegie Mellon Doherty Prize for sustained contributions to education.
Programming Collective Intelligence: Building Smart Web 2.0 Applications
Toby Segaran - 2002
With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.Programming Collective Intelligence takes you into the world of machine learning and statistics, and explains how to draw conclusions about user experience, marketing, personal tastes, and human behavior in general -- all from information that you and others collect every day. Each algorithm is described clearly and concisely with code that can immediately be used on your web site, blog, Wiki, or specialized application. This book explains:Collaborative filtering techniques that enable online retailers to recommend products or media Methods of clustering to detect groups of similar items in a large dataset Search engine features -- crawlers, indexers, query engines, and the PageRank algorithm Optimization algorithms that search millions of possible solutions to a problem and choose the best one Bayesian filtering, used in spam filters for classifying documents based on word types and other features Using decision trees not only to make predictions, but to model the way decisions are made Predicting numerical values rather than classifications to build price models Support vector machines to match people in online dating sites Non-negative matrix factorization to find the independent features in a dataset Evolving intelligence for problem solving -- how a computer develops its skill by improving its own code the more it plays a game Each chapter includes exercises for extending the algorithms to make them more powerful. Go beyond simple database-backed applications and put the wealth of Internet data to work for you. "Bravo! I cannot think of a better way for a developer to first learn these algorithms and methods, nor can I think of a better way for me (an old AI dog) to reinvigorate my knowledge of the details."-- Dan Russell, Google "Toby's book does a great job of breaking down the complex subject matter of machine-learning algorithms into practical, easy-to-understand examples that can be directly applied to analysis of social interaction across the Web today. If I had this book two years ago, it would have saved precious time going down some fruitless paths."-- Tim Wolters, CTO, Collective Intellect
Against the Gods: The Remarkable Story of Risk
Peter L. Bernstein - 1996
Peter Bernstein has written a comprehensive history of man's efforts to understand risk and probability, beginning with early gamblers in ancient Greece, continuing through the 17th-century French mathematicians Pascal and Fermat and up to modern chaos theory. Along the way he demonstrates that understanding risk underlies everything from game theory to bridge-building to winemaking.
The Fractal Geometry of Nature
Benoît B. Mandelbrot - 1977
The complexity of nature's shapes differs in kind, not merely degree, from that of the shapes of ordinary geometry, the geometry of fractal shapes.Now that the field has expanded greatly with many active researchers, Mandelbrot presents the definitive overview of the origins of his ideas and their new applications. The Fractal Geometry of Nature is based on his highly acclaimed earlier work, but has much broader and deeper coverage and more extensive illustrations.
Information Theory, Inference and Learning Algorithms
David J.C. MacKay - 2002
These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives
Stephen Thomas Ziliak - 2008
If it takes a book to get it across, I hope this book will do it. It ought to.”—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics “With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots. Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).
Machine Learning
Tom M. Mitchell - 1986
Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.
The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution
Gregory Zuckerman - 2019
No other investor--Warren Buffett, Peter Lynch, Ray Dalio, Steve Cohen, or George Soros--can touch his record. Since 1988, Renaissance's signature Medallion fund has generated average annual returns of 66 percent. The firm has earned profits of more than $100 billion; Simons is worth twenty-three billion dollars.Drawing on unprecedented access to Simons and dozens of current and former employees, Zuckerman, a veteran Wall Street Journal investigative reporter, tells the gripping story of how a world-class mathematician and former code breaker mastered the market. Simons pioneered a data-driven, algorithmic approach that's sweeping the world.As Renaissance became a market force, its executives began influencing the world beyond finance. Simons became a major figure in scientific research, education, and liberal politics. Senior executive Robert Mercer is more responsible than anyone else for the Trump presidency, placing Steve Bannon in the campaign and funding Trump's victorious 2016 effort. Mercer also impacted the campaign behind Brexit.The Man Who Solved the Market is a portrait of a modern-day Midas who remade markets in his own image, but failed to anticipate how his success would impact his firm and his country. It's also a story of what Simons's revolution means for the rest of us.
A Man for All Markets
Edward O. Thorp - 2016
Thorp invented card counting, proving the seemingly impossible: that you could beat the dealer at the blackjack table. As a result he launched a gambling renaissance. His remarkable success--and mathematically unassailable method--caused such an uproar that casinos altered the rules of the game to thwart him and the legions he inspired. They barred him from their premises, even put his life in jeopardy. Nonetheless, gambling was forever changed.Thereafter, Thorp shifted his sights to "the biggest casino in the world" Wall Street. Devising and then deploying mathematical formulas to beat the market, Thorp ushered in the era of quantitative finance we live in today. Along the way, the so-called godfather of the quants played bridge with Warren Buffett, crossed swords with a young Rudy Giuliani, detected the Bernie Madoff scheme, and, to beat the game of roulette, invented, with Claude Shannon, the world's first wearable computer.Here, for the first time, Thorp tells the story of what he did, how he did it, his passions and motivations, and the curiosity that has always driven him to disregard conventional wisdom and devise game-changing solutions to seemingly insoluble problems. An intellectual thrill ride, replete with practical wisdom that can guide us all in uncertain financial waters, A Man for All Markets is an instant classic--a book that challenges its readers to think logically about a seemingly irrational world.Praise for A Man for All Markets"In A Man for All Markets, [Thorp] delightfully recounts his progress (if that is the word) from college teacher to gambler to hedge-fund manager. Along the way we learn important lessons about the functioning of markets and the logic of investment."--The Wall Street Journal"[Thorp] gives a biological summation (think Richard Feynman's Surely You're Joking, Mr. Feynman!) of his quest to prove the aphorism 'the house always wins' is flawed. . . . Illuminating for the mathematically inclined, and cautionary for would-be gamblers and day traders"--
Library Journal
Applied Predictive Modeling
Max Kuhn - 2013
Non- mathematical readers will appreciate the intuitive explanations of the techniques while an emphasis on problem-solving with real data across a wide variety of applications will aid practitioners who wish to extend their expertise. Readers should have knowledge of basic statistical ideas, such as correlation and linear regression analysis. While the text is biased against complex equations, a mathematical background is needed for advanced topics. Dr. Kuhn is a Director of Non-Clinical Statistics at Pfizer Global R&D in Groton Connecticut. He has been applying predictive models in the pharmaceutical and diagnostic industries for over 15 years and is the author of a number of R packages. Dr. Johnson has more than a decade of statistical consulting and predictive modeling experience in pharmaceutical research and development. He is a co-founder of Arbor Analytics, a firm specializing in predictive modeling and is a former Director of Statistics at Pfizer Global R&D. His scholarly work centers on the application and development of statistical methodology and learning algorithms. Applied Predictive Modeling covers the overall predictive modeling process, beginning with the crucial steps of data preprocessing, data splitting and foundations of model tuning. The text then provides intuitive explanations of numerous common and modern regression and classification techniques, always with an emphasis on illustrating and solving real data problems. Addressing practical concerns extends beyond model fitting to topics such as handling class imbalance, selecting predictors, and pinpointing causes of poor model performance-all of which are problems that occur frequently in practice. The text illustrates all parts of the modeling process through many hands-on, real-life examples. And every chapter contains extensive R code f
Automate This: How Algorithms Came to Rule Our World
Christopher Steiner - 2012
It used to be that to diagnose an illness, interpret legal documents, analyze foreign policy, or write a newspaper article you needed a human being with specific skills—and maybe an advanced degree or two. These days, high-level tasks are increasingly being handled by algorithms that can do precise work not only with speed but also with nuance. These “bots” started with human programming and logic, but now their reach extends beyond what their creators ever expected. In this fascinating, frightening book, Christopher Steiner tells the story of how algorithms took over—and shows why the “bot revolution” is about to spill into every aspect of our lives, often silently, without our knowledge. The May 2010 “Flash Crash” exposed Wall Street’s reliance on trading bots to the tune of a 998-point market drop and $1 trillion in vanished market value. But that was just the beginning. In Automate This, we meet bots that are driving cars, penning haiku, and writing music mistaken for Bach’s. They listen in on our customer service calls and figure out what Iran would do in the event of a nuclear standoff. There are algorithms that can pick out the most cohesive crew of astronauts for a space mission or identify the next Jeremy Lin. Some can even ingest statistics from baseball games and spit out pitch-perfect sports journalism indistinguishable from that produced by humans. The interaction of man and machine can make our lives easier. But what will the world look like when algorithms control our hospitals, our roads, our culture, and our national security? What happens to businesses when we automate judgment and eliminate human instinct? And what role will be left for doctors, lawyers, writers, truck drivers, and many others? Who knows—maybe there’s a bot learning to do your job this minute.
Introduction to Information Retrieval
Christopher D. Manning - 2008
Written from a computer science perspective by three leading experts in the field, it gives an up-to-date treatment of all aspects of the design and implementation of systems for gathering, indexing, and searching documents; methods for evaluating systems; and an introduction to the use of machine learning methods on text collections. All the important ideas are explained using examples and figures, making it perfect for introductory courses in information retrieval for advanced undergraduates and graduate students in computer science. Based on feedback from extensive classroom experience, the book has been carefully structured in order to make teaching more natural and effective. Although originally designed as the primary text for a graduate or advanced undergraduate course in information retrieval, the book will also create a buzz for researchers and professionals alike.
Convex Optimization
Stephen Boyd - 2004
A comprehensive introduction to the subject, this book shows in detail how such problems can be solved numerically with great efficiency. The focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them. The text contains many worked examples and homework exercises and will appeal to students, researchers and practitioners in fields such as engineering, computer science, mathematics, statistics, finance, and economics.
Introduction to Mathematical Statistics
Robert V. Hogg - 1962
Designed for two-semester, beginning graduate courses in Mathematical Statistics, and for senior undergraduate Mathematics, Statistics, and Actuarial Science majors, this text retains its ongoing features and continues to provide students with background material.
Superforecasting: The Art and Science of Prediction
Philip E. Tetlock - 2015
Unfortunately, people tend to be terrible forecasters. As Wharton professor Philip Tetlock showed in a landmark 2005 study, even experts’ predictions are only slightly better than chance. However, an important and underreported conclusion of that study was that some experts do have real foresight, and Tetlock has spent the past decade trying to figure out why. What makes some people so good? And can this talent be taught? In Superforecasting, Tetlock and coauthor Dan Gardner offer a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people—including a Brooklyn filmmaker, a retired pipe installer, and a former ballroom dancer—who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. They’ve beaten other benchmarks, competitors, and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information. They are "superforecasters." In this groundbreaking and accessible book, Tetlock and Gardner show us how we can learn from this elite group. Weaving together stories of forecasting successes (the raid on Osama bin Laden’s compound) and failures (the Bay of Pigs) and interviews with a range of high-level decision makers, from David Petraeus to Robert Rubin, they show that good forecasting doesn’t require powerful computers or arcane methods. It involves gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course. Superforecasting offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.