Book picks similar to
Time Series Analysis by James Douglas Hamilton
economics
econometrics
finance
textbooks
Game Theory
Drew Fudenberg - 1991
The analytic material is accompanied by many applications, examples, and exercises. The theory of noncooperative games studies the behavior of agents in any situation where each agent's optimal choice may depend on a forecast of the opponents' choices. "Noncooperative" refers to choices that are based on the participant's perceived selfinterest. Although game theory has been applied to many fields, Fudenberg and Tirole focus on the kinds of game theory that have been most useful in the study of economic problems. They also include some applications to political science. The fourteen chapters are grouped in parts that cover static games of complete information, dynamic games of complete information, static games of incomplete information, dynamic games of incomplete information, and advanced topics.--mitpress.mit.edu
The General Theory of Employment, Interest, and Money
John Maynard Keynes - 1935
In his most important work, The General Theory of Employment, Interest, and Money (1936), Keynes critiqued the laissez-faire policies of his day, particularly the proposition that a normally functioning market economy would bring full employment. Keynes's forward-looking work transformed economics from merely a descriptive and analytic discipline into one that is policy oriented. For Keynes, enlightened government intervention in a nation's economic life was essential to curbing what he saw as the inherent inequalities and instabilities of unregulated capitalism.
Triumph of the City: How Our Greatest Invention Makes Us Richer, Smarter, Greener, Healthier and Happier
Edward L. Glaeser - 2011
America is an urban nation. More than two thirds of us live on the 3 percent of land that contains our cities. Yet cities get a bad rap: they're dirty, poor, unhealthy, crime ridden, expensive, environmentally unfriendly... Or are they? As Edward Glaeser proves in this myth-shattering book, cities are actually the healthiest, greenest, and richest (in cultural and economic terms) places to live. New Yorkers, for instance, live longer than other Americans; heart disease and cancer rates are lower in Gotham than in the nation as a whole. More than half of America's income is earned in twenty-two metropolitan areas. And city dwellers use, on average, 40 percent less energy than suburbanites. Glaeser travels through history and around the globe to reveal the hidden workings of cities and how they bring out the best in humankind. Even the worst cities-Kinshasa, Kolkata, Lagos- confer surprising benefits on the people who flock to them, including better health and more jobs than the rural areas that surround them. Glaeser visits Bangalore and Silicon Valley, whose strangely similar histories prove how essential education is to urban success and how new technology actually encourages people to gather together physically. He discovers why Detroit is dying while other old industrial cities-Chicago, Boston, New York-thrive. He investigates why a new house costs 350 percent more in Los Angeles than in Houston, even though building costs are only 25 percent higher in L.A. He pinpoints the single factor that most influences urban growth-January temperatures-and explains how certain chilly cities manage to defy that link. He explains how West Coast environmentalists have harmed the environment, and how struggling cities from Youngstown to New Orleans can "shrink to greatness." And he exposes the dangerous anti-urban political bias that is harming both cities and the entire country. Using intrepid reportage, keen analysis, and eloquent argument, Glaeser makes an impassioned case for the city's import and splendor. He reminds us forcefully why we should nurture our cities or suffer consequences that will hurt us all, no matter where we live.
Elementary Statistics: A Step by Step Approach
Allan G. Bluman - 1992
The book is non-theoretical, explaining concepts intuitively and teaching problem solving through worked examples and step-by-step instructions. This edition places more emphasis on conceptual understanding and understanding results. This edition also features increased emphasis on Excel, MINITAB, and the TI-83 Plus and TI 84-Plus graphing calculators, computing technologies commonly used in such courses.
13 Bankers: The Wall Street Takeover and the Next Financial Meltdown
Simon Johnson - 2010
Anchored by six megabanks—Bank of America, JPMorgan Chase, Citigroup, Wells Fargo, Goldman Sachs, and Morgan Stanley—which together control assets amounting, astonishingly, to more than 60 percent of the country’s gross domestic product, these financial institutions (now more emphatically “too big to fail”) continue to hold the global economy hostage, threatening yet another financial meltdown with their excessive risk-taking and toxic “business as usual” practices. How did this come to be—and what is to be done? These are the central concerns of 13 Bankers, a brilliant, historically informed account of our troubled political economy. In 13 Bankers, Simon Johnson—one of the most prominent and frequently cited economists in America (former chief economist of the International Monetary Fund, Professor of Entrepreneurship at MIT, and author of the controversial “The Quiet Coup” in The Atlantic)—and James Kwak give a wide-ranging, meticulous, and bracing account of recent U.S. financial history within the context of previous showdowns between American democracy and Big Finance: from Thomas Jefferson to Andrew Jackson, from Theodore Roosevelt to Franklin Delano Roosevelt. They convincingly show why our future is imperiled by the ideology of finance (finance is good, unregulated finance is better, unfettered finance run amok is best) and by Wall Street’s political control of government policy pertaining to it. As the authors insist, the choice that America faces is stark: whether Washington will accede to the vested interests of an unbridled financial sector that runs up profits in good years and dumps its losses on taxpayers in lean years, or reform through stringent regulation the banking system as first and foremost an engine of economic growth. To restore health and balance to our economy, Johnson and Kwak make a radical yet feasible and focused proposal: reconfigure the megabanks to be “small enough to fail.” Lucid, authoritative, crucial for its timeliness, 13 Bankers is certain to be one of the most discussed and debated books of 2010.
Economics in One Lesson: The Shortest & Surest Way to Understand Basic Economics
Henry Hazlitt - 1946
But it is also much more, having become a fundamental influence on modern “libertarian” economics of the type espoused by Ron Paul and others.Considered among the leading economic thinkers of the “Austrian School,” which includes Carl Menger, Ludwig von Mises, Friedrich (F.A.) Hayek, and others, Henry Hazlitt (1894-1993), was a libertarian philosopher, an economist, and a journalist. He was the founding vice-president of the Foundation for Economic Education and an early editor of The Freeman magazine, an influential libertarian publication. Hazlitt wrote Economics in One Lesson, his seminal work, in 1946. Concise and instructive, it is also deceptively prescient and far-reaching in its efforts to dissemble economic fallacies that are so prevalent they have almost become a new orthodoxy.Many current economic commentators across the political spectrum have credited Hazlitt with foreseeing the collapse of the global economy which occurred more than 50 years after the initial publication of Economics in One Lesson. Hazlitt’s focus on non-governmental solutions, strong — and strongly reasoned — anti-deficit position, and general emphasis on free markets, economic liberty of individuals, and the dangers of government intervention make Economics in One Lesson, every bit as relevant and valuable today as it has been since publication.
Feynman Lectures On Computation
Richard P. Feynman - 1996
Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.
Neural Networks: A Comprehensive Foundation
Simon Haykin - 1994
Introducing students to the many facets of neural networks, this text provides many case studies to illustrate their real-life, practical applications.
Principles of Statistics
M.G. Bulmer - 1979
There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again for the classroom or for self-study.Principles of Statistics was created primarily for the student of natural sciences, the social scientist, the undergraduate mathematics student, or anyone familiar with the basics of mathematical language. It assumes no previous knowledge of statistics or probability; nor is extensive mathematical knowledge necessary beyond a familiarity with the fundamentals of differential and integral calculus. (The calculus is used primarily for ease of notation; skill in the techniques of integration is not necessary in order to understand the text.)Professor Bulmer devotes the first chapters to a concise, admirably clear description of basic terminology and fundamental statistical theory: abstract concepts of probability and their applications in dice games, Mendelian heredity, etc.; definitions and examples of discrete and continuous random variables; multivariate distributions and the descriptive tools used to delineate them; expected values; etc. The book then moves quickly to more advanced levels, as Professor Bulmer describes important distributions (binomial, Poisson, exponential, normal, etc.), tests of significance, statistical inference, point estimation, regression, and correlation. Dozens of exercises and problems appear at the end of various chapters, with answers provided at the back of the book. Also included are a number of statistical tables and selected references.
Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence
Kate Crawford - 2020
It draws our attention away from the bright shiny objects of the new colonialism through elucidating the social, material and political dimensions of Artificial Intelligence.”—Geoffrey C. Bowker, University of California, Irvine What happens when artificial intelligence saturates political life and depletes the planet? How is AI shaping our understanding of ourselves and our societies? In this book Kate Crawford reveals how this planetary network is fueling a shift toward undemocratic governance and increased racial, gender, and economic inequality. Drawing on more than a decade of research, award‑winning science, and technology, Crawford reveals how AI is a technology of extraction: from the energy and minerals needed to build and sustain its infrastructure, to the exploited workers behind “automated” services, to the data AI collects from us. Rather than taking a narrow focus on code and algorithms, Crawford offers us a political and a material perspective on what it takes to make artificial intelligence and where it goes wrong. While technical systems present a veneer of objectivity, they are always systems of power. This is an urgent account of what is at stake as technology companies use artificial intelligence to reshape the world.
Data Visualization: A Practical Introduction
Kieran Healy - 2018
It explains what makes some graphs succeed while others fail, how to make high-quality figures from data using powerful and reproducible methods, and how to think about data visualization in an honest and effective way.Data Visualization builds the reader's expertise in ggplot2, a versatile visualization library for the R programming language. Through a series of worked examples, this accessible primer then demonstrates how to create plots piece by piece, beginning with summaries of single variables and moving on to more complex graphics. Topics include plotting continuous and categorical variables; layering information on graphics; producing effective "small multiple" plots; grouping, summarizing, and transforming data for plotting; creating maps; working with the output of statistical models; and refining plots to make them more comprehensible.Effective graphics are essential to communicating ideas and a great way to better understand data. This book provides the practical skills students and practitioners need to visualize quantitative data and get the most out of their research findings.Provides hands-on instruction using R and ggplot2Shows how the "tidyverse" of data analysis tools makes working with R easier and more consistentIncludes a library of data sets, code, and functions
Innumeracy: Mathematical Illiteracy and Its Consequences
John Allen Paulos - 1988
Dozens of examples in innumeracy show us how it affects not only personal economics and travel plans, but explains mis-chosen mates, inappropriate drug-testing, and the allure of pseudo-science.
R for Everyone: Advanced Analytics and Graphics
Jared P. Lander - 2013
R has traditionally been difficult for non-statisticians to learn, and most R books assume far too much knowledge to be of help. R for Everyone is the solution. Drawing on his unsurpassed experience teaching new users, professional data scientist Jared P. Lander has written the perfect tutorial for anyone new to statistical programming and modeling. Organized to make learning easy and intuitive, this guide focuses on the 20 percent of R functionality you'll need to accomplish 80 percent of modern data tasks. Lander's self-contained chapters start with the absolute basics, offering extensive hands-on practice and sample code. You'll download and install R; navigate and use the R environment; master basic program control, data import, and manipulation; and walk through several essential tests. Then, building on this foundation, you'll construct several complete models, both linear and nonlinear, and use some data mining techniques. By the time you're done, you won't just know how to write R programs, you'll be ready to tackle the statistical problems you care about most. COVERAGE INCLUDES - Exploring R, RStudio, and R packages - Using R for math: variable types, vectors, calling functions, and more - Exploiting data structures, including data.frames, matrices, and lists - Creating attractive, intuitive statistical graphics - Writing user-defined functions - Controlling program flow with if, ifelse, and complex checks - Improving program efficiency with group manipulations - Combining and reshaping multiple datasets - Manipulating strings using R's facilities and regular expressions - Creating normal, binomial, and Poisson probability distributions - Programming basic statistics: mean, standard deviation, and t-tests - Building linear, generalized linear, and nonlinear models - Assessing the quality of models and variable selection - Preventing overfitting, using the Elastic Net and Bayesian methods - Analyzing univariate and multivariate time series data - Grouping data via K-means and hierarchical clustering - Preparing reports, slideshows, and web pages with knitr - Building reusable R packages with devtools and Rcpp - Getting involved with the R global community
Computer Organization & Design: The Hardware/Software Interface
David A. Patterson - 1993
More importantly, this book provides a framework for thinking about computer organization and design that will enable the reader to continue the lifetime of learning necessary for staying at the forefront of this competitive discipline. --John Crawford Intel Fellow Director of Microprocessor Architecture, Intel The performance of software systems is dramatically affected by how well software designers understand the basic hardware technologies at work in a system. Similarly, hardware designers must understand the far reaching effects their design decisions have on software applications. For readers in either category, this classic introduction to the field provides a deep look into the computer. It demonstrates the relationship between the software and hardware and focuses on the foundational concepts that are the basis for current computer design. Using a distinctive learning by evolution approach the authors present each idea from its first principles, guiding readers through a series of worked examples that incrementally add more complex instructions until they ha