Book picks similar to
Econometric Analysis by William H. Greene


economics
econometrics
statistics
data-science

Using Econometrics: A Practical Guide


A.H. Studenmund - 1987
    "Using Econometrics: A Practical Guide "provides readers with a practical introduction that combines single-equation linear regression analysis with real-world examples and exercises. This text also avoids complex matrix algebra and calculus, making it an ideal text for beginners. New problem sets and added support make "Using Econometrics" modern and easier to use.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Statistics for Business and Economics [with CD-ROM and InfoTrac]


David R. Anderson - 1986
    Written by authors who are highly regarded in the field, the text provides sound methodological development. The discussion and development of each technique is presented in an application setting, with the statistical results providing insights to decisions and solutions to problems. Statistics for Business and Economics, 9e offers proven accuracy that has led instructors to adopt it simply for its superior examples and exercises alone.

Frequently Asked Questions in Quantitative Finance


Paul Wilmott - 2007
    It is fascinating because of the speed at which the subject develops, the new products and the new models which we have to understand. And it is rewarding because anyone can make a fundamental breakthrough."Having worked in this field for many years, I have come to appreciate the importance of getting the right balance between mathematics and intuition. Too little maths and you won't be able to make much progress, too much maths and you'll be held back by technicalities. I imagine, but expect I will never know for certain, that getting the right level of maths is like having the right equipment to climb Mount Everest; too little and you won't make the first base camp, too much and you'll collapse in a heap before the top."Whenever I write about or teach this subject I also aim to get the right mix of theory and practice. Finance is not a hard science like physics, so you have to accept the limitations of the models. But nor is it a very soft science, so without those models you would be at a disadvantage compared with those better equipped. I believe this adds to the fascination of the subject."This FAQs book looks at some of the most important aspects of financial engineering, and considers them from both theoretical and practical points of view. I hope that you will see that finance is just as much fun in practice as in theory, and if you are reading this book to help you with your job interviews, good luck! Let me know how you get on!"

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

Naked Statistics: Stripping the Dread from the Data


Charles Wheelan - 2012
    How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more.For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions.And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.

Statistical Inference


George Casella - 2001
    Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and are natural extensions and consequences of previous concepts. This book can be used for readers who have a solid mathematics background. It can also be used in a way that stresses the more practical uses of statistical theory, being more concerned with understanding basic statistical concepts and deriving reasonable statistical procedures for a variety of situations, and less concerned with formal optimality investigations.

Microeconomics: Principles, Problems, and Policies


Campbell R. McConnell - 1989
    The 17th Edition builds upon the tradition of leadership by sticking to 3 main goals: help the beginning student master the principles essential for understanding the economizing problem, specific economic issues, and the policy alternatives; help the student understand and apply the economic perspective and reason accurately and objectively about economic matters; and promote a lasting student interest in economics and the economy.

Game Theory for Applied Economists


Robert Gibbons - 1992
    Robert Gibbons addresses scholars in applied fields within economics who want a serious and thorough discussion of game theory but who may have found other works overly abstract. Gibbons emphasizes the economic applications of the theory at least as much as the pure theory itself; formal arguments about abstract games play a minor role. The applications illustrate the process of model building--of translating an informal description of a multi-person decision situation into a formal game-theoretic problem to be analyzed. Also, the variety of applications shows that similar issues arise in different areas of economics, and that the same game-theoretic tools can be applied in each setting. In order to emphasize the broad potential scope of the theory, conventional applications from industrial organization have been largely replaced by applications from labor, macro, and other applied fields in economics. The book covers four classes of games, and four corresponding notions of equilibrium: static games of complete information and Nash equilibrium, dynamic games of complete information and subgame-perfect Nash equilibrium, static games of incomplete information and Bayesian Nash equilibrium, and dynamic games of incomplete information and perfect Bayesian equilibrium.

R for Data Science: Import, Tidy, Transform, Visualize, and Model Data


Hadley Wickham - 2016
    This book introduces you to R, RStudio, and the tidyverse, a collection of R packages designed to work together to make data science fast, fluent, and fun. Suitable for readers with no previous programming experience, R for Data Science is designed to get you doing data science as quickly as possible. Authors Hadley Wickham and Garrett Grolemund guide you through the steps of importing, wrangling, exploring, and modeling your data and communicating the results. You’ll get a complete, big-picture understanding of the data science cycle, along with basic tools you need to manage the details. Each section of the book is paired with exercises to help you practice what you’ve learned along the way. You’ll learn how to: Wrangle—transform your datasets into a form convenient for analysis Program—learn powerful R tools for solving data problems with greater clarity and ease Explore—examine your data, generate hypotheses, and quickly test them Model—provide a low-dimensional summary that captures true "signals" in your dataset Communicate—learn R Markdown for integrating prose, code, and results

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Introduction to Mathematical Statistics


Robert V. Hogg - 1962
    Designed for two-semester, beginning graduate courses in Mathematical Statistics, and for senior undergraduate Mathematics, Statistics, and Actuarial Science majors, this text retains its ongoing features and continues to provide students with background material.

Probability Theory: The Logic of Science


E.T. Jaynes - 1999
    It discusses new results, along with applications of probability theory to a variety of problems. The book contains many exercises and is suitable for use as a textbook on graduate-level courses involving data analysis. Aimed at readers already familiar with applied mathematics at an advanced undergraduate level or higher, it is of interest to scientists concerned with inference from incomplete information.

Game Theory


Drew Fudenberg - 1991
    The analytic material is accompanied by many applications, examples, and exercises. The theory of noncooperative games studies the behavior of agents in any situation where each agent's optimal choice may depend on a forecast of the opponents' choices. "Noncooperative" refers to choices that are based on the participant's perceived selfinterest. Although game theory has been applied to many fields, Fudenberg and Tirole focus on the kinds of game theory that have been most useful in the study of economic problems. They also include some applications to political science. The fourteen chapters are grouped in parts that cover static games of complete information, dynamic games of complete information, static games of incomplete information, dynamic games of incomplete information, and advanced topics.--mitpress.mit.edu

The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives


Stephen Thomas Ziliak - 2008
    If it takes a book to get it across, I hope this book will do it. It ought to.”—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics “With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots. Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).