Book picks similar to
Econometric Analysis of Cross Section and Panel Data by Jeffrey M. Wooldridge
economics
econometrics
mathematics
reference
Approaches to Social Research
Royce A. Singleton Jr. - 1988
Covering all of the fundamentals in a straightforward, student-friendly manner, it is ideal for undergraduate- and graduate-level courses across the social sciences and also serves as an indispensable guide for researchers. Striking a balance between specific techniques and the underlying logic of scientific inquiry, this book provides a lucid treatment of the four major approaches to research: experimentation, survey research, field research, and the use of available data. Richly developed examples of empirical research and an emphasis on the research process enable students to better understand the real-world application of research methods. The authors also offer a unique chapter (13) advocating a multiple-methods strategy.
Data Science from Scratch: First Principles with Python
Joel Grus - 2015
In this book, you’ll learn how many of the most fundamental data science tools and algorithms work by implementing them from scratch.
If you have an aptitude for mathematics and some programming skills, author Joel Grus will help you get comfortable with the math and statistics at the core of data science, and with hacking skills you need to get started as a data scientist. Today’s messy glut of data holds answers to questions no one’s even thought to ask. This book provides you with the know-how to dig those answers out.
Get a crash course in Python
Learn the basics of linear algebra, statistics, and probability—and understand how and when they're used in data science
Collect, explore, clean, munge, and manipulate data
Dive into the fundamentals of machine learning
Implement models such as k-nearest Neighbors, Naive Bayes, linear and logistic regression, decision trees, neural networks, and clustering
Explore recommender systems, natural language processing, network analysis, MapReduce, and databases
Microeconomic Theory: Basic Principles and Extensions
Walter Nicholson - 1972
Applauded for providing the most clear and accurate presentation of advanced microeconomic concepts, it offers an ideal level of mathematical rigor for upper level undergraduate students and beginning graduate students. It gives students the opportunity to work directly with theoretical tools, real-world applications, and cutting edge developments in the study of microeconomics. This text is solid, rigorous, comprehensive, and is sensibly challenging for students, best serving students with a mathematics background.
The Signal and the Noise: Why So Many Predictions Fail—But Some Don't
Nate Silver - 2012
He solidified his standing as the nation's foremost political forecaster with his near perfect prediction of the 2012 election. Silver is the founder and editor in chief of FiveThirtyEight.com. Drawing on his own groundbreaking work, Silver examines the world of prediction, investigating how we can distinguish a true signal from a universe of noisy data. Most predictions fail, often at great cost to society, because most of us have a poor understanding of probability and uncertainty. Both experts and laypeople mistake more confident predictions for more accurate ones. But overconfidence is often the reason for failure. If our appreciation of uncertainty improves, our predictions can get better too. This is the "prediction paradox": The more humility we have about our ability to make predictions, the more successful we can be in planning for the future.In keeping with his own aim to seek truth from data, Silver visits the most successful forecasters in a range of areas, from hurricanes to baseball, from the poker table to the stock market, from Capitol Hill to the NBA. He explains and evaluates how these forecasters think and what bonds they share. What lies behind their success? Are they good-or just lucky? What patterns have they unraveled? And are their forecasts really right? He explores unanticipated commonalities and exposes unexpected juxtapositions. And sometimes, it is not so much how good a prediction is in an absolute sense that matters but how good it is relative to the competition. In other cases, prediction is still a very rudimentary-and dangerous-science.Silver observes that the most accurate forecasters tend to have a superior command of probability, and they tend to be both humble and hardworking. They distinguish the predictable from the unpredictable, and they notice a thousand little details that lead them closer to the truth. Because of their appreciation of probability, they can distinguish the signal from the noise.
Essentials of Investments [with Standard & Poor's Bind-In Card & CD-ROM]
Zvi Bodie - 1992
The authors have eliminated unnecessary mathematical detail and concentrate on the intuition and insights that will be useful to practitioners throughout their careers as new ideas and challenges emerge from the financial marketplace. Essentials maintains the theme of asset allocation (authors discuss asset pricing and trading then apply these theories to portfolio planning in real-world securities markets that are governed by risk/return relationships).
A Primer of Ecological Statistics
Nicholas J. Gotelli - 2004
The book emphasizes a general introduction to probability theory and provides a detailed discussion of specific designs and analyses that are typically encountered in ecology and environmental science. Appropriate for use as either a stand-alone or supplementary text for upper-division undergraduate or graduate courses in ecological and environmental statistics, ecology, environmental science, environmental studies, or experimental design, the Primer also serves as a resource for environmental professionals who need to use and interpret statistics daily but have little or no formal training in the subject.
How Not to Be Wrong: The Power of Mathematical Thinking
Jordan Ellenberg - 2014
In How Not to Be Wrong, Jordan Ellenberg shows us how terribly limiting this view is: Math isn’t confined to abstract incidents that never occur in real life, but rather touches everything we do—the whole world is shot through with it.Math allows us to see the hidden structures underneath the messy and chaotic surface of our world. It’s a science of not being wrong, hammered out by centuries of hard work and argument. Armed with the tools of mathematics, we can see through to the true meaning of information we take for granted: How early should you get to the airport? What does “public opinion” really represent? Why do tall parents have shorter children? Who really won Florida in 2000? And how likely are you, really, to develop cancer?How Not to Be Wrong presents the surprising revelations behind all of these questions and many more, using the mathematician’s method of analyzing life and exposing the hard-won insights of the academic community to the layman—minus the jargon. Ellenberg chases mathematical threads through a vast range of time and space, from the everyday to the cosmic, encountering, among other things, baseball, Reaganomics, daring lottery schemes, Voltaire, the replicability crisis in psychology, Italian Renaissance painting, artificial languages, the development of non-Euclidean geometry, the coming obesity apocalypse, Antonin Scalia’s views on crime and punishment, the psychology of slime molds, what Facebook can and can’t figure out about you, and the existence of God.Ellenberg pulls from history as well as from the latest theoretical developments to provide those not trained in math with the knowledge they need. Math, as Ellenberg says, is “an atomic-powered prosthesis that you attach to your common sense, vastly multiplying its reach and strength.” With the tools of mathematics in hand, you can understand the world in a deeper, more meaningful way. How Not to Be Wrong will show you how.
Machine Learning: A Probabilistic Perspective
Kevin P. Murphy - 2012
Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Causality: Models, Reasoning, and Inference
Judea Pearl - 2000
It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable. Professor of Computer Science at the UCLA, Judea Pearl is the winner of the 2008 Benjamin Franklin Award in Computers and Cognitive Science.
Computer Age Statistical Inference: Algorithms, Evidence, and Data Science
Bradley Efron - 2016
'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.
Probabilistic Graphical Models: Principles and Techniques
Daphne Koller - 2009
The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality.Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.
Schaum's Outline of Calculus
Frank Ayres Jr. - 1990
They'll also find the related analytic geometry much easier. The clear review of algebra and geometry in this edition will make calculus easier for students who wish to strengthen their knowledge in these areas. Updated to meet the emphasis in current courses, this new edition of a popular guide--more than 104,000 copies were bought of the prior edition--includes problems and examples using graphing calculators..
Applied Linear Regression Models- 4th Edition with Student CD (McGraw Hill/Irwin Series: Operations and Decision Sciences)
Michael H. Kutner - 2003
Cases, datasets, and examples allow for a more real-world perspective and explore relevant uses of regression techniques in business today.
Probability And Statistics For Engineering And The Sciences
Jay L. Devore - 1982
In this book, a wealth of exercises are provided throughout each section, designed to reinforce learning and the logical comprehension of topics. The use of real data is incorporated much more extensively than in any other book on the market. Consist of strong coverage of computer-based methods, especially in the coverage of analysis of variance and regression. This text stresses mastery of methods most often used in medical research, with specific reference to actual medical literature and actual medical research. The approach minimizes mathematical formulation, yet gives complete explanations of all important concepts. Every new concept is systematically developed through completely worked-out examples from current medical research problems. Computer output is used to illustrate concepts when appropriate.
The Fractal Geometry of Nature
Benoît B. Mandelbrot - 1977
The complexity of nature's shapes differs in kind, not merely degree, from that of the shapes of ordinary geometry, the geometry of fractal shapes.Now that the field has expanded greatly with many active researchers, Mandelbrot presents the definitive overview of the origins of his ideas and their new applications. The Fractal Geometry of Nature is based on his highly acclaimed earlier work, but has much broader and deeper coverage and more extensive illustrations.