Econometric Analysis of Cross Section and Panel Data


Jeffrey M. Wooldridge - 2001
    The book makes clear that applied microeconometrics is about the estimation of marginal and treatment effects, and that parametric estimation is simply a means to this end. It also clarifies the distinction between causality and statistical association. The book focuses specifically on cross section and panel data methods. Population assumptions are stated separately from sampling assumptions, leading to simple statements as well as to important insights. The unified approach to linear and nonlinear models and to cross section and panel data enables straightforward coverage of more advanced methods. The numerous end-of-chapter problems are an important component of the book. Some problems contain important points not fully described in the text, and others cover new ideas that can be analyzed using tools presented in the current and previous chapters. Several problems require the use of the data sets located at the author's website.

Physics, Volume 1


Robert Resnick - 1966
    The Fourth Edition of volumes 1 and 2 is concerned with mechanics and E&M/Optics. New features include: expanded coverage of classic physics topics, substantial increases in the number of in-text examples which reinforce text exposition, the latest pedagogical and technical advances in the field, numerical analysis, computer-generated graphics, computer projects and much more.

Introduction to Electrodynamics


David J. Griffiths - 1981
    This work offers accesible coverage of the fundamentals of electrodynamics, enhanced with with discussion points, examples and exercises.

All the Mathematics You Missed


Thomas A. Garrity - 2001
    This book will offer students a broad outline of essential mathematics and will help to fill in the gaps in their knowledge. The author explains the basic points and a few key results of all the most important undergraduate topics in mathematics, emphasizing the intuitions behind the subject. The topics include linear algebra, vector calculus, differential and analytical geometry, real analysis, point-set topology, probability, complex analysis, set theory, algorithms, and more. An annotated bibliography offers a guide to further reading and to more rigorous foundations.

Statistical Mechanics


R.K. Pathria - 1972
    Highly recommended for graduate-level libraries.' ChoiceThis highly successful text, which first appeared in the year 1972 and has continued to be popular ever since, has now been brought up-to-date by incorporating the remarkable developments in the field of 'phase transitions and critical phenomena' that took place over the intervening years. This has been done by adding three new chapters (comprising over 150 pages and containing over 60 homework problems) which should enhance the usefulness of the book for both students and instructors. We trust that this classic text, which has been widely acclaimed for its clean derivations and clear explanations, will continue to provide further generations of students a sound training in the methods of statistical physics.

Linear Algebra With Applications


Steven J. Leon - 1980
    Each chapter contains integrated worked examples and chapter tests. This edition has the ancillary ATLAST computer exercise guide and new MATLAB and Maple guides.

Concrete Mathematics: A Foundation for Computer Science


Ronald L. Graham - 1988
    "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."

Schaum's Outline of Calculus


Frank Ayres Jr. - 1990
    They'll also find the related analytic geometry much easier. The clear review of algebra and geometry in this edition will make calculus easier for students who wish to strengthen their knowledge in these areas. Updated to meet the emphasis in current courses, this new edition of a popular guide--more than 104,000 copies were bought of the prior edition--includes problems and examples using graphing calculators..

Introduction to Graph Theory


Douglas B. West - 1995
    Verification that algorithms work is emphasized more than their complexity. An effective use of examples, and huge number of interesting exercises, demonstrate the topics of trees and distance, matchings and factors, connectivity and paths, graph coloring, edges and cycles, and planar graphs. For those who need to learn to make coherent arguments in the fields of mathematics and computer science.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Information Theory: A Tutorial Introduction


James V. Stone - 2015
    In this richly illustrated book, accessible examples are used to show how information theory can be understood in terms of everyday games like '20 Questions', and the simple MatLab programs provided give hands-on experience of information theory in action. Written in a tutorial style, with a comprehensive glossary, this text represents an ideal primer for novices who wish to become familiar with the basic principles of information theory.Download chapter 1 from http://jim-stone.staff.shef.ac.uk/Boo...

Linear Algebra and Its Applications [with CD-ROM]


David C. Lay - 1993
    

Physics for Scientists and Engineers


Paul Allen Tipler - 1981
    Now in its fourth edition, the work has been extensively revised, with entirely new artwork, updated examples and new pedagogical features. An interactive CD-ROM with worked examples is included. Alternatively, the material on from the CD-ROM can be down-loaded from a website (see supplements section). Twentieth-century developments such as quantum mechanics are introduced early on, so that students can appreciate their importance and see how they fit into the bigger picture.

Calculus with Analytic Geometry


Earl W. Swokowski - 1979
    

Probability And Statistics For Engineering And The Sciences


Jay L. Devore - 1982
    In this book, a wealth of exercises are provided throughout each section, designed to reinforce learning and the logical comprehension of topics. The use of real data is incorporated much more extensively than in any other book on the market. Consist of strong coverage of computer-based methods, especially in the coverage of analysis of variance and regression. This text stresses mastery of methods most often used in medical research, with specific reference to actual medical literature and actual medical research. The approach minimizes mathematical formulation, yet gives complete explanations of all important concepts. Every new concept is systematically developed through completely worked-out examples from current medical research problems. Computer output is used to illustrate concepts when appropriate.