Book picks similar to
Matrix Algebra: Theory, Computations, and Applications in Statistics by James E. Gentle
mathematics
math
statistics
math-statistics
Statistics in Plain English
Timothy C. Urdan - 2001
Each self-contained chapter consists of three sections. The first describes the statistic, including how it is used and what information it provides. The second section reviews how it works, how to calculate the formula, the strengths and weaknesses of the technique, and the conditions needed for its use. The final section provides examples that use and interpret the statistic. A glossary of terms and symbols is also included.New features in the second edition include:an interactive CD with PowerPoint presentations and problems for each chapter including an overview of the problem's solution; new chapters on basic research concepts including sampling, definitions of different types of variables, and basic research designs and one on nonparametric statistics; more graphs and more precise descriptions of each statistic; and a discussion of confidence intervals.This brief paperback is an ideal supplement for statistics, research methods, courses that use statistics, or as a reference tool to refresh one's memory about key concepts. The actual research examples are from psychology, education, and other social and behavioral sciences.Materials formerly available with this book on CD-ROM are now available for download from our website www.psypress.com. Go to the book's page and look for the 'Download' link in the right-hand column.
Elementary Number Theory and Its Applications
Kenneth H. Rosen - 1984
The Fourth Edition builds on this strength with new examples, additional applications and increased cryptology coverage. Up-to-date information on the latest discoveries is included.Elementary Number Theory and Its Applications provides a diverse group of exercises, including basic exercises designed to help students develop skills, challenging exercises and computer projects. In addition to years of use and professor feedback, the fourth edition of this text has been thoroughly accuracy checked to ensure the quality of the mathematical content and the exercises.
Introductory Statistics
Prem S. Mann - 2006
The realistic content of its examples and exercises, the clarity and brevity of its presentation, and the soundness of its pedagogical approach have received the highest remarks from both students and instructors. Now this bestseller is available in a new 6th edition.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie - 2001
With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.
Schaum's Outline of Complex Variables
Murray R. Spiegel - 1968
Contains 640 problems including solutions; additional practice problems with answers; explanations of complex variable theory; coverage of applications of complex variables in engineering, physics, and elsewhere, with accompanying sample problems and solutions.
Introduction to Probability Models
Sheldon M. Ross - 1972
This updated edition of Ross's classic bestseller provides an introduction to elementary probability theory and stochastic processes, and shows how probability theory can be applied to the study of phenomena in fields such as engineering, computer science, management science, the physical and social sciences, and operations research. With the addition of several new sections relating to actuaries, this text is highly recommended by the Society of Actuaries.This book now contains a new section on compound random variables that can be used to establish a recursive formula for computing probability mass functions for a variety of common compounding distributions; a new section on hiddden Markov chains, including the forward and backward approaches for computing the joint probability mass function of the signals, as well as the Viterbi algorithm for determining the most likely sequence of states; and a simplified approach for analyzing nonhomogeneous Poisson processes. There are also additional results on queues relating to the conditional distribution of the number found by an M/M/1 arrival who spends a time t in the system; inspection paradox for M/M/1 queues; and M/G/1 queue with server breakdown. Furthermore, the book includes new examples and exercises, along with compulsory material for new Exam 3 of the Society of Actuaries.This book is essential reading for professionals and students in actuarial science, engineering, operations research, and other fields in applied probability.
Information Theory, Inference and Learning Algorithms
David J.C. MacKay - 2002
These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Matrix Computations
Gene H. Golub - 1983
It includes rewritten and clarified proofs and derivations, as well as new topics such as Arnoldi iteration, and domain decomposition methods.
Abstract Algebra
I.N. Herstein - 1986
Providing a concise introduction to abstract algebra, this work unfolds some of the fundamental systems with the aim of reaching applicable, significant results.
Numerical Optimization
Jorge Nocedal - 2000
One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
Head First Statistics
Dawn Griffiths - 2008
Whether you're a student, a professional, or just curious about statistical analysis, Head First's brain-friendly formula helps you get a firm grasp of statistics so you can understand key points and actually use them. Learn to present data visually with charts and plots; discover the difference between taking the average with mean, median, and mode, and why it's important; learn how to calculate probability and expectation; and much more.Head First Statistics is ideal for high school and college students taking statistics and satisfies the requirements for passing the College Board's Advanced Placement (AP) Statistics Exam. With this book, you'll:Study the full range of topics covered in first-year statistics Tackle tough statistical concepts using Head First's dynamic, visually rich format proven to stimulate learning and help you retain knowledge Explore real-world scenarios, ranging from casino gambling to prescription drug testing, to bring statistical principles to life Discover how to measure spread, calculate odds through probability, and understand the normal, binomial, geometric, and Poisson distributions Conduct sampling, use correlation and regression, do hypothesis testing, perform chi square analysis, and moreBefore you know it, you'll not only have mastered statistics, you'll also see how they work in the real world. Head First Statistics will help you pass your statistics course, and give you a firm understanding of the subject so you can apply the knowledge throughout your life.
Statistics Done Wrong: The Woefully Complete Guide
Alex Reinhart - 2013
Politicians and marketers present shoddy evidence for dubious claims all the time. But smart people make mistakes too, and when it comes to statistics, plenty of otherwise great scientists--yes, even those published in peer-reviewed journals--are doing statistics wrong."Statistics Done Wrong" comes to the rescue with cautionary tales of all-too-common statistical fallacies. It'll help you see where and why researchers often go wrong and teach you the best practices for avoiding their mistakes.In this book, you'll learn: - Why "statistically significant" doesn't necessarily imply practical significance- Ideas behind hypothesis testing and regression analysis, and common misinterpretations of those ideas- How and how not to ask questions, design experiments, and work with data- Why many studies have too little data to detect what they're looking for-and, surprisingly, why this means published results are often overestimates- Why false positives are much more common than "significant at the 5% level" would suggestBy walking through colorful examples of statistics gone awry, the book offers approachable lessons on proper methodology, and each chapter ends with pro tips for practicing scientists and statisticians. No matter what your level of experience, "Statistics Done Wrong" will teach you how to be a better analyst, data scientist, or researcher.
An Introduction to Probability Theory and Its Applications, Volume 1
William Feller - 1968
Beginning with the background and very nature of probability theory, the book then proceeds through sample spaces, combinatorial analysis, fluctuations in coin tossing and random walks, the combination of events, types of distributions, Markov chains, stochastic processes, and more. The book's comprehensive approach provides a complete view of theory along with enlightening examples along the way.
Introductory Statistics
Neil A. Weiss - 1987
This book develops statistical thinking over rote drill and practice. The Nature of Statistics; Organizing Data; Descriptive Measures; Probability Concepts; Discrete Random Variables; The Normal Distribution; The Sampling Distribution of the Sample Menu; Confidence Intervals for One Population Mean; Hypothesis Tests for One Population Mean; Inferences for Two Population Means; Inferences for Population Standard Deviations; Inferences for Population Proportions; Chi-Square Procedures; Descriptive Methods in Regression and Correlation; Inferential Methods in Regression and Correlation; Analysis of Variance (ANOVA)
For all readers interested in Introductory Statistics.