Book picks similar to
Pattern Recognition and Classification: An Introduction by Geoff Dougherty
statistics
کنجکاوی
computer-science
machine-learning
Doing Bayesian Data Analysis: A Tutorial Introduction with R and BUGS
John K. Kruschke - 2010
Included are step-by-step instructions on how to carry out Bayesian data analyses.Download Link : readbux.com/download?i=0124058884 0124058884 Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan PDF by John Kruschke
Statistical Rethinking: A Bayesian Course with Examples in R and Stan
Richard McElreath - 2015
Reflecting the need for even minor programming in today's model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation.By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling.Web ResourceThe book is accompanied by an R package (rethinking) that is available on the author's website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.
Coding the Matrix: Linear Algebra through Computer Science Applications
Philip N. Klein - 2013
Mathematical concepts and computational problems are motivated by applications in computer science. The reader learns by "doing," writing programs to implement the mathematical concepts and using them to carry out tasks and explore the applications. Examples include: error-correcting codes, transformations in graphics, face detection, encryption and secret-sharing, integer factoring, removing perspective from an image, PageRank (Google's ranking algorithm), and cancer detection from cell features. A companion web site, codingthematrix.com provides data and support code. Most of the assignments can be auto-graded online. Over two hundred illustrations, including a selection of relevant "xkcd" comics. Chapters: "The Function," "The Field," "The Vector," "The Vector Space," "The Matrix," "The Basis," "Dimension," "Gaussian Elimination," "The Inner Product," "Special Bases," "The Singular Value Decomposition," "The Eigenvector," "The Linear Program"
Probabilistic Robotics
Sebastian Thrun - 2005
Building on the field of mathematical statistics, probabilistic robotics endows robots with a new level of robustness in real-world situations. This book introduces the reader to a wealth of techniques and algorithms in the field. All algorithms are based on a single overarching mathematical foundation. Each chapter provides example implementations in pseudo code, detailed mathematical derivations, discussions from a practitioner's perspective, and extensive lists of exercises and class projects. The book's Web site, www.probabilistic-robotics.org, has additional material. The book is relevant for anyone involved in robotic software development and scientific research. It will also be of interest to applied statisticians and engineers dealing with real-world sensor data.
Forecasting: Principles and Practice
Rob J. Hyndman - 2013
Deciding whether to build another power generation plant in the next five years requires forecasts of future demand. Scheduling staff in a call centre next week requires forecasts of call volumes. Stocking an inventory requires forecasts of stock requirements. Telecommunication routing requires traffic forecasts a few minutes ahead. Whatever the circumstances or time horizons involved, forecasting is an important aid in effective and efficient planning. This textbook provides a comprehensive introduction to forecasting methods and presents enough information about each method for readers to use them sensibly. Examples use R with many data sets taken from the authors' own consulting experience.
Elements of Information Theory
Thomas M. Cover - 1991
Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Data Jujitsu: The Art of Turning Data into Product
D.J. Patil - 2012
Acclaimed data scientist DJ Patil details a new approach to solving problems in Data Jujitsu.Learn how to use a problem's "weight" against itself to:Break down seemingly complex data problems into simplified partsUse alternative data analysis techniques to examine themUse human input, such as Mechanical Turk, and design tricks that enlist the help of your users to take short cuts around tough problemsLearn more about the problems before starting on the solutions—and use the findings to solve them, or determine whether the problems are worth solving at all.
Feature Engineering for Machine Learning
Alice Zheng - 2018
With this practical book, you’ll learn techniques for extracting and transforming features—the numeric representations of raw data—into formats for machine-learning models. Each chapter guides you through a single data problem, such as how to represent text or image data. Together, these examples illustrate the main principles of feature engineering.Rather than simply teach these principles, authors Alice Zheng and Amanda Casari focus on practical application with exercises throughout the book. The closing chapter brings everything together by tackling a real-world, structured dataset with several feature-engineering techniques. Python packages including numpy, Pandas, Scikit-learn, and Matplotlib are used in code examples.
Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
Virginia Eubanks - 2018
In Pittsburgh, a child welfare agency uses a statistical model to try to predict which children might be future victims of abuse or neglect.Since the dawn of the digital age, decision-making in finance, employment, politics, health and human services has undergone revolutionary change. Today, automated systems—rather than humans—control which neighborhoods get policed, which families attain needed resources, and who is investigated for fraud. While we all live under this new regime of data, the most invasive and punitive systems are aimed at the poor.In Automating Inequality, Virginia Eubanks systematically investigates the impacts of data mining, policy algorithms, and predictive risk models on poor and working-class people in America. The book is full of heart-wrenching and eye-opening stories, from a woman in Indiana whose benefits are literally cut off as she lays dying to a family in Pennsylvania in daily fear of losing their daughter because they fit a certain statistical profile.The U.S. has always used its most cutting-edge science and technology to contain, investigate, discipline and punish the destitute. Like the county poorhouse and scientific charity before them, digital tracking and automated decision-making hide poverty from the middle-class public and give the nation the ethical distance it needs to make inhumane choices: which families get food and which starve, who has housing and who remains homeless, and which families are broken up by the state. In the process, they weaken democracy and betray our most cherished national values.This deeply researched and passionate book could not be more timely.Naomi Klein: "This book is downright scary."Ethan Zuckerman, MIT: "Should be required reading."Dorothy Roberts, author of Killing the Black Body: "A must-read for everyone concerned about modern tools of inequality in America."Astra Taylor, author of The People's Platform: "This is the single most important book about technology you will read this year."
Data Science
John D. Kelleher - 2018
Today data science determines the ads we see online, the books and movies that are recommended to us online, which emails are filtered into our spam folders, and even how much we pay for health insurance. This volume in the MIT Press Essential Knowledge series offers a concise introduction to the emerging field of data science, explaining its evolution, current uses, data infrastructure issues, and ethical challenges.It has never been easier for organizations to gather, store, and process data. Use of data science is driven by the rise of big data and social media, the development of high-performance computing, and the emergence of such powerful methods for data analysis and modeling as deep learning. Data science encompasses a set of principles, problem definitions, algorithms, and processes for extracting non-obvious and useful patterns from large datasets. It is closely related to the fields of data mining and machine learning, but broader in scope. This book offers a brief history of the field, introduces fundamental data concepts, and describes the stages in a data science project. It considers data infrastructure and the challenges posed by integrating data from multiple sources, introduces the basics of machine learning, and discusses how to link machine learning expertise with real-world problems. The book also reviews ethical and legal issues, developments in data regulation, and computational approaches to preserving privacy. Finally, it considers the future impact of data science and offers principles for success in data science projects.
Introduction to Probability
Dimitri P. Bertsekas - 2002
This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.
Computer Vision: Algorithms and Applications
Richard Szeliski - 2010
However, despite all of the recent advances in computer vision research, the dream of having a computer interpret an image at the same level as a two-year old remains elusive. Why is computer vision such a challenging problem and what is the current state of the art?Computer Vision: Algorithms and Applications explores the variety of techniques commonly used to analyze and interpret images. It also describes challenging real-world applications where vision is being successfully used, both for specialized applications such as medical imaging, and for fun, consumer-level tasks such as image editing and stitching, which students can apply to their own personal photos and videos.More than just a source of "recipes," this exceptionally authoritative and comprehensive textbook/reference also takes a scientific approach to basic vision problems, formulating physical models of the imaging process before inverting them to produce descriptions of a scene. These problems are also analyzed using statistical models and solved using rigorous engineering techniquesTopics and features: Structured to support active curricula and project-oriented courses, with tips in the Introduction for using the book in a variety of customized courses Presents exercises at the end of each chapter with a heavy emphasis on testing algorithms and containing numerous suggestions for small mid-term projects Provides additional material and more detailed mathematical topics in the Appendices, which cover linear algebra, numerical techniques, and Bayesian estimation theory Suggests additional reading at the end of each chapter, including the latest research in each sub-field, in addition to a full Bibliography at the end of the book Supplies supplementary course material for students at the associated website, http: //szeliski.org/Book/ Suitable for an upper-level undergraduate or graduate-level course in computer science or engineering, this textbook focuses on basic techniques that work under real-world conditions and encourages students to push their creative boundaries. Its design and exposition also make it eminently suitable as a unique reference to the fundamental techniques and current research literature in computer vision.
Time Series Analysis
James Douglas Hamilton - 1994
This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. Time Series Analysis fills an important need for a textbook that integrates economic theory, econometrics, and new results.The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.-- "Journal of Economics"
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Peter Dayan - 2001
This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory.The book is divided into three parts. Part I discusses the relationship between sensory stimuli and neural responses, focusing on the representation of information by the spiking activity of neurons. Part II discusses the modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics. Part III analyzes the role of plasticity in development and learning. An appendix covers the mathematical methods used, and exercises are available on the book's Web site.
Numerical Linear Algebra
Lloyd N. Trefethen - 1997
The clarity and eloquence of the presentation make it popular with teachers and students alike. The text aims to expand the reader's view of the field and to present standard material in a novel way. All of the most important topics in the field are covered with a fresh perspective, including iterative methods for systems of equations and eigenvalue problems and the underlying principles of conditioning and stability. Presentation is in the form of 40 lectures, which each focus on one or two central ideas. The unity between topics is emphasized throughout, with no risk of getting lost in details and technicalities. The book breaks with tradition by beginning with the QR factorization - an important and fresh idea for students, and the thread that connects most of the algorithms of numerical linear algebra.