Book picks similar to
Simulation by Sheldon M. Ross


statistics
data-science
textbooks
stem

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Computer Age Statistical Inference: Algorithms, Evidence, and Data Science


Bradley Efron - 2016
    'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.

Calculus Made Easy


Silvanus Phillips Thompson - 1910
    With a new introduction, three new chapters, modernized language and methods throughout, and an appendix of challenging and enjoyable practice problems, Calculus Made Easy has been thoroughly updated for the modern reader.

Elementary Number Theory


David M. Burton - 1976
    It reveals the attraction that has drawn leading mathematicians and amateurs alike to number theory over the course of history.

Linear Algebra With Applications


Steven J. Leon - 1980
    Each chapter contains integrated worked examples and chapter tests. This edition has the ancillary ATLAST computer exercise guide and new MATLAB and Maple guides.

Data Science at the Command Line: Facing the Future with Time-Tested Tools


Jeroen Janssens - 2014
    You'll learn how to combine small, yet powerful, command-line tools to quickly obtain, scrub, explore, and model your data.To get you started--whether you're on Windows, OS X, or Linux--author Jeroen Janssens introduces the Data Science Toolbox, an easy-to-install virtual environment packed with over 80 command-line tools.Discover why the command line is an agile, scalable, and extensible technology. Even if you're already comfortable processing data with, say, Python or R, you'll greatly improve your data science workflow by also leveraging the power of the command line.Obtain data from websites, APIs, databases, and spreadsheetsPerform scrub operations on plain text, CSV, HTML/XML, and JSONExplore data, compute descriptive statistics, and create visualizationsManage your data science workflow using DrakeCreate reusable tools from one-liners and existing Python or R codeParallelize and distribute data-intensive pipelines using GNU ParallelModel data with dimensionality reduction, clustering, regression, and classification algorithms

R Cookbook: Proven Recipes for Data Analysis, Statistics, and Graphics


Paul Teetor - 2011
    The R language provides everything you need to do statistical work, but its structure can be difficult to master. This collection of concise, task-oriented recipes makes you productive with R immediately, with solutions ranging from basic tasks to input and output, general statistics, graphics, and linear regression.Each recipe addresses a specific problem, with a discussion that explains the solution and offers insight into how it works. If you're a beginner, R Cookbook will help get you started. If you're an experienced data programmer, it will jog your memory and expand your horizons. You'll get the job done faster and learn more about R in the process.Create vectors, handle variables, and perform other basic functionsInput and output dataTackle data structures such as matrices, lists, factors, and data framesWork with probability, probability distributions, and random variablesCalculate statistics and confidence intervals, and perform statistical testsCreate a variety of graphic displaysBuild statistical models with linear regressions and analysis of variance (ANOVA)Explore advanced statistical techniques, such as finding clusters in your dataWonderfully readable, R Cookbook serves not only as a solutions manual of sorts, but as a truly enjoyable way to explore the R language--one practical example at a time.--Jeffrey Ryan, software consultant and R package author

Access 2007: The Missing Manual


Matthew MacDonald - 2006
    It runs on PCs rather than servers and is ideal for small- to mid-sized businesses and households. But Access is still intimidating to learn. It doesn't help that each new version crammed in yet another set of features; so many, in fact, that even the pros don't know where to find them all. Access 2007 breaks this pattern with some of the most dramatic changes users have seen since Office 95. Most obvious is the thoroughly redesigned user interface, with its tabbed toolbar (or "Ribbon") that makes features easy to locate and use. The features list also includes several long-awaited changes. One thing that hasn't improved is Microsoft's documentation. To learn the ins and outs of all the features in Access 2007, Microsoft merely offers online help.Access 2007: The Missing Manual was written from the ground up for this redesigned application. You will learn how to design complete databases, maintain them, search for valuable nuggets of information, and build attractive forms for quick-and-easy data entry. You'll even delve into the black art of Access programming (including macros and Visual Basic), and pick up valuable tricks and techniques to automate common tasks -- even if you've never touched a line of code before. You will also learn all about the new prebuilt databases you can customize to fit your needs, and how the new complex data feature will simplify your life. With plenty of downloadable examples, this objective and witty book will turn an Access neophyte into a true master.

Feynman Lectures On Computation


Richard P. Feynman - 1996
    Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

Graph Databases


Ian Robinson - 2013
    With this practical book, you’ll learn how to design and implement a graph database that brings the power of graphs to bear on a broad range of problem domains. Whether you want to speed up your response to user queries or build a database that can adapt as your business evolves, this book shows you how to apply the schema-free graph model to real-world problems.Learn how different organizations are using graph databases to outperform their competitors. With this book’s data modeling, query, and code examples, you’ll quickly be able to implement your own solution.Model data with the Cypher query language and property graph modelLearn best practices and common pitfalls when modeling with graphsPlan and implement a graph database solution in test-driven fashionExplore real-world examples to learn how and why organizations use a graph databaseUnderstand common patterns and components of graph database architectureUse analytical techniques and algorithms to mine graph database information

Bayes' Rule: A Tutorial Introduction to Bayesian Analysis


James V. Stone - 2013
    Discovered by an 18th century mathematician and preacher, Bayes' rule is a cornerstone of modern probability theory. In this richly illustrated book, intuitive visual representations of real-world examples are used to show how Bayes' rule is actually a form of commonsense reasoning. The tutorial style of writing, combined with a comprehensive glossary, makes this an ideal primer for novices who wish to gain an intuitive understanding of Bayesian analysis. As an aid to understanding, online computer code (in MatLab, Python and R) reproduces key numerical results and diagrams.Stone's book is renowned for its visually engaging style of presentation, which stems from teaching Bayes' rule to psychology students for over 10 years as a university lecturer.

Bayesian Data Analysis


Andrew Gelman - 1995
    Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.

Introduction to Quantum Mechanics


David J. Griffiths - 1994
    The book s two-part coverage organizes topics under basic theory, and assembles an arsenal of approximation schemes with illustrative applications. For physicists and engineers. "

Foundations of Statistical Natural Language Processing


Christopher D. Manning - 1999
    This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.