Book picks similar to
Bayesian Data Analysis by Andrew Gelman


statistics
math
data-science
machine-learning

What is a P-Value Anyway? 34 Stories to Help You Actually Understand Statistics


Andrew J. Vickers - 2009
    Drawing on his experience as a medical researcher, Vickers blends insightful explanations and humor, with minimal math, to help readers understand and interpret the statistics they read every day. Describing data; Data distributions; Variation of study results: confidence intervals; Hypothesis testing; Regression and decision making; Some common statistical errors, and what they teach us For all readers interested in statistics.

Effective Python: 90 Specific Ways to Write Better Python (Effective Software Development Series)


Brett Slatkin - 2019
    However, Python’s unique strengths, charms, and expressiveness can be hard to grasp, and there are hidden pitfalls that can easily trip you up. This second edition of Effective Python will help you master a truly “Pythonic” approach to programming, harnessing Python’s full power to write exceptionally robust and well-performing code. Using the concise, scenario-driven style pioneered in Scott Meyers’ best-selling Effective C++, Brett Slatkin brings together 90 Python best practices, tips, and shortcuts, and explains them with realistic code examples so that you can embrace Python with confidence. Drawing on years of experience building Python infrastructure at Google, Slatkin uncovers little-known quirks and idioms that powerfully impact code behavior and performance. You’ll understand the best way to accomplish key tasks so you can write code that’s easier to understand, maintain, and improve. In addition to even more advice, this new edition substantially revises all items from the first edition to reflect how best practices have evolved. Key features include 30 new actionable guidelines for all major areas of Python Detailed explanations and examples of statements, expressions, and built-in types Best practices for writing functions that clarify intention, promote reuse, and avoid bugs Better techniques and idioms for using comprehensions and generator functions Coverage of how to accurately express behaviors with classes and interfaces Guidance on how to avoid pitfalls with metaclasses and dynamic attributes More efficient and clear approaches to concurrency and parallelism Solutions for optimizing and hardening to maximize performance and quality Techniques and built-in modules that aid in debugging and testing Tools and best practices for collaborative development   Effective Python will prepare growing programmers to make a big impact using Python.

Neural Networks: A Comprehensive Foundation


Simon Haykin - 1994
    Introducing students to the many facets of neural networks, this text provides many case studies to illustrate their real-life, practical applications.

Visualize This: The FlowingData Guide to Design, Visualization, and Statistics


Nathan Yau - 2011
    Wouldn't it be wonderful if we could actually visualize data in such a way that we could maximize its potential and tell a story in a clear, concise manner? Thanks to the creative genius of Nathan Yau, we can. With this full-color book, data visualization guru and author Nathan Yau uses step-by-step tutorials to show you how to visualize and tell stories with data. He explains how to gather, parse, and format data and then design high quality graphics that help you explore and present patterns, outliers, and relationships.Presents a unique approach to visualizing and telling stories with data, from a data visualization expert and the creator of flowingdata.com, Nathan Yau Offers step-by-step tutorials and practical design tips for creating statistical graphics, geographical maps, and information design to find meaning in the numbers Details tools that can be used to visualize data-native graphics for the Web, such as ActionScript, Flash libraries, PHP, and JavaScript and tools to design graphics for print, such as R and Illustrator Contains numerous examples and descriptions of patterns and outliers and explains how to show them Visualize This demonstrates how to explain data visually so that you can present your information in a way that is easy to understand and appealing.

Introduction to the Theory of Computation


Michael Sipser - 1996
    Sipser's candid, crystal-clear style allows students at every level to understand and enjoy this field. His innovative "proof idea" sections explain profound concepts in plain English. The new edition incorporates many improvements students and professors have suggested over the years, and offers updated, classroom-tested problem sets at the end of each chapter.

A New Kind of Science


Stephen Wolfram - 1997
    Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton

Spark: The Definitive Guide: Big Data Processing Made Simple


Bill Chambers - 2018
    With an emphasis on improvements and new features in Spark 2.0, authors Bill Chambers and Matei Zaharia break down Spark topics into distinct sections, each with unique goals. You’ll explore the basic operations and common functions of Spark’s structured APIs, as well as Structured Streaming, a new high-level API for building end-to-end streaming applications. Developers and system administrators will learn the fundamentals of monitoring, tuning, and debugging Spark, and explore machine learning techniques and scenarios for employing MLlib, Spark’s scalable machine-learning library. Get a gentle overview of big data and Spark Learn about DataFrames, SQL, and Datasets—Spark’s core APIs—through worked examples Dive into Spark’s low-level APIs, RDDs, and execution of SQL and DataFrames Understand how Spark runs on a cluster Debug, monitor, and tune Spark clusters and applications Learn the power of Structured Streaming, Spark’s stream-processing engine Learn how you can apply MLlib to a variety of problems, including classification or recommendation

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers


John MacCormick - 2012
    A simple web search picks out a handful of relevant needles from the world's biggest haystack: the billions of pages on the World Wide Web. Uploading a photo to Facebook transmits millions of pieces of information over numerous error-prone network links, yet somehow a perfect copy of the photo arrives intact. Without even knowing it, we use public-key cryptography to transmit secret information like credit card numbers; and we use digital signatures to verify the identity of the websites we visit. How do our computers perform these tasks with such ease? This is the first book to answer that question in language anyone can understand, revealing the extraordinary ideas that power our PCs, laptops, and smartphones. Using vivid examples, John MacCormick explains the fundamental "tricks" behind nine types of computer algorithms, including artificial intelligence (where we learn about the "nearest neighbor trick" and "twenty questions trick"), Google's famous PageRank algorithm (which uses the "random surfer trick"), data compression, error correction, and much more. These revolutionary algorithms have changed our world: this book unlocks their secrets, and lays bare the incredible ideas that our computers use every day.

Elementary Statistics: Picturing the World


Ron Larson - 2002
    Offering an approach with a visual/graphical emphasis, this text offers a number of examples on the premise that students learn best by doing. This book features an emphasis on interpretation of results and critical thinking over calculations.

Introduction to Mathematical Statistics


Robert V. Hogg - 1962
    Designed for two-semester, beginning graduate courses in Mathematical Statistics, and for senior undergraduate Mathematics, Statistics, and Actuarial Science majors, this text retains its ongoing features and continues to provide students with background material.

Data Mining: Concepts and Techniques (The Morgan Kaufmann Series in Data Management Systems)


Jiawei Han - 2000
    Not only are all of our business, scientific, and government transactions now computerized, but the widespread use of digital cameras, publication tools, and bar codes also generate data. On the collection side, scanned text and image platforms, satellite remote sensing systems, and the World Wide Web have flooded us with a tremendous amount of data. This explosive growth has generated an even more urgent need for new techniques and automated tools that can help us transform this data into useful information and knowledge.Like the first edition, voted the most popular data mining book by KD Nuggets readers, this book explores concepts and techniques for the discovery of patterns hidden in large data sets, focusing on issues relating to their feasibility, usefulness, effectiveness, and scalability. However, since the publication of the first edition, great progress has been made in the development of new data mining methods, systems, and applications. This new edition substantially enhances the first edition, and new chapters have been added to address recent developments on mining complex types of data- including stream data, sequence data, graph structured data, social network data, and multi-relational data.A comprehensive, practical look at the concepts and techniques you need to know to get the most out of real business dataUpdates that incorporate input from readers, changes in the field, and more material on statistics and machine learningDozens of algorithms and implementation examples, all in easily understood pseudo-code and suitable for use in real-world, large-scale data mining projectsComplete classroom support for instructors at www.mkp.com/datamining2e companion site

Algorithms


Robert Sedgewick - 1983
    This book surveys the most important computer algorithms currently in use and provides a full treatment of data structures and algorithms for sorting, searching, graph processing, and string processing -- including fifty algorithms every programmer should know. In this edition, new Java implementations are written in an accessible modular programming style, where all of the code is exposed to the reader and ready to use.The algorithms in this book represent a body of knowledge developed over the last 50 years that has become indispensable, not just for professional programmers and computer science students but for any student with interests in science, mathematics, and engineering, not to mention students who use computation in the liberal arts.The companion web site, algs4.cs.princeton.edu contains An online synopsis Full Java implementations Test data Exercises and answers Dynamic visualizations Lecture slides Programming assignments with checklists Links to related material The MOOC related to this book is accessible via the "Online Course" link at algs4.cs.princeton.edu. The course offers more than 100 video lecture segments that are integrated with the text, extensive online assessments, and the large-scale discussion forums that have proven so valuable. Offered each fall and spring, this course regularly attracts tens of thousands of registrants.Robert Sedgewick and Kevin Wayne are developing a modern approach to disseminating knowledge that fully embraces technology, enabling people all around the world to discover new ways of learning and teaching. By integrating their textbook, online content, and MOOC, all at the state of the art, they have built a unique resource that greatly expands the breadth and depth of the educational experience.

Probability And Statistics For Engineers And Scientists


Ronald E. Walpole - 1978
     Offers extensively updated coverage, new problem sets, and chapter-ending material to enhance the book’s relevance to today’s engineers and scientists. Includes new problem sets demonstrating updated applications to engineering as well as biological, physical, and computer science. Emphasizes key ideas as well as the risks and hazards associated with practical application of the material. Includes new material on topics including: difference between discrete and continuous measurements; binary data; quartiles; importance of experimental design; “dummy” variables; rules for expectations and variances of linear functions; Poisson distribution; Weibull and lognormal distributions; central limit theorem, and data plotting. Introduces Bayesian statistics, including its applications to many fields. For those interested in learning more about probability and statistics.

Prediction Machines: The Simple Economics of Artificial Intelligence


Ajay Agrawal - 2018
    But facing the sea change that AI will bring can be paralyzing. How should companies set strategies, governments design policies, and people plan their lives for a world so different from what we know? In the face of such uncertainty, many analysts either cower in fear or predict an impossibly sunny future.But in Prediction Machines, three eminent economists recast the rise of AI as a drop in the cost of prediction. With this single, masterful stroke, they lift the curtain on the AI-is-magic hype and show how basic tools from economics provide clarity about the AI revolution and a basis for action by CEOs, managers, policy makers, investors, and entrepreneurs.When AI is framed as cheap prediction, its extraordinary potential becomes clear: Prediction is at the heart of making decisions under uncertainty. Our businesses and personal lives are riddled with such decisions. Prediction tools increase productivity--operating machines, handling documents, communicating with customers. Uncertainty constrains strategy. Better prediction creates opportunities for new business structures and strategies to compete. Penetrating, fun, and always insightful and practical, Prediction Machines follows its inescapable logic to explain how to navigate the changes on the horizon. The impact of AI will be profound, but the economic framework for understanding it is surprisingly simple.

Introduction to Data Mining


Vipin Kumar - 2005
    Each major topic is organized into two chapters, beginning with basic concepts that provide necessary background for understanding each data mining technique, followed by more advanced concepts and algorithms.