Statistical Rethinking: A Bayesian Course with Examples in R and Stan


Richard McElreath - 2015
    Reflecting the need for even minor programming in today's model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation.By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling.Web ResourceThe book is accompanied by an R package (rethinking) that is available on the author's website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.

Vehicles: Experiments in Synthetic Psychology


Valentino Braitenberg - 1984
    They are vehicles, a series of hypothetical, self-operating machines that exhibit increasingly intricate if not always successful or civilized behavior. Each of the vehicles in the series incorporates the essential features of all the earlier models and along the way they come to embody aggression, love, logic, manifestations of foresight, concept formation, creative thinking, personality, and free will. In a section of extensive biological notes, Braitenberg locates many elements of his fantasy in current brain research.

The Feynman Lectures on Physics Vol 1


Richard P. Feynman - 1963
    This edition, which was prepared by Kip S. Thorne (Feynman Professor of Theoretical Physics at California Institute of Technology), fully incorporates all the errata and corrections gathered (but never used in a published edition) by Feynman.

The Algorithm Design Manual


Steven S. Skiena - 1997
    Drawing heavily on the author's own real-world experiences, the book stresses design and analysis. Coverage is divided into two parts, the first being a general guide to techniques for the design and analysis of computer algorithms. The second is a reference section, which includes a catalog of the 75 most important algorithmic problems. By browsing this catalog, readers can quickly identify what the problem they have encountered is called, what is known about it, and how they should proceed if they need to solve it. This book is ideal for the working professional who uses algorithms on a daily basis and has need for a handy reference. This work can also readily be used in an upper-division course or as a student reference guide. THE ALGORITHM DESIGN MANUAL comes with a CD-ROM that contains: * a complete hypertext version of the full printed book. * the source code and URLs for all cited implementations. * over 30 hours of audio lectures on the design and analysis of algorithms are provided, all keyed to on-line lecture notes.

Fundamentals of Modern Manufacturing: Materials, Processes, and Systems


Mikell P. Groover - 2000
    It follows a more quantitative and design-oriented approach than other texts in the market, helping readers gain a better understanding of important concepts. They'll also discover how material properties relate to the process variables in a given process as well as how to perform manufacturing science and quantitative engineering analysis of manufacturing processes.

Numerical Recipes in C: The Art of Scientific Computing


William H. Press - 1988
    In a self-contained manner it proceeds from mathematical and theoretical considerations to actual practical computer routines. With over 100 new routines bringing the total to well over 300, plus upgraded versions of the original routines, the new edition remains the most practical, comprehensive handbook of scientific computing available today.

Data Mining: Practical Machine Learning Tools and Techniques


Ian H. Witten - 1999
    This highly anticipated fourth edition of the most ...Download Link : readmeaway.com/download?i=0128042915            0128042915 Data Mining: Practical Machine Learning Tools and Techniques (Morgan Kaufmann Series in Data Management Systems) PDF by Ian H. WittenRead Data Mining: Practical Machine Learning Tools and Techniques (Morgan Kaufmann Series in Data Management Systems) PDF from Morgan Kaufmann,Ian H. WittenDownload Ian H. Witten's PDF E-book Data Mining: Practical Machine Learning Tools and Techniques (Morgan Kaufmann Series in Data Management Systems)

Chaos: Making a New Science


James Gleick - 1987
    From Edward Lorenz’s discovery of the Butterfly Effect, to Mitchell Feigenbaum’s calculation of a universal constant, to Benoit Mandelbrot’s concept of fractals, which created a new geometry of nature, Gleick’s engaging narrative focuses on the key figures whose genius converged to chart an innovative direction for science. In Chaos, Gleick makes the story of chaos theory not only fascinating but also accessible to beginners, and opens our eyes to a surprising new view of the universe.

Grokking Deep Learning


Andrew W. Trask - 2017
    Loosely based on neuron behavior inside of human brains, these systems are rapidly catching up with the intelligence of their human creators, defeating the world champion Go player, achieving superhuman performance on video games, driving cars, translating languages, and sometimes even helping law enforcement fight crime. Deep Learning is a revolution that is changing every industry across the globe.Grokking Deep Learning is the perfect place to begin your deep learning journey. Rather than just learn the “black box” API of some library or framework, you will actually understand how to build these algorithms completely from scratch. You will understand how Deep Learning is able to learn at levels greater than humans. You will be able to understand the “brain” behind state-of-the-art Artificial Intelligence. Furthermore, unlike other courses that assume advanced knowledge of Calculus and leverage complex mathematical notation, if you’re a Python hacker who passed high-school algebra, you’re ready to go. And at the end, you’ll even build an A.I. that will learn to defeat you in a classic Atari game.

Tell Me The Odds: A 15 Page Introduction To Bayes Theorem


Scott Hartshorn - 2017
    Essentially, you make an initial guess, and then get more data to improve it. Bayes Theorem, or Bayes Rule, has a ton of real world applications, from estimating your risk of a heart attack to making recommendations on Netflix But It Isn't That Complicated This book is a short introduction to Bayes Theorem. It is only 15 pages long, and is intended to show you how Bayes Theorem works as quickly as possible. The examples are intentionally kept simple to focus solely on Bayes Theorem without requiring that the reader know complicated probability distributions. If you want to learn the basics of Bayes Theorem as quickly as possible, with some easy to duplicate examples, this is a good book for you.

Elements of Information Theory


Thomas M. Cover - 1991
    Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

Algorithms


Robert Sedgewick - 1983
    This book surveys the most important computer algorithms currently in use and provides a full treatment of data structures and algorithms for sorting, searching, graph processing, and string processing -- including fifty algorithms every programmer should know. In this edition, new Java implementations are written in an accessible modular programming style, where all of the code is exposed to the reader and ready to use.The algorithms in this book represent a body of knowledge developed over the last 50 years that has become indispensable, not just for professional programmers and computer science students but for any student with interests in science, mathematics, and engineering, not to mention students who use computation in the liberal arts.The companion web site, algs4.cs.princeton.edu contains An online synopsis Full Java implementations Test data Exercises and answers Dynamic visualizations Lecture slides Programming assignments with checklists Links to related material The MOOC related to this book is accessible via the "Online Course" link at algs4.cs.princeton.edu. The course offers more than 100 video lecture segments that are integrated with the text, extensive online assessments, and the large-scale discussion forums that have proven so valuable. Offered each fall and spring, this course regularly attracts tens of thousands of registrants.Robert Sedgewick and Kevin Wayne are developing a modern approach to disseminating knowledge that fully embraces technology, enabling people all around the world to discover new ways of learning and teaching. By integrating their textbook, online content, and MOOC, all at the state of the art, they have built a unique resource that greatly expands the breadth and depth of the educational experience.

Advanced Engineering Mathematics


Erwin Kreyszig - 1968
    The new edition provides invitations - not requirements - to use technology, as well as new conceptual problems, and new projects that focus on writing and working in teams.

R for Dummies


Joris Meys - 2012
    R is packed with powerful programming capabilities, but learning to use R in the real world can be overwhelming for even the most seasoned statisticians. This easy-to-follow guide explains how to use R for data processing and statistical analysis, and then, shows you how to present your data using compelling and informative graphics. You'll gain practical experience using R in a variety of settings and delve deeper into R's feature-rich toolset.Includes tips for the initial installation of RDemonstrates how to easily perform calculations on vectors, arrays, and lists of dataShows how to effectively visualize data using R's powerful graphics packagesGives pointers on how to find, install, and use add-on packages created by the R communityProvides tips on getting additional help from R mailing lists and websitesWhether you're just starting out with statistical analysis or are a procedural programming pro, "R For Dummies" is the book you need to get the most out of R.

Statistics Done Wrong: The Woefully Complete Guide


Alex Reinhart - 2013
    Politicians and marketers present shoddy evidence for dubious claims all the time. But smart people make mistakes too, and when it comes to statistics, plenty of otherwise great scientists--yes, even those published in peer-reviewed journals--are doing statistics wrong."Statistics Done Wrong" comes to the rescue with cautionary tales of all-too-common statistical fallacies. It'll help you see where and why researchers often go wrong and teach you the best practices for avoiding their mistakes.In this book, you'll learn: - Why "statistically significant" doesn't necessarily imply practical significance- Ideas behind hypothesis testing and regression analysis, and common misinterpretations of those ideas- How and how not to ask questions, design experiments, and work with data- Why many studies have too little data to detect what they're looking for-and, surprisingly, why this means published results are often overestimates- Why false positives are much more common than "significant at the 5% level" would suggestBy walking through colorful examples of statistics gone awry, the book offers approachable lessons on proper methodology, and each chapter ends with pro tips for practicing scientists and statisticians. No matter what your level of experience, "Statistics Done Wrong" will teach you how to be a better analyst, data scientist, or researcher.