Bayes Theorem Examples: An Intuitive Guide


Scott Hartshorn - 2016
    Essentially, you are estimating a probability, but then updating that estimate based on other things that you know. This book is designed to give you an intuitive understanding of how to use Bayes Theorem. It starts with the definition of what Bayes Theorem is, but the focus of the book is on providing examples that you can follow and duplicate. Most of the examples are calculated in Excel, which is useful for updating probability if you have dozens or hundreds of data points to roll in.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Basic Category Theory for Computer Scientists


Benjamin C. Pierce - 1991
    Assuming a minimum of mathematical preparation, Basic Category Theory for Computer Scientists provides a straightforward presentation of the basic constructions and terminology of category theory, including limits, functors, natural transformations, adjoints, and cartesian closed categories. Four case studies illustrate applications of category theory to programming language design, semantics, and the solution of recursive domain equations. A brief literature survey offers suggestions for further study in more advanced texts.

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

Numerical Optimization


Jorge Nocedal - 2000
    One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.

Introduction to Automata Theory, Languages, and Computation


John E. Hopcroft - 1979
    With this long-awaited revision, the authors continue to present the theory in a concise and straightforward manner, now with an eye out for the practical applications. They have revised this book to make it more accessible to today's students, including the addition of more material on writing proofs, more figures and pictures to convey ideas, side-boxes to highlight other interesting material, and a less formal writing style. Exercises at the end of each chapter, including some new, easier exercises, help readers confirm and enhance their understanding of the material. *NEW! Completely rewritten to be less formal, providing more accessibility to todays students. *NEW! Increased usage of figures and pictures to help convey ideas. *NEW! More detail and intuition provided for definitions and proofs. *NEW! Provides special side-boxes to present supplemental material that may be of interest to readers. *NEW! Includes more exercises, including many at a lower level. *NEW! Presents program-like notation for PDAs and Turing machines. *NEW! Increas

Coding the Matrix: Linear Algebra through Computer Science Applications


Philip N. Klein - 2013
    Mathematical concepts and computational problems are motivated by applications in computer science. The reader learns by "doing," writing programs to implement the mathematical concepts and using them to carry out tasks and explore the applications. Examples include: error-correcting codes, transformations in graphics, face detection, encryption and secret-sharing, integer factoring, removing perspective from an image, PageRank (Google's ranking algorithm), and cancer detection from cell features. A companion web site, codingthematrix.com provides data and support code. Most of the assignments can be auto-graded online. Over two hundred illustrations, including a selection of relevant "xkcd" comics. Chapters: "The Function," "The Field," "The Vector," "The Vector Space," "The Matrix," "The Basis," "Dimension," "Gaussian Elimination," "The Inner Product," "Special Bases," "The Singular Value Decomposition," "The Eigenvector," "The Linear Program"

Computability and Logic


George S. Boolos - 1980
    Including a selection of exercises, adjusted for this edition, at the end of each chapter, it offers a new and simpler treatment of the representability of recursive functions, a traditional stumbling block for students on the way to the Godel incompleteness theorems.

Quantum Computation and Quantum Information


Michael A. Nielsen - 2000
    A wealth of accompanying figures and exercises illustrate and develop the material in more depth. They describe what a quantum computer is, how it can be used to solve problems faster than familiar "classical" computers, and the real-world implementation of quantum computers. Their book concludes with an explanation of how quantum states can be used to perform remarkable feats of communication, and of how it is possible to protect quantum states against the effects of noise.

Numerical Linear Algebra


Lloyd N. Trefethen - 1997
    The clarity and eloquence of the presentation make it popular with teachers and students alike. The text aims to expand the reader's view of the field and to present standard material in a novel way. All of the most important topics in the field are covered with a fresh perspective, including iterative methods for systems of equations and eigenvalue problems and the underlying principles of conditioning and stability. Presentation is in the form of 40 lectures, which each focus on one or two central ideas. The unity between topics is emphasized throughout, with no risk of getting lost in details and technicalities. The book breaks with tradition by beginning with the QR factorization - an important and fresh idea for students, and the thread that connects most of the algorithms of numerical linear algebra.

Elements Of Discrete Mathematics: Solutions Manual


Chung Laung Liu - 1999
    

How to Create a Mind: The Secret of Human Thought Revealed


Ray Kurzweil - 2012
    In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse engineering the brain to understand precisely how it works and using that knowledge to create even more intelligent machines.Kurzweil discusses how the brain functions, how the mind emerges from the brain, and the implications of vastly increasing the powers of our intelligence in addressing the world’s problems. He thoughtfully examines emotional and moral intelligence and the origins of consciousness and envisions the radical possibilities of our merging with the intelligent technology we are creating.Certain to be one of the most widely discussed and debated science books of the year, How to Create a Mind is sure to take its place alongside Kurzweil’s previous classics which include Fantastic Voyage: Live Long Enough to Live Forever and The Age of Spiritual Machines.

The Book of Why: The New Science of Cause and Effect


Judea Pearl - 2018
    Today, that taboo is dead. The causal revolution, instigated by Judea Pearl and his colleagues, has cut through a century of confusion and established causality -- the study of cause and effect -- on a firm scientific basis. His work explains how we can know easy things, like whether it was rain or a sprinkler that made a sidewalk wet; and how to answer hard questions, like whether a drug cured an illness. Pearl's work enables us to know not just whether one thing causes another: it lets us explore the world that is and the worlds that could have been. It shows us the essence of human thought and key to artificial intelligence. Anyone who wants to understand either needs The Book of Why.

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Introduction to Artificial Intelligence and Expert Systems


Dan W. Patterson - 1990