Book picks similar to
Probabilistic Models of Cognition by Noah D. Goodman
ai
grad
ai-ea-extras
i-can-has-stats
Hands-On Machine Learning with Scikit-Learn and TensorFlow
Aurélien Géron - 2017
Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details
Kotlin for Android Developers: Learn Kotlin the easy way while developing an Android App
Antonio Leiva - 2016
The Conscious Mind: In Search of a Fundamental Theory
David J. Chalmers - 1996
Dennett, Gerald Edelman, and Roger Penrose, all firing volleys in what has come to be called the consciousness wars. Now, in The Conscious Mind, philosopher David J. Chalmers offers a cogent analysis of this heated debate as he unveils a major new theory of consciousness, one that rejects the prevailing reductionist trend of science, while offering provocative insights into the relationship between mind and brain.Writing in a rigorous, thought-provoking style, the author takes us on a far-reaching tour through the philosophical ramifications of consciousness. Chalmers convincingly reveals how contemporary cognitive science and neurobiology have failed to explain how and why mental events emerge from physiological occurrences in the brain. He proposes instead that conscious experience must be understood in an entirely new light--as an irreducible entity (similar to such physical properties as time, mass, and space) that exists at a fundamental level and cannot be understood as the sum of its parts. And after suggesting some intriguing possibilities about the structure and laws of conscious experience, he details how his unique reinterpretation of the mind could be the focus of a new science. Throughout the book, Chalmers provides fascinating thought experiments that trenchantly illustrate his ideas. For example, in exploring the notion that consciousness could be experienced by machines as well as humans, Chalmers asks us to imagine a thinking brain in which neurons are slowly replaced by silicon chips that precisely duplicate their functions--as the neurons are replaced, will consciousness gradually fade away? The book also features thoughtful discussions of how the author's theories might be practically applied to subjects as diverse as artificial intelligence and the interpretation of quantum mechanics.All of us have pondered the nature and meaning of consciousness. Engaging and penetrating, The Conscious Mind adds a fresh new perspective to the subject that is sure to spark debate about our understanding of the mind for years to come.
Deep Learning
Ian Goodfellow - 2016
Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
Basic Category Theory for Computer Scientists
Benjamin C. Pierce - 1991
Assuming a minimum of mathematical preparation, Basic Category Theory for Computer Scientists provides a straightforward presentation of the basic constructions and terminology of category theory, including limits, functors, natural transformations, adjoints, and cartesian closed categories. Four case studies illustrate applications of category theory to programming language design, semantics, and the solution of recursive domain equations. A brief literature survey offers suggestions for further study in more advanced texts.
Machine Learning
Tom M. Mitchell - 1986
Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.
Feynman Lectures On Computation
Richard P. Feynman - 1996
Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.
Paradigms of Artificial Intelligence Programming: Case Studies in Common LISP
Peter Norvig - 1991
By reconstructing authentic, complex AI programs using state-of-the-art Common Lisp, the book teaches students and professionals how to build and debug robust practical programs, while demonstrating superior programming style and important AI concepts. The author strongly emphasizes the practical performance issues involved in writing real working programs of significant size. Chapters on troubleshooting and efficiency are included, along with a discussion of the fundamentals of object-oriented programming and a description of the main CLOS functions. This volume is an excellent text for a course on AI programming, a useful supplement for general AI courses and an indispensable reference for the professional programmer.
Bayesian Methods for Hackers: Probabilistic Programming and Bayesian Inference
Cameron Davidson-Pilon - 2014
However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice-freeing you to get results using computing power.
Bayesian Methods for Hackers
illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples and intuitive explanations that have been refined after extensive user feedback. You'll learn how to use the Markov Chain Monte Carlo algorithm, choose appropriate sample sizes and priors, work with loss functions, and apply Bayesian inference in domains ranging from finance to marketing. Once you've mastered these techniques, you'll constantly turn to this guide for the working PyMC code you need to jumpstart future projects. Coverage includes - Learning the Bayesian "state of mind" and its practical implications - Understanding how computers perform Bayesian inference - Using the PyMC Python library to program Bayesian analyses - Building and debugging models with PyMC - Testing your model's "goodness of fit" - Opening the "black box" of the Markov Chain Monte Carlo algorithm to see how and why it works - Leveraging the power of the "Law of Large Numbers" - Mastering key concepts, such as clustering, convergence, autocorrelation, and thinning - Using loss functions to measure an estimate's weaknesses based on your goals and desired outcomes - Selecting appropriate priors and understanding how their influence changes with dataset size - Overcoming the "exploration versus exploitation" dilemma: deciding when "pretty good" is good enough - Using Bayesian inference to improve A/B testing - Solving data science problems when only small amounts of data are available Cameron Davidson-Pilon has worked in many areas of applied mathematics, from the evolutionary dynamics of genes and diseases to stochastic modeling of financial prices. His contributions to the open source community include lifelines, an implementation of survival analysis in Python. Educated at the University of Waterloo and at the Independent University of Moscow, he currently works with the online commerce leader Shopify.
Computer Systems: A Programmer's Perspective
Randal E. Bryant - 2002
Often, computer science and computer engineering curricula don't provide students with a concentrated and consistent introduction to the fundamental concepts that underlie all computer systems. Traditional computer organization and logic design courses cover some of this material, but they focus largely on hardware design. They provide students with little or no understanding of how important software components operate, how application programs use systems, or how system attributes affect the performance and correctness of application programs. - A more complete view of systems - Takes a broader view of systems than traditional computer organization books, covering aspects of computer design, operating systems, compilers, and networking, provides students with the understanding of how programs run on real systems. - Systems presented from a programmers perspective - Material is presented in such a way that it has clear benefit to application programmers, students learn how to use this knowledge to improve program performance and reliability. They also become more effective in program debugging, because t
Mind: A Brief Introduction
John Rogers Searle - 2004
One of the world's most eminent thinkers, Searle dismantles these theories as he presents a vividly written, comprehensive introduction to the mind. He begins with a look at the twelve problems of philosophy of mind--which he calls Descartes and Other Disasters--problems which he returns to throughout the volume, as he illuminates such topics as materialism, consciousness, the mind-body problem, intentionality, mental causation, free will, and the self. The book offers a refreshingly direct and engaging introduction to one of the most intriguing areas of philosophy.
Computer Age Statistical Inference: Algorithms, Evidence, and Data Science
Bradley Efron - 2016
'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.
Hands-On Programming with R: Write Your Own Functions and Simulations
Garrett Grolemund - 2014
With this book, you'll learn how to load data, assemble and disassemble data objects, navigate R's environment system, write your own functions, and use all of R's programming tools.RStudio Master Instructor Garrett Grolemund not only teaches you how to program, but also shows you how to get more from R than just visualizing and modeling data. You'll gain valuable programming skills and support your work as a data scientist at the same time.Work hands-on with three practical data analysis projects based on casino gamesStore, retrieve, and change data values in your computer's memoryWrite programs and simulations that outperform those written by typical R usersUse R programming tools such as if else statements, for loops, and S3 classesLearn how to write lightning-fast vectorized R codeTake advantage of R's package system and debugging toolsPractice and apply R programming concepts as you learn them
Artificial Intelligence: Structures and Strategies for Complex Problem Solving
George F. Luger - 1997
It is suitable for a one or two semester university course on AI, as well as for researchers in the field.
Machine Learning for Dummies
John Paul Mueller - 2016
Without machine learning, fraud detection, web search results, real-time ads on web pages, credit scoring, automation, and email spam filtering wouldn't be possible, and this is only showcasing just a few of its capabilities. Written by two data science experts, Machine Learning For Dummies offers a much-needed entry point for anyone looking to use machine learning to accomplish practical tasks.Covering the entry-level topics needed to get you familiar with the basic concepts of machine learning, this guide quickly helps you make sense of the programming languages and tools you need to turn machine learning-based tasks into a reality. Whether you're maddened by the math behind machine learning, apprehensive about AI, perplexed by preprocessing data--or anything in between--this guide makes it easier to understand and implement machine learning seamlessly.Grasp how day-to-day activities are powered by machine learning Learn to 'speak' certain languages, such as Python and R, to teach machines to perform pattern-oriented tasks and data analysis Learn to code in R using R Studio Find out how to code in Python using Anaconda Dive into this complete beginner's guide so you are armed with all you need to know about machine learning!