Gödel, Escher, Bach: An Eternal Golden Braid


Douglas R. Hofstadter - 1979
    However, according to Hofstadter, the formal system that underlies all mental activity transcends the system that supports it. If life can grow out of the formal chemical substrate of the cell, if consciousness can emerge out of a formal system of firing neurons, then so too will computers attain human intelligence. Gödel, Escher, Bach is a wonderful exploration of fascinating ideas at the heart of cognitive science: meaning, reduction, recursion, and much more.

Python for Data Analysis


Wes McKinney - 2011
    It is also a practical, modern introduction to scientific computing in Python, tailored for data-intensive applications. This is a book about the parts of the Python language and libraries you'll need to effectively solve a broad set of data analysis problems. This book is not an exposition on analytical methods using Python as the implementation language.Written by Wes McKinney, the main author of the pandas library, this hands-on book is packed with practical cases studies. It's ideal for analysts new to Python and for Python programmers new to scientific computing.Use the IPython interactive shell as your primary development environmentLearn basic and advanced NumPy (Numerical Python) featuresGet started with data analysis tools in the pandas libraryUse high-performance tools to load, clean, transform, merge, and reshape dataCreate scatter plots and static or interactive visualizations with matplotlibApply the pandas groupby facility to slice, dice, and summarize datasetsMeasure data by points in time, whether it's specific instances, fixed periods, or intervalsLearn how to solve problems in web analytics, social sciences, finance, and economics, through detailed examples

Feynman Lectures On Computation


Richard P. Feynman - 1996
    Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

An Introduction to the Event-Related Potential Technique


Steven J. Luck - 2005
    In " An Introduction to the Event-Related Potential Technique," Steve Luck offers the first comprehensive guide to the practicalities of conducting ERP experiments in cognitive neuroscience and related fields, including affective neuroscience and experimental psychopathology. The book can serve as a guide for the classroom or the laboratory and as a reference for researchers who do not conduct ERP studies themselves but need to understand and evaluate ERP experiments in the literature. It summarizes the accumulated body of ERP theory and practice, providing detailed, practical advice about how to design, conduct, and interpret ERP experiments, and presents the theoretical background needed to understand why an experiment is carried out in a particular way. Luck focuses on the most fundamental techniques, describing them as they are used in many of the world's leading ERP laboratories. These techniques reflect a long history of electrophysiological recordings and provide an excellent foundation for more advanced approaches.The book also provides advice on the key topic of how to design ERP experiments so that they will be useful in answering questions of broad scientific interest. This reflects the increasing proportion of ERP research that focuses on these broader questions rather than the "ERPology" of early studies, which concentrated primarily on ERP components and methods. Topics covered include the neural origins of ERPs, signal averaging, artifact rejection and correction, filtering, measurement and analysis, localization, and the practicalities of setting up the lab.

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

Artificial Intelligence: A Modern Approach


Stuart Russell - 1994
    The long-anticipated revision of this best-selling text offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. *NEW-Nontechnical learning material-Accompanies each part of the book. *NEW-The Internet as a sample application for intelligent systems-Added in several places including logical agents, planning, and natural language. *NEW-Increased coverage of material - Includes expanded coverage of: default reasoning and truth maintenance systems, including multi-agent/distributed AI and game theory; probabilistic approaches to learning including EM; more detailed descriptions of probabilistic inference algorithms. *NEW-Updated and expanded exercises-75% of the exercises are revised, with 100 new exercises. *NEW-On-line Java software. *Makes it easy for students to do projects on the web using intelligent agents. *A unified, agent-based approach to AI-Organizes the material around the task of building intelligent agents. *Comprehensive, up-to-date coverage-Includes a unified view of the field organized around the rational decision making pa

Networks: An Introduction


M.E.J. Newman - 2010
    The rise of the Internet and the wide availability of inexpensive computers have made it possible to gather and analyze network data on a large scale, and the development of a variety of new theoretical tools has allowed us to extract new knowledge from many different kinds of networks.The study of networks is broadly interdisciplinary and important developments have occurred in many fields, including mathematics, physics, computer and information sciences, biology, and the social sciences. This book brings together for the first time the most important breakthroughs in each of these fields and presents them in a coherent fashion, highlighting the strong interconnections between work in different areas.Subjects covered include the measurement and structure of networks in many branches of science, methods for analyzing network data, including methods developed in physics, statistics, and sociology, the fundamentals of graph theory, computer algorithms, and spectral methods, mathematical models of networks, including random graph models and generative models, and theories of dynamical processes taking place on networks.

Biological Psychology


James W. Kalat - 1981
    This Eighth Edition redefines the high standard set by previous editions. It offers the best balance of rigor and accessibility, the most current research, and the most thorough technology integration available for your course--all presented within a unique modular format that supports student mastery and provides instructors with maximum teaching flexibility. In every chapter, Kalat accurately portrays biopsychology as a dynamic and empirical field in which fascinating new discoveries are constantly being made. He captures readers' interest with the latest biological psychology findings, such as how gingko biloba claims to aid memory and coverage of the hypothesis that humans' mate choice patterns are influenced by natural selection. Throughout, the author's goal is not only to convey information, but also to convey his excitement about and dedication to the subject.

Research-Based Strategies to Ignite Student Learning: Insights from a Neurologist and Classroom Teacher: Insights from a Neurologist and Classroom Teacher


Judy Willis - 2006
    The result is a comprehensive and accessible guide for improving student learning based on the best the research world has to offer.Willis takes a reader-friendly approach to neuroscience, describing how the brain processes, stores, and retrieves material and which instructional strategies help students learn most effectively and joyfully. You will discover how to captivate and hold the attention of your students and how to enhance their memory and test-taking success. You will learn how to know when students are ready for learning and when their brains need a rest. You will also learn how stress and emotion affect learning and how to improve student engagement. And you will find innovative techniques for designing assessments and adjusting teaching practices to ensure that all students reach their potential.No matter what grade or subject you teach, Research-Based Strategies to Ignite Student Learning will enrich your repertoire of teaching strategies so you can help students reach their full academic potential.

Connectome: How the Brain's Wiring Makes Us Who We Are


Sebastian Seung - 2012
    Is it in our genes? The structure of our brains? Our genome may determine our eye color and even aspects of our personality. But our friendships, failures, and passions also shape who we are. The question is: how?Sebastian Seung, a dynamic professor at MIT, is on a quest to discover the biological basis of identity. He believes it lies in the pattern of connections between the brain’s neurons, which change slowly over time as we learn and grow. The connectome, as it’s called, is where our genetic inheritance intersects with our life experience. It’s where nature meets nurture.Seung introduces us to the dedicated researchers who are mapping the brain’s connections, neuron by neuron, synapse by synapse. It is a monumental undertaking—the scientific equivalent of climbing Mount Everest—but if they succeed, it could reveal the basis of personality, intelligence, memory, and perhaps even mental disorders. Many scientists speculate that people with anorexia, autism, and schizophrenia are "wired differently," but nobody knows for sure. The brain’s wiring has never been clearly seen.In sparklingly clear prose, Seung reveals the amazing technological advances that will soon help us map connectomes. He also examines the evidence that these maps will someday allow humans to "upload" their minds into computers, achieving a kind of immortality.Connectome is a mind-bending adventure story, told with great passion and authority. It presents a daring scientific and technological vision for at last understanding what makes us who we are. Welcome to the future of neuroscience.

Starting Out with Java: From Control Structures Through Objects


Tony Gaddis - 2009
    If you wouldlike to purchase both the physical text and MyProgrammingLab search for ISBN-10: 0132989999/ISBN-13: 9780132989992. That packageincludes ISBN-10: 0132855836/ISBN-13: 9780132855839 and ISBN-10: 0132891557/ISBN-13: 9780132891554. MyProgrammingLab should only be purchased when required by an instructor. In "Starting Out with Java: From Control Structures through Objects", Gaddis covers procedural programming control structures and methods before introducing object-oriented programming. As with all Gaddis texts, clear and easy-to-read code listings, concise and practical real-world examples, and an abundance of exercises appear in every chapter. "

Probabilistic Graphical Models: Principles and Techniques


Daphne Koller - 2009
    The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality.Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.

Artificial Intelligence: A Guide for Thinking Humans


Melanie Mitchell - 2019
    The award-winning author Melanie Mitchell, a leading computer scientist, now reveals AI’s turbulent history and the recent spate of apparent successes, grand hopes, and emerging fears surrounding it.In Artificial Intelligence, Mitchell turns to the most urgent questions concerning AI today: How intelligent—really—are the best AI programs? How do they work? What can they actually do, and when do they fail? How humanlike do we expect them to become, and how soon do we need to worry about them surpassing us? Along the way, she introduces the dominant models of modern AI and machine learning, describing cutting-edge AI programs, their human inventors, and the historical lines of thought underpinning recent achievements. She meets with fellow experts such as Douglas Hofstadter, the cognitive scientist and Pulitzer Prize–winning author of the modern classic Gödel, Escher, Bach, who explains why he is “terrified” about the future of AI. She explores the profound disconnect between the hype and the actual achievements in AI, providing a clear sense of what the field has accomplished and how much further it has to go.Interweaving stories about the science of AI and the people behind it, Artificial Intelligence brims with clear-sighted, captivating, and accessible accounts of the most interesting and provocative modern work in the field, flavored with Mitchell’s humor and personal observations. This frank, lively book is an indispensable guide to understanding today’s AI, its quest for “human-level” intelligence, and its impact on the future for us all.

On Being Certain: Believing You Are Right Even When You're Not


Robert A. Burton - 2008
    In On Being Certain, neurologist Robert Burton challenges the notions of how we think about what we know. He shows that the feeling of certainty we have when we know something comes from sources beyond our control and knowledge. In fact, certainty is a mental sensation, rather than evidence of fact. Because this feeling of knowing seems like confirmation of knowledge, we tend to think of it as a product of reason. But an increasing body of evidence suggests that feelings such as certainty stem from primitive areas of the brain, and are independent of active, conscious reflection and reasoning. The feeling of knowing happens to us; we cannot make it happen. Bringing together cutting edge neuroscience, experimental data, and fascinating anecdotes, Robert Burton explores the inconsistent and sometimes paradoxical relationship between our thoughts and what we actually know. Provocative and groundbreaking, On Being Certain, will challenge what you know (or think you know) about the mind, knowledge, and reason.

Natural Language Processing with Python


Steven Bird - 2009
    With it, you'll learn how to write Python programs that work with large collections of unstructured text. You'll access richly annotated datasets using a comprehensive range of linguistic data structures, and you'll understand the main algorithms for analyzing the content and structure of written communication.Packed with examples and exercises, Natural Language Processing with Python will help you: Extract information from unstructured text, either to guess the topic or identify "named entities" Analyze linguistic structure in text, including parsing and semantic analysis Access popular linguistic databases, including WordNet and treebanks Integrate techniques drawn from fields as diverse as linguistics and artificial intelligenceThis book will help you gain practical skills in natural language processing using the Python programming language and the Natural Language Toolkit (NLTK) open source library. If you're interested in developing web applications, analyzing multilingual news sources, or documenting endangered languages -- or if you're simply curious to have a programmer's perspective on how human language works -- you'll find Natural Language Processing with Python both fascinating and immensely useful.