Book picks similar to
Computational Intelligence: An Introduction by Andries P. Engelbrecht
ai
computer-science
tier-3
cs
Behind Deep Blue: Building the Computer That Defeated the World Chess Champion
Feng-Hsiung Hsu - 2002
Written by the man who started the adventure, Behind Deep Blue reveals the inside story of what happened behind the scenes at the two historic Deep Blue vs. Kasparov matches. This is also the story behind the quest to create the mother of all chess machines. The book unveils how a modest student project eventually produced a multimillion dollar supercomputer, from the development of the scientific ideas through technical setbacks, rivalry in the race to develop the ultimate chess machine, and wild controversies to the final triumph over the world's greatest human player.In nontechnical, conversational prose, Feng-hsiung Hsu, the system architect of Deep Blue, tells us how he and a small team of fellow researchers forged ahead at IBM with a project they'd begun as students at Carnegie Mellon in the mid-1980s: the search for one of the oldest holy grails in artificial intelligence--a machine that could beat any human chess player in a bona fide match. Back in 1949 science had conceived the foundations of modern chess computers but not until almost fifty years later--until Deep Blue--would the quest be realized.Hsu refutes Kasparov's controversial claim that only human intervention could have allowed Deep Blue to make its decisive, "uncomputerlike" moves. In riveting detail he describes the heightening tension in this war of brains and nerves, the "smoldering fire" in Kasparov's eyes. Behind Deep Blue is not just another tale of man versus machine. This fascinating book tells us how man as genius was given an ultimate, unforgettable run for his mind, no, not by the genius of a computer, but of man as toolmaker.
Classroom Assessment: What Teachers Need to Know
W. James Popham - 1994
This well-written book is grounded in the reality of teaching today to show real-world teachers who want to use assessment in their classroom the latest tools necessary to teach more effectively. The fifth edition of Classroom Assessment addresses the range of assessments that teachers are likely to use in their classrooms. With expanded coverage of problems related to measurement of special education children, a new student website with online activities, and an improved instructor's manual, this book continues to be a cutting-edge and indispensable resource not only for instructors, but also for pre- and in-service teachers. New to This Edition: *Chapter 12 contains new material dealing with formative assessment as well as assessment FOR learning. *The text is committed to fostering readers' realizations regarding the critical link between testing and teaching. Instructional implications are constantly stressed in the text. early childhood assessment throughout the text. *The 5th edition contains a brand-new website providing readers with Extra Electronic Exercises for each chapter, so readers, if they wish, can solidify their understanding of what chapters address (go to www.ablongman.com/popham5e). *A newly revised Instructor's Resource Manual contains Instructor-to-Instructor suggestions as well as a test for each chapter. It also includes a mid-term and final exam and an effective inventory measuring students' confidence in assessment. Here's what your colleagues have to say about this book: Dr. Popham has done a tremendous job in researching and incorporating current trends throughout the entire text! Terry H. Stepka, Arkansas State University Overall, I am extremely satisfied with the text. It is well-written, and I love the author's sense of humor! Terry H. Stepka, Arkansas State University I LOVE the arrangement of the chapters and the high quality of the self-checks and discussion questions that are provided. Karen E. Eifler, University of Portland
Python Data Science Handbook: Tools and Techniques for Developers
Jake Vanderplas - 2016
Several resources exist for individual pieces of this data science stack, but only with the Python Data Science Handbook do you get them all—IPython, NumPy, Pandas, Matplotlib, Scikit-Learn, and other related tools.Working scientists and data crunchers familiar with reading and writing Python code will find this comprehensive desk reference ideal for tackling day-to-day issues: manipulating, transforming, and cleaning data; visualizing different types of data; and using data to build statistical or machine learning models. Quite simply, this is the must-have reference for scientific computing in Python.With this handbook, you’ll learn how to use: * IPython and Jupyter: provide computational environments for data scientists using Python * NumPy: includes the ndarray for efficient storage and manipulation of dense data arrays in Python * Pandas: features the DataFrame for efficient storage and manipulation of labeled/columnar data in Python * Matplotlib: includes capabilities for a flexible range of data visualizations in Python * Scikit-Learn: for efficient and clean Python implementations of the most important and established machine learning algorithms
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
Learning From Data: A Short Course
Yaser S. Abu-Mostafa - 2012
Its techniques are widely applied in engineering, science, finance, and commerce. This book is designed for a short course on machine learning. It is a short course, not a hurried course. From over a decade of teaching this material, we have distilled what we believe to be the core topics that every student of the subject should know. We chose the title `learning from data' that faithfully describes what the subject is about, and made it a point to cover the topics in a story-like fashion. Our hope is that the reader can learn all the fundamentals of the subject by reading the book cover to cover. ---- Learning from data has distinct theoretical and practical tracks. In this book, we balance the theoretical and the practical, the mathematical and the heuristic. Our criterion for inclusion is relevance. Theory that establishes the conceptual framework for learning is included, and so are heuristics that impact the performance of real learning systems. ---- Learning from data is a very dynamic field. Some of the hot techniques and theories at times become just fads, and others gain traction and become part of the field. What we have emphasized in this book are the necessary fundamentals that give any student of learning from data a solid foundation, and enable him or her to venture out and explore further techniques and theories, or perhaps to contribute their own. ---- The authors are professors at California Institute of Technology (Caltech), Rensselaer Polytechnic Institute (RPI), and National Taiwan University (NTU), where this book is the main text for their popular courses on machine learning. The authors also consult extensively with financial and commercial companies on machine learning applications, and have led winning teams in machine learning competitions.
Mining of Massive Datasets
Anand Rajaraman - 2011
This book focuses on practical algorithms that have been used to solve key problems in data mining and which can be used on even the largest datasets. It begins with a discussion of the map-reduce framework, an important tool for parallelizing algorithms automatically. The authors explain the tricks of locality-sensitive hashing and stream processing algorithms for mining data that arrives too fast for exhaustive processing. The PageRank idea and related tricks for organizing the Web are covered next. Other chapters cover the problems of finding frequent itemsets and clustering. The final chapters cover two applications: recommendation systems and Web advertising, each vital in e-commerce. Written by two authorities in database and Web technologies, this book is essential reading for students and practitioners alike.
Cybernetics: or the Control and Communication in the Animal and the Machine
Norbert Wiener - 1948
It is a ‘ must’ book for those in every branch of science . . . in addition, economists, politicians, statesmen, and businessmen cannot afford to overlook cybernetics and its tremendous, even terrifying implications. "It is a beautifully written book, lucid, direct, and despite its complexity, as readable by the layman as the trained scientist." -- John B. Thurston, "The Saturday Review of Literature" Acclaimed one of the "seminal books . . . comparable in ultimate importance to . . . Galileo or Malthus or Rousseau or Mill," "Cybernetics" was judged by twenty-seven historians, economists, educators, and philosophers to be one of those books published during the "past four decades", which may have a substantial impact on public thought and action in the years ahead." -- Saturday Review
Reinforcement Learning: An Introduction
Richard S. Sutton - 1998
Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.
What Computers Still Can't Do: A Critique of Artificial Reason
Hubert L. Dreyfus - 1972
The world has changed since then. Today it is clear that "good old-fashioned AI," based on the idea of using symbolic representations to produce general intelligence, is in decline (although several believers still pursue its pot of gold), and the focus of the AI community has shifted to more complex models of the mind. It has also become more common for AI researchers to seek out and study philosophy. For this edition of his now classic book, Dreyfus has added a lengthy new introduction outlining these changes and assessing the paradigms of connectionism and neural networks that have transformed the field. At a time when researchers were proposing grand plans for general problem solvers and automatic translation machines, Dreyfus predicted that they would fail because their conception of mental functioning was naive, and he suggested that they would do well to acquaint themselves with modern philosophical approaches to human being. "What Computers Still Can't Do" was widely attacked but quietly studied. Dreyfus's arguments are still provocative and focus our attention once again on what it is that makes human beings unique.
Fundamentals of Computer Algorithms
Ellis Horowitz - 1978
The book comprises chapters on elementary data structures, dynamic programming, backtracking, algebraic problems, lower bound theory, pram algorithms, mesh algorithms, and hypercube algorithms. In addition, the book consists of several real-world examples to understand the concepts better. This book is indispensable for computer engineers preparing for competitive examinations like GATE and IES.
Why Scientists Disagree About Global Warming: The NIPCC Report on Scientific Consensus
Craig D. Idso - 2015
This claim is not only false, but its presence in the debate is an insult to science." With these words, the authors begin a detailed analysis of one of the most controversial topics of the day. The authors make a compelling case against claims of a scientific consensus. The purported proof of such a consensus consists of sloppy research by nonscientists, college students, and a highly partisan Australian blogger. Surveys of climate scientists, even those heavily biased in favor of climate alarmism, find extensive disagreement on the underlying science and doubts about its reliability. The authors point to four reasons why scientists disagree about global warming: a conflict among scientists in different and often competing disciplines; fundamental scientific uncertainties concerning how the global climate responds to the human presence; failure of the United Nations Intergovernmental Panel on Climate Change (IPCC) to provide objective guidance to the complex science; and bias among researchers. The authors offer a succinct summary of the real science of climate change based on their previously published comprehensive review of climate science in a volume titled Climate Change Reconsidered II: Physical Science. They recommend that policymakers resist pressure from lobby groups to silence scientists who question the authority of the IPCC to claim to speak for climate science. More than 50,000 copies of the first edition were sold or given away in five months to elected officials, civic and business leaders, scientists, and other opinion leaders. The response from the science community and experts on climate change has been overwhelmingly positive. To meet demand for more copies, we have produced this second revised edition. Changes include a foreword by Marita Noon, at the time executive director of Energy Makes America Great, Inc. Some of the discussion in Chapter 1 has been revised and expanded thanks to feedback from readers of the first edition. Graphs in Chapters 4, 5, and 6 are now full color, and new graphs have been added.
A New Kind of Science
Stephen Wolfram - 1997
Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton
The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life Plus the Secrets of Enigma
Alan Turing - 2004
In 1935, aged 22, he developed the mathematical theory upon which all subsequent stored-program digital computers are modeled.At the outbreak of hostilities with Germany in September 1939, he joined the Government Codebreaking team at Bletchley Park, Buckinghamshire and played a crucial role in deciphering Engima, the code used by the German armed forces to protect their radio communications. Turing's work on the versionof Enigma used by the German navy was vital to the battle for supremacy in the North Atlantic. He also contributed to the attack on the cyphers known as Fish, which were used by the German High Command for the encryption of signals during the latter part of the war. His contribution helped toshorten the war in Europe by an estimated two years.After the war, his theoretical work led to the development of Britain's first computers at the National Physical Laboratory and the Royal Society Computing Machine Laboratory at Manchester University.Turing was also a founding father of modern cognitive science, theorizing that the cortex at birth is an unorganized machine which through training becomes organized into a universal machine or something like it. He went on to develop the use of computers to model biological growth, launchingthe discipline now referred to as Artificial Life.The papers in this book are the key works for understanding Turing's phenomenal contribution across all these fields. The collection includes Turing's declassified wartime Treatise on the Enigma; letters from Turing to Churchill and to codebreakers; lectures, papers, and broadcasts which opened upthe concept of AI and its implications; and the paper which formed the genesis of the investigation of Artifical Life.
Probabilistic Robotics
Sebastian Thrun - 2005
Building on the field of mathematical statistics, probabilistic robotics endows robots with a new level of robustness in real-world situations. This book introduces the reader to a wealth of techniques and algorithms in the field. All algorithms are based on a single overarching mathematical foundation. Each chapter provides example implementations in pseudo code, detailed mathematical derivations, discussions from a practitioner's perspective, and extensive lists of exercises and class projects. The book's Web site, www.probabilistic-robotics.org, has additional material. The book is relevant for anyone involved in robotic software development and scientific research. It will also be of interest to applied statisticians and engineers dealing with real-world sensor data.