Machine Learning: A Probabilistic Perspective


Kevin P. Murphy - 2012
    Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

Computational Complexity


Sanjeev Arora - 2007
    Requiring essentially no background apart from mathematical maturity, the book can be used as a reference for self-study for anyone interested in complexity, including physicists, mathematicians, and other scientists, as well as a textbook for a variety of courses and seminars. More than 300 exercises are included with a selected hint set.

Paradigms of Artificial Intelligence Programming: Case Studies in Common LISP


Peter Norvig - 1991
    By reconstructing authentic, complex AI programs using state-of-the-art Common Lisp, the book teaches students and professionals how to build and debug robust practical programs, while demonstrating superior programming style and important AI concepts. The author strongly emphasizes the practical performance issues involved in writing real working programs of significant size. Chapters on troubleshooting and efficiency are included, along with a discussion of the fundamentals of object-oriented programming and a description of the main CLOS functions. This volume is an excellent text for a course on AI programming, a useful supplement for general AI courses and an indispensable reference for the professional programmer.

Computer Science Illuminated


Nell B. Dale - 2002
    Written By Two Of Today'S Most Respected Computer Science Educators, Nell Dale And John Lewis, The Text Provides A Broad Overview Of The Many Aspects Of The Discipline From A Generic View Point. Separate Program Language Chapters Are Available As Bundle Items For Those Instructors Who Would Like To Explore A Particular Programming Language With Their Students. The Many Layers Of Computing Are Thoroughly Explained Beginning With The Information Layer, Working Through The Hardware, Programming, Operating Systems, Application, And Communication Layers, And Ending With A Discussion On The Limitations Of Computing. Perfect For Introductory Computing And Computer Science Courses, Computer Science Illuminated, Third Edition's Thorough Presentation Of Computing Systems Provides Computer Science Majors With A Solid Foundation For Further Study, And Offers Non-Majors A Comprehensive And Complete Introduction To Computing.

Think Complexity: Complexity Science and Computational Modeling


Allen B. Downey - 2009
    Whether you’re an intermediate-level Python programmer or a student of computational modeling, you’ll delve into examples of complex systems through a series of exercises, case studies, and easy-to-understand explanations.You’ll work with graphs, algorithm analysis, scale-free networks, and cellular automata, using advanced features that make Python such a powerful language. Ideal as a text for courses on Python programming and algorithms, Think Complexity will also help self-learners gain valuable experience with topics and ideas they might not encounter otherwise.Work with NumPy arrays and SciPy methods, basic signal processing and Fast Fourier Transform, and hash tablesStudy abstract models of complex physical systems, including power laws, fractals and pink noise, and Turing machinesGet starter code and solutions to help you re-implement and extend original experiments in complexityExplore the philosophy of science, including the nature of scientific laws, theory choice, realism and instrumentalism, and other topicsExamine case studies of complex systems submitted by students and readers

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

Data Structures and Algorithm Analysis in C++


Mark Allen Weiss - 1993
    Readers learn how to reduce time constraints and develop programs efficiently by analyzing the feasibility of an algorithm before it is coded. The C++ language is brought up-to-date and simplified, and the Standard Template Library is now fully incorporated throughout the text. This Third Edition also features significantly revised coverage of lists, stacks, queues, and trees and an entire chapter dedicated to amortized analysis and advanced data structures such as the Fibonacci heap. Known for its clear and friendly writing style, Data Structures and Algorithm Analysis in C++ is logically organized to cover advanced data structures topics from binary heaps to sorting to NP-completeness. Figures and examples illustrating successive stages of algorithms contribute to Weiss' careful, rigorous and in-depth analysis of each type of algorithm.

Thermodynamics and an Introduction to Thermostatistics


Herbert B. Callen - 1985
    Presents essential ideas on critical phenomena developed over the last decade in simple, qualitative terms. This new edition maintains the simple structure of the first and puts new emphasis on pedagogical considerations. Thermostatistics is incorporated into the text without eclipsing macroscopic thermodynamics, and is integrated into the conceptual framework of physical theory.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

Introduction to Algorithms


Thomas H. Cormen - 1989
    Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.

Computer System Architecture


M. Morris Mano - 1976
    Written to aid electrical engineers, computer engineers, and computer scientists, the volume includes: KEY FEATURES: the computer architecture, organization, and design associated with computer hardware - the various digital components used in the organization and design of digital computers - detailed steps that a designer must go through in order to design an elementary basic computer - the organization and architecture of the central processing unit - the organization and architecture of input-output and memory - the concept of multiprocessing - two new chapters on pipeline and vector processing - two sections devoted completely to the reduced instruction set computer (RISC) - and sample worked-out problems to clarify topics.

Compilers: Principles, Techniques, and Tools


Alfred V. Aho - 1986
    The authors present updated coverage of compilers based on research and techniques that have been developed in the field over the past few years. The book provides a thorough introduction to compiler design and covers topics such as context-free grammars, fine state machines, and syntax-directed translation.

Multiple View Geometry in Computer Vision


Richard Hartley - 2000
    This book covers relevant geometric principles and how to represent objects algebraically so they can be computed and applied. Recent major developments in the theory and practice of scene reconstruction are described in detail in a unified framework. Richard Hartley and Andrew Zisserman provide comprehensive background material and explain how to apply the methods and implement the algorithms. First Edition HB (2000): 0-521-62304-9

Ethics And Technology: Ethical Issues In An Age Of Information And Communication Technology


Herman T. Tavani - 2003
    . . . We need a good book in cyberethics to deal with the present and prepare us for an uncertain future. Tavani's Ethics and Technology is such a book." --from the foreword by James Moor, Dartmouth College Is there privacy in a world of camera phones and wireless networking? Does technology threaten your civil liberties? How will bioinformatics and nanotechnology affect us? Should you worry about equity and access in a globalized economy? From privacy and security to free speech and intellectual property to globalization and outsourcing, the issues and controversies of the information age are serious, complex, and pervasive. In this new edition of his groundbreaking book, Herman Tavani introduces computer professionals to the emerging field of Cyberethics, the interdisciplinary field of study that addresses these new ethical issues from all perspectives: technical, social, and philosophical. Using fascinating real-world examples--including the latest court decisions in such cases as Verizon v. RIAA, MGM v. Grokster, Google versus the Bush Administration, and the Children's Online Pornography Act (CIPA) --as well as hypothetical scenarios, he shows you how to understand and analyze the practical, moral, and legal issues that impact your work and your life. Tavani discusses such cutting-edge areas as: * Globalization and outsourcing * Property rights and open source software * HIPAA (privacy laws) and surveillance * The Patriot Act and civil liberties * Bioinformatics and genomics research * Converging technologies--pervasive computing and nanocomputing * Children's online pornography laws Updating and expanding upon the previous edition, Ethics and Technology, Second Edition provides a much-needed ethical compass to help computer and non-computer professionals alike navigate the challenging waters of cyberspace. About the Author Herman T. Tavani is Professor of Philosophy at Rivier College and Co-Director of the International Society for Ethics and Information Technology (INSEIT). He is the author, editor, or co-editor of five books on ethical aspects of information technology. www.wiley.com/college/tavani

Gödel, Escher, Bach: An Eternal Golden Braid


Douglas R. Hofstadter - 1979
    However, according to Hofstadter, the formal system that underlies all mental activity transcends the system that supports it. If life can grow out of the formal chemical substrate of the cell, if consciousness can emerge out of a formal system of firing neurons, then so too will computers attain human intelligence. Gödel, Escher, Bach is a wonderful exploration of fascinating ideas at the heart of cognitive science: meaning, reduction, recursion, and much more.