The Lifebox, the Seashell, and the Soul: What Gnarly Computation Taught Me About Ultimate Reality, the Meaning of Life, and How to Be Happy


Rudy Rucker - 2005
    This concept is at the root of the computational worldview, which basically says that very complex systems — the world we live in — have their beginnings in simple mathematical equations. We've lately come to understand that such an algorithm is only the start of a never-ending story — the real action occurs in the unfolding consequences of the rules. The chip-in-a-box computers so popular in our time have acted as a kind of microscope, letting us see into the secret machinery of the world. In Lifebox, Rucker uses whimsical drawings, fables, and humor to demonstrate that everything is a computation — that thoughts, computations, and physical processes are all the same. Rucker discusses the linguistic and computational advances that make this kind of "digital philosophy" possible, and explains how, like every great new principle, the computational world view contains the seeds of a next step.

Purely Functional Data Structures


Chris Okasaki - 1996
    However, data structures for these languages do not always translate well to functional languages such as Standard ML, Haskell, or Scheme. This book describes data structures from the point of view of functional languages, with examples, and presents design techniques that allow programmers to develop their own functional data structures. The author includes both classical data structures, such as red-black trees and binomial queues, and a host of new data structures developed exclusively for functional languages. All source code is given in Standard ML and Haskell, and most of the programs are easily adaptable to other functional languages. This handy reference for professional programmers working with functional languages can also be used as a tutorial or for self-study.

Vision: A Computational Investigation into the Human Representation and Processing of Visual Information


David Marr - 1982
    A computational investigation into the human representation and processing of visual information.

Elements of the Theory of Computation


Harry R. Lewis - 1981
    The authors are well-known for their clear presentation that makes the material accessible to a a broad audience and requires no special previous mathematical experience. KEY TOPICS: In this new edition, the authors incorporate a somewhat more informal, friendly writing style to present both classical and contemporary theories of computation. Algorithms, complexity analysis, and algorithmic ideas are introduced informally in Chapter 1, and are pursued throughout the book. Each section is followed by problems.

Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World


Leslie Valiant - 2013
    We nevertheless muddle through even in the absence of theories of how to act. But how do we do it?In Probably Approximately Correct, computer scientist Leslie Valiant presents a masterful synthesis of learning and evolution to show how both individually and collectively we not only survive, but prosper in a world as complex as our own. The key is “probably approximately correct” algorithms, a concept Valiant developed to explain how effective behavior can be learned. The model shows that pragmatically coping with a problem can provide a satisfactory solution in the absence of any theory of the problem. After all, finding a mate does not require a theory of mating. Valiant’s theory reveals the shared computational nature of evolution and learning, and sheds light on perennial questions such as nature versus nurture and the limits of artificial intelligence.Offering a powerful and elegant model that encompasses life’s complexity, Probably Approximately Correct has profound implications for how we think about behavior, cognition, biological evolution, and the possibilities and limits of human and machine intelligence.

Types and Programming Languages


Benjamin C. Pierce - 2002
    The study of type systems--and of programming languages from a type-theoretic perspective--has important applications in software engineering, language design, high-performance compilers, and security.This text provides a comprehensive introduction both to type systems in computer science and to the basic theory of programming languages. The approach is pragmatic and operational; each new concept is motivated by programming examples and the more theoretical sections are driven by the needs of implementations. Each chapter is accompanied by numerous exercises and solutions, as well as a running implementation, available via the Web. Dependencies between chapters are explicitly identified, allowing readers to choose a variety of paths through the material.The core topics include the untyped lambda-calculus, simple type systems, type reconstruction, universal and existential polymorphism, subtyping, bounded quantification, recursive types, kinds, and type operators. Extended case studies develop a variety of approaches to modeling the features of object-oriented languages.

The Art of Electronics


Paul Horowitz - 1980
    Widely accepted as the authoritative text and reference on electronic circuit design, both analog and digital, this book revolutionized the teaching of electronics by emphasizing the methods actually used by circuit designers -- a combination of some basic laws, rules of thumb, and a large bag of tricks. The result is a largely nonmathematical treatment that encourages circuit intuition, brainstorming, and simplified calculations of circuit values and performance. The new Art of Electronics retains the feeling of informality and easy access that helped make the first edition so successful and popular. It is an ideal first textbook on electronics for scientists and engineers and an indispensable reference for anyone, professional or amateur, who works with electronic circuits.

Computational Thinking


Peter J. Denning - 2019
    More recently, "computational thinking" has become part of the K-12 curriculum. But what is computational thinking? This volume in the MIT Press Essential Knowledge series offers an accessible overview, tracing a genealogy that begins centuries before digital computers and portraying computational thinking as pioneers of computing have described it.The authors explain that computational thinking (CT) is not a set of concepts for programming; it is a way of thinking that is honed through practice: the mental skills for designing computations to do jobs for us, and for explaining and interpreting the world as a complex of information processes. Mathematically trained experts (known as "computers") who performed complex calculations as teams engaged in CT long before electronic computers. The authors identify six dimensions of today's highly developed CT--methods, machines, computing education, software engineering, computational science, and design--and cover each in a chapter. Along the way, they debunk inflated claims for CT and computation while making clear the power of CT in all its complexity and multiplicity.

Real World Haskell: Code You Can Believe In


Bryan O'Sullivan - 2008
    You'll learn how to use Haskell in a variety of practical ways, from short scripts to large and demanding applications. Real World Haskell takes you through the basics of functional programming at a brisk pace, and then helps you increase your understanding of Haskell in real-world issues like I/O, performance, dealing with data, concurrency, and more as you move through each chapter. With this book, you will:Understand the differences between procedural and functional programming Learn the features of Haskell, and how to use it to develop useful programs Interact with filesystems, databases, and network services Write solid code with automated tests, code coverage, and error handling Harness the power of multicore systems via concurrent and parallel programming You'll find plenty of hands-on exercises, along with examples of real Haskell programs that you can modify, compile, and run. Whether or not you've used a functional language before, if you want to understand why Haskell is coming into its own as a practical language in so many major organizations, Real World Haskell is the best place to start.

But How Do It Know? - The Basic Principles of Computers for Everyone


J. Clark Scott - 2009
    Its humorous title begins with the punch line of a classic joke about someone who is baffled by technology. It was written by a 40-year computer veteran who wants to take the mystery out of computers and allow everyone to gain a true understanding of exactly what computers are, and also what they are not. Years of writing, diagramming, piloting and editing have culminated in one easy to read volume that contains all of the basic principles of computers written so that everyone can understand them. There used to be only two types of book that delved into the insides of computers. The simple ones point out the major parts and describe their functions in broad general terms. Computer Science textbooks eventually tell the whole story, but along the way, they include every detail that an engineer could conceivably ever need to know. Like Momma Bear's porridge, But How Do It Know? is just right, but it is much more than just a happy medium. For the first time, this book thoroughly demonstrates each of the basic principles that have been used in every computer ever built, while at the same time showing the integral role that codes play in everything that computers are able to do. It cuts through all of the electronics and mathematics, and gets right to practical matters. Here is a simple part, see what it does. Connect a few of these together and you get a new part that does another simple thing. After just a few iterations of connecting up simple parts - voilà! - it's a computer. And it is much simpler than anyone ever imagined. But How Do It Know? really explains how computers work. They are far simpler than anyone has ever permitted you to believe. It contains everything you need to know, and nothing you don't need to know. No technical background of any kind is required. The basic principles of computers have not changed one iota since they were invented in the mid 20th century. "Since the day I learned how computers work, it always felt like I knew a giant secret, but couldn't tell anyone," says the author. Now he's taken the time to explain it in such a manner that anyone can have that same moment of enlightenment and thereafter see computers in an entirely new light.

Genetic Algorithms in Search, Optimization, and Machine Learning


David Edward Goldberg - 1989
    Major concepts are illustrated with running examples, and major algorithms are illustrated by Pascal computer programs. No prior knowledge of GAs or genetics is assumed, and only a minimum of computer programming and mathematics background is required. 0201157675B07092001

Building Java Programs: A Back to Basics Approach


Stuart Reges - 2007
    By using objects early to solve interesting problems and defining objects later in the course, Building Java Programs develops programming knowledge for a broad audience. Introduction to Java Programming, Primitive Data and Definite Loops, Introduction to Parameters and Objects, Conditional Execution, Program Logic and Indefinite Loops, File Processing, Arrays, Defining Classes, Inheritance and Interfaces, ArrayLists, Java Collections Framework, Recursion, Searching and Sorting, Graphical User Interfaces. For all readers interested in introductory programming.

Computability and Logic


George S. Boolos - 1980
    Including a selection of exercises, adjusted for this edition, at the end of each chapter, it offers a new and simpler treatment of the representability of recursive functions, a traditional stumbling block for students on the way to the Godel incompleteness theorems.

Introduction to Probability


Dimitri P. Bertsekas - 2002
    This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.

Discrete and Combinatorial Mathematics


Ralph P. Grimaldi - 1985
    The text offers a flexible organization, enabling instructors to adapt the book to their particular courses. The book is both complete and careful, and it continues to maintain its emphasis on algorithms and applications. Excellent exercise sets allow students to perfect skills as they practice. This new edition continues to feature numerous computer science applications-making this the ideal text for preparing students for advanced study.