Vision: A Computational Investigation into the Human Representation and Processing of Visual Information


David Marr - 1982
    A computational investigation into the human representation and processing of visual information.

The Meme Machine


Susan Blackmore - 1999
    The meme is also one of the most important--and controversial--concepts to emerge since 'The Origin of the Species' appeared nearly 150 years ago.In 'The Meme Machine' Susan Blackmore boldly asserts: "Just as the design of our bodies can be understood only in terms of natural selection, so the design of our minds can be understood only in terms of memetic selection." Indeed, Blackmore shows that once our distant ancestors acquired the crucial ability to imitate, a second kind of natural selection began, a survival of the fittest amongst competing ideas and behaviors. Ideas and behaviors that proved most adaptive - making tools, for example, or using language--survived and flourished, replicating themselves in as many minds as possible. These memes then passed themselves on from generation to generation by helping to ensure that the genes of those who acquired them also survived and reproduced. Applying this theory to many aspects of human life, Blackmore offers brilliant explanations for why we live in cities, why we talk so much, why we can't stop thinking, why we behave altruistically, how we choose our mates, and much more.With controversial implications for our religious beliefs, our free will, our very sense of "self," 'The Meme Machine' offers a provocative theory everyone will soon be talking about.

Artificial Intelligence


Patrick Henry Winston - 1977
    From the book, you learn why the field is important, both as a branch of engineering and as a science. If you are a computer scientist or an engineer, you will enjoy the book, because it provides a cornucopia of new ideas for representing knowledge, using knowledge, and building practical systems. If you are a psychologist, biologist, linguist, or philosopher, you will enjoy the book because it provides an exciting computational perspective on the mystery of intelligence. The Knowledge You Need This completely rewritten and updated edition of Artificial Intelligence reflects the revolutionary progress made since the previous edition was published. Part I is about representing knowledge and about reasoning methods that make use of knowledge. The material covered includes the semantic-net family of representations, describe and match, generate and test, means-ends analysis, problem reduction, basic search, optimal search, adversarial search, rule chaining, the rete algorithm, frame inheritance, topological sorting, constraint propagation, logic, truth

Neural Networks and Deep Learning


Michael Nielsen - 2013
    The book will teach you about:* Neural networks, a beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data* Deep learning, a powerful set of techniques for learning in neural networksNeural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you the core concepts behind neural networks and deep learning.

Pragmatic Thinking and Learning: Refactor Your Wetware


Andy Hunt - 2008
    Not in an editor, IDE, or design tool. You're well educated on how to work with software and hardware, but what about wetware--our own brains? Learning new skills and new technology is critical to your career, and it's all in your head. In this book by Andy Hunt, you'll learn how our brains are wired, and how to take advantage of your brain's architecture. You'll learn new tricks and tips to learn more, faster, and retain more of what you learn. You need a pragmatic approach to thinking and learning. You need to Refactor Your Wetware. Programmers have to learn constantly; not just the stereotypical new technologies, but also the problem domain of the application, the whims of the user community, the quirks of your teammates, the shifting sands of the industry, and the evolving characteristics of the project itself as it is built. We'll journey together through bits of cognitive and neuroscience, learning and behavioral theory. You'll see some surprising aspects of how our brains work, and how you can take advantage of the system to improve your own learning and thinking skills.In this book you'll learn how to:Use the Dreyfus Model of Skill Acquisition to become more expertLeverage the architecture of the brain to strengthen different thinking modesAvoid common "known bugs" in your mindLearn more deliberately and more effectivelyManage knowledge more efficientlyPrinted in full color.

Philosophy in the Flesh: The Embodied Mind and its Challenge to Western Thought


George Lakoff - 1998
    In addressing them, philosophers have made certain fundamental assumptions-that we can know our own minds by introspection, that most of our thinking about the world is literal, and that reason is disembodied and universal-that are now called into question by well-established results of cognitive science. It has been shown empirically that: Most thought is unconscious. We have no direct conscious access to the mechanisms of thought and language. Our ideas go by too quickly and at too deep a level for us to observe them in any simple way. Abstract concepts are mostly metaphorical. Much of the subject matter of philosophy, such as the nature of time, morality, causation, the mind, and the self, relies heavily on basic metaphors derived from bodily experience. What is literal in our reasoning about such concepts is minimal and conceptually impoverished. All the richness comes from metaphor. For instance, we have two mutually incompatible metaphors for time, both of which represent it as movement through space: in one it is a flow past us and in the other a spatial dimension we move along. Mind is embodied. Thought requires a body-not in the trivial sense that you need a physical brain to think with, but in the profound sense that the very structure of our thoughts comes from the nature of the body. Nearly all of our unconscious metaphors are based on common bodily experiences. Most of the central themes of the Western philosophical tradition are called into question by these findings. The Cartesian person, with a mind wholly separate from the body, does not exist. The Kantian person, capable of moral action according to the dictates of a universal reason, does not exist. The phenomenological person, capable of knowing his or her mind entirely through introspection alone, does not exist. The utilitarian person, the Chomskian person, the poststructuralist person, the computational person, and the person defined by analytic philosophy all do not exist. Then what does? Lakoff and Johnson show that a philosophy responsible to the science of mind offers radically new and detailed understandings of what a person is. After first describing the philosophical stance that must follow from taking cognitive science seriously, they re-examine the basic concepts of the mind, time, causation, morality, and the self: then they rethink a host of philosophical traditions, from the classical Greeks through Kantian morality through modern analytic philosophy. They reveal the metaphorical structure underlying each mode of thought and show how the metaphysics of each theory flows from its metaphors. Finally, they take on two major issues of twentieth-century philosophy: how we conceive rationality, and how we conceive language.

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Computer Networks


Andrew S. Tanenbaum - 1981
    In this revision, the author takes a structured approach to explaining how networks function.

Algorithms to Live By: The Computer Science of Human Decisions


Brian Christian - 2016
    What should we do, or leave undone, in a day or a lifetime? How much messiness should we accept? What balance of new activities and familiar favorites is the most fulfilling? These may seem like uniquely human quandaries, but they are not: computers, too, face the same constraints, so computer scientists have been grappling with their version of such issues for decades. And the solutions they've found have much to teach us.In a dazzlingly interdisciplinary work, acclaimed author Brian Christian and cognitive scientist Tom Griffiths show how the algorithms used by computers can also untangle very human questions. They explain how to have better hunches and when to leave things to chance, how to deal with overwhelming choices and how best to connect with others. From finding a spouse to finding a parking spot, from organizing one's inbox to understanding the workings of memory, Algorithms to Live By transforms the wisdom of computer science into strategies for human living.

Artificial Intelligence


Elaine Rich - 1983
    I. is explored and explained in this best selling text. Assuming no prior knowledge, it covers topics like neural networks and robotics. This text explores the range of problems which have been and remain to be solved using A. I. tools and techniques. The second half of this text is an excellent reference.

The Protocols (TCP/IP Illustrated, Volume 1)


W. Richard Stevens - 1993
    In eight chapters, it provides the most thorough coverage of TCP available. It also covers the newest TCP/IP features, including multicasting, path MTU discovery and long fat pipes. The author describes various protocols, including ARP, ICMP and UDP. He utilizes network diagnostic tools to actually show the protocols in action. He also explains how to avoid silly window syndrome (SWS) by using numerous helpful diagrams. This book gives you a broader understanding of concepts like connection establishment, timeout, retransmission and fragmentation. It is ideal for anyone wanting to gain a greater understanding of how the TCP/IP protocols work.

How the Mind Works


Steven Pinker - 1997
    He explains what the mind is, how it evolved, and how it allows us to see, think, feel, laugh, interact, enjoy the arts, and ponder the mysteries of life. And he does it with the wit that prompted Mark Ridley to write in the New York Times Book Review, "No other science writer makes me laugh so much. . . . [Pinker] deserves the superlatives that are lavished on him."  The arguments in the book are as bold as its title. Pinker rehabilitates some unfashionable ideas, such as that the mind is a computer and that human nature was shaped by natural selection, and challenges fashionable ones, such as that passionate emotions are irrational, that parents socialize their children, and that nature is good and modern society corrupting. Winner of the Los Angeles Times Book Prize A New York Times Notable Book of the Year and Publishers Weekly Best Book of 1997 Featured in Time magazine, the New York Times Magazine, The New Yorker, Nature, Science, Lingua Franca, and Science Times Front-page reviews in the Washington Post Book World, the Boston Globe Book Section, and the San Diego Union Book Review

Connectome: How the Brain's Wiring Makes Us Who We Are


Sebastian Seung - 2012
    Is it in our genes? The structure of our brains? Our genome may determine our eye color and even aspects of our personality. But our friendships, failures, and passions also shape who we are. The question is: how?Sebastian Seung, a dynamic professor at MIT, is on a quest to discover the biological basis of identity. He believes it lies in the pattern of connections between the brain’s neurons, which change slowly over time as we learn and grow. The connectome, as it’s called, is where our genetic inheritance intersects with our life experience. It’s where nature meets nurture.Seung introduces us to the dedicated researchers who are mapping the brain’s connections, neuron by neuron, synapse by synapse. It is a monumental undertaking—the scientific equivalent of climbing Mount Everest—but if they succeed, it could reveal the basis of personality, intelligence, memory, and perhaps even mental disorders. Many scientists speculate that people with anorexia, autism, and schizophrenia are "wired differently," but nobody knows for sure. The brain’s wiring has never been clearly seen.In sparklingly clear prose, Seung reveals the amazing technological advances that will soon help us map connectomes. He also examines the evidence that these maps will someday allow humans to "upload" their minds into computers, achieving a kind of immortality.Connectome is a mind-bending adventure story, told with great passion and authority. It presents a daring scientific and technological vision for at last understanding what makes us who we are. Welcome to the future of neuroscience.

Make Your Own Neural Network


Tariq Rashid - 2016
     Neural networks are a key element of deep learning and artificial intelligence, which today is capable of some truly impressive feats. Yet too few really understand how neural networks actually work. This guide will take you on a fun and unhurried journey, starting from very simple ideas, and gradually building up an understanding of how neural networks work. You won't need any mathematics beyond secondary school, and an accessible introduction to calculus is also included. The ambition of this guide is to make neural networks as accessible as possible to as many readers as possible - there are enough texts for advanced readers already! You'll learn to code in Python and make your own neural network, teaching it to recognise human handwritten numbers, and performing as well as professionally developed networks. Part 1 is about ideas. We introduce the mathematical ideas underlying the neural networks, gently with lots of illustrations and examples. Part 2 is practical. We introduce the popular and easy to learn Python programming language, and gradually builds up a neural network which can learn to recognise human handwritten numbers, easily getting it to perform as well as networks made by professionals. Part 3 extends these ideas further. We push the performance of our neural network to an industry leading 98% using only simple ideas and code, test the network on your own handwriting, take a privileged peek inside the mysterious mind of a neural network, and even get it all working on a Raspberry Pi. All the code in this has been tested to work on a Raspberry Pi Zero.

Mind: Introduction to Cognitive Science


Paul Thagard - 1996
    With Mind, Paul Thagard offers an introduction to this interdisciplinary field for readers who come to the subject with very different backgrounds. It is suitable for classroom use by students with interests ranging from computer science and engineering to psychology and philosophy.Thagard's systematic descriptions and evaluations of the main theories of mental representation advanced by cognitive scientists allow students to see that there are many complementary approaches to the investigation of mind. The fundamental theoretical perspectives he describes include logic, rules, concepts, analogies, images, and connections (artificial neural networks). The discussion of these theories provides an integrated view of the different achievements of the various fields of cognitive science.This second edition includes substantial revision and new material. Part I, which presents the different theoretical approaches, has been updated in light of recent work the field. Part II, which treats extensions to cognitive science, has been thoroughly revised, with new chapters added on brains, emotions, and consciousness. Other additions include a list of relevant Web sites at the end of each chapter and a glossary at the end of the book. As in the first edition, each chapter concludes with a summary and suggestions for further reading.