Best of
Artificial-Intelligence
2002
Information Theory, Inference and Learning Algorithms
David J.C. MacKay - 2002
These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Programming Collective Intelligence: Building Smart Web 2.0 Applications
Toby Segaran - 2002
With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.Programming Collective Intelligence takes you into the world of machine learning and statistics, and explains how to draw conclusions about user experience, marketing, personal tastes, and human behavior in general -- all from information that you and others collect every day. Each algorithm is described clearly and concisely with code that can immediately be used on your web site, blog, Wiki, or specialized application. This book explains:Collaborative filtering techniques that enable online retailers to recommend products or media Methods of clustering to detect groups of similar items in a large dataset Search engine features -- crawlers, indexers, query engines, and the PageRank algorithm Optimization algorithms that search millions of possible solutions to a problem and choose the best one Bayesian filtering, used in spam filters for classifying documents based on word types and other features Using decision trees not only to make predictions, but to model the way decisions are made Predicting numerical values rather than classifications to build price models Support vector machines to match people in online dating sites Non-negative matrix factorization to find the independent features in a dataset Evolving intelligence for problem solving -- how a computer develops its skill by improving its own code the more it plays a game Each chapter includes exercises for extending the algorithms to make them more powerful. Go beyond simple database-backed applications and put the wealth of Internet data to work for you. "Bravo! I cannot think of a better way for a developer to first learn these algorithms and methods, nor can I think of a better way for me (an old AI dog) to reinvigorate my knowledge of the details."-- Dan Russell, Google "Toby's book does a great job of breaking down the complex subject matter of machine-learning algorithms into practical, easy-to-understand examples that can be directly applied to analysis of social interaction across the Web today. If I had this book two years ago, it would have saved precious time going down some fruitless paths."-- Tim Wolters, CTO, Collective Intellect
Fundamentals Of Computational Neuroscience
Thomas P. Trappenberg - 2002
Computational neuroscience is the theoretical study of the brain to uncover the principles and mechanisms that guide the development, organization, information processing, and mental functions of the nervous system.
Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems
Chris Eliasmith - 2002
In this text, Chris Eliasmith and Charles Anderson provide a synthesis of the disparate approaches in computational neuroscience, incorporating ideas from neural coding, neural computation, physiology, communications theory, control theory, dynamics, and probability theory. This synthesis, they argue, enables novel theoretical and practical insights into the functioning of neural systems. Such insights are pertinent to experimental and computational neuroscientists and to engineers, physicists, and computer scientists interested in how their quantitative tools relate to the brain.
Foundations of Genetic Programming
William B. Langdon - 2002
Since its inceptions more than ten years ago, GP has been used to solve practical problems in a variety of application fields. Along with this ad-hoc engineering approaches interest increased in how and why GP works. This book provides a coherent consolidation of recent work on the theoretical foundations of GP. A concise introduction to GP and genetic algorithms (GA) is followed by a discussion of fitness landscapes and other theoretical approaches to natural and artificial evolution. Having surveyed early approaches to GP theory it presents new exact schema analysis, showing that it applies to GP as well as to the simpler GAs. New results on the potentially infinite number of possible programs are followed by two chapters applying these new techniques.
Spiking Neuron Models: Single Neurons, Populations, Plasticity
Wulfram Gerstner - 2002
It focuses on phenomenological approaches rather than detailed models in order to provide the reader with a conceptual framework. The authors formulate the theoretical concepts clearly without many mathematical details. While the book contains standard material for courses in computational neuroscience, neural modeling, or neural networks, it also provides an entry to current research. No prior knowledge beyond undergraduate mathematics is required.
Cheap Complex Devices
John Compton Sundman - 2002
Sundman and The Pains by John Compton Sundman. While ostensibly telling the story of the inaugural Hofstadeter Prize for Machine-Written Narrative, Cheap Complex Devices tells the story of an entity coming to awareness. What is that entity? Is it Todd Griffith, the chip designer with bullet in his brain from the novel Acts of the Apostles? Is it a bee, or a swarm or bees, a Shaker village or a very buggy floating point processor? There is ample evidence to support any of these hypotheses. Or is it, possibly, the mythical meta-character named "Sundman"? Read the book and form your own opinions.Acts of the Apostles is a Bourne-Identity style thriller about nanomachines, neurobiology, Gulf War Syndrome and a Silicon Valley messiah. It tells how Todd Griffith, a chip designer, gets a bullet in the head after successfully debugging a race condition in the Kali chip. In Cheap Complex Devices, Todd's situation is looked at from a different angle. Some people even think that Todd himself, or his consciousness transferred into a bug-riddled computer, is the real author of Cheap Complex Devices.The Pains is a lavishly illustrated dystopian phantasmagoria set in a universe that is part George Orwell's 1984 and part Ronald Reagan's 1984. It tells the story of Mr. Norman Lux, a sincere young monk beset with bewildering maladies that seem somehow chaotically connected to the fate of the world. Some people have observed that Mr. Lux's condition is markedly similar to that of an electron in a race condition in a buggy chip -- perhaps the one Todd Griffith was designing when he was shot? Or the one in which his thoughts are now imprisoned?
Mining the Web
Soumen Chakrabarti - 2002
Building on an initial survey of infrastructural issues—including Web crawling and indexing—Chakrabarti examines low-level machine learning techniques as they relate specifically to the challenges of Web mining. He then devotes the final part of the book to applications that unite infrastructure and analysis to bring machine learning to bear on systematically acquired and stored data. Here the focus is on results: the strengths and weaknesses of these applications, along with their potential as foundations for further progress. From Chakrabarti's work—painstaking, critical, and forward-looking—readers will gain the theoretical and practical understanding they need to contribute to the Web mining effort.* A comprehensive, critical exploration of statistics-based attempts to make sense of Web Mining.* Details the special challenges associated with analyzing unstructured and semi-structured data.* Looks at how classical Information Retrieval techniques have been modified for use with Web data.* Focuses on today's dominant learning methods: clustering and classification, hyperlink analysis, and supervised and semi-supervised learning.* Analyzes current applications for resource discovery and social network analysis.* An excellent way to introduce students to especially vital applications of data mining and machine learning technology.