The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine


Charles Petzold - 2008
    Turing Mathematician Alan Turing invented an imaginary computer known as the Turing Machine; in an age before computers, he explored the concept of what it meant to be "computable," creating the field of computability theory in the process, a foundation of present-day computer programming.The book expands Turing's original 36-page paper with additional background chapters and extensive annotations; the author elaborates on and clarifies many of Turing's statements, making the original difficult-to-read document accessible to present day programmers, computer science majors, math geeks, and others.Interwoven into the narrative are the highlights of Turing's own life: his years at Cambridge and Princeton, his secret work in cryptanalysis during World War II, his involvement in seminal computer projects, his speculations about artificial intelligence, his arrest and prosecution for the crime of "gross indecency," and his early death by apparent suicide at the age of 41.

Computer Science Illuminated


Nell B. Dale - 2002
    Written By Two Of Today'S Most Respected Computer Science Educators, Nell Dale And John Lewis, The Text Provides A Broad Overview Of The Many Aspects Of The Discipline From A Generic View Point. Separate Program Language Chapters Are Available As Bundle Items For Those Instructors Who Would Like To Explore A Particular Programming Language With Their Students. The Many Layers Of Computing Are Thoroughly Explained Beginning With The Information Layer, Working Through The Hardware, Programming, Operating Systems, Application, And Communication Layers, And Ending With A Discussion On The Limitations Of Computing. Perfect For Introductory Computing And Computer Science Courses, Computer Science Illuminated, Third Edition's Thorough Presentation Of Computing Systems Provides Computer Science Majors With A Solid Foundation For Further Study, And Offers Non-Majors A Comprehensive And Complete Introduction To Computing.

A Mind at Play: How Claude Shannon Invented the Information Age


Jimmy Soni - 2017
    He constructed a fleet of customized unicycles and a flamethrowing trumpet, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” His discoveries would lead contemporaries to compare him to Albert Einstein and Isaac Newton. His work anticipated by decades the world we’d be living in today—and gave mathematicians and engineers the tools to bring that world to pass.In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener.And it’s the story of Shannon’s life as an often reclusive, always playful genius. With access to Shannon’s family and friends, A Mind at Play brings this singular innovator and creative genius to life.

Turing's Cathedral: The Origins of the Digital Universe


George Dyson - 2012
    In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same. Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars. Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.  How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.

Feynman Lectures On Computation


Richard P. Feynman - 1996
    Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

Introduction to Algorithms


Thomas H. Cormen - 1989
    Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.

Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers


John MacCormick - 2012
    A simple web search picks out a handful of relevant needles from the world's biggest haystack: the billions of pages on the World Wide Web. Uploading a photo to Facebook transmits millions of pieces of information over numerous error-prone network links, yet somehow a perfect copy of the photo arrives intact. Without even knowing it, we use public-key cryptography to transmit secret information like credit card numbers; and we use digital signatures to verify the identity of the websites we visit. How do our computers perform these tasks with such ease? This is the first book to answer that question in language anyone can understand, revealing the extraordinary ideas that power our PCs, laptops, and smartphones. Using vivid examples, John MacCormick explains the fundamental "tricks" behind nine types of computer algorithms, including artificial intelligence (where we learn about the "nearest neighbor trick" and "twenty questions trick"), Google's famous PageRank algorithm (which uses the "random surfer trick"), data compression, error correction, and much more. These revolutionary algorithms have changed our world: this book unlocks their secrets, and lays bare the incredible ideas that our computers use every day.

Quantum Computing Since Democritus


Scott Aaronson - 2013
    Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.

Introduction to the Theory of Computation


Michael Sipser - 1996
    Sipser's candid, crystal-clear style allows students at every level to understand and enjoy this field. His innovative "proof idea" sections explain profound concepts in plain English. The new edition incorporates many improvements students and professors have suggested over the years, and offers updated, classroom-tested problem sets at the end of each chapter.

Hackers: Heroes of the Computer Revolution


Steven Levy - 1984
    That was before one pioneering work documented the underground computer revolution that was about to change our world forever. With groundbreaking profiles of Bill Gates, Steve Wozniak, MIT's Tech Model Railroad Club, and more, Steven Levy's Hackers brilliantly captured a seminal moment when the risk-takers and explorers were poised to conquer twentieth-century America's last great frontier. And in the Internet age, the hacker ethic-first espoused here-is alive and well.

But How Do It Know? - The Basic Principles of Computers for Everyone


J. Clark Scott - 2009
    Its humorous title begins with the punch line of a classic joke about someone who is baffled by technology. It was written by a 40-year computer veteran who wants to take the mystery out of computers and allow everyone to gain a true understanding of exactly what computers are, and also what they are not. Years of writing, diagramming, piloting and editing have culminated in one easy to read volume that contains all of the basic principles of computers written so that everyone can understand them. There used to be only two types of book that delved into the insides of computers. The simple ones point out the major parts and describe their functions in broad general terms. Computer Science textbooks eventually tell the whole story, but along the way, they include every detail that an engineer could conceivably ever need to know. Like Momma Bear's porridge, But How Do It Know? is just right, but it is much more than just a happy medium. For the first time, this book thoroughly demonstrates each of the basic principles that have been used in every computer ever built, while at the same time showing the integral role that codes play in everything that computers are able to do. It cuts through all of the electronics and mathematics, and gets right to practical matters. Here is a simple part, see what it does. Connect a few of these together and you get a new part that does another simple thing. After just a few iterations of connecting up simple parts - voilà! - it's a computer. And it is much simpler than anyone ever imagined. But How Do It Know? really explains how computers work. They are far simpler than anyone has ever permitted you to believe. It contains everything you need to know, and nothing you don't need to know. No technical background of any kind is required. The basic principles of computers have not changed one iota since they were invented in the mid 20th century. "Since the day I learned how computers work, it always felt like I knew a giant secret, but couldn't tell anyone," says the author. Now he's taken the time to explain it in such a manner that anyone can have that same moment of enlightenment and thereafter see computers in an entirely new light.

Out of Their Minds: The Lives and Discoveries of 15 Great Computer Scientists


Dennis E. Shasha - 1995
    The latter half of our century has seen its own Renaissance - informations technology has changed irrevocable the way we live, work, and think about the world. We are fortunate, therefore, that the authors of Out of Their Minds have been able to talk so candidly with the founders of computer science. In Out of their Minds, readers will hear the Newtons and Euclids of the computer age as they talk about their discoveries in information technology that have changed forever the way we live, work, and think about the world. Based on interviews by freelance writer Cathy Lazere and the expertise of computer scientist Dennis Shasha, Out of their Minds introduces readers to fifteen of the planet's foremost computer scientists, including eight winners of the Turing Award, computing's Nobel Prize. The scientists reveal themselves in fascinating anecdotes about their early inspirations and influences, their contributions to computer science, and their thoughts on its explosive future. These are the programmers whose work

Alan Turing: The Enigma


Andrew Hodges - 1983
    His breaking of the German U-boat Enigma cipher in World War II ensured Allied-American control of the Atlantic. But Turing's vision went far beyond the desperate wartime struggle. Already in the 1930s he had defined the concept of the universal machine, which underpins the computer revolution. In 1945 he was a pioneer of electronic computer design. But Turing's true goal was the scientific understanding of the mind, brought out in the drama and wit of the famous "Turing test" for machine intelligence and in his prophecy for the twenty-first century.Drawn in to the cockpit of world events and the forefront of technological innovation, Alan Turing was also an innocent and unpretentious gay man trying to live in a society that criminalized him. In 1952 he revealed his homosexuality and was forced to participate in a humiliating treatment program, and was ever after regarded as a security risk. His suicide in 1954 remains one of the many enigmas in an astonishing life story.

Deep Learning


Ian Goodfellow - 2016
    Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

Artificial Intelligence: A Modern Approach


Stuart Russell - 1994
    The long-anticipated revision of this best-selling text offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. *NEW-Nontechnical learning material-Accompanies each part of the book. *NEW-The Internet as a sample application for intelligent systems-Added in several places including logical agents, planning, and natural language. *NEW-Increased coverage of material - Includes expanded coverage of: default reasoning and truth maintenance systems, including multi-agent/distributed AI and game theory; probabilistic approaches to learning including EM; more detailed descriptions of probabilistic inference algorithms. *NEW-Updated and expanded exercises-75% of the exercises are revised, with 100 new exercises. *NEW-On-line Java software. *Makes it easy for students to do projects on the web using intelligent agents. *A unified, agent-based approach to AI-Organizes the material around the task of building intelligent agents. *Comprehensive, up-to-date coverage-Includes a unified view of the field organized around the rational decision making pa