Alan Turing: The Enigma


Andrew Hodges - 1983
    His breaking of the German U-boat Enigma cipher in World War II ensured Allied-American control of the Atlantic. But Turing's vision went far beyond the desperate wartime struggle. Already in the 1930s he had defined the concept of the universal machine, which underpins the computer revolution. In 1945 he was a pioneer of electronic computer design. But Turing's true goal was the scientific understanding of the mind, brought out in the drama and wit of the famous "Turing test" for machine intelligence and in his prophecy for the twenty-first century.Drawn in to the cockpit of world events and the forefront of technological innovation, Alan Turing was also an innocent and unpretentious gay man trying to live in a society that criminalized him. In 1952 he revealed his homosexuality and was forced to participate in a humiliating treatment program, and was ever after regarded as a security risk. His suicide in 1954 remains one of the many enigmas in an astonishing life story.

The Sciences of the Artificial


Herbert A. Simon - 1969
    There are updates throughout the book as well. These take into account important advances in cognitive psychology and the science of design while confirming and extending the book's basic thesis: that a physical symbol system has the necessary and sufficient means for intelligent action. The chapter "Economic Reality" has also been revised to reflect a change in emphasis in Simon's thinking about the respective roles of organizations and markets in economic systems."People sometimes ask me what they should read to find out about artificial intelligence. Herbert Simon's book The Sciences of the Artificial is always on the list I give them. Every page issues a challenge to conventional thinking, and the layman who digests it well will certainly understand what the field of artificial intelligence hopes to accomplish. I recommend it in the same spirit that I recommend Freud to people who ask about psychoanalysis, or Piaget to those who ask about child psychology: If you want to learn about a subject, start by reading its founding fathers." -- George A. Miller

iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It


Steve Wozniak - 2006
    individual whose contributions to the scientific, business and cultural realms are extensive."—BookpageBefore slim laptops that fit into briefcases, computers looked like strange, alien vending machines. But in "the most staggering burst of technical invention by a single person in high-tech history" (BusinessWeek​) Steve Wozniak invented the first true personal computer. Wozniak teamed up with Steve Jobs, and Apple Computer was born, igniting the computer revolution and transforming the world. Here, thirty years later, the mischievous genius with the low profile treats readers to a rollicking, no-holds-barred account of his life—for once, in the voice of the wizard himself.

How to Solve It: A New Aspect of Mathematical Method


George Pólya - 1944
    Polya, How to Solve It will show anyone in any field how to think straight. In lucid and appealing prose, Polya reveals how the mathematical method of demonstrating a proof or finding an unknown can be of help in attacking any problem that can be reasoned out--from building a bridge to winning a game of anagrams. Generations of readers have relished Polya's deft--indeed, brilliant--instructions on stripping away irrelevancies and going straight to the heart of the problem.

Hacker's Delight


Henry S. Warren Jr. - 2002
    Aiming to tell the dark secrets of computer arithmetic, this title is suitable for library developers, compiler writers, and lovers of elegant hacks.

Big Data: A Revolution That Will Transform How We Live, Work, and Think


Viktor Mayer-Schönberger - 2013
    “Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. This emerging science can translate myriad phenomena—from the price of airline tickets to the text of millions of books—into searchable form, and uses our increasing computing power to unearth epiphanies that we never could have seen before. A revolution on par with the Internet or perhaps even the printing press, big data will change the way we think about business, health, politics, education, and innovation in the years to come. It also poses fresh threats, from the inevitable end of privacy as we know it to the prospect of being penalized for things we haven’t even done yet, based on big data’s ability to predict our future behavior.In this brilliantly clear, often surprising work, two leading experts explain what big data is, how it will change our lives, and what we can do to protect ourselves from its hazards. Big Data is the first big book about the next big thing.www.big-data-book.com

Let Over Lambda


Doug Hoyte - 2008
    Starting with the fundamentals, it describes the most advanced features of the most advanced language: Common Lisp. Only the top percentile of programmers use lisp and if you can understand this book you are in the top percentile of lisp programmers. If you are looking for a dry coding manual that re-hashes common-sense techniques in whatever langue du jour, this book is not for you. This book is about pushing the boundaries of what we know about programming. While this book teaches useful skills that can help solve your programming problems today and now, it has also been designed to be entertaining and inspiring. If you have ever wondered what lisp or even programming itself is really about, this is the book you have been looking for.

The Beginning of Infinity: Explanations That Transform the World


David Deutsch - 2011
    Taking us on a journey through every fundamental field of science, as well as the history of civilization, art, moral values, and the theory of political institutions, Deutsch tracks how we form new explanations and drop bad ones, explaining the conditions under which progress—which he argues is potentially boundless—can and cannot happen. Hugely ambitious and highly original, The Beginning of Infinity explores and establishes deep connections between the laws of nature, the human condition, knowledge, and the possibility for progress.

Where Wizards Stay Up Late: The Origins of the Internet


Katie Hafner - 1996
    Today, twenty million people worldwide are surfing the Net. Where Wizards Stay Up Late is the exciting story of the pioneers responsible for creating the most talked about, most influential, and most far-reaching communications breakthrough since the invention of the telephone. In the 1960's, when computers where regarded as mere giant calculators, J.C.R. Licklider at MIT saw them as the ultimate communications devices. With Defense Department funds, he and a band of visionary computer whizzes began work on a nationwide, interlocking network of computers. Taking readers behind the scenes, Where Wizards Stay Up Late captures the hard work, genius, and happy accidents of their daring, stunningly successful venture.

Quantum Computing for Everyone


Chris Bernhardt - 2019
    In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means.Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement--which, he says, is easier to describe mathematically than verbally--and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as "spooky action at a distance"); and introduces quantum cryptography. He recaps standard topics in classical computing--bits, gates, and logic--and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.

A History of Modern Computing


Paul E. Ceruzzi - 1998
    The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001.The new material focuses on the Microsoft antitrust suit, the rise and fall of the dot-coms, and the advent of open source software, particularly Linux. Within the chronological narrative, the book traces several overlapping threads: the evolution of the computer's internal design; the effect of economic trends and the Cold War; the long-term role of IBM as a player and as a target for upstart entrepreneurs; the growth of software from a hidden element to a major character in the story of computing; and the recurring issue of the place of information and computing in a democratic society.The focus is on the United States (though Europe and Japan enter the story at crucial points), on computing per se rather than on applications such as artificial intelligence, and on systems that were sold commercially and installed in quantities.

Grokking Algorithms An Illustrated Guide For Programmers and Other Curious People


Aditya Y. Bhargava - 2015
    The algorithms you'll use most often as a programmer have already been discovered, tested, and proven. If you want to take a hard pass on Knuth's brilliant but impenetrable theories and the dense multi-page proofs you'll find in most textbooks, this is the book for you. This fully-illustrated and engaging guide makes it easy for you to learn how to use algorithms effectively in your own programs.Grokking Algorithms is a disarming take on a core computer science topic. In it, you'll learn how to apply common algorithms to the practical problems you face in day-to-day life as a programmer. You'll start with problems like sorting and searching. As you build up your skills in thinking algorithmically, you'll tackle more complex concerns such as data compression or artificial intelligence. Whether you're writing business software, video games, mobile apps, or system utilities, you'll learn algorithmic techniques for solving problems that you thought were out of your grasp. For example, you'll be able to:Write a spell checker using graph algorithmsUnderstand how data compression works using Huffman codingIdentify problems that take too long to solve with naive algorithms, and attack them with algorithms that give you an approximate answer insteadEach carefully-presented example includes helpful diagrams and fully-annotated code samples in Python. By the end of this book, you will know some of the most widely applicable algorithms as well as how and when to use them.

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Concrete Mathematics: A Foundation for Computer Science


Ronald L. Graham - 1988
    "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."

Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software


Scott Rosenberg - 2007
    Along the way, we encounter black holes, turtles, snakes, dragons, axe-sharpening, and yak-shaving—and take a guided tour through the theories and methods, both brilliant and misguided, that litter the history of software development, from the famous ‘mythical man-month’ to Extreme Programming. Not just for technophiles but for anyone captivated by the drama of invention, Dreaming in Code offers a window into both the information age and the workings of the human mind.