Gödel's Proof


Ernest Nagel - 1958
    Gödel received public recognition of his work in 1951 when he was awarded the first Albert Einstein Award for achievement in the natural sciences--perhaps the highest award of its kind in the United States. The award committee described his work in mathematical logic as "one of the greatest contributions to the sciences in recent times."However, few mathematicians of the time were equipped to understand the young scholar's complex proof. Ernest Nagel and James Newman provide a readable and accessible explanation to both scholars and non-specialists of the main ideas and broad implications of Gödel's discovery. It offers every educated person with a taste for logic and philosophy the chance to understand a previously difficult and inaccessible subject.New York University Press is proud to publish this special edition of one of its bestselling books. With a new introduction by Douglas R. Hofstadter, this book will appeal students, scholars, and professionals in the fields of mathematics, computer science, logic and philosophy, and science.

Are You Smart Enough to Work at Google?


William Poundstone - 2012
    The blades start moving in 60 seconds. What do you do? If you want to work at Google, or any of America's best companies, you need to have an answer to this and other puzzling questions. Are You Smart Enough to Work at Google? guides readers through the surprising solutions to dozens of the most challenging interview questions. The book covers the importance of creative thinking, ways to get a leg up on the competition, what your Facebook page says about you, and much more. Are You Smart Enough to Work at Google? is a must-read for anyone who wants to succeed in today's job market.

Google Hacking: An Ethical Hacking Guide To Google


Ankit Fadia - 2007
    Google Hacking teaches people how to get the most out of this revolutionary search engine. Not only will this book teach readers how Google works, but it will also empower them with the necessary skills to make their everyday searches easier, more efficient, and more productive. Google Hacking also demonstrates how Google can be used for negative means. It's immense searching power, means that everyone, including cyber criminals, can feasibly access confidential data, such as company presentations, budgets, blueprints, even credit card numbers, with just the click of a mouse. Using numerous examples, case studies, and screenshots, this book explains the art of ethical Google Hacking -- it not only teaches readers how Google works, but it provides them with the knowledge they need to protect their data and systems from getting Google Hacked. This is the only book you need to maximize (and protect yourself) from Google searches!

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Data Structures and Algorithm Analysis in C


Mark Allen Weiss - 1992
    The book's conceptual presentation focuses on ADTs and the analysis of algorithms for efficiency, with a particular concentration on performance and running time. The second edition contains a new chapter that examines advanced data structures such as red black trees, top down splay trees, treaps, k-d trees, and pairing heaps among others. All code examples now conform to ANSI C and coverage of the formal proofs underpinning several key data structures has been strengthened.

How to Solve It: A New Aspect of Mathematical Method


George Pólya - 1944
    Polya, How to Solve It will show anyone in any field how to think straight. In lucid and appealing prose, Polya reveals how the mathematical method of demonstrating a proof or finding an unknown can be of help in attacking any problem that can be reasoned out--from building a bridge to winning a game of anagrams. Generations of readers have relished Polya's deft--indeed, brilliant--instructions on stripping away irrelevancies and going straight to the heart of the problem.

But How Do It Know? - The Basic Principles of Computers for Everyone


J. Clark Scott - 2009
    Its humorous title begins with the punch line of a classic joke about someone who is baffled by technology. It was written by a 40-year computer veteran who wants to take the mystery out of computers and allow everyone to gain a true understanding of exactly what computers are, and also what they are not. Years of writing, diagramming, piloting and editing have culminated in one easy to read volume that contains all of the basic principles of computers written so that everyone can understand them. There used to be only two types of book that delved into the insides of computers. The simple ones point out the major parts and describe their functions in broad general terms. Computer Science textbooks eventually tell the whole story, but along the way, they include every detail that an engineer could conceivably ever need to know. Like Momma Bear's porridge, But How Do It Know? is just right, but it is much more than just a happy medium. For the first time, this book thoroughly demonstrates each of the basic principles that have been used in every computer ever built, while at the same time showing the integral role that codes play in everything that computers are able to do. It cuts through all of the electronics and mathematics, and gets right to practical matters. Here is a simple part, see what it does. Connect a few of these together and you get a new part that does another simple thing. After just a few iterations of connecting up simple parts - voilà! - it's a computer. And it is much simpler than anyone ever imagined. But How Do It Know? really explains how computers work. They are far simpler than anyone has ever permitted you to believe. It contains everything you need to know, and nothing you don't need to know. No technical background of any kind is required. The basic principles of computers have not changed one iota since they were invented in the mid 20th century. "Since the day I learned how computers work, it always felt like I knew a giant secret, but couldn't tell anyone," says the author. Now he's taken the time to explain it in such a manner that anyone can have that same moment of enlightenment and thereafter see computers in an entirely new light.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

The Haskell Road to Logic, Maths and Programming


Kees Doets - 2004
    Haskell emerged in the last decade as a standard for lazy functional programming, a programming style where arguments are evaluated only when the value is actually needed. Haskell is a marvellous demonstration tool for logic and maths because its functional character allows implementations to remain very close to the concepts that get implemented, while the laziness permits smooth handling of infinite data structures.This book does not assume the reader to have previous experience with either programming or construction of formal proofs, but acquaintance with mathematical notation, at the level of secondary school mathematics is presumed. Everything one needs to know about mathematical reasoning or programming is explained as we go along. After proper digestion of the material in this book the reader will be able to write interesting programs, reason about their correctness, and document them in a clear fashion. The reader will also have learned how to set up mathematical proofs in a structured way, and how to read and digest mathematical proofs written by others.

Fuzzy Logic: The Revolutionary Computer Technology That Is Changing Our World


Daniel McNeill - 1993
    Professor Lofti Zadeh masterminded "fuzzy logic"--a way of programming computers to "make decisions" bases on imprecise data and complex situations. In "Fuzzy Logic," Daniel McNeill and Paul Freiberger relate the compelling tale of this remarkable new technology, the genius who brought it to life, and how it will soon affect the lives of every one of us.

The Lifebox, the Seashell, and the Soul: What Gnarly Computation Taught Me About Ultimate Reality, the Meaning of Life, and How to Be Happy


Rudy Rucker - 2005
    This concept is at the root of the computational worldview, which basically says that very complex systems — the world we live in — have their beginnings in simple mathematical equations. We've lately come to understand that such an algorithm is only the start of a never-ending story — the real action occurs in the unfolding consequences of the rules. The chip-in-a-box computers so popular in our time have acted as a kind of microscope, letting us see into the secret machinery of the world. In Lifebox, Rucker uses whimsical drawings, fables, and humor to demonstrate that everything is a computation — that thoughts, computations, and physical processes are all the same. Rucker discusses the linguistic and computational advances that make this kind of "digital philosophy" possible, and explains how, like every great new principle, the computational world view contains the seeds of a next step.

The Man Who Knew Too Much: Alan Turing and the Invention of the Computer


David Leavitt - 2006
    Then, attempting to break a Nazi code during World War II, he successfully designed and built one, thus ensuring the Allied victory. Turing became a champion of artificial intelligence, but his work was cut short. As an openly gay man at a time when homosexuality was illegal in England, he was convicted and forced to undergo a humiliating "treatment" that may have led to his suicide.With a novelist's sensitivity, David Leavitt portrays Turing in all his humanity—his eccentricities, his brilliance, his fatal candor—and elegantly explains his work and its implications.

Quantum Computing for Everyone


Chris Bernhardt - 2019
    In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means.Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement--which, he says, is easier to describe mathematically than verbally--and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as "spooky action at a distance"); and introduces quantum cryptography. He recaps standard topics in classical computing--bits, gates, and logic--and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.

Computer Science Illuminated


Nell B. Dale - 2002
    Written By Two Of Today'S Most Respected Computer Science Educators, Nell Dale And John Lewis, The Text Provides A Broad Overview Of The Many Aspects Of The Discipline From A Generic View Point. Separate Program Language Chapters Are Available As Bundle Items For Those Instructors Who Would Like To Explore A Particular Programming Language With Their Students. The Many Layers Of Computing Are Thoroughly Explained Beginning With The Information Layer, Working Through The Hardware, Programming, Operating Systems, Application, And Communication Layers, And Ending With A Discussion On The Limitations Of Computing. Perfect For Introductory Computing And Computer Science Courses, Computer Science Illuminated, Third Edition's Thorough Presentation Of Computing Systems Provides Computer Science Majors With A Solid Foundation For Further Study, And Offers Non-Majors A Comprehensive And Complete Introduction To Computing.

The Chip: How Two Americans Invented the Microchip and Launched a Revolution


T.R. Reid - 1984
    The world's brightest engineers were stymied in their quest to make these machines small and affordable until the solution finally came from two ingenious young Americans. Jack Kilby and Robert Noyce hit upon the stunning discovery that would make possible the silicon microchip, a work that would ultimately earn Kilby the Nobel Prize for physics in 2000. In this completely revised and updated edition of The Chip, T.R. Reid tells the gripping adventure story of their invention and of its growth into a global information industry. This is the story of how the digital age began.