Feynman Lectures On Computation


Richard P. Feynman - 1996
    Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.

Network Security: Private Communication in a Public World


Charlie Kaufman - 1995
    In the second edition of Network Security, this most distinguished of author teams draws on hard-won experience to explain every facet of information security, from the basics to advanced cryptography and authentication; secure Web and email services; and emerging security standards. Highlights of the book's extensive new coverage include Advanced Encryption Standard (AES), IPsec, SSL, PKI Standards, and Web security.

The Data Detective: Ten Easy Rules to Make Sense of Statistics


Tim Harford - 2020
    That’s a mistake, Tim Harford says in The Data Detective. We shouldn’t be suspicious of statistics—we need to understand what they mean and how they can improve our lives: they are, at heart, human behavior seen through the prism of numbers and are often “the only way of grasping much of what is going on around us.” If we can toss aside our fears and learn to approach them clearly—understanding how our own preconceptions lead us astray—statistics can point to ways we can live better and work smarter.As “perhaps the best popular economics writer in the world” (New Statesman), Tim Harford is an expert at taking complicated ideas and untangling them for millions of readers. In The Data Detective, he uses new research in science and psychology to set out ten strategies for using statistics to erase our biases and replace them with new ideas that use virtues like patience, curiosity, and good sense to better understand ourselves and the world. As a result, The Data Detective is a big-idea book about statistics and human behavior that is fresh, unexpected, and insightful.

Introduction to Automata Theory, Languages, and Computation


John E. Hopcroft - 1979
    With this long-awaited revision, the authors continue to present the theory in a concise and straightforward manner, now with an eye out for the practical applications. They have revised this book to make it more accessible to today's students, including the addition of more material on writing proofs, more figures and pictures to convey ideas, side-boxes to highlight other interesting material, and a less formal writing style. Exercises at the end of each chapter, including some new, easier exercises, help readers confirm and enhance their understanding of the material. *NEW! Completely rewritten to be less formal, providing more accessibility to todays students. *NEW! Increased usage of figures and pictures to help convey ideas. *NEW! More detail and intuition provided for definitions and proofs. *NEW! Provides special side-boxes to present supplemental material that may be of interest to readers. *NEW! Includes more exercises, including many at a lower level. *NEW! Presents program-like notation for PDAs and Turing machines. *NEW! Increas

Landing Eagle: Inside the Cockpit During the First Moon Landing


Michael Engle - 2019
    It was a sea in name only. It was actually a bone dry, ancient dusty basin pockmarked with craters and littered with rocks and boulders. Somewhere in that 500 mile diameter basin, the astronauts would attempt to make Mankind’s first landing on the Moon. Neil Armstrong would pilot the Lunar Module “Eagle” during its twelve minute descent from orbit down to a landing. Col. Edwin “Buzz” Aldrin would assist him. On the way down they would encounter a host of problems, any one of which could have potentially caused them to have to call off the landing, or, even worse, die making the attempt. The problems were all technical-communications problems, computer problems, guidance problems, sensor problems. Armstrong and Aldrin faced the very real risk of dying by the very same technical sword that they had to live by in order to accomplish the enormous task of landing on the Moon for the first time. Yet the human skills Armstrong and Aldrin employed would be more than equal to the task. Armstrong’s formidable skills as an aviator, honed from the time he was a young boy, would serve him well as he piloted Eagle down amidst a continuing series of systems problems that might have fatally distracted a lesser aviator. Armstrong’s brilliant piloting was complemented by Aldrin’s equally remarkable discipline and calmness as he stoically provided a running commentary on altitude and descent rate while handling systems problems that threatened the landing. Finally, after a harrowing twelve and a half minutes, Armstrong gently landed Eagle at “Tranquility Base”, a name he had personally chosen to denote the location of the first Moon landing. In “Landing Eagle-Inside the Cockpit During the First Moon Landing”, author Mike Engle gives a minute by minute account of the events that occurred throughout Eagle’s descent and landing on the Moon. Engle, a retired NASA engineer and Mission Control flight controller, uses NASA audio files of actual voice recordings made inside Eagle’s cockpit during landing to give the reader an “inside the cockpit” perspective on the first Moon landing. Engle’s transcripts of these recordings, along with background material on the history and technical details behind the enormous effort to accomplish the first Moon landing, give a new and fascinating insight into the events that occurred on that remarkable day fifty years ago.

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.

How to Think Like a Mathematician


Kevin Houston - 2009
    Working through the book you will develop an arsenal of techniques to help you unlock the meaning of definitions, theorems and proofs, solve problems, and write mathematics effectively. All the major methods of proof - direct method, cases, induction, contradiction and contrapositive - are featured. Concrete examples are used throughout, and you'll get plenty of practice on topics common to many courses such as divisors, Euclidean algorithms, modular arithmetic, equivalence relations, and injectivity and surjectivity of functions. The material has been tested by real students over many years so all the essentials are covered. With over 300 exercises to help you test your progress, you'll soon learn how to think like a mathematician.

Godel: A Life Of Logic, The Mind, And Mathematics


John L. Casti - 2000
    His Incompleteness Theorem turned not only mathematics but also the whole world of science and philosophy on its head. Equally legendary were Gö's eccentricities, his close friendship with Albert Einstein, and his paranoid fear of germs that eventually led to his death from self-starvation. Now, in the first popular biography of this strange and brilliant thinker, John Casti and Werner DePauli bring the legend to life. After describing his childhood in the Moravian capital of Brno, the authors trace the arc of Gö's remarkable career, from the famed Vienna Circle, where philosophers and scientists debated notions of truth, to the Institute for Advanced Study in Princeton, New Jersey, where he lived and worked until his death in 1978. In the process, they shed light on Gö's contributions to mathematics, philosophy, computer science, artificial intelligence -- even cosmology -- in an entertaining and accessible way.

Introduction to Probability


Joseph K. Blitzstein - 2014
    The book explores a wide variety of applications and examples, ranging from coincidences and paradoxes to Google PageRank and Markov chain Monte Carlo MCMC. Additional application areas explored include genetics, medicine, computer science, and information theory. The print book version includes a code that provides free access to an eBook version. The authors present the material in an accessible style and motivate concepts using real-world examples. Throughout, they use stories to uncover connections between the fundamental distributions in statistics and conditioning to reduce complicated problems to manageable pieces. The book includes many intuitive explanations, diagrams, and practice problems. Each chapter ends with a section showing how to perform relevant simulations and calculations in R, a free statistical software environment.

The Hundred-Page Machine Learning Book


Andriy Burkov - 2019
    During that week, you will learn almost everything modern machine learning has to offer. The author and other practitioners have spent years learning these concepts.Companion wiki — the book has a continuously updated wiki that extends some book chapters with additional information: Q&A, code snippets, further reading, tools, and other relevant resources.Flexible price and formats — choose from a variety of formats and price options: Kindle, hardcover, paperback, EPUB, PDF. If you buy an EPUB or a PDF, you decide the price you pay!Read first, buy later — download book chapters for free, read them and share with your friends and colleagues. Only if you liked the book or found it useful in your work, study or business, then buy it.

The Little Book of Mathematical Principles, Theories, & Things


Robert Solomon - 2008
    Rare Book

Learning the UNIX Operating System


Jerry Peek - 1989
    Why wade through a 600-page book when you can begin working productively in a matter of minutes? It's an ideal primer for Mac and PC users of the Internet who need to know a little bit about UNIX on the systems they visit.This book is the most effective introduction to UNIX in print. The fourth edition covers the highlights of the Linux operating system. It's a handy book for someone just starting with UNIX or Linux, as well as someone who encounters a UNIX system on the Internet. And it now includes a quick-reference card.Topics covered include: Linux operating system highlightsLogging in and logging outWindow systems (especially X/Motif)Managing UNIX files and directoriesSending and receiving mailRedirecting input/outputPipes and filtersBackground processingBasic network commandsv

The Computational Beauty of Nature: Computer Explorations of Fractals, Chaos, Complex Systems, and Adaptation


Gary William Flake - 1998
    Distinguishing agents (e.g., molecules, cells, animals, and species) from their interactions (e.g., chemical reactions, immune system responses, sexual reproduction, and evolution), Flake argues that it is the computational properties of interactions that account for much of what we think of as beautiful and interesting. From this basic thesis, Flake explores what he considers to be today's four most interesting computational topics: fractals, chaos, complex systems, and adaptation.Each of the book's parts can be read independently, enabling even the casual reader to understand and work with the basic equations and programs. Yet the parts are bound together by the theme of the computer as a laboratory and a metaphor for understanding the universe. The inspired reader will experiment further with the ideas presented to create fractal landscapes, chaotic systems, artificial life forms, genetic algorithms, and artificial neural networks.

Make Your Own Neural Network


Tariq Rashid - 2016
     Neural networks are a key element of deep learning and artificial intelligence, which today is capable of some truly impressive feats. Yet too few really understand how neural networks actually work. This guide will take you on a fun and unhurried journey, starting from very simple ideas, and gradually building up an understanding of how neural networks work. You won't need any mathematics beyond secondary school, and an accessible introduction to calculus is also included. The ambition of this guide is to make neural networks as accessible as possible to as many readers as possible - there are enough texts for advanced readers already! You'll learn to code in Python and make your own neural network, teaching it to recognise human handwritten numbers, and performing as well as professionally developed networks. Part 1 is about ideas. We introduce the mathematical ideas underlying the neural networks, gently with lots of illustrations and examples. Part 2 is practical. We introduce the popular and easy to learn Python programming language, and gradually builds up a neural network which can learn to recognise human handwritten numbers, easily getting it to perform as well as networks made by professionals. Part 3 extends these ideas further. We push the performance of our neural network to an industry leading 98% using only simple ideas and code, test the network on your own handwriting, take a privileged peek inside the mysterious mind of a neural network, and even get it all working on a Raspberry Pi. All the code in this has been tested to work on a Raspberry Pi Zero.

Artificial Intelligence: A Guide for Thinking Humans


Melanie Mitchell - 2019
    The award-winning author Melanie Mitchell, a leading computer scientist, now reveals AI’s turbulent history and the recent spate of apparent successes, grand hopes, and emerging fears surrounding it.In Artificial Intelligence, Mitchell turns to the most urgent questions concerning AI today: How intelligent—really—are the best AI programs? How do they work? What can they actually do, and when do they fail? How humanlike do we expect them to become, and how soon do we need to worry about them surpassing us? Along the way, she introduces the dominant models of modern AI and machine learning, describing cutting-edge AI programs, their human inventors, and the historical lines of thought underpinning recent achievements. She meets with fellow experts such as Douglas Hofstadter, the cognitive scientist and Pulitzer Prize–winning author of the modern classic Gödel, Escher, Bach, who explains why he is “terrified” about the future of AI. She explores the profound disconnect between the hype and the actual achievements in AI, providing a clear sense of what the field has accomplished and how much further it has to go.Interweaving stories about the science of AI and the people behind it, Artificial Intelligence brims with clear-sighted, captivating, and accessible accounts of the most interesting and provocative modern work in the field, flavored with Mitchell’s humor and personal observations. This frank, lively book is an indispensable guide to understanding today’s AI, its quest for “human-level” intelligence, and its impact on the future for us all.