Computer: A History of the Information Machine


Martin Campbell-Kelly - 1996
    Old-fashioned entrepreneurship combined with scientific know-how inspired now famous computer engineers to create the technology that became IBM. Wartime needs drove the giant ENIAC, the first fully electronic computer. Later, the PC enabled modes of computing that liberated people from room-sized, mainframe computers. This second edition now extends beyond the development of Microsoft Windows and the Internet, to include open source operating systems like Linux, and the rise again and fall and potential rise of the dot.com industries.

Computer Age Statistical Inference: Algorithms, Evidence, and Data Science


Bradley Efron - 2016
    'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Intermediate Perl


Randal L. Schwartz - 2003
    One slogan of Perl is that it makes easy things easy and hard things possible. "Intermediate Perl" is about making the leap from the easy things to the hard ones.Originally released in 2003 as "Learning Perl Objects, References, and Modules" and revised and updated for Perl 5.8, this book offers a gentle but thorough introduction to intermediate programming in Perl. Written by the authors of the best-selling "Learning Perl," it picks up where that book left off. Topics include: Packages and namespacesReferences and scopingManipulating complex data structuresObject-oriented programmingWriting and using modulesTesting Perl codeContributing to CPANFollowing the successful format of "Learning Perl," we designed each chapter in the book to be small enough to be read in just an hour or two, ending with a series of exercises to help you practice what you've learned. To use the book, you just need to be familiar with the material in "Learning Perl" and have ambition to go further.Perl is a different language to different people. It is a quick scripting tool for some, and a fully-featured object-oriented language for others. It is used for everything from performing quick global replacements on text files, to crunching huge, complex sets of scientific data that take weeks to process. Perl is what you make of it. But regardless of what you use Perl for, this book helps you do it more effectively, efficiently, and elegantly."Intermediate Perl" is about learning to use Perl as a programming language, and not just a scripting language. This is the book that turns the Perl dabbler into the Perl programmer.

Concepts, Techniques, and Models of Computer Programming


Peter Van Roy - 2004
    The book focuses on techniques of lasting value and explains them precisely in terms of a simple abstract machine. The book presents all major programming paradigms in a uniform framework that shows their deep relationships and how and where to use them together.After an introduction to programming concepts, the book presents both well-known and lesser-known computation models ("programming paradigms"). Each model has its own set of techniques and each is included on the basis of its usefulness in practice. The general models include declarative programming, declarative concurrency, message-passing concurrency, explicit state, object-oriented programming, shared-state concurrency, and relational programming. Specialized models include graphical user interface programming, distributed programming, and constraint programming. Each model is based on its kernel language—a simple core language that consists of a small number of programmer- significant elements. The kernel languages are introduced progressively, adding concepts one by one, thus showing the deep relationships between different models. The kernel languages are defined precisely in terms of a simple abstract machine. Because a wide variety of languages and programming paradigms can be modeled by a small set of closely related kernel languages, this approach allows programmer and student to grasp the underlying unity of programming. The book has many program fragments and exercises, all of which can be run on the Mozart Programming System, an Open Source software package that features an interactive incremental development environment.

The Fractal Geometry of Nature


Benoît B. Mandelbrot - 1977
    The complexity of nature's shapes differs in kind, not merely degree, from that of the shapes of ordinary geometry, the geometry of fractal shapes.Now that the field has expanded greatly with many active researchers, Mandelbrot presents the definitive overview of the origins of his ideas and their new applications. The Fractal Geometry of Nature is based on his highly acclaimed earlier work, but has much broader and deeper coverage and more extensive illustrations.

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.

Python for Data Analysis


Wes McKinney - 2011
    It is also a practical, modern introduction to scientific computing in Python, tailored for data-intensive applications. This is a book about the parts of the Python language and libraries you'll need to effectively solve a broad set of data analysis problems. This book is not an exposition on analytical methods using Python as the implementation language.Written by Wes McKinney, the main author of the pandas library, this hands-on book is packed with practical cases studies. It's ideal for analysts new to Python and for Python programmers new to scientific computing.Use the IPython interactive shell as your primary development environmentLearn basic and advanced NumPy (Numerical Python) featuresGet started with data analysis tools in the pandas libraryUse high-performance tools to load, clean, transform, merge, and reshape dataCreate scatter plots and static or interactive visualizations with matplotlibApply the pandas groupby facility to slice, dice, and summarize datasetsMeasure data by points in time, whether it's specific instances, fixed periods, or intervalsLearn how to solve problems in web analytics, social sciences, finance, and economics, through detailed examples

Machine Learning: Fundamental Algorithms for Supervised and Unsupervised Learning With Real-World Applications


Joshua Chapmann - 2017
    Right?! Machine Learning is a branch of computer science that wants to stop programming computers using a detailed list of commands to follow blindly. Instead, their aim is to implement high-level routines that teach computers how to approach new and unknown problems – these are called algorithms. In practice, they want to give computers the ability to Learn and to Adapt. We can use these algorithms to obtain insights, recognize patterns and make predictions from data, images, sounds or videos we have never seen before – or even knew existed. Unfortunately, the true power and applications of today’s Machine Learning Algorithms remain deeply misunderstood by most people. Through this book I want fix this confusion, I want to shed light on the most relevant Machine Learning Algorithms used in the industry. I will show you exactly how each algorithm works, why it works and when you should use it. Supervised Learning Algorithms K-Nearest Neighbour Naïve Bayes Regressions Unsupervised Learning Algorithms: Support Vector Machines Neural Networks Decision Trees

The Golden Ticket: P, Np, and the Search for the Impossible


Lance Fortnow - 2013
    Simply stated, it asks whether every problem whose solution can be quickly checked by computer can also be quickly solved by computer. The Golden Ticket provides a nontechnical introduction to P-NP, its rich history, and its algorithmic implications for everything we do with computers and beyond. Lance Fortnow traces the history and development of P-NP, giving examples from a variety of disciplines, including economics, physics, and biology. He explores problems that capture the full difficulty of the P-NP dilemma, from discovering the shortest route through all the rides at Disney World to finding large groups of friends on Facebook. The Golden Ticket explores what we truly can and cannot achieve computationally, describing the benefits and unexpected challenges of this compelling problem.

The Systems Bible: The Beginner's Guide to Systems Large and Small: Being the Third Edition of Systemantics


John Gall - 1977
    Hardcover published by Quadragle/The New York Times Book Co., third printing, August 1977, copyright 1975.

Machine Learning: A Probabilistic Perspective


Kevin P. Murphy - 2012
    Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

Computer Science: An Overview


J. Glenn Brookshear - 1985
    This bookpresents an introductory survey of computer science. It explores thebreadth of the subject while including enough depth to convey anhonest appreciation for the topics involved. The new edition includesreorganization of some key material for enhanced clarity (SoftwareEngineering and Artificial Intelligence chapters), new and expandedmaterial on Security and Data Abstractions, more on ethics anddifferent ethical theories in Chapter 0. Anyone interested in gaining athorough introduction to Computer Science.

Show Stopper!: The Breakneck Race to Create Windows NT and the Next Generation at Microsoft


G. Pascal Zachary - 1994
    Describes the five-year, 150 million dollar project Microsoft undertook to develop an advanced PC operating system.

Hadoop: The Definitive Guide


Tom White - 2009
    Ideal for processing large datasets, the Apache Hadoop framework is an open source implementation of the MapReduce algorithm on which Google built its empire. This comprehensive resource demonstrates how to use Hadoop to build reliable, scalable, distributed systems: programmers will find details for analyzing large datasets, and administrators will learn how to set up and run Hadoop clusters. Complete with case studies that illustrate how Hadoop solves specific problems, this book helps you:Use the Hadoop Distributed File System (HDFS) for storing large datasets, and run distributed computations over those datasets using MapReduce Become familiar with Hadoop's data and I/O building blocks for compression, data integrity, serialization, and persistence Discover common pitfalls and advanced features for writing real-world MapReduce programs Design, build, and administer a dedicated Hadoop cluster, or run Hadoop in the cloud Use Pig, a high-level query language for large-scale data processing Take advantage of HBase, Hadoop's database for structured and semi-structured data Learn ZooKeeper, a toolkit of coordination primitives for building distributed systems If you have lots of data -- whether it's gigabytes or petabytes -- Hadoop is the perfect solution. Hadoop: The Definitive Guide is the most thorough book available on the subject. "Now you have the opportunity to learn about Hadoop from a master-not only of the technology, but also of common sense and plain talk." -- Doug Cutting, Hadoop Founder, Yahoo!

Tubes: A Journey to the Center of the Internet


Andrew Blum - 2012
    But what is it physically? And where is it really? Our mental map of the network is as blank as the map of the ocean that Columbus carried on his first Atlantic voyage. The Internet, its material nuts and bolts, is an unexplored territory. Until now.In Tubes, journalist Andrew Blum goes inside the Internet's physical infrastructure and flips on the lights, revealing an utterly fresh look at the online world we think we know. It is a shockingly tactile realm of unmarked compounds, populated by a special caste of engineer who pieces together our networks by hand; where glass fibers pulse with light and creaky telegraph buildings, tortuously rewired, become communication hubs once again. From the room in Los Angeles where the Internet first flickered to life to the caverns beneath Manhattan where new fiber-optic cable is buried; from the coast of Portugal, where a ten-thousand-mile undersea cable just two thumbs wide connects Europe and Africa, to the wilds of the Pacific Northwest, where Google, Microsoft, and Facebook have built monumental data centers—Blum chronicles the dramatic story of the Internet's development, explains how it all works, and takes the first-ever in-depth look inside its hidden monuments.This is a book about real places on the map: their sounds and smells, their storied pasts, their physical details, and the people who live there. For all the talk of the "placelessness" of our digital age, the Internet is as fixed in real, physical spaces as the railroad or telephone. You can map it and touch it, and you can visit it. Is the Internet in fact "a series of tubes" as Ted Stevens, the late senator from Alaska, once famously described it? How can we know the Internet's possibilities if we don't know its parts?Like Tracy Kidder's classic The Soul of a New Machine or Tom Vanderbilt's recent bestseller Traffic, Tubes combines on-the-ground reporting and lucid explanation into an engaging, mind-bending narrative to help us understand the physical world that underlies our digital lives.