The Master Switch: The Rise and Fall of Information Empires


Tim Wu - 2010
    With all our media now traveling a single network, an unprecedented potential is building for centralized control over what Americans see and hear. Could history repeat itself with the next industrial consolidation? Could the Internet—the entire flow of American information—come to be ruled by one corporate leviathan in possession of “the master switch”? That is the big question of Tim Wu’s pathbreaking book.As Wu’s sweeping history shows, each of the new media of the twentieth century—radio, telephone, television, and film—was born free and open. Each invited unrestricted use and enterprising experiment until some would-be mogul battled his way to total domination. Here are stories of an uncommon will to power, the power over information: Adolph Zukor, who took a technology once used as commonly as YouTube is today and made it the exclusive prerogative of a kingdom called Hollywood . . . NBC’s founder, David Sarnoff, who, to save his broadcast empire from disruptive visionaries, bullied one inventor (of electronic television) into alcoholic despair and another (this one of FM radio, and his boyhood friend) into suicide . . . And foremost, Theodore Vail, founder of the Bell System, the greatest information empire of all time, and a capitalist whose faith in Soviet-style central planning set the course of every information industry thereafter.Explaining how invention begets industry and industry begets empire—a progress often blessed by government, typically with stifling consequences for free expression and technical innovation alike—Wu identifies a time-honored pattern in the maneuvers of today’s great information powers: Apple, Google, and an eerily resurgent AT&T. A battle royal looms for the Internet’s future, and with almost every aspect of our lives now dependent on that network, this is one war we dare not tune out.Part industrial exposé, part meditation on what freedom requires in the information age, The Master Switch is a stirring illumination of a drama that has played out over decades in the shadows of our national life and now culminates with terrifying implications for our future.

Physics in Mind: A Quantum View of the Brain


Werner R. Loewenstein - 2013
    But what is the mind? What do we mean when we say we are "aware" of something? What is this peculiar state in our heads, at once utterly familiar and bewilderingly mysterious, that we call awareness or consciousness? In Physics in Mind, eminent biophysicist Werner R. Loewenstein argues that to answer these questions, we must first understand the physical mechanisms that underlie the workings of the mind. And so begins an exhilarating journey along the sensory data stream of the brain, which shows how our most complex organ processes the vast amounts of information coming in through our senses to create a coherent, meaningful picture of the world. Bringing information theory to bear on recent advances in the neurosciences, Loewenstein reveals a web of immense computational power inside the brain. He introduces the revolutionary idea that quantum mechanics could be fundamental to how our minds almost instantaneously deal with staggering amounts of information, as in the case of the information streaming through our eyes. Combining cutting-edge research in neuroscience and physics, Loewenstein presents an ambitious hypothesis about the parallel processing of sensory information that is the heart, hub, and pivot of the cognitive brain. Wide-ranging and brimming with insight, Physics in Mind breaks new ground in our understanding of how the mind works.

How to Prove It: A Structured Approach


Daniel J. Velleman - 1994
    The book begins with the basic concepts of logic and set theory, to familiarize students with the language of mathematics and how it is interpreted. These concepts are used as the basis for a step-by-step breakdown of the most important techniques used in constructing proofs. To help students construct their own proofs, this new edition contains over 200 new exercises, selected solutions, and an introduction to Proof Designer software. No background beyond standard high school mathematics is assumed. Previous Edition Hb (1994) 0-521-44116-1 Previous Edition Pb (1994) 0-521-44663-5

Natural Language Processing with Python


Steven Bird - 2009
    With it, you'll learn how to write Python programs that work with large collections of unstructured text. You'll access richly annotated datasets using a comprehensive range of linguistic data structures, and you'll understand the main algorithms for analyzing the content and structure of written communication.Packed with examples and exercises, Natural Language Processing with Python will help you: Extract information from unstructured text, either to guess the topic or identify "named entities" Analyze linguistic structure in text, including parsing and semantic analysis Access popular linguistic databases, including WordNet and treebanks Integrate techniques drawn from fields as diverse as linguistics and artificial intelligenceThis book will help you gain practical skills in natural language processing using the Python programming language and the Natural Language Toolkit (NLTK) open source library. If you're interested in developing web applications, analyzing multilingual news sources, or documenting endangered languages -- or if you're simply curious to have a programmer's perspective on how human language works -- you'll find Natural Language Processing with Python both fascinating and immensely useful.

Structures: Or Why Things Don't Fall Down


J.E. Gordon - 1978
    Gordon strips engineering of its confusing technical terms, communicating its founding principles in accessible, witty prose.For anyone who has ever wondered why suspension bridges don't collapse under eight lanes of traffic, how dams hold back--or give way under--thousands of gallons of water, or what principles guide the design of a skyscraper, a bias-cut dress, or a kangaroo, this book will ease your anxiety and answer your questions.Structures: Or Why Things Don't Fall Down is an informal explanation of the basic forces that hold together the ordinary and essential things of this world--from buildings and bodies to flying aircraft and eggshells. In a style that combines wit, a masterful command of his subject, and an encyclopedic range of reference, Gordon includes such chapters as "How to Design a Worm" and "The Advantage of Being a Beam," offering humorous insights in human and natural creation.Architects and engineers will appreciate the clear and cogent explanations of the concepts of stress, shear, torsion, fracture, and compression. If you're building a house, a sailboat, or a catapult, here is a handy tool for understanding the mechanics of joinery, floors, ceilings, hulls, masts--or flying buttresses.Without jargon or oversimplification, Structures opens up the marvels of technology to anyone interested in the foundations of our everyday lives.

Introduction to Topology


Bert Mendelson - 1975
    It provides a simple, thorough survey of elementary topics, starting with set theory and advancing to metric and topological spaces, connectedness, and compactness. 1975 edition.

Fact or Fiction: Science Tackles 58 Popular Myths


Scientific American - 2013
    Drawing from Scientific American's "Fact or Fiction" and "Strange But True" columns, we've selected fifty-eight of the most surprising, fascinating, useful, and just plain wacky topics confronted by our writers over the years.

Probabilistic Graphical Models: Principles and Techniques


Daphne Koller - 2009
    The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality.Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.

Data Science


John D. Kelleher - 2018
    Today data science determines the ads we see online, the books and movies that are recommended to us online, which emails are filtered into our spam folders, and even how much we pay for health insurance. This volume in the MIT Press Essential Knowledge series offers a concise introduction to the emerging field of data science, explaining its evolution, current uses, data infrastructure issues, and ethical challenges.It has never been easier for organizations to gather, store, and process data. Use of data science is driven by the rise of big data and social media, the development of high-performance computing, and the emergence of such powerful methods for data analysis and modeling as deep learning. Data science encompasses a set of principles, problem definitions, algorithms, and processes for extracting non-obvious and useful patterns from large datasets. It is closely related to the fields of data mining and machine learning, but broader in scope. This book offers a brief history of the field, introduces fundamental data concepts, and describes the stages in a data science project. It considers data infrastructure and the challenges posed by integrating data from multiple sources, introduces the basics of machine learning, and discusses how to link machine learning expertise with real-world problems. The book also reviews ethical and legal issues, developments in data regulation, and computational approaches to preserving privacy. Finally, it considers the future impact of data science and offers principles for success in data science projects.

Real World OCaml: Functional programming for the masses


Yaron Minsky - 2013
    Through the book’s many examples, you’ll quickly learn how OCaml stands out as a tool for writing fast, succinct, and readable systems code.Real World OCaml takes you through the concepts of the language at a brisk pace, and then helps you explore the tools and techniques that make OCaml an effective and practical tool. In the book’s third section, you’ll delve deep into the details of the compiler toolchain and OCaml’s simple and efficient runtime system.Learn the foundations of the language, such as higher-order functions, algebraic data types, and modulesExplore advanced features such as functors, first-class modules, and objectsLeverage Core, a comprehensive general-purpose standard library for OCamlDesign effective and reusable libraries, making the most of OCaml’s approach to abstraction and modularityTackle practical programming problems from command-line parsing to asynchronous network programmingExamine profiling and interactive debugging techniques with tools such as GNU gdb

Making Games with Python & Pygame


Al Sweigart - 2012
    Each chapter gives you the complete source code for a new game and teaches the programming concepts from these examples. The book is available under a Creative Commons license and can be downloaded in full for free from http: //inventwithpython.com/pygame This book was written to be understandable by kids as young as 10 to 12 years old, although it is great for anyone of any age who has some familiarity with Python.

Pattern Classification


David G. Stork - 1973
    Now with the second edition, readers will find information on key new topics such as neural networks and statistical pattern recognition, the theory of machine learning, and the theory of invariances. Also included are worked examples, comparisons between different methods, extensive graphics, expanded exercises and computer project topics.An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.

Pattern Recognition and Machine Learning


Christopher M. Bishop - 2006
    However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.

The Wizard and the Prophet: Two Remarkable Scientists and Their Dueling Visions to Shape Tomorrow's World


Charles C. Mann - 2018
    Can our world support that? What kind of world will it be? Those answering these questions generally fall into two deeply divided groups--Wizards and Prophets, as Charles Mann calls them in this balanced, authoritative, nonpolemical new book. The Prophets, he explains, follow William Vogt, a founding environmentalist who believed that in using more than our planet has to give, our prosperity will lead us to ruin. Cut back! was his mantra. Otherwise everyone will lose! The Wizards are the heirs of Norman Borlaug, whose research, in effect, wrangled the world in service to our species to produce modern high-yield crops that then saved millions from starvation. Innovate! was Borlaug's cry. Only in that way can everyone win! Mann delves into these diverging viewpoints to assess the four great challenges humanity faces--food, water, energy, climate change--grounding each in historical context and weighing the options for the future. With our civilization on the line, the author's insightful analysis is an essential addition to the urgent conversation about how our children will fare on an increasingly crowded Earth.

Reinforcement Learning: An Introduction


Richard S. Sutton - 1998
    Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.