Big Data: A Revolution That Will Transform How We Live, Work, and Think


Viktor Mayer-Schönberger - 2013
    “Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. This emerging science can translate myriad phenomena—from the price of airline tickets to the text of millions of books—into searchable form, and uses our increasing computing power to unearth epiphanies that we never could have seen before. A revolution on par with the Internet or perhaps even the printing press, big data will change the way we think about business, health, politics, education, and innovation in the years to come. It also poses fresh threats, from the inevitable end of privacy as we know it to the prospect of being penalized for things we haven’t even done yet, based on big data’s ability to predict our future behavior.In this brilliantly clear, often surprising work, two leading experts explain what big data is, how it will change our lives, and what we can do to protect ourselves from its hazards. Big Data is the first big book about the next big thing.www.big-data-book.com

Brilliant Blunders: From Darwin to Einstein - Colossal Mistakes by Great Scientists That Changed Our Understanding of Life and the Universe


Mario Livio - 2013
    Nobody is perfect. And that includes five of the greatest scientists in history—Charles Darwin, William Thomson (Lord Kelvin), Linus Pauling, Fred Hoyle, and Albert Einstein. But the mistakes that these great luminaries made helped advance science. Indeed, as Mario Livio explains, science thrives on error, advancing when erroneous ideas are disproven.As a young scientist, Einstein tried to conceive of a way to describe the evolution of the universe at large, based on General Relativity—his theory of space, time, and gravity. Unfortunately he fell victim to a misguided notion of aesthetic simplicity. Fred Hoyle was an eminent astrophysicist who ridiculed an emerging theory about the origin of the universe that he dismissively called “The Big Bang.” The name stuck, but Hoyle was dead wrong in his opposition.They, along with Darwin (a blunder in his theory of Natural Selection), Kelvin (a blunder in his calculation of the age of the earth), and Pauling (a blunder in his model for the structure of the DNA molecule), were brilliant men and fascinating human beings. Their blunders were a necessary part of the scientific process. Collectively they helped to dramatically further our knowledge of the evolution of life, the Earth, and the universe.

Why Information Grows: The Evolution of Order, from Atoms to Economies


Cesar A. Hidalgo - 2015
    He believes that we should investigate what makes some countries more capable than others. Complex products—from films to robots, apps to automobiles—are a physical distillation of an economy’s knowledge, a measurable embodiment of its education, infrastructure, and capability. Economic wealth accrues when applications of this knowledge turn ideas into tangible products; the more complex its products, the more economic growth a country will experience.A radical new interpretation of global economics, Why Information Grows overturns traditional assumptions about the development of economies and the origins of wealth and takes a crucial step toward making economics less the dismal science and more the insightful one.

You Are Not a Gadget


Jaron Lanier - 2010
    Now, in his first book, written more than two decades after the web was created, Lanier offers this provocative and cautionary look at the way it is transforming our lives for better and for worse.The current design and function of the web have become so familiar that it is easy to forget that they grew out of programming decisions made decades ago. The web’s first designers made crucial choices (such as making one’s presence anonymous) that have had enormous—and often unintended—consequences. What’s more, these designs quickly became “locked in,” a permanent part of the web’s very structure. Lanier discusses the technical and cultural problems that can grow out of poorly considered digital design and warns that our financial markets and sites like Wikipedia, Facebook, and Twitter are elevating the “wisdom” of mobs and computer algorithms over the intelligence and judgment of individuals. Lanier also shows:How 1960s antigovernment paranoia influenced the design of the online world and enabled trolling and trivialization in online discourseHow file sharing is killing the artistic middle class;How a belief in a technological “rapture” motivates some of the most influential technologistsWhy a new humanistic technology is necessary. Controversial and fascinating, You Are Not a Gadget is a deeply felt defense of the individual from an author uniquely qualified to comment on the way technology interacts with our culture.

Free as in Freedom: Richard Stallman's Crusade for Free Software


Sam Williams - 2002
    It examines Stallman's unique personality and how that personality has been at turns a driving force and a drawback in terms of the movement's overall success.Free as in Freedom examines one man's 20-year attempt to codify and communicate the ethics of 1970s era "hacking" culture in such a way that later generations might easily share and build upon the knowledge of their computing forebears. The book documents Stallman's personal evolution from teenage misfit to prescient adult hacker to political leader and examines how that evolution has shaped the free software movement. Like Alan Greenspan in the financial sector, Richard Stallman has assumed the role of tribal elder within the hacking community, a community that bills itself as anarchic and averse to central leadership or authority. How did this paradox come about? Free as in Freedom provides an answer. It also looks at how the latest twists and turns in the software marketplace have diminished Stallman's leadership role in some areas while augmenting it in others.Finally, Free as in Freedom examines both Stallman and the free software movement from historical viewpoint. Will future generations see Stallman as a genius or crackpot? The answer to that question depends partly on which side of the free software debate the reader currently stands and partly upon the reader's own outlook for the future. 100 years from now, when terms such as "computer," "operating system" and perhaps even "software" itself seem hopelessly quaint, will Richard Stallman's particular vision of freedom still resonate, or will it have taken its place alongside other utopian concepts on the 'ash-heap of history?'

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy


Cathy O'Neil - 2016
    Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.

Alan Turing: Unlocking the Enigma


David Boyle - 2014
    Turing’s openness about his homosexuality at a time when it was an imprisonable offense ultimately led to his untimely lo death at the age of only forty-one. In Alan Turing: Unlocking the Enigma, David Boyle reveals the mysteries behind the man and his remarkable career. Aged just 22, Turing was elected a fellow at King's College, Cambridge on the strength of a dissertation in which he proved the central limit theorem. By the age of 33, he had been awarded the OBE by King George VI for his wartime services: Turing was instrumental in cracking the Nazi Enigma machines at the top secret code breaking establishment at Bletchley Park during the Second World War.But his achievements were to be tragically overshadowed by the paranoia of the post-War years. Hounded for his supposedly subversive views and for his sexuality, Turing was prosecuted in 1952, and forced to accept the humiliation of hormone treatment to avoid a prison sentence. Just two years later, at the age of 41 he was dead. The verdict: cyanide poisoning.Was Turing’s death accidental as his mother always claimed? Or did persistent persecution drive him to take him own life?Alan Turing: Unlocking the Enigma seeks to find the man behind the science, illuminating the life of a person who is still a shadowy presence behind his brilliant achievements.

Grammatical Man: Information, Entropy, Language and Life


Jeremy Campbell - 1973
    It describes how the laws and discoveries of information theory now support controversial revisions to Darwinian evolution, begin to unravel the mysteries of language, memory and dreams, and stimulate provocative ideas in psychology, philosophy, art, music, computers and even the structure of society. Perhaps its most fascinating and unexpected surprise is the suggestion the order and complexity may be as natural as disorder and disorganization. Contrary to the entropy principle, which implies that order is the exception and confusion the rule, information theory asserts that order and sense can indeed prevail against disorder and nonsense. From the simplest forms of organic life to the words used to express our most complex ideas, from our genes to our dreams, from microcomputers to telecommunications, virtually everything around us follows simple rules of information. Life and the material world, like language, remain "grammatical." Grammatical man inhabits a grammatical universe.

Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy


George Gilder - 2018
    Gilder says or writes is ever delivered at anything less than the fullest philosophical decibel... Mr. Gilder sounds less like a tech guru than a poet, and his words tumble out in a romantic cascade." “Google’s algorithms assume the world’s future is nothing more than the next moment in a random process. George Gilder shows how deep this assumption goes, what motivates people to make it, and why it’s wrong: the future depends on human action.” — Peter Thiel, founder of PayPal and Palantir Technologies and author of Zero to One: Notes on Startups, or How to Build the Future The Age of Google, built on big data and machine intelligence, has been an awesome era. But it’s coming to an end. In Life after Google, George Gilder—the peerless visionary of technology and culture—explains why Silicon Valley is suffering a nervous breakdown and what to expect as the post-Google age dawns. Google’s astonishing ability to “search and sort” attracts the entire world to its search engine and countless other goodies—videos, maps, email, calendars….And everything it offers is free, or so it seems. Instead of paying directly, users submit to advertising. The system of “aggregate and advertise” works—for a while—if you control an empire of data centers, but a market without prices strangles entrepreneurship and turns the Internet into a wasteland of ads. The crisis is not just economic. Even as advances in artificial intelligence induce delusions of omnipotence and transcendence, Silicon Valley has pretty much given up on security. The Internet firewalls supposedly protecting all those passwords and personal information have proved hopelessly permeable. The crisis cannot be solved within the current computer and network architecture. The future lies with the “cryptocosm”—the new architecture of the blockchain and its derivatives. Enabling cryptocurrencies such as bitcoin and ether, NEO and Hashgraph, it will provide the Internet a secure global payments system, ending the aggregate-and-advertise Age of Google. Silicon Valley, long dominated by a few giants, faces a “great unbundling,” which will disperse computer power and commerce and transform the economy and the Internet. Life after Google is almost here.   For fans of "Wealth and Poverty," "Knowledge and Power," and "The Scandal of Money."

Quantum Computing Since Democritus


Scott Aaronson - 2013
    Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.

What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry


John Markoff - 2005
    Many accounts of the birth of personal computing have been written, but this is the first close look at the drug habits of the earliest pioneers." --New York TimesMost histories of the personal computer industry focus on technology or business. John Markoff's landmark book is about the culture and consciousness behind the first PCs--the culture being counter- and the consciousness expanded, sometimes chemically. It's a brilliant evocation of Stanford, California, in the 1960s and '70s, where a group of visionaries set out to turn computers into a means for freeing minds and information. In these pages one encounters Ken Kesey and the phone hacker Cap'n Crunch, est and LSD, The Whole Earth Catalog and the Homebrew Computer Lab. What the Dormouse Said is a poignant, funny, and inspiring book by one of the smartest technology writers around.

The Mathematical Universe: An Alphabetical Journey Through the Great Proofs, Problems, and Personalities


William Dunham - 1994
    . .he believes these ideas to be accessible to the audience he wantsto reach, and he writes so that they are. -- NatureIf you want to encourage anyone's interest in math, get them TheMathematical Universe. * New Scientist

A Mathematician's Apology


G.H. Hardy - 1940
    H. Hardy was one of this century's finest mathematical thinkers, renowned among his contemporaries as a 'real mathematician ... the purest of the pure'. He was also, as C. P. Snow recounts in his Foreword, 'unorthodox, eccentric, radical, ready to talk about anything'. This 'apology', written in 1940 as his mathematical powers were declining, offers a brilliant and engaging account of mathematics as very much more than a science; when it was first published, Graham Greene hailed it alongside Henry James's notebooks as 'the best account of what it was like to be a creative artist'. C. P. Snow's Foreword gives sympathetic and witty insights into Hardy's life, with its rich store of anecdotes concerning his collaboration with the brilliant Indian mathematician Ramanujan, his aphorisms and idiosyncrasies, and his passion for cricket. This is a unique account of the fascination of mathematics and of one of its most compelling exponents in modern times.

Being Digital


Nicholas Negroponte - 1995
    Negroponte's fans will want to get a copy of Being Digital, which is an edited version of the 18 articles he wrote for Wired about "being digital." Negroponte's text is mostly a history of media technology rather than a set of predictions for future technologies. In the beginning, he describes the evolution of CD-ROMs, multimedia, hypermedia, HDTV (high-definition television), and more. The section on interfaces is informative, offering an up-to-date history on visual interfaces, graphics, virtual reality (VR), holograms, teleconferencing hardware, the mouse and touch-sensitive interfaces, and speech recognition. In the last chapter and the epilogue, Negroponte offers visionary insight on what "being digital" means for our future. Negroponte praises computers for their educational value but recognizes certain dangers of technological advances, such as increased software and data piracy and huge shifts in our job market that will require workers to transfer their skills to the digital medium. Overall, Being Digital provides an informative history of the rise of technology and some interesting predictions for its future.

Cognitive Surplus: Creativity and Generosity in a Connected Age


Clay Shirky - 2010
     For decades, technology encouraged people to squander their time and intellect as passive consumers. Today, tech has finally caught up with human potential. In Cognitive Surplus, Internet guru Clay Shirky forecasts the thrilling changes we will all enjoy as new digital technology puts our untapped resources of talent and goodwill to use at last. Since we Americans were suburbanized and educated by the postwar boom, we've had a surfeit of intellect, energy, and time-what Shirky calls a cognitive surplus. But this abundance had little impact on the common good because television consumed the lion's share of it-and we consume TV passively, in isolation from one another. Now, for the first time, people are embracing new media that allow us to pool our efforts at vanishingly low cost. The results of this aggregated effort range from mind expanding-reference tools like Wikipedia-to lifesaving-such as Ushahidi.com, which has allowed Kenyans to sidestep government censorship and report on acts of violence in real time. Shirky argues persuasively that this cognitive surplus-rather than being some strange new departure from normal behavior-actually returns our society to forms of collaboration that were natural to us up through the early twentieth century. He also charts the vast effects that our cognitive surplus- aided by new technologies-will have on twenty-first-century society, and how we can best exploit those effects. Shirky envisions an era of lower creative quality on average but greater innovation, an increase in transparency in all areas of society, and a dramatic rise in productivity that will transform our civilization. The potential impact of cognitive surplus is enormous. As Shirky points out, Wikipedia was built out of roughly 1 percent of the man-hours that Americans spend watching TV every year. Wikipedia and other current products of cognitive surplus are only the iceberg's tip. Shirky shows how society and our daily lives will be improved dramatically as we learn to exploit our goodwill and free time like never before.