Information: The New Language of Science


Hans Christian Von Baeyer - 2003
    In this indispensable volume, a primer for the information age, Hans Christian von Baeyer presents a clear description of what information is, how concepts of its measurement, meaning, and transmission evolved, and what its ever-expanding presence portends for the future. Information is poised to replace matter as the primary stuff of the universe, von Baeyer suggests; it will provide a new basic framework for describing and predicting reality in the twenty-first century. Despite its revolutionary premise, von Baeyer's book is written simply in a straightforward fashion, offering a wonderfully accessible introduction to classical and quantum information. Enlivened with anecdotes from the lives of philosophers, mathematicians, and scientists who have contributed significantly to the field, Information conducts readers from questions of subjectivity inherent in classical information to the blurring of distinctions between computers and what they measure or store in our quantum age. A great advance in our efforts to define and describe the nature of information, the book also marks an important step forward in our ability to exploit information--and, ultimately, to transform the nature of our relationship with the physical universe. (20040301)

Algorithms to Live By: The Computer Science of Human Decisions


Brian Christian - 2016
    What should we do, or leave undone, in a day or a lifetime? How much messiness should we accept? What balance of new activities and familiar favorites is the most fulfilling? These may seem like uniquely human quandaries, but they are not: computers, too, face the same constraints, so computer scientists have been grappling with their version of such issues for decades. And the solutions they've found have much to teach us.In a dazzlingly interdisciplinary work, acclaimed author Brian Christian and cognitive scientist Tom Griffiths show how the algorithms used by computers can also untangle very human questions. They explain how to have better hunches and when to leave things to chance, how to deal with overwhelming choices and how best to connect with others. From finding a spouse to finding a parking spot, from organizing one's inbox to understanding the workings of memory, Algorithms to Live By transforms the wisdom of computer science into strategies for human living.

The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life Plus the Secrets of Enigma


Alan Turing - 2004
    In 1935, aged 22, he developed the mathematical theory upon which all subsequent stored-program digital computers are modeled.At the outbreak of hostilities with Germany in September 1939, he joined the Government Codebreaking team at Bletchley Park, Buckinghamshire and played a crucial role in deciphering Engima, the code used by the German armed forces to protect their radio communications. Turing's work on the versionof Enigma used by the German navy was vital to the battle for supremacy in the North Atlantic. He also contributed to the attack on the cyphers known as Fish, which were used by the German High Command for the encryption of signals during the latter part of the war. His contribution helped toshorten the war in Europe by an estimated two years.After the war, his theoretical work led to the development of Britain's first computers at the National Physical Laboratory and the Royal Society Computing Machine Laboratory at Manchester University.Turing was also a founding father of modern cognitive science, theorizing that the cortex at birth is an unorganized machine which through training becomes organized into a universal machine or something like it. He went on to develop the use of computers to model biological growth, launchingthe discipline now referred to as Artificial Life.The papers in this book are the key works for understanding Turing's phenomenal contribution across all these fields. The collection includes Turing's declassified wartime Treatise on the Enigma; letters from Turing to Churchill and to codebreakers; lectures, papers, and broadcasts which opened upthe concept of AI and its implications; and the paper which formed the genesis of the investigation of Artifical Life.

Secrets and Lies: Digital Security in a Networked World


Bruce Schneier - 2000
    Identity Theft. Corporate Espionage. National secrets compromised. Can anyone promise security in our digital world?The man who introduced cryptography to the boardroom says no. But in this fascinating read, he shows us how to come closer by developing security measures in terms of context, tools, and strategy. Security is a process, not a product – one that system administrators and corporate executives alike must understand to survive.This edition updated with new information about post-9/11 security.

Reality is Not What it Seems: The Journey to Quantum Gravity


Carlo Rovelli - 2014
    Here he explains how our image of the world has changed throughout centuries. Fom Aristotle to Albert Einstein, Michael Faraday to the Higgs boson, he takes us on a wondrous journey to show us that beyond our ever-changing idea of reality is a whole new world that has yet to be discovered.

It Must Be Beautiful: Great Equations of Modern Science


Graham Farmelo - 2002
    Contributors include Steven Weinberg, Peter Galison, John Maynard Smith, and Frank Wilczek.

Free Culture: The Nature and Future of Creativity


Lawrence Lessig - 2004
    Never before have the cultural powers- that-be been able to exert such control over what we can and can't do with the culture around us. Our society defends free markets and free speech; why then does it permit such top-down control? To lose our long tradition of free culture, Lawrence Lessig shows us, is to lose our freedom to create, our freedom to build, and, ultimately, our freedom to imagine.

On Intelligence


Jeff Hawkins - 2004
    Now he stands ready to revolutionize both neuroscience and computing in one stroke, with a new understanding of intelligence itself.Hawkins develops a powerful theory of how the human brain works, explaining why computers are not intelligent and how, based on this new theory, we can finally build intelligent machines.The brain is not a computer, but a memory system that stores experiences in a way that reflects the true structure of the world, remembering sequences of events and their nested relationships and making predictions based on those memories. It is this memory-prediction system that forms the basis of intelligence, perception, creativity, and even consciousness.In an engaging style that will captivate audiences from the merely curious to the professional scientist, Hawkins shows how a clear understanding of how the brain works will make it possible for us to build intelligent machines, in silicon, that will exceed our human ability in surprising ways.Written with acclaimed science writer Sandra Blakeslee, On Intelligence promises to completely transfigure the possibilities of the technology age. It is a landmark book in its scope and clarity.

Fuzzy Logic: The Revolutionary Computer Technology That Is Changing Our World


Daniel McNeill - 1993
    Professor Lofti Zadeh masterminded "fuzzy logic"--a way of programming computers to "make decisions" bases on imprecise data and complex situations. In "Fuzzy Logic," Daniel McNeill and Paul Freiberger relate the compelling tale of this remarkable new technology, the genius who brought it to life, and how it will soon affect the lives of every one of us.

Out of Control: The New Biology of Machines, Social Systems, and the Economic World


Kevin Kelly - 1992
    Out of Control chronicles the dawn of a new era in which the machines and systems that drive our economy are so complex and autonomous as to be indistinguishable from living things.

Big Data: A Revolution That Will Transform How We Live, Work, and Think


Viktor Mayer-Schönberger - 2013
    “Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. This emerging science can translate myriad phenomena—from the price of airline tickets to the text of millions of books—into searchable form, and uses our increasing computing power to unearth epiphanies that we never could have seen before. A revolution on par with the Internet or perhaps even the printing press, big data will change the way we think about business, health, politics, education, and innovation in the years to come. It also poses fresh threats, from the inevitable end of privacy as we know it to the prospect of being penalized for things we haven’t even done yet, based on big data’s ability to predict our future behavior.In this brilliantly clear, often surprising work, two leading experts explain what big data is, how it will change our lives, and what we can do to protect ourselves from its hazards. Big Data is the first big book about the next big thing.www.big-data-book.com

Alan Turing: The Enigma


Andrew Hodges - 1983
    His breaking of the German U-boat Enigma cipher in World War II ensured Allied-American control of the Atlantic. But Turing's vision went far beyond the desperate wartime struggle. Already in the 1930s he had defined the concept of the universal machine, which underpins the computer revolution. In 1945 he was a pioneer of electronic computer design. But Turing's true goal was the scientific understanding of the mind, brought out in the drama and wit of the famous "Turing test" for machine intelligence and in his prophecy for the twenty-first century.Drawn in to the cockpit of world events and the forefront of technological innovation, Alan Turing was also an innocent and unpretentious gay man trying to live in a society that criminalized him. In 1952 he revealed his homosexuality and was forced to participate in a humiliating treatment program, and was ever after regarded as a security risk. His suicide in 1954 remains one of the many enigmas in an astonishing life story.

Infinite Powers: How Calculus Reveals the Secrets of the Universe


Steven H. Strogatz - 2019
    We wouldn’t have unraveled DNA or discovered Neptune or figured out how to put 5,000 songs in your pocket. Though many of us were scared away from this essential, engrossing subject in high school and college, Steven Strogatz’s brilliantly creative, down‑to‑earth history shows that calculus is not about complexity; it’s about simplicity. It harnesses an unreal number—infinity—to tackle real‑world problems, breaking them down into easier ones and then reassembling the answers into solutions that feel miraculous. Infinite Powers recounts how calculus tantalized and thrilled its inventors, starting with its first glimmers in ancient Greece and bringing us right up to the discovery of gravitational waves (a phenomenon predicted by calculus). Strogatz reveals how this form of math rose to the challenges of each age: how to determine the area of a circle with only sand and a stick; how to explain why Mars goes “backwards” sometimes; how to make electricity with magnets; how to ensure your rocket doesn’t miss the moon; how to turn the tide in the fight against AIDS. As Strogatz proves, calculus is truly the language of the universe. By unveiling the principles of that language, Infinite Powers makes us marvel at the world anew.

Data Smart: Using Data Science to Transform Information into Insight


John W. Foreman - 2013
    Major retailers are predicting everything from when their customers are pregnant to when they want a new pair of Chuck Taylors. It's a brave new world where seemingly meaningless data can be transformed into valuable insight to drive smart business decisions.But how does one exactly do data science? Do you have to hire one of these priests of the dark arts, the "data scientist," to extract this gold from your data? Nope.Data science is little more than using straight-forward steps to process raw data into actionable insight. And in Data Smart, author and data scientist John Foreman will show you how that's done within the familiar environment of a spreadsheet. Why a spreadsheet? It's comfortable! You get to look at the data every step of the way, building confidence as you learn the tricks of the trade. Plus, spreadsheets are a vendor-neutral place to learn data science without the hype. But don't let the Excel sheets fool you. This is a book for those serious about learning the analytic techniques, the math and the magic, behind big data.Each chapter will cover a different technique in a spreadsheet so you can follow along: - Mathematical optimization, including non-linear programming and genetic algorithms- Clustering via k-means, spherical k-means, and graph modularity- Data mining in graphs, such as outlier detection- Supervised AI through logistic regression, ensemble models, and bag-of-words models- Forecasting, seasonal adjustments, and prediction intervals through monte carlo simulation- Moving from spreadsheets into the R programming languageYou get your hands dirty as you work alongside John through each technique. But never fear, the topics are readily applicable and the author laces humor throughout. You'll even learn what a dead squirrel has to do with optimization modeling, which you no doubt are dying to know.

The Art of Electronics


Paul Horowitz - 1980
    Widely accepted as the authoritative text and reference on electronic circuit design, both analog and digital, this book revolutionized the teaching of electronics by emphasizing the methods actually used by circuit designers -- a combination of some basic laws, rules of thumb, and a large bag of tricks. The result is a largely nonmathematical treatment that encourages circuit intuition, brainstorming, and simplified calculations of circuit values and performance. The new Art of Electronics retains the feeling of informality and easy access that helped make the first edition so successful and popular. It is an ideal first textbook on electronics for scientists and engineers and an indispensable reference for anyone, professional or amateur, who works with electronic circuits.