A Short History of Nearly Everything


Bill Bryson - 2003
    Taking as territory everything from the Big Bang to the rise of civilization, Bryson seeks to understand how we got from there being nothing at all to there being us. To that end, he has attached himself to a host of the world’s most advanced (and often obsessed) archaeologists, anthropologists, and mathematicians, travelling to their offices, laboratories, and field camps. He has read (or tried to read) their books, pestered them with questions, apprenticed himself to their powerful minds. A Short History of Nearly Everything is the record of this quest, and it is a sometimes profound, sometimes funny, and always supremely clear and entertaining adventure in the realms of human knowledge, as only Bill Bryson can render it. Science has never been more involving or entertaining.

Mindstorms: Children, Computers, And Powerful Ideas


Seymour Papert - 1980
    We have Mindstorms to thank for that. In this book, pioneering computer scientist Seymour Papert uses the invention of LOGO, the first child-friendly programming language, to make the case for the value of teaching children with computers. Papert argues that children are more than capable of mastering computers, and that teaching computational processes like de-bugging in the classroom can change the way we learn everything else. He also shows that schools saturated with technology can actually improve socialization and interaction among students and between students and teachers.

Hackers & Painters: Big Ideas from the Computer Age


Paul Graham - 2004
    Who are these people, what motivates them, and why should you care?Consider these facts: Everything around us is turning into computers. Your typewriter is gone, replaced by a computer. Your phone has turned into a computer. So has your camera. Soon your TV will. Your car was not only designed on computers, but has more processing power in it than a room-sized mainframe did in 1970. Letters, encyclopedias, newspapers, and even your local store are being replaced by the Internet.Hackers & Painters: Big Ideas from the Computer Age, by Paul Graham, explains this world and the motivations of the people who occupy it. In clear, thoughtful prose that draws on illuminating historical examples, Graham takes readers on an unflinching exploration into what he calls “an intellectual Wild West.”The ideas discussed in this book will have a powerful and lasting impact on how we think, how we work, how we develop technology, and how we live. Topics include the importance of beauty in software design, how to make wealth, heresy and free speech, the programming language renaissance, the open-source movement, digital design, internet startups, and more.

John Von Neumann


Norman Macrae - 1992
    This book discusses Von Neumann's work in areas such as game theory, mathematics, physics, and meteorology which formed the building blocks for the most important discoveries of the century: the modern computer, game theory, and the atom bomb.

Gödel's Proof


Ernest Nagel - 1958
    Gödel received public recognition of his work in 1951 when he was awarded the first Albert Einstein Award for achievement in the natural sciences--perhaps the highest award of its kind in the United States. The award committee described his work in mathematical logic as "one of the greatest contributions to the sciences in recent times."However, few mathematicians of the time were equipped to understand the young scholar's complex proof. Ernest Nagel and James Newman provide a readable and accessible explanation to both scholars and non-specialists of the main ideas and broad implications of Gödel's discovery. It offers every educated person with a taste for logic and philosophy the chance to understand a previously difficult and inaccessible subject.New York University Press is proud to publish this special edition of one of its bestselling books. With a new introduction by Douglas R. Hofstadter, this book will appeal students, scholars, and professionals in the fields of mathematics, computer science, logic and philosophy, and science.

Hatching Twitter: A True Story of Money, Power, Friendship, and Betrayal


Nick Bilton - 2013
    In barely six years, a small group of young, ambitious programmers in Silicon Valley built an $11.5 billion business out of the ashes of a failed podcasting company. Today Twitter boasts more than 200 million active users and has affected business, politics, media, and other fields in innumerable ways. Now Nick Bilton of the New York Times takes readers behind the scenes with a narrative that shows what happened inside Twitter as it grew at exponential speeds. This is a tale of betrayed friendships and high-stakes power struggles as the four founders—Biz Stone, Evan Williams, Jack Dorsey, and Noah Glass—went from everyday engineers to wealthy celebrities, featured on magazine covers, Oprah, The Daily Show, and Time’s list of the world’s most influential people. Bilton’s exclusive access and exhaustive investigative reporting—drawing on hundreds of sources, documents, and internal e-mails—have enabled him to write an intimate portrait of fame, influence, and power. He also captures the zeitgeist and global influence of Twitter, which has been used to help overthrow governments in the Middle East and disrupt the very fabric of the way people communicate.

One, Two, Three...Infinity: Facts and Speculations of Science


George Gamow - 1947
    . . full of intellectual treats and tricks, of whimsy and deep scientific philosophy. It is highbrow entertainment at its best, a teasing challenge to all who aspire to think about the universe." — New York Herald TribuneOne of the world's foremost nuclear physicists (celebrated for his theory of radioactive decay, among other accomplishments), George Gamow possessed the unique ability of making the world of science accessible to the general reader.He brings that ability to bear in this delightful expedition through the problems, pleasures, and puzzles of modern science. Among the topics scrutinized with the author's celebrated good humor and pedagogical prowess are the macrocosm and the microcosm, theory of numbers, relativity of space and time, entropy, genes, atomic structure, nuclear fission, and the origin of the solar system.In the pages of this book readers grapple with such crucial matters as whether it is possible to bend space, why a rocket shrinks, the "end of the world problem," excursions into the fourth dimension, and a host of other tantalizing topics for the scientifically curious. Brimming with amusing anecdotes and provocative problems, One Two Three . . . Infinity also includes over 120 delightful pen-and-ink illustrations by the author, adding another dimension of good-natured charm to these wide-ranging explorations.Whatever your level of scientific expertise, chances are you'll derive a great deal of pleasure, stimulation, and information from this unusual and imaginative book. It belongs in the library of anyone curious about the wonders of the scientific universe. "In One Two Three . . . Infinity, as in his other books, George Gamow succeeds where others fail because of his remarkable ability to combine technical accuracy, choice of material, dignity of expression, and readability." — Saturday Review of Literature

Infinite Loop: How Apple, the World's Most Insanely Great Computer Company, Went Insane


Michael S. Malone - 1999
    How did Apple lose its way? Why did the world still care so deeply about a company that had lost its leadership position? Michael S. Malone, from the unique vantage point of having grown up with the company's founders, and having covered Apple and Silicon Valley for years, sets out to tell the gripping behind-the-scenes story - a story that is even zanier than the business world thought. In essence, Malone claims, with only a couple of incredible inventions (the Apple II and Macintosh), and backed by an arrogance matched only by its corporate ineptitude, Apple managed to create a multibillion-dollar house of cards. And, like a faulty program repeating itself in an infinite loop, Apple could never learn from its mistakes. The miracle was not that Apple went into free fall, but that it held up for so long. Within the pages of Infinite Loop, we discover a bruising portrait of the megalomaniacal Steve Jobs and an incompetent John Sculley, as well as the kind of political backstabbings, stupid mistakes, and overweening egos more typical of a soap opera than a corporate history.

Deep Learning with Python


François Chollet - 2017
    It is the technology behind photo tagging systems at Facebook and Google, self-driving cars, speech recognition systems on your smartphone, and much more.In particular, Deep learning excels at solving machine perception problems: understanding the content of image data, video data, or sound data. Here's a simple example: say you have a large collection of images, and that you want tags associated with each image, for example, "dog," "cat," etc. Deep learning can allow you to create a system that understands how to map such tags to images, learning only from examples. This system can then be applied to new images, automating the task of photo tagging. A deep learning model only has to be fed examples of a task to start generating useful results on new data.

The Two Cultures


C.P. Snow - 1959
    But it was C. P. Snow's Rede lecture of 1959 that brought it to prominence and began a public debate that is still raging in the media today. This 50th anniversary printing of The Two Cultures and its successor piece, A Second Look (in which Snow responded to the controversy four years later) features an introduction by Stefan Collini, charting the history and context of the debate, its implications and its afterlife. The importance of science and technology in policy run largely by non-scientists, the future for education and research, and the problem of fragmentation threatening hopes for a common culture are just some of the subjects discussed.

The Perfectionists: How Precision Engineers Created the Modern World


Simon Winchester - 2018
    At the dawn of the Industrial Revolution in eighteenth-century England, standards of measurement were established, giving way to the development of machine tools—machines that make machines. Eventually, the application of precision tools and methods resulted in the creation and mass production of items from guns and glass to mirrors, lenses, and cameras—and eventually gave way to further breakthroughs, including gene splicing, microchips, and the Hadron Collider.Simon Winchester takes us back to origins of the Industrial Age, to England where he introduces the scientific minds that helped usher in modern production: John Wilkinson, Henry Maudslay, Joseph Bramah, Jesse Ramsden, and Joseph Whitworth. It was Thomas Jefferson who later exported their discoveries to the fledgling United States, setting the nation on its course to become a manufacturing titan. Winchester moves forward through time, to today’s cutting-edge developments occurring around the world, from America to Western Europe to Asia.As he introduces the minds and methods that have changed the modern world, Winchester explores fundamental questions. Why is precision important? What are the different tools we use to measure it? Who has invented and perfected it? Has the pursuit of the ultra-precise in so many facets of human life blinded us to other things of equal value, such as an appreciation for the age-old traditions of craftsmanship, art, and high culture? Are we missing something that reflects the world as it is, rather than the world as we think we would wish it to be? And can the precise and the natural co-exist in society?

A Question of Time: The Ultimate Paradox


Scientific American - 2012
    

Nonzero: The Logic of Human Destiny


Robert Wright - 1999
    Now Wright attempts something even more ambitious: explaining the direction of evolution and human history–and discerning where history will lead us next.In Nonzero: The Logic of Human Destiny, Wright asserts that, ever since the primordial ooze, life has followed a basic pattern. Organisms and human societies alike have grown more complex by mastering the challenges of internal cooperation. Wright's narrative ranges from fossilized bacteria to vampire bats, from stone-age villages to the World Trade Organization, uncovering such surprises as the benefits of barbarian hordes and the useful stability of feudalism. Here is history endowed with moral significance–a way of looking at our biological and cultural evolution that suggests, refreshingly, that human morality has improved over time, and that our instinct to discover meaning may itself serve a higher purpose. Insightful, witty, profound, Nonzero offers breathtaking implications for what we believe and how we adapt to technology's ongoing transformation of the world.From the Trade Paperback edition.

Computational Thinking


Peter J. Denning - 2019
    More recently, "computational thinking" has become part of the K-12 curriculum. But what is computational thinking? This volume in the MIT Press Essential Knowledge series offers an accessible overview, tracing a genealogy that begins centuries before digital computers and portraying computational thinking as pioneers of computing have described it.The authors explain that computational thinking (CT) is not a set of concepts for programming; it is a way of thinking that is honed through practice: the mental skills for designing computations to do jobs for us, and for explaining and interpreting the world as a complex of information processes. Mathematically trained experts (known as "computers") who performed complex calculations as teams engaged in CT long before electronic computers. The authors identify six dimensions of today's highly developed CT--methods, machines, computing education, software engineering, computational science, and design--and cover each in a chapter. Along the way, they debunk inflated claims for CT and computation while making clear the power of CT in all its complexity and multiplicity.

Uncertainty: Einstein, Heisenberg, Bohr, and the Struggle for the Soul of Science


David Lindley - 2007
    Heisenberg’s principle implied that scientific quantities/concepts do not have absolute, independent meaning, but acquire meaning only in terms of the experiments used to measure them. This proposition, undermining the cherished belief that science could reveal the physical world with limitless detail and precision, placed Heisenberg in direct opposition to the revered Albert Einstein. The eminent scientist Niels Bohr, Heisenberg’s mentor and Einstein’s long-time friend, found himself caught between the two.Uncertainty chronicles the birth and evolution of one of the most significant findings in the history of science, and portrays the clash of ideas and personalities it provoked. Einstein was emotionally as well as intellectually determined to prove the uncertainty principle false. Heisenberg represented a new generation of physicists who believed that quantum theory overthrew the old certainties; confident of his reasoning, Heisenberg dismissed Einstein’s objections. Bohr understood that Heisenberg was correct, but he also recognized the vital necessity of gaining Einstein’s support as the world faced the shocking implications of Heisenberg’s principle.