The Physics of Immortality: Modern Cosmology, God and the Resurrection of the Dead


Frank J. Tipler - 1994
    Tipler claims to scientifically prove the existence of God and the physical resurrection of the dead.

Grammatical Man: Information, Entropy, Language and Life


Jeremy Campbell - 1973
    It describes how the laws and discoveries of information theory now support controversial revisions to Darwinian evolution, begin to unravel the mysteries of language, memory and dreams, and stimulate provocative ideas in psychology, philosophy, art, music, computers and even the structure of society. Perhaps its most fascinating and unexpected surprise is the suggestion the order and complexity may be as natural as disorder and disorganization. Contrary to the entropy principle, which implies that order is the exception and confusion the rule, information theory asserts that order and sense can indeed prevail against disorder and nonsense. From the simplest forms of organic life to the words used to express our most complex ideas, from our genes to our dreams, from microcomputers to telecommunications, virtually everything around us follows simple rules of information. Life and the material world, like language, remain "grammatical." Grammatical man inhabits a grammatical universe.

A New Kind of Science


Stephen Wolfram - 1997
    Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton

Cybernetics: or the Control and Communication in the Animal and the Machine


Norbert Wiener - 1948
    It is a ‘ must’ book for those in every branch of science . . . in addition, economists, politicians, statesmen, and businessmen cannot afford to overlook cybernetics and its tremendous, even terrifying implications. "It is a beautifully written book, lucid, direct, and despite its complexity, as readable by the layman as the trained scientist." -- John B. Thurston, "The Saturday Review of Literature" Acclaimed one of the "seminal books . . . comparable in ultimate importance to . . . Galileo or Malthus or Rousseau or Mill," "Cybernetics" was judged by twenty-seven historians, economists, educators, and philosophers to be one of those books published during the "past four decades", which may have a substantial impact on public thought and action in the years ahead." -- Saturday Review

The Information: A History, a Theory, a Flood


James Gleick - 2011
    The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself. And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.

Complexity: A Guided Tour


Melanie Mitchell - 2009
    Based on her work at the Santa Fe Institute and drawing on its interdisciplinary strategies, Mitchell brings clarity to the workings of complexity across a broad range of biological, technological, and social phenomena, seeking out the general principles or laws that apply to all of them. Richly illustrated, Complexity: A Guided Tour--winner of the 2010 Phi Beta Kappa Book Award in Science--offers a wide-ranging overview of the ideas underlying complex systems science, the current research at the forefront of this field, and the prospects for its contribution to solving some of the most important scientific questions of our time.

Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World


Leslie Valiant - 2013
    We nevertheless muddle through even in the absence of theories of how to act. But how do we do it?In Probably Approximately Correct, computer scientist Leslie Valiant presents a masterful synthesis of learning and evolution to show how both individually and collectively we not only survive, but prosper in a world as complex as our own. The key is “probably approximately correct” algorithms, a concept Valiant developed to explain how effective behavior can be learned. The model shows that pragmatically coping with a problem can provide a satisfactory solution in the absence of any theory of the problem. After all, finding a mate does not require a theory of mating. Valiant’s theory reveals the shared computational nature of evolution and learning, and sheds light on perennial questions such as nature versus nurture and the limits of artificial intelligence.Offering a powerful and elegant model that encompasses life’s complexity, Probably Approximately Correct has profound implications for how we think about behavior, cognition, biological evolution, and the possibilities and limits of human and machine intelligence.

The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge


William Poundstone - 1984
    Topics include the limits of knowledge, paradox of complexity, Maxwell's demon, Big Bang theory, much more. 1985 edition.

Out of Control: The New Biology of Machines, Social Systems, and the Economic World


Kevin Kelly - 1992
    Out of Control chronicles the dawn of a new era in which the machines and systems that drive our economy are so complex and autonomous as to be indistinguishable from living things.

The Monty Hall Problem: The Remarkable Story of Math's Most Contentious Brain Teaser


Jason Rosenhouse - 2009
    Imagine that you face three doors, behind one of which is a prize. You choose one but do not open it. The host--call him Monty Hall--opens a different door, alwayschoosing one he knows to be empty. Left with two doors, will you do better by sticking with your first choice, or by switching to the other remaining door? In this light-hearted yet ultimately serious book, Jason Rosenhouse explores the history of this fascinating puzzle. Using a minimum ofmathematics (and none at all for much of the book), he shows how the problem has fascinated philosophers, psychologists, and many others, and examines the many variations that have appeared over the years. As Rosenhouse demonstrates, the Monty Hall Problem illuminates fundamental mathematical issuesand has abiding philosophical implications. Perhaps most important, he writes, the problem opens a window on our cognitive difficulties in reasoning about uncertainty.

Computational Complexity


Sanjeev Arora - 2007
    Requiring essentially no background apart from mathematical maturity, the book can be used as a reference for self-study for anyone interested in complexity, including physicists, mathematicians, and other scientists, as well as a textbook for a variety of courses and seminars. More than 300 exercises are included with a selected hint set.

Why Information Grows: The Evolution of Order, from Atoms to Economies


Cesar A. Hidalgo - 2015
    He believes that we should investigate what makes some countries more capable than others. Complex products—from films to robots, apps to automobiles—are a physical distillation of an economy’s knowledge, a measurable embodiment of its education, infrastructure, and capability. Economic wealth accrues when applications of this knowledge turn ideas into tangible products; the more complex its products, the more economic growth a country will experience.A radical new interpretation of global economics, Why Information Grows overturns traditional assumptions about the development of economies and the origins of wealth and takes a crucial step toward making economics less the dismal science and more the insightful one.

Introduction to Automata Theory, Languages, and Computation


John E. Hopcroft - 1979
    With this long-awaited revision, the authors continue to present the theory in a concise and straightforward manner, now with an eye out for the practical applications. They have revised this book to make it more accessible to today's students, including the addition of more material on writing proofs, more figures and pictures to convey ideas, side-boxes to highlight other interesting material, and a less formal writing style. Exercises at the end of each chapter, including some new, easier exercises, help readers confirm and enhance their understanding of the material. *NEW! Completely rewritten to be less formal, providing more accessibility to todays students. *NEW! Increased usage of figures and pictures to help convey ideas. *NEW! More detail and intuition provided for definitions and proofs. *NEW! Provides special side-boxes to present supplemental material that may be of interest to readers. *NEW! Includes more exercises, including many at a lower level. *NEW! Presents program-like notation for PDAs and Turing machines. *NEW! Increas

Think Complexity: Complexity Science and Computational Modeling


Allen B. Downey - 2009
    Whether you’re an intermediate-level Python programmer or a student of computational modeling, you’ll delve into examples of complex systems through a series of exercises, case studies, and easy-to-understand explanations.You’ll work with graphs, algorithm analysis, scale-free networks, and cellular automata, using advanced features that make Python such a powerful language. Ideal as a text for courses on Python programming and algorithms, Think Complexity will also help self-learners gain valuable experience with topics and ideas they might not encounter otherwise.Work with NumPy arrays and SciPy methods, basic signal processing and Fast Fourier Transform, and hash tablesStudy abstract models of complex physical systems, including power laws, fractals and pink noise, and Turing machinesGet starter code and solutions to help you re-implement and extend original experiments in complexityExplore the philosophy of science, including the nature of scientific laws, theory choice, realism and instrumentalism, and other topicsExamine case studies of complex systems submitted by students and readers

Causal Inference in Statistics: A Primer


Judea Pearl - 2016
    Judea Pearl presents a book ideal for beginners in statistics, providing a comprehensive introduction to the field of causality. Examples from classical statistics are presented throughout to demonstrate the need for causality in resolving decision-making dilemmas posed by data. Causal methods are also compared to traditional statistical methods, whilst questions are provided at the end of each section to aid student learning.