Book picks similar to
Numerical Methods for Ordinary Differential Equations: Initial Value Problems by David F. Griffiths
mathematics
science-tech
statistics
want-read
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Cathy O'Neil - 2016
Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.
Six Easy Pieces: Essentials of Physics By Its Most Brilliant Teacher
Richard P. Feynman - 1995
This set couples a book containing the six easiest chapters from Richard P. Feynman's landmark work, Lectures on Physics—specifically designed for the general, non-scientist reader—with the actual recordings of the late, great physicist delivering the lectures on which the chapters are based. Nobel Laureate Feynman gave these lectures just once, to a group of Caltech undergraduates in 1961 and 1962, and these newly released recordings allow you to experience one of the Twentieth Century's greatest minds—as if you were right there in the classroom.
The Improbability Principle: Why Coincidences, Miracles, and Rare Events Happen Every Day
David J. Hand - 2014
Hand argues that extraordinarily rare events are anything but. In fact, they’re commonplace. Not only that, we should all expect to experience a miracle roughly once every month. But Hand is no believer in superstitions, prophecies, or the paranormal. His definition of “miracle” is thoroughly rational. No mystical or supernatural explanation is necessary to understand why someone is lucky enough to win the lottery twice, or is destined to be hit by lightning three times and still survive. All we need, Hand argues, is a firm grounding in a powerful set of laws: the laws of inevitability, of truly large numbers, of selection, of the probability lever, and of near enough. Together, these constitute Hand’s groundbreaking Improbability Principle. And together, they explain why we should not be so surprised to bump into a friend in a foreign country, or to come across the same unfamiliar word four times in one day. Hand wrestles with seemingly less explicable questions as well: what the Bible and Shakespeare have in common, why financial crashes are par for the course, and why lightning does strike the same place (and the same person) twice. Along the way, he teaches us how to use the Improbability Principle in our own lives—including how to cash in at a casino and how to recognize when a medicine is truly effective. An irresistible adventure into the laws behind “chance” moments and a trusty guide for understanding the world and universe we live in, The Improbability Principle will transform how you think about serendipity and luck, whether it’s in the world of business and finance or you’re merely sitting in your backyard, tossing a ball into the air and wondering where it will land.
The Information: A History, a Theory, a Flood
James Gleick - 2011
The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself. And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.
Elements of Information Theory
Thomas M. Cover - 1991
Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
An Introduction to Systems Biology: Design Principles of Biological Circuits
Uri Alon - 2006
It provides a simple mathematical framework which can be used to understand and even design biological circuits. The textavoids specialist terms, focusing instead on several well-studied biological systems that concisely demonstrate key principles. An Introduction to Systems Biology: Design Principles of Biological Circuits builds a solid foundation for the intuitive understanding of general principles. It encourages the reader to ask why a system is designed in a particular way and then proceeds to answer with simplified models.
The Outer Limits of Reason: What Science, Mathematics, and Logic Cannot Tell Us
Noson S. Yanofsky - 2013
This book investigates what cannot be known. Rather than exploring the amazing facts that science, mathematics, and reason have revealed to us, this work studies what science, mathematics, and reason tell us cannot be revealed. In The Outer Limits of Reason, Noson Yanofsky considers what cannot be predicted, described, or known, and what will never be understood. He discusses the limitations of computers, physics, logic, and our own thought processes.Yanofsky describes simple tasks that would take computers trillions of centuries to complete and other problems that computers can never solve; perfectly formed English sentences that make no sense; different levels of infinity; the bizarre world of the quantum; the relevance of relativity theory; the causes of chaos theory; math problems that cannot be solved by normal means; and statements that are true but cannot be proven. He explains the limitations of our intuitions about the world -- our ideas about space, time, and motion, and the complex relationship between the knower and the known.Moving from the concrete to the abstract, from problems of everyday language to straightforward philosophical questions to the formalities of physics and mathematics, Yanofsky demonstrates a myriad of unsolvable problems and paradoxes. Exploring the various limitations of our knowledge, he shows that many of these limitations have a similar pattern and that by investigating these patterns, we can better understand the structure and limitations of reason itself. Yanofsky even attempts to look beyond the borders of reason to see what, if anything, is out there.
Advanced Engineering Mathematics [with Accompanying Mathematics Manual]
Erwin Kreyszig - 1998
The Idea Factory: Bell Labs and the Great Age of American Innovation
Jon Gertner - 2012
From the transistor to the laser, it s hard to find an aspect of modern life that hasn t been touched by Bell Labs. Why did so many transformative ideas come from Bell Labs? In "The Idea Factory," Jon Gertner traces the origins of some of the twentieth century s most important inventions and delivers a riveting and heretofore untold chapter of American history. At its heart this is a story about the life and work of a small group of brilliant and eccentric men Mervin Kelly, Bill Shockley, Claude Shannon, John Pierce, and Bill Baker who spent their careers at Bell Labs. Their job was to research and develop the future of communications. Small-town boys, childhood hobbyists, oddballs: they give the lie to the idea that Bell Labs was a grim cathedral of top-down command and control.Gertner brings to life the powerful alchemy of the forces at work behind Bell Labs inventions, teasing out the intersections between science, business, and society. He distills the lessons that abide: how to recruit and nurture young talent; how to organize and lead fractious employees; how to find solutions to the most stubbornly vexing problems; how to transform a scientific discovery into a marketable product, then make it even better, cheaper, or both. Today, when the drive to invent has become a mantra, Bell Labs offers us a way to enrich our understanding of the challenges and solutions to technological innovation. Here, after all, was where the foundational ideas on the management of innovation were born. "The Idea Factory" is the story of the origins of modern communications and the beginnings of the information age a deeply human story of extraordinary men who were given extraordinary means time, space, funds, and access to one another and edged the world into a new dimension."
The R Book
Michael J. Crawley - 2007
The R language is recognised as one of the most powerful and flexible statistical software packages, and it enables the user to apply many statistical techniques that would be impossible without such software to help implement such large data sets.
The Emperor's New Mind: Concerning Computers, Minds and the Laws of Physics
Roger Penrose - 1989
Admittedly, computers now play chess at the grandmaster level, but do they understand the game as we do? Can a computer eventually do everything a human mind can do? In this absorbing and frequently contentious book, Roger Penrose--eminent physicist and winner, with Stephen Hawking, of the prestigious Wolf prize--puts forward his view that there are some facets of human thinking that can never be emulated by a machine. Penrose examines what physics and mathematics can tell us about how the mind works, what they can't, and what we need to know to understand the physical processes of consciousness. He is among a growing number of physicists who think Einstein wasn't being stubborn when he said his little finger told him that quantum mechanics is incomplete, and he concludes that laws even deeper than quantum mechanics are essential for the operation of a mind. To support this contention, Penrose takes the reader on a dazzling tour that covers such topics as complex numbers, Turing machines, complexity theory, quantum mechanics, formal systems, Godel undecidability, phase spaces, Hilbert spaces, black holes, white holes, Hawking radiation, entropy, quasicrystals, the structure of the brain, and scores of other subjects. The Emperor's New Mind will appeal to anyone with a serious interest in modern physics and its relation to philosophical issues, as well as to physicists, mathematicians, philosophers and those on either side of the AI debate.
Archimedes' Revenge: The Joys and Perils of Mathematics
Paul Hoffman - 1988
An extremely clever account.--The New Yorker.
Quantum Computing Since Democritus
Scott Aaronson - 2013
Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.