Computational Complexity


Sanjeev Arora - 2007
    Requiring essentially no background apart from mathematical maturity, the book can be used as a reference for self-study for anyone interested in complexity, including physicists, mathematicians, and other scientists, as well as a textbook for a variety of courses and seminars. More than 300 exercises are included with a selected hint set.

Statistics for Dummies


Deborah J. Rumsey - 2003
    . ." and "The data bear this out. . . ." But the field of statistics is not just about data. Statistics is the entire process involved in gathering evidence to answer questions about the world, in cases where that evidence happens to be numerical data. Statistics For Dummies is for everyone who wants to sort through and evaluate the incredible amount of statistical information that comes to them on a daily basis. (You know the stuff: charts, graphs, tables, as well as headlines that talk about the results of the latest poll, survey, experiment, or other scientific study.) This book arms you with the ability to decipher and make important decisions about statistical results, being ever aware of the ways in which people can mislead you with statistics. Get the inside scoop on number-crunching nuances, plus insight into how you canDetermine the odds Calculate a standard score Find the margin of error Recognize the impact of polls Establish criteria for a good survey Make informed decisions about experiments This down-to-earth reference is chock-full of real examples from real sources that are relevant to your everyday life: from the latest medical breakthroughs, crime studies, and population trends to surveys on Internet dating, cell phone use, and the worst cars of the millennium. Statistics For Dummies departs from traditional statistics texts, references, supplement books, and study guides in the following ways:Practical and intuitive explanations of statistical concepts, ideas, techniques, formulas, and calculations. Clear and concise step-by-step procedures that intuitively explain how to work through statistics problems. Upfront and honest answers to your questions like, "What does this really mean?" and "When and how I will ever use this?" Chances are, Statistics For Dummies will be your No. 1 resource for discovering how numerical data figures into your corner of the universe.

Programming Windows 8 Apps with HTML, CSS, and JavaScript


Kraig Brockschmidt - 2012
    

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All


Robert Elliott Smith - 2019
    Frighteningly often, the influence of technology in and on our lives goes completely unchallenged by citizens and governments. We comfort ourselves with the soothing refrain that technology has no morals and can display no prejudice, and it's only the users of technology who distort certain aspects of it.But is this statement actually true? Dr Robert Smith thinks it is dangerously untrue in the modern era.Having worked in the field of artificial intelligence for over 30 years, Smith reveals the mounting evidence that the mechanical actors in our lives do indeed have, or at least express, morals: they're just not the morals of the progressive modern society that we imagined we were moving towards. Instead, as we are just beginning to see – in the US elections and Brexit to name but a few – there are increasing incidences of machine bigotry, greed and the crass manipulation of our basest instincts.It is easy to assume that these are the result of programmer prejudices or the product of dark forces manipulating the masses through the network of the Internet. But what if there is something more fundamental and explicitly mechanical at play, something inherent within technology itself?This book demonstrates how non-scientific ideas have been encoded deep into our technological infrastructure. Offering a rigorous, fresh perspective on how technology has brought us to this place, Rage Inside the Machine challenges the long-held assumption that technology is an apolitical and amoral force. Shedding light on little-known historical stories and investigating the complex connections between scientific philosophy, institutional prejudice and new technology, this book offers a new, honest and more truly scientific vision of ourselves.

The Art of Doing Science and Engineering: Learning to Learn


Richard Hamming - 1996
    By presenting actual experiences and analyzing them as they are described, the author conveys the developmental thought processes employed and shows a style of thinking that leads to successful results is something that can be learned. Along with spectacular successes, the author also conveys how failures contributed to shaping the thought processes. Provides the reader with a style of thinking that will enhance a person's ability to function as a problem-solver of complex technical issues. Consists of a collection of stories about the author's participation in significant discoveries, relating how those discoveries came about and, most importantly, provides analysis about the thought processes and reasoning that took place as the author and his associates progressed through engineering problems.

Python Machine Learning


Sebastian Raschka - 2015
    We are living in an age where data comes in abundance, and thanks to the self-learning algorithms from the field of machine learning, we can turn this data into knowledge. Automated speech recognition on our smart phones, web search engines, e-mail spam filters, the recommendation systems of our favorite movie streaming services – machine learning makes it all possible.Thanks to the many powerful open-source libraries that have been developed in recent years, machine learning is now right at our fingertips. Python provides the perfect environment to build machine learning systems productively.This book will teach you the fundamentals of machine learning and how to utilize these in real-world applications using Python. Step-by-step, you will expand your skill set with the best practices for transforming raw data into useful information, developing learning algorithms efficiently, and evaluating results.You will discover the different problem categories that machine learning can solve and explore how to classify objects, predict continuous outcomes with regression analysis, and find hidden structures in data via clustering. You will build your own machine learning system for sentiment analysis and finally, learn how to embed your model into a web app to share with the world

OS X 10.10 Yosemite: The Ars Technica Review


John Siracusa - 2014
    Siracusa's overview, wrap-up, and critique of everything new in OS X 10.10 Yosemite.

Game Theory: A Nontechnical Introduction


Morton D. Davis - 1970
    . . a most valuable contribution." — Douglas R. Hofstadter, author of Gödel, Escher, BachThe foundations of game theory were laid by John von Neumann, who in 1928 proved the basic minimax theorem, and with the 1944 publication of the Theory of Games and Economic Behavior, the field was established. Since then, game theory has become an enormously important discipline because of its novel mathematical properties and its many applications to social, economic, and political problems.Game theory has been used to make investment decisions, pick jurors, commit tanks to battle, allocate business expenses equitably — even to measure a senator's power, among many other uses. In this revised edition of his highly regarded work, Morton Davis begins with an overview of game theory, then discusses the two-person zero-sum game with equilibrium points; the general, two-person zero-sum game; utility theory; the two-person, non-zero-sum game; and the n-person game.A number of problems are posed at the start of each chapter and readers are given a chance to solve them before moving on. (Unlike most mathematical problems, many problems in game theory are easily understood by the lay reader.) At the end of the chapter, where solutions are discussed, readers can compare their "common sense" solutions with those of the author. Brimming with applications to an enormous variety of everyday situations, this book offers readers a fascinating, accessible introduction to one of the most fruitful and interesting intellectual systems of our time.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

Computer Age Statistical Inference: Algorithms, Evidence, and Data Science


Bradley Efron - 2016
    'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Paradigms of Artificial Intelligence Programming: Case Studies in Common LISP


Peter Norvig - 1991
    By reconstructing authentic, complex AI programs using state-of-the-art Common Lisp, the book teaches students and professionals how to build and debug robust practical programs, while demonstrating superior programming style and important AI concepts. The author strongly emphasizes the practical performance issues involved in writing real working programs of significant size. Chapters on troubleshooting and efficiency are included, along with a discussion of the fundamentals of object-oriented programming and a description of the main CLOS functions. This volume is an excellent text for a course on AI programming, a useful supplement for general AI courses and an indispensable reference for the professional programmer.

Compilers: Principles, Techniques, and Tools


Alfred V. Aho - 1986
    The authors present updated coverage of compilers based on research and techniques that have been developed in the field over the past few years. The book provides a thorough introduction to compiler design and covers topics such as context-free grammars, fine state machines, and syntax-directed translation.

The Universal Computer: The Road from Leibniz to Turing


Martin D. Davis - 2000
    How can today's computers perform such a bewildering variety of tasks if computing is just glorified arithmetic? The answer, as Martin Davis lucidly illustrates, lies in the fact that computers are essentially engines of logic. Their hardware and software embody concepts developed over centuries by logicians such as Leibniz, Boole, and Godel, culminating in the amazing insights of Alan Turing. The Universal Computer traces the development of these concepts by exploring with captivating detail the lives and work of the geniuses who first formulated them. Readers will come away with a revelatory understanding of how and why computers work and how the algorithms within them came to be.

Elementary Statistics: Picturing the World


Ron Larson - 2002
    Offering an approach with a visual/graphical emphasis, this text offers a number of examples on the premise that students learn best by doing. This book features an emphasis on interpretation of results and critical thinking over calculations.

Calculated Risks: How to Know When Numbers Deceive You


Gerd Gigerenzer - 2002
    G. Wells predicted that statistical thinking would be as necessary for citizenship in a technological world as the ability to read and write. But in the twenty-first century, we are often overwhelmed by a baffling array of percentages and probabilities as we try to navigate in a world dominated by statistics. Cognitive scientist Gerd Gigerenzer says that because we haven't learned statistical thinking, we don't understand risk and uncertainty. In order to assess risk -- everything from the risk of an automobile accident to the certainty or uncertainty of some common medical screening tests -- we need a basic understanding of statistics.Astonishingly, doctors and lawyers don't understand risk any better than anyone else. Gigerenzer reports a study in which doctors were told the results of breast cancer screenings and then were asked to explain the risks of contracting breast cancer to a woman who received a positive result from a screening. The actual risk was small because the test gives many false positives. But nearly every physician in the study overstated the risk. Yet many people will have to make important health decisions based on such information and the interpretation of that information by their doctors.Gigerenzer explains that a major obstacle to our understanding of numbers is that we live with an illusion of certainty. Many of us believe that HIV tests, DNA fingerprinting, and the growing number of genetic tests are absolutely certain. But even DNA evidence can produce spurious matches. We cling to our illusion of certainty because the medical industry, insurance companies, investment advisers, and election campaigns have become purveyors of certainty, marketing it like a commodity.To avoid confusion, says Gigerenzer, we should rely on more understandable representations of risk, such as absolute risks. For example, it is said that a mammography screening reduces the risk of breast cancer by 25 percent. But in absolute risks, that means that out of every 1,000 women who do not participate in screening, 4 will die; while out of 1,000 women who do, 3 will die. A 25 percent risk reduction sounds much more significant than a benefit that 1 out of 1,000 women will reap.This eye-opening book explains how we can overcome our ignorance of numbers and better understand the risks we may be taking with our money, our health, and our lives.