How to Lie with Statistics
Darrell Huff - 1954
Darrell Huff runs the gamut of every popularly used type of statistic, probes such things as the sample study, the tabulation method, the interview technique, or the way the results are derived from the figures, and points up the countless number of dodges which are used to fool rather than to inform.
The Fourth Paradigm: Data-Intensive Scientific Discovery
Tony Hey - 2009
Increasingly, scientific breakthroughs will be powered by advanced computing capabilities that help researchers manipulate and explore massive datasets. The speed at which any given scientific discipline advances will depend on how well its researchers collaborate with one another, and with technologists, in areas of eScience such as databases, workflow management, visualization, and cloud-computing technologies. This collection of essays expands on the vision of pioneering computer scientist Jim Gray for a new, fourth paradigm of discovery based on data-intensive science and offers insights into how it can be fully realized.
The Myth of Artificial Intelligence: Why Computers Can't Think the Way We Do
Erik J. Larson - 2021
What hope do we have against superintelligent machines? But we aren't really on the path to developing intelligent machines. In fact, we don't even know where that path might be.A tech entrepreneur and pioneering research scientist working at the forefront of natural language processing, Erik Larson takes us on a tour of the landscape of AI to show how far we are from superintelligence, and what it would take to get there. Ever since Alan Turing, AI enthusiasts have equated artificial intelligence with human intelligence. This is a profound mistake. AI works on inductive reasoning, crunching data sets to predict outcomes. But humans don't correlate data sets: we make conjectures informed by context and experience. Human intelligence is a web of best guesses, given what we know about the world. We haven't a clue how to program this kind of intuitive reasoning, known as abduction. Yet it is the heart of common sense. That's why Alexa can't understand what you are asking, and why AI can only take us so far.Larson argues that AI hype is both bad science and bad for science. A culture of invention thrives on exploring unknowns, not overselling existing methods. Inductive AI will continue to improve at narrow tasks, but if we want to make real progress, we will need to start by more fully appreciating the only true intelligence we know--our own.
Statistical Techniques in Business & Economics [With CDROM]
Douglas A. Lind - 1974
The text is non-threatening and presents concepts clearly and succinctly with a conversational writing style. All statistical concepts are illustrated with solved applied examples immediately upon introduction. Self reviews and exercises for each section, and review sections for groups of chapters also support the student learning steps. Modern computing applications (Excel, Minitab, and MegaStat) are introduced, but the text maintains a focus on presenting statistics concepts as applied in business as opposed to technology or programming methods. The thirteenth edition continues as a students' text with increased emphasis on interpretation of data and results.
How to Solve It: A New Aspect of Mathematical Method
George Pólya - 1944
Polya, How to Solve It will show anyone in any field how to think straight. In lucid and appealing prose, Polya reveals how the mathematical method of demonstrating a proof or finding an unknown can be of help in attacking any problem that can be reasoned out--from building a bridge to winning a game of anagrams. Generations of readers have relished Polya's deft--indeed, brilliant--instructions on stripping away irrelevancies and going straight to the heart of the problem.
Algorithms
Robert Sedgewick - 1983
This book surveys the most important computer algorithms currently in use and provides a full treatment of data structures and algorithms for sorting, searching, graph processing, and string processing -- including fifty algorithms every programmer should know. In this edition, new Java implementations are written in an accessible modular programming style, where all of the code is exposed to the reader and ready to use.The algorithms in this book represent a body of knowledge developed over the last 50 years that has become indispensable, not just for professional programmers and computer science students but for any student with interests in science, mathematics, and engineering, not to mention students who use computation in the liberal arts.The companion web site, algs4.cs.princeton.edu contains An online synopsis Full Java implementations Test data Exercises and answers Dynamic visualizations Lecture slides Programming assignments with checklists Links to related material The MOOC related to this book is accessible via the "Online Course" link at algs4.cs.princeton.edu. The course offers more than 100 video lecture segments that are integrated with the text, extensive online assessments, and the large-scale discussion forums that have proven so valuable. Offered each fall and spring, this course regularly attracts tens of thousands of registrants.Robert Sedgewick and Kevin Wayne are developing a modern approach to disseminating knowledge that fully embraces technology, enabling people all around the world to discover new ways of learning and teaching. By integrating their textbook, online content, and MOOC, all at the state of the art, they have built a unique resource that greatly expands the breadth and depth of the educational experience.
Big Data: A Very Short Introduction
Dawn E. Holmes - 2018
Once access to the Internet became a reality for large swathes of the world's population, the amount of data generated each day became huge, and continues to grow exponentially. It includes all our uploaded documents, video, and photos, all our social media traffic, our online shopping, even the GPS data from our cars."Big Data" represents a qualitative change, not simply a quantitative one. The term refers both to the new technologies involved, and to the way it can be used by business and government. Dawn E. Holmes uses a variety of case studies to explain how data is stored, analyzed, and exploited by a variety of bodies from big companies to organizations concerned with disease control. Big data is transforming the way businesses operate, and the way medical research can be carried out. At the same time, it raises important ethical issues; Holmes discusses cases such as the Snowden affair, data security, and domestic smart devices which can be hijacked by hackers.ABOUT THE SERIES: The Very Short Introductions series from Oxford University Press contains hundreds of titles in almost every subject area. These pocket-sized books are the perfect way to get ahead in a new subject quickly. Our expert authors combine facts, analysis, perspective, new ideas, and enthusiasm to make interesting and challenging topics highly readable.
Python 3 Object Oriented Programming
Dusty Phillips - 2010
Many examples are taken from real-world projects. The book focuses on high-level design as well as the gritty details of the Python syntax. The provided exercises inspire the reader to think about his or her own code, rather than providing solved problems. If you're new to Object Oriented Programming techniques, or if you have basic Python skills and wish to learn in depth how and when to correctly apply Object Oriented Programming in Python, this is the book for you. If you are an object-oriented programmer for other languages, you too will find this book a useful introduction to Python, as it uses terminology you are already familiar with. Python 2 programmers seeking a leg up in the new world of Python 3 will also find the book beneficial, and you need not necessarily know Python 2.
Concrete Mathematics: A Foundation for Computer Science
Ronald L. Graham - 1988
"More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."
Rebooting AI: Building Artificial Intelligence We Can Trust
Gary F. Marcus - 2019
Professors Gary Marcus and Ernest Davis have spent their careers at the forefront of AI research and have witnessed some of the greatest milestones in the field, but they argue that a computer winning in games like Jeopardy and go does not signal that we are on the doorstep of fully autonomous cars or superintelligent machines. The achievements in the field thus far have occurred in closed systems with fixed sets of rules. These approaches are too narrow to achieve genuine intelligence. The world we live in is wildly complex and open-ended. How can we bridge this gap? What will the consequences be when we do? Marcus and Davis show us what we need to first accomplish before we get there and argue that if we are wise along the way, we won't need to worry about a future of machine overlords. If we heed their advice, humanity can create an AI that we can trust in our homes, our cars, and our doctor's offices. Reboot provides a lucid, clear-eyed assessment of the current science and offers an inspiring vision of what we can achieve and how AI can make our lives better.
HTML Black Book: The Programmer's Complete HTML Reference Book
Steven Holzner - 2000
An immediate and comprehensive answer source, rather than a diffuse tutorial, for serious programmers who want to see difficult material covered in depth without the fluff. Discusses XML, dynamic HTML, JavaScript, Java, and Perl CGI programming to create a full Web site programming package. Written by the author of several successful titles published by The Coriolis Group.
Algorithmic Puzzles
Anany V. Levitin - 2011
This logic extends far beyond the realm of computer science and into the wide and entertaining world of puzzles. In Algorithmic Puzzles, Anany and Maria Levitin use many classic brainteasers as well as newer examples from job interviews with major corporations to show readers how to apply analytical thinking to solve puzzles requiring well-defined procedures.The book's unique collection of puzzles is supplemented with carefully developed tutorials on algorithm design strategies and analysis techniques intended to walk the reader step-by-step through the various approaches to algorithmic problem solving. Mastery of these strategies--exhaustive search, backtracking, and divide-and-conquer, among others--will aid the reader in solving not only the puzzles contained in this book, but also others encountered in interviews, puzzle collections, and throughout everyday life. Each of the 150 puzzles contains hints and solutions, along with commentary onthe puzzle's origins and solution methods. The only book of its kind, Algorithmic Puzzles houses puzzles for all skill levels. Readers with only middle school mathematics will develop their algorithmic problem-solving skills through puzzles at the elementary level, while seasoned puzzle solvers will enjoy the challenge of thinking throughmore difficult puzzles.
SQL Performance Explained
Markus Winand - 2011
The focus is on SQL-it covers all major SQL databases without getting lost in the details of any one specific product. Starting with the basics of indexing and the WHERE clause, SQL Performance Explained guides developers through all parts of an SQL statement and explains the pitfalls of object-relational mapping (ORM) tools like Hibernate. Topics covered include: Using multi-column indexes; Correctly applying SQL functions; Efficient use of LIKE queries; Optimizing join operations; Clustering data to improve performance; Pipelined execution of ORDER BY and GROUP BY; Getting the best performance for pagination queries; Understanding the scalability of databases. Its systematic structure makes SQL Performance Explained both a textbook and a reference manual that should be on every developer's bookshelf.
The Seven Pillars of Statistical Wisdom
Stephen M. Stigler - 2016
It allows one to gain information by discarding information, namely, the individuality of the observations. Stigler s second pillar, information measurement, challenges the importance of big data by noting that observations are not all equally important: the amount of information in a data set is often proportional to only the square root of the number of observations, not the absolute number. The third idea is likelihood, the calibration of inferences with the use of probability. Intercomparison is the principle that statistical comparisons do not need to be made with respect to an external standard. The fifth pillar is regression, both a paradox (tall parents on average produce shorter children; tall children on average have shorter parents) and the basis of inference, including Bayesian inference and causal reasoning. The sixth concept captures the importance of experimental design for example, by recognizing the gains to be had from a combinatorial approach with rigorous randomization. The seventh idea is the residual the notion that a complicated phenomenon can be simplified by subtracting the effect of known causes, leaving a residual phenomenon that can be explained more easily.The Seven Pillars of Statistical Wisdom presents an original, unified account of statistical science that will fascinate the interested layperson and engage the professional statistician."
Prediction Machines: The Simple Economics of Artificial Intelligence
Ajay Agrawal - 2018
But facing the sea change that AI will bring can be paralyzing. How should companies set strategies, governments design policies, and people plan their lives for a world so different from what we know? In the face of such uncertainty, many analysts either cower in fear or predict an impossibly sunny future.But in Prediction Machines, three eminent economists recast the rise of AI as a drop in the cost of prediction. With this single, masterful stroke, they lift the curtain on the AI-is-magic hype and show how basic tools from economics provide clarity about the AI revolution and a basis for action by CEOs, managers, policy makers, investors, and entrepreneurs.When AI is framed as cheap prediction, its extraordinary potential becomes clear:
Prediction is at the heart of making decisions under uncertainty. Our businesses and personal lives are riddled with such decisions.
Prediction tools increase productivity--operating machines, handling documents, communicating with customers.
Uncertainty constrains strategy. Better prediction creates opportunities for new business structures and strategies to compete.
Penetrating, fun, and always insightful and practical, Prediction Machines follows its inescapable logic to explain how to navigate the changes on the horizon. The impact of AI will be profound, but the economic framework for understanding it is surprisingly simple.