Book picks similar to
Berkeley Problems in Mathematics by Paulo Ney de Souza
mathematics
math
analysis
theory
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie - 2001
With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.
The Monty Hall Problem: The Remarkable Story of Math's Most Contentious Brain Teaser
Jason Rosenhouse - 2009
Imagine that you face three doors, behind one of which is a prize. You choose one but do not open it. The host--call him Monty Hall--opens a different door, alwayschoosing one he knows to be empty. Left with two doors, will you do better by sticking with your first choice, or by switching to the other remaining door? In this light-hearted yet ultimately serious book, Jason Rosenhouse explores the history of this fascinating puzzle. Using a minimum ofmathematics (and none at all for much of the book), he shows how the problem has fascinated philosophers, psychologists, and many others, and examines the many variations that have appeared over the years. As Rosenhouse demonstrates, the Monty Hall Problem illuminates fundamental mathematical issuesand has abiding philosophical implications. Perhaps most important, he writes, the problem opens a window on our cognitive difficulties in reasoning about uncertainty.
Grokking Algorithms An Illustrated Guide For Programmers and Other Curious People
Aditya Y. Bhargava - 2015
The algorithms you'll use most often as a programmer have already been discovered, tested, and proven. If you want to take a hard pass on Knuth's brilliant but impenetrable theories and the dense multi-page proofs you'll find in most textbooks, this is the book for you. This fully-illustrated and engaging guide makes it easy for you to learn how to use algorithms effectively in your own programs.Grokking Algorithms is a disarming take on a core computer science topic. In it, you'll learn how to apply common algorithms to the practical problems you face in day-to-day life as a programmer. You'll start with problems like sorting and searching. As you build up your skills in thinking algorithmically, you'll tackle more complex concerns such as data compression or artificial intelligence. Whether you're writing business software, video games, mobile apps, or system utilities, you'll learn algorithmic techniques for solving problems that you thought were out of your grasp. For example, you'll be able to:Write a spell checker using graph algorithmsUnderstand how data compression works using Huffman codingIdentify problems that take too long to solve with naive algorithms, and attack them with algorithms that give you an approximate answer insteadEach carefully-presented example includes helpful diagrams and fully-annotated code samples in Python. By the end of this book, you will know some of the most widely applicable algorithms as well as how and when to use them.
On Formally Undecidable Propositions of Principia Mathematica and Related Systems
Kurt Gödel - 1992
Kurt Giidel maintained, and offered detailed proof, that in any arithmetic system, even in elementary parts of arithmetic, there are propositions which cannot be proved or disproved within the system. It is thus uncertain that the basic axioms of arithmetic will not give rise to contradictions. The repercussions of this discovery are still being felt and debated in 20th-century mathematics.The present volume reprints the first English translation of Giidel's far-reaching work. Not only does it make the argument more intelligible, but the introduction contributed by Professor R. B. Braithwaite (Cambridge University}, an excellent work of scholarship in its own right, illuminates it by paraphrasing the major part of the argument.This Dover edition thus makes widely available a superb edition of a classic work of original thought, one that will be of profound interest to mathematicians, logicians and anyone interested in the history of attempts to establish axioms that would provide a rigorous basis for all mathematics. Translated by B. Meltzer, University of Edinburgh. Preface. Introduction by R. B. Braithwaite.
Information Theory, Inference and Learning Algorithms
David J.C. MacKay - 2002
These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.