Book picks similar to
Numbers and Geometry by John Stiilwell


geometry
machine-learning
topology
undergraduate-texts-in-mathematics

The Hundred-Page Machine Learning Book


Andriy Burkov - 2019
    During that week, you will learn almost everything modern machine learning has to offer. The author and other practitioners have spent years learning these concepts.Companion wiki — the book has a continuously updated wiki that extends some book chapters with additional information: Q&A, code snippets, further reading, tools, and other relevant resources.Flexible price and formats — choose from a variety of formats and price options: Kindle, hardcover, paperback, EPUB, PDF. If you buy an EPUB or a PDF, you decide the price you pay!Read first, buy later — download book chapters for free, read them and share with your friends and colleagues. Only if you liked the book or found it useful in your work, study or business, then buy it.

Practical Statistics for Data Scientists: 50 Essential Concepts


Peter Bruce - 2017
    Courses and books on basic statistics rarely cover the topic from a data science perspective. This practical guide explains how to apply various statistical methods to data science, tells you how to avoid their misuse, and gives you advice on what's important and what's not.Many data science resources incorporate statistical methods but lack a deeper statistical perspective. If you're familiar with the R programming language, and have some exposure to statistics, this quick reference bridges the gap in an accessible, readable format.With this book, you'll learn:Why exploratory data analysis is a key preliminary step in data scienceHow random sampling can reduce bias and yield a higher quality dataset, even with big dataHow the principles of experimental design yield definitive answers to questionsHow to use regression to estimate outcomes and detect anomaliesKey classification techniques for predicting which categories a record belongs toStatistical machine learning methods that "learn" from dataUnsupervised learning methods for extracting meaning from unlabeled data

Alan Turing: Unlocking the Enigma


David Boyle - 2014
    Turing’s openness about his homosexuality at a time when it was an imprisonable offense ultimately led to his untimely lo death at the age of only forty-one. In Alan Turing: Unlocking the Enigma, David Boyle reveals the mysteries behind the man and his remarkable career. Aged just 22, Turing was elected a fellow at King's College, Cambridge on the strength of a dissertation in which he proved the central limit theorem. By the age of 33, he had been awarded the OBE by King George VI for his wartime services: Turing was instrumental in cracking the Nazi Enigma machines at the top secret code breaking establishment at Bletchley Park during the Second World War.But his achievements were to be tragically overshadowed by the paranoia of the post-War years. Hounded for his supposedly subversive views and for his sexuality, Turing was prosecuted in 1952, and forced to accept the humiliation of hormone treatment to avoid a prison sentence. Just two years later, at the age of 41 he was dead. The verdict: cyanide poisoning.Was Turing’s death accidental as his mother always claimed? Or did persistent persecution drive him to take him own life?Alan Turing: Unlocking the Enigma seeks to find the man behind the science, illuminating the life of a person who is still a shadowy presence behind his brilliant achievements.

Make Your Own Neural Network


Tariq Rashid - 2016
     Neural networks are a key element of deep learning and artificial intelligence, which today is capable of some truly impressive feats. Yet too few really understand how neural networks actually work. This guide will take you on a fun and unhurried journey, starting from very simple ideas, and gradually building up an understanding of how neural networks work. You won't need any mathematics beyond secondary school, and an accessible introduction to calculus is also included. The ambition of this guide is to make neural networks as accessible as possible to as many readers as possible - there are enough texts for advanced readers already! You'll learn to code in Python and make your own neural network, teaching it to recognise human handwritten numbers, and performing as well as professionally developed networks. Part 1 is about ideas. We introduce the mathematical ideas underlying the neural networks, gently with lots of illustrations and examples. Part 2 is practical. We introduce the popular and easy to learn Python programming language, and gradually builds up a neural network which can learn to recognise human handwritten numbers, easily getting it to perform as well as networks made by professionals. Part 3 extends these ideas further. We push the performance of our neural network to an industry leading 98% using only simple ideas and code, test the network on your own handwriting, take a privileged peek inside the mysterious mind of a neural network, and even get it all working on a Raspberry Pi. All the code in this has been tested to work on a Raspberry Pi Zero.

Neural Networks for Pattern Recognition


Christopher M. Bishop - 1996
    After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layerperceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization in neural networks, and Bayesian techniques and their applications. Designed as a text, with over 100exercises, this fully up-to-date work will benefit anyone involved in the fields of neural computation and pattern recognition.

Numerical Optimization


Jorge Nocedal - 2000
    One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.

Discrete Mathematics


Richard Johnsonbaugh - 1984
    Focused on helping students understand and construct proofs and expanding their mathematical maturity, this best-selling text is an accessible introduction to discrete mathematics. Johnsonbaugh's algorithmic approach emphasizes problem-solving techniques. The Seventh Edition reflects user and reviewer feedback on both content and organization.

Operations Research: Applications and Algorithms (with CD-ROM and InfoTrac)


Wayne L. Winston - 1987
    It moves beyond a mere study of algorithms without sacrificing the rigor that faculty desire. As in every edition, Winston reinforces the book's successful features and coverage with the most recent developments in the field. The Student Suite CD-ROM, which now accompanies every new copy of the text, contains the latest versions of commercial software for optimization, simulation, and decision analysis.

Computational Geometry: Algorithms and Applications


Mark de Berg - 1997
    The focus is on algorithms and hence the book is well suited for students in computer science and engineering. Motivation is provided from the application areas: all solutions and techniques from computational geometry are related to particular applications in robotics, graphics, CAD/CAM, and geographic information systems. For students this motivation will be especially welcome. Modern insights in computational geometry are used to provide solutions that are both efficient and easy to understand and implement. All the basic techniques and topics from computational geometry, as well as several more advanced topics, are covered. The book is largely self-contained and can be used for self-study by anyone with a basic background in algorithms. In the second edition, besides revisions to the first edition, a number of new exercises have been added.

Make Your Own Neural Network: An In-depth Visual Introduction For Beginners


Michael Taylor - 2017
    A step-by-step visual journey through the mathematics of neural networks, and making your own using Python and Tensorflow.

Pattern Classification


David G. Stork - 1973
    Now with the second edition, readers will find information on key new topics such as neural networks and statistical pattern recognition, the theory of machine learning, and the theory of invariances. Also included are worked examples, comparisons between different methods, extensive graphics, expanded exercises and computer project topics.An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.

Topology


James R. Munkres - 1975
    Includes many examples and figures. GENERAL TOPOLOGY. Set Theory and Logic. Topological Spaces and Continuous Functions. Connectedness and Compactness. Countability and Separation Axioms. The Tychonoff Theorem. Metrization Theorems and paracompactness. Complete Metric Spaces and Function Spaces. Baire Spaces and Dimension Theory. ALGEBRAIC TOPOLOGY. The Fundamental Group. Separation Theorems. The Seifert-van Kampen Theorem. Classification of Surfaces. Classification of Covering Spaces. Applications to Group Theory. For anyone needing a basic, thorough, introduction to general and algebraic topology and its applications.

Machine Learning: A Probabilistic Perspective


Kevin P. Murphy - 2012
    Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach.The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package—PMTK (probabilistic modeling toolkit)—that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.

Learning From Data: A Short Course


Yaser S. Abu-Mostafa - 2012
    Its techniques are widely applied in engineering, science, finance, and commerce. This book is designed for a short course on machine learning. It is a short course, not a hurried course. From over a decade of teaching this material, we have distilled what we believe to be the core topics that every student of the subject should know. We chose the title `learning from data' that faithfully describes what the subject is about, and made it a point to cover the topics in a story-like fashion. Our hope is that the reader can learn all the fundamentals of the subject by reading the book cover to cover. ---- Learning from data has distinct theoretical and practical tracks. In this book, we balance the theoretical and the practical, the mathematical and the heuristic. Our criterion for inclusion is relevance. Theory that establishes the conceptual framework for learning is included, and so are heuristics that impact the performance of real learning systems. ---- Learning from data is a very dynamic field. Some of the hot techniques and theories at times become just fads, and others gain traction and become part of the field. What we have emphasized in this book are the necessary fundamentals that give any student of learning from data a solid foundation, and enable him or her to venture out and explore further techniques and theories, or perhaps to contribute their own. ---- The authors are professors at California Institute of Technology (Caltech), Rensselaer Polytechnic Institute (RPI), and National Taiwan University (NTU), where this book is the main text for their popular courses on machine learning. The authors also consult extensively with financial and commercial companies on machine learning applications, and have led winning teams in machine learning competitions.