An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

Pattern Recognition and Machine Learning


Christopher M. Bishop - 2006
    However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.

Calculus


Michael Spivak - 1967
    His aim is to present calculus as the first real encounter with mathematics: it is the place to learn how logical reasoning combined with fundamental concepts can be developed into a rigorous mathematical theory rather than a bunch of tools and techniques learned by rote. Since analysis is a subject students traditionally find difficult to grasp, Spivak provides leisurely explanations, a profusion of examples, a wide range of exercises and plenty of illustrations in an easy-going approach that enlightens difficult concepts and rewards effort. Calculus will continue to be regarded as a modern classic, ideal for honours students and mathematics majors, who seek an alternative to doorstop textbooks on calculus, and the more formidable introductions to real analysis.

Elements of Information Theory


Thomas M. Cover - 1991
    Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.

Statistical Rethinking: A Bayesian Course with Examples in R and Stan


Richard McElreath - 2015
    Reflecting the need for even minor programming in today's model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation.By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling.Web ResourceThe book is accompanied by an R package (rethinking) that is available on the author's website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Introductory Graph Theory


Gary Chartrand - 1984
    Introductory Graph Theory presents a nontechnical introduction to this exciting field in a clear, lively, and informative style. Author Gary Chartrand covers the important elementary topics of graph theory and its applications. In addition, he presents a large variety of proofs designed to strengthen mathematical techniques and offers challenging opportunities to have fun with mathematics. Ten major topics — profusely illustrated — include: Mathematical Models, Elementary Concepts of Graph Theory, Transportation Problems, Connection Problems, Party Problems, Digraphs and Mathematical Models, Games and Puzzles, Graphs and Social Psychology, Planar Graphs and Coloring Problems, and Graphs and Other Mathematics. A useful Appendix covers Sets, Relations, Functions, and Proofs, and a section devoted to exercises — with answers, hints, and solutions — is especially valuable to anyone encountering graph theory for the first time. Undergraduate mathematics students at every level, puzzlists, and mathematical hobbyists will find well-organized coverage of the fundamentals of graph theory in this highly readable and thoroughly enjoyable book.

Visual Complex Analysis


Tristan Needham - 1997
    Aimed at undergraduate students in mathematics, physics, and engineering, the book's intuitive explanations, lack ofadvanced prerequisites, and consciously user-friendly prose style will help students to master the subject more readily than was previously possible. The key to this is the book's use of new geometric arguments in place of the standard calculational ones. These geometric arguments are communicatedwith the aid of hundreds of diagrams of a standard seldom encountered in mathematical works. A new approach to a classical topic, this work will be of interest to students in mathematics, physics, and engineering, as well as to professionals in these fields.

Engineering Mathematics


K.A. Stroud - 2001
    Fully revised to meet the needs of the wide range of students beginning engineering courses, this edition has an extended Foundation section including new chapters on graphs, trigonometry, binomial series and functions and a CD-ROM

How to Prove It: A Structured Approach


Daniel J. Velleman - 1994
    The book begins with the basic concepts of logic and set theory, to familiarize students with the language of mathematics and how it is interpreted. These concepts are used as the basis for a step-by-step breakdown of the most important techniques used in constructing proofs. To help students construct their own proofs, this new edition contains over 200 new exercises, selected solutions, and an introduction to Proof Designer software. No background beyond standard high school mathematics is assumed. Previous Edition Hb (1994) 0-521-44116-1 Previous Edition Pb (1994) 0-521-44663-5

Mathematics: Its Content, Methods and Meaning


A.D. Aleksandrov - 1963
    . . Nothing less than a major contribution to the scientific culture of this world." — The New York Times Book ReviewThis major survey of mathematics, featuring the work of 18 outstanding Russian mathematicians and including material on both elementary and advanced levels, encompasses 20 prime subject areas in mathematics in terms of their simple origins and their subsequent sophisticated developement. As Professor Morris Kline of New York University noted, "This unique work presents the amazing panorama of mathematics proper. It is the best answer in print to what mathematics contains both on the elementary and advanced levels."Beginning with an overview and analysis of mathematics, the first of three major divisions of the book progresses to an exploration of analytic geometry, algebra, and ordinary differential equations. The second part introduces partial differential equations, along with theories of curves and surfaces, the calculus of variations, and functions of a complex variable. It furthur examines prime numbers, the theory of probability, approximations, and the role of computers in mathematics. The theory of functions of a real variable opens the final section, followed by discussions of linear algebra and nonEuclidian geometry, topology, functional analysis, and groups and other algebraic systems.Thorough, coherent explanations of each topic are further augumented by numerous illustrative figures, and every chapter concludes with a suggested reading list. Formerly issued as a three-volume set, this mathematical masterpiece is now available in a convenient and modestly priced one-volume edition, perfect for study or reference."This is a masterful English translation of a stupendous and formidable mathematical masterpiece . . ." — Social Science

Deep Learning


Ian Goodfellow - 2016
    Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

Principles of Mathematical Analysis


Walter Rudin - 1964
    The text begins with a discussion of the real number system as a complete ordered field. (Dedekind's construction is now treated in an appendix to Chapter I.) The topological background needed for the development of convergence, continuity, differentiation and integration is provided in Chapter 2. There is a new section on the gamma function, and many new and interesting exercises are included. This text is part of the Walter Rudin Student Series in Advanced Mathematics.

Fourier Series


Georgi P. Tolstov - 1976
    Over 100 problems at ends of chapters. Answers in back of book. 1962 edition.

Combinatorial Optimization: Algorithms and Complexity


Christos H. Papadimitriou - 1998
    All chapters are supplemented by thought-provoking problems. A useful work for graduate-level students with backgrounds in computer science, operations research, and electrical engineering. "Mathematicians wishing a self-contained introduction need look no further." — American Mathematical Monthly.