Innumeracy: Mathematical Illiteracy and Its Consequences


John Allen Paulos - 1988
    Dozens of examples in innumeracy show us how it affects not only personal economics and travel plans, but explains mis-chosen mates, inappropriate drug-testing, and the allure of pseudo-science.

Information Theory: A Tutorial Introduction


James V. Stone - 2015
    In this richly illustrated book, accessible examples are used to show how information theory can be understood in terms of everyday games like '20 Questions', and the simple MatLab programs provided give hands-on experience of information theory in action. Written in a tutorial style, with a comprehensive glossary, this text represents an ideal primer for novices who wish to become familiar with the basic principles of information theory.Download chapter 1 from http://jim-stone.staff.shef.ac.uk/Boo...

Statistical Mechanics


R.K. Pathria - 1972
    Highly recommended for graduate-level libraries.' ChoiceThis highly successful text, which first appeared in the year 1972 and has continued to be popular ever since, has now been brought up-to-date by incorporating the remarkable developments in the field of 'phase transitions and critical phenomena' that took place over the intervening years. This has been done by adding three new chapters (comprising over 150 pages and containing over 60 homework problems) which should enhance the usefulness of the book for both students and instructors. We trust that this classic text, which has been widely acclaimed for its clean derivations and clear explanations, will continue to provide further generations of students a sound training in the methods of statistical physics.

Linear Algebra


Stephen H. Friedberg - 1979
     This top-selling, theorem-proof text presents a careful treatment of the principal topics of linear algebra, and illustrates the power of the subject through a variety of applications. It emphasizes the symbiotic relationship between linear transformations and matrices, but states theorems in the more general infinite-dimensional case where appropriate.

Data Smart: Using Data Science to Transform Information into Insight


John W. Foreman - 2013
    Major retailers are predicting everything from when their customers are pregnant to when they want a new pair of Chuck Taylors. It's a brave new world where seemingly meaningless data can be transformed into valuable insight to drive smart business decisions.But how does one exactly do data science? Do you have to hire one of these priests of the dark arts, the "data scientist," to extract this gold from your data? Nope.Data science is little more than using straight-forward steps to process raw data into actionable insight. And in Data Smart, author and data scientist John Foreman will show you how that's done within the familiar environment of a spreadsheet. Why a spreadsheet? It's comfortable! You get to look at the data every step of the way, building confidence as you learn the tricks of the trade. Plus, spreadsheets are a vendor-neutral place to learn data science without the hype. But don't let the Excel sheets fool you. This is a book for those serious about learning the analytic techniques, the math and the magic, behind big data.Each chapter will cover a different technique in a spreadsheet so you can follow along: - Mathematical optimization, including non-linear programming and genetic algorithms- Clustering via k-means, spherical k-means, and graph modularity- Data mining in graphs, such as outlier detection- Supervised AI through logistic regression, ensemble models, and bag-of-words models- Forecasting, seasonal adjustments, and prediction intervals through monte carlo simulation- Moving from spreadsheets into the R programming languageYou get your hands dirty as you work alongside John through each technique. But never fear, the topics are readily applicable and the author laces humor throughout. You'll even learn what a dead squirrel has to do with optimization modeling, which you no doubt are dying to know.

Schaum's Outline of Calculus


Frank Ayres Jr. - 1990
    They'll also find the related analytic geometry much easier. The clear review of algebra and geometry in this edition will make calculus easier for students who wish to strengthen their knowledge in these areas. Updated to meet the emphasis in current courses, this new edition of a popular guide--more than 104,000 copies were bought of the prior edition--includes problems and examples using graphing calculators..

Computers and Intractability: A Guide to the Theory of NP-Completeness


Michael R. Garey - 1979
    Johnson. It was the first book exclusively on the theory of NP-completeness and computational intractability. The book features an appendix providing a thorough compendium of NP-complete problems (which was updated in later printings of the book). The book is now outdated in some respects as it does not cover more recent development such as the PCP theorem. It is nevertheless still in print and is regarded as a classic: in a 2006 study, the CiteSeer search engine listed the book as the most cited reference in computer science literature.

Book of Proof


Richard Hammack - 2009
    It is a bridge from the computational courses (such as calculus or differential equations) that students typically encounter in their first year of college to a more abstract outlook. It lays a foundation for more theoretical courses such as topology, analysis and abstract algebra. Although it may be more meaningful to the student who has had some calculus, there is really no prerequisite other than a measure of mathematical maturity. Topics include sets, logic, counting, methods of conditional and non-conditional proof, disproof, induction, relations, functions and infinite cardinality.

Coding the Matrix: Linear Algebra through Computer Science Applications


Philip N. Klein - 2013
    Mathematical concepts and computational problems are motivated by applications in computer science. The reader learns by "doing," writing programs to implement the mathematical concepts and using them to carry out tasks and explore the applications. Examples include: error-correcting codes, transformations in graphics, face detection, encryption and secret-sharing, integer factoring, removing perspective from an image, PageRank (Google's ranking algorithm), and cancer detection from cell features. A companion web site, codingthematrix.com provides data and support code. Most of the assignments can be auto-graded online. Over two hundred illustrations, including a selection of relevant "xkcd" comics. Chapters: "The Function," "The Field," "The Vector," "The Vector Space," "The Matrix," "The Basis," "Dimension," "Gaussian Elimination," "The Inner Product," "Special Bases," "The Singular Value Decomposition," "The Eigenvector," "The Linear Program"

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Engineering Mathematics


K.A. Stroud - 2001
    Fully revised to meet the needs of the wide range of students beginning engineering courses, this edition has an extended Foundation section including new chapters on graphs, trigonometry, binomial series and functions and a CD-ROM

Statistical Inference


George Casella - 2001
    Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and are natural extensions and consequences of previous concepts. This book can be used for readers who have a solid mathematics background. It can also be used in a way that stresses the more practical uses of statistical theory, being more concerned with understanding basic statistical concepts and deriving reasonable statistical procedures for a variety of situations, and less concerned with formal optimality investigations.

Calculus


Ron Larson - 1999
    It has been widely praised by a generation of users for its solid and effective pedagogy that addresses the needs of a broad range of teaching and learning styles and environments. Each title is just one component in a comprehensive calculus course program that carefully integrates and coordinates print, media, and technology products for successful teaching and learning.

Real Analysis


H.L. Royden - 1963
    Dealing with measure theory and Lebesque integration, this is an introductory graduate text.

An Introduction to Statistical Learning: With Applications in R


Gareth James - 2013
    This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.