Book picks similar to
Real Analysis: Theory of Measure and Integration by J. Yeh
mathematics
maths
graduate-level
math-journey
Computers and Intractability: A Guide to the Theory of NP-Completeness
Michael R. Garey - 1979
Johnson. It was the first book exclusively on the theory of NP-completeness and computational intractability. The book features an appendix providing a thorough compendium of NP-complete problems (which was updated in later printings of the book). The book is now outdated in some respects as it does not cover more recent development such as the PCP theorem. It is nevertheless still in print and is regarded as a classic: in a 2006 study, the CiteSeer search engine listed the book as the most cited reference in computer science literature.
Numerical Optimization
Jorge Nocedal - 2000
One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.
Data Science for Business: What you need to know about data mining and data-analytic thinking
Foster Provost - 2013
This guide also helps you understand the many data-mining techniques in use today.Based on an MBA course Provost has taught at New York University over the past ten years, Data Science for Business provides examples of real-world business problems to illustrate these principles. You’ll not only learn how to improve communication between business stakeholders and data scientists, but also how participate intelligently in your company’s data science projects. You’ll also discover how to think data-analytically, and fully appreciate how data science methods can support business decision-making.Understand how data science fits in your organization—and how you can use it for competitive advantageTreat data as a business asset that requires careful investment if you’re to gain real valueApproach business problems data-analytically, using the data-mining process to gather good data in the most appropriate wayLearn general concepts for actually extracting knowledge from dataApply data science principles when interviewing data science job candidates
Feynman Lectures On Computation
Richard P. Feynman - 1996
Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.
Methods of Logic
Willard Van Orman Quine - 1950
Incorporating updated notations, selective answers to exercises, expanded treatment of natural deduction, and new discussions of predicate-functor logic and the affinities between higher set theory and the elementary logic of terms, W. V. Quine's new edition will serve admirably for both classroom and independent use.
The Colossal Book of Short Puzzles and Problems
Martin Gardner - 2005
His yearly gatherings of short and inventive problems were easily his most anticipated math columns. Loyal readers would savor the wit and elegance of his explorations in physics, probability, topology, and chess, among others. Grouped by subject and arrayed from easiest to hardest, the puzzles gathered here, which complement the lengthier, more involved problems in The Colossal Book of Mathematics, have been selected by Gardner for their illuminating; and often bewildering; solutions. Filled with over 300 illustrations, this new volume even contains nine new mathematical gems that Gardner, now ninety, has been gathering for the last decade. No amateur or expert math lover should be without this indispensable volume; a capstone to Gardner's seventy-year career.
Spurious Correlations
Tyler Vigen - 2015
is the most fun you'll ever have with graphs." -- Bustle Military intelligence analyst and Harvard Law student Tyler Vigen illustrates the golden rule that "correlation does not equal causation" through hilarious graphs inspired by his viral website.Is there a correlation between Nic Cage films and swimming pool accidents? What about beef consumption and people getting struck by lightning? Absolutely not. But that hasn't stopped millions of people from going to tylervigen.com and asking, "Wait, what?" Vigen has designed software that scours enormous data sets to find unlikely statistical correlations. He began pulling the funniest ones for his website and has since gained millions of views, hundreds of thousands of likes, and tons of media coverage. Subversive and clever, Spurious Correlations is geek humor at its finest, nailing our obsession with data and conspiracy theory.
How to Think Like a Mathematician
Kevin Houston - 2009
Working through the book you will develop an arsenal of techniques to help you unlock the meaning of definitions, theorems and proofs, solve problems, and write mathematics effectively. All the major methods of proof - direct method, cases, induction, contradiction and contrapositive - are featured. Concrete examples are used throughout, and you'll get plenty of practice on topics common to many courses such as divisors, Euclidean algorithms, modular arithmetic, equivalence relations, and injectivity and surjectivity of functions. The material has been tested by real students over many years so all the essentials are covered. With over 300 exercises to help you test your progress, you'll soon learn how to think like a mathematician.
Probability, Random Variables and Stochastic Processes with Errata Sheet
Athanasios Papoulis - 2001
Unnikrishna Pillai of Polytechnic University. The book is intended for a senior/graduate level course in probability and is aimed at students in electrical engineering, math, and physics departments. The authors' approach is to develop the subject of probability theory and stochastic processes as a deductive discipline and to illustrate the theory with basic applications of engineering interest. Approximately 1/3 of the text is new material--this material maintains the style and spirit of previous editions. In order to bridge the gap between concepts and applications, a number of additional examples have been added for further clarity, as well as several new topics.
Introduction to Machine Learning
Ethem Alpaydin - 2004
Many successful applications of machine learning exist already, including systems that analyze past sales data to predict customer behavior, recognize faces or spoken speech, optimize robot behavior so that a task can be completed using minimum resources, and extract knowledge from bioinformatics data. "Introduction to Machine Learning" is a comprehensive textbook on the subject, covering a broad array of topics not usually included in introductory machine learning texts. It discusses many methods based in different fields, including statistics, pattern recognition, neural networks, artificial intelligence, signal processing, control, and data mining, in order to present a unified treatment of machine learning problems and solutions. All learning algorithms are explained so that the student can easily move from the equations in the book to a computer program. The book can be used by advanced undergraduates and graduate students who have completed courses in computer programming, probability, calculus, and linear algebra. It will also be of interest to engineers in the field who are concerned with the application of machine learning methods.After an introduction that defines machine learning and gives examples of machine learning applications, the book covers supervised learning, Bayesian decision theory, parametric methods, multivariate methods, dimensionality reduction, clustering, nonparametric methods, decision trees, linear discrimination, multilayer perceptrons, local models, hidden Markov models, assessing and comparing classification algorithms, combining multiple learners, and reinforcement learning.
Networks: An Introduction
M.E.J. Newman - 2010
The rise of the Internet and the wide availability of inexpensive computers have made it possible to gather and analyze network data on a large scale, and the development of a variety of new theoretical tools has allowed us to extract new knowledge from many different kinds of networks.The study of networks is broadly interdisciplinary and important developments have occurred in many fields, including mathematics, physics, computer and information sciences, biology, and the social sciences. This book brings together for the first time the most important breakthroughs in each of these fields and presents them in a coherent fashion, highlighting the strong interconnections between work in different areas.Subjects covered include the measurement and structure of networks in many branches of science, methods for analyzing network data, including methods developed in physics, statistics, and sociology, the fundamentals of graph theory, computer algorithms, and spectral methods, mathematical models of networks, including random graph models and generative models, and theories of dynamical processes taking place on networks.
Introduction to the Theory of Computation
Michael Sipser - 1996
Sipser's candid, crystal-clear style allows students at every level to understand and enjoy this field. His innovative "proof idea" sections explain profound concepts in plain English. The new edition incorporates many improvements students and professors have suggested over the years, and offers updated, classroom-tested problem sets at the end of each chapter.
Introduction to Probability
Dimitri P. Bertsekas - 2002
This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.