Advanced Differential Equations


M.D. Raisinghania - 1995
    

CRC Handbook of Chemistry and Physics


David R. Lide - 1984
    This edition contains NEW tables on Properties of Ionic Liquids, Solubilities of Hydrocarbons in Sea Water, Solubility of Organic Compounds in Superheated Water, and Nutritive Value of Foods. It also updates many tables including Critical Constants, Heats of Vaporization, Aqueous Solubility of Organic Compounds, Vapor Pressure of Mercury, Scientific Abbreviations and Symbols, and Bond Dissociation Energies. The 88th Edition also presents a new Foreword written by Dr. Harold Kroto, a 1996 Nobel Laureate in Chemistry.

Algorithms Unlocked


Thomas H. Cormen - 2013
    For anyone who has ever wondered how computers solve problems, an engagingly written guide for nonexperts to the basics of computer algorithms.

Engineering Mathematics


K.A. Stroud - 2001
    Fully revised to meet the needs of the wide range of students beginning engineering courses, this edition has an extended Foundation section including new chapters on graphs, trigonometry, binomial series and functions and a CD-ROM

Introduction to Quantum Mechanics with Applications to Chemistry


Linus Pauling - 1985
    Numerous tables and figures.

Introduction to Probability


Dimitri P. Bertsekas - 2002
    This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.

Tell Me The Odds: A 15 Page Introduction To Bayes Theorem


Scott Hartshorn - 2017
    Essentially, you make an initial guess, and then get more data to improve it. Bayes Theorem, or Bayes Rule, has a ton of real world applications, from estimating your risk of a heart attack to making recommendations on Netflix But It Isn't That Complicated This book is a short introduction to Bayes Theorem. It is only 15 pages long, and is intended to show you how Bayes Theorem works as quickly as possible. The examples are intentionally kept simple to focus solely on Bayes Theorem without requiring that the reader know complicated probability distributions. If you want to learn the basics of Bayes Theorem as quickly as possible, with some easy to duplicate examples, this is a good book for you.

Pattern Recognition and Machine Learning


Christopher M. Bishop - 2006
    However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.

How Math Explains the World: A Guide to the Power of Numbers, from Car Repair to Modern Physics


James D. Stein - 2008
    In the four main sections of the book, Stein tells the stories of the mathematical thinkers who discerned some of the most fundamental aspects of our universe. From their successes and failures, delusions, and even duels, the trajectories of their innovations—and their impact on society—are traced in this fascinating narrative. Quantum mechanics, space-time, chaos theory and the workings of complex systems, and the impossibility of a "perfect" democracy are all here. Stein's book is both mind-bending and practical, as he explains the best way for a salesman to plan a trip, examines why any thought you could have is imbedded in the number π , and—perhaps most importantly—answers one of the modern world's toughest questions: why the garage can never get your car repaired on time.Friendly, entertaining, and fun, How Math Explains the World is the first book by one of California's most popular math teachers, a veteran of both "math for poets" and Princeton's Institute for Advanced Studies. And it's perfect for any reader wanting to know how math makes both science and the world tick.

Statistical Rethinking: A Bayesian Course with Examples in R and Stan


Richard McElreath - 2015
    Reflecting the need for even minor programming in today's model-based statistics, the book pushes readers to perform step-by-step calculations that are usually automated. This unique computational approach ensures that readers understand enough of the details to make reasonable choices and interpretations in their own modeling work.The text presents generalized linear multilevel models from a Bayesian perspective, relying on a simple logical interpretation of Bayesian probability and maximum entropy. It covers from the basics of regression to multilevel models. The author also discusses measurement error, missing data, and Gaussian process models for spatial and network autocorrelation.By using complete R code examples throughout, this book provides a practical foundation for performing statistical inference. Designed for both PhD students and seasoned professionals in the natural and social sciences, it prepares them for more advanced or specialized statistical modeling.Web ResourceThe book is accompanied by an R package (rethinking) that is available on the author's website and GitHub. The two core functions (map and map2stan) of this package allow a variety of statistical models to be constructed from standard model formulas.

Multiple View Geometry in Computer Vision


Richard Hartley - 2000
    This book covers relevant geometric principles and how to represent objects algebraically so they can be computed and applied. Recent major developments in the theory and practice of scene reconstruction are described in detail in a unified framework. Richard Hartley and Andrew Zisserman provide comprehensive background material and explain how to apply the methods and implement the algorithms. First Edition HB (2000): 0-521-62304-9

The Cult of Statistical Significance: How the Standard Error Costs Us Jobs, Justice, and Lives


Stephen Thomas Ziliak - 2008
    If it takes a book to get it across, I hope this book will do it. It ought to.”—Thomas Schelling, Distinguished University Professor, School of Public Policy, University of Maryland, and 2005 Nobel Prize Laureate in Economics “With humor, insight, piercing logic and a nod to history, Ziliak and McCloskey show how economists—and other scientists—suffer from a mass delusion about statistical analysis. The quest for statistical significance that pervades science today is a deeply flawed substitute for thoughtful analysis. . . . Yet few participants in the scientific bureaucracy have been willing to admit what Ziliak and McCloskey make clear: the emperor has no clothes.”—Kenneth Rothman, Professor of Epidemiology, Boston University School of Health The Cult of Statistical Significance shows, field by field, how “statistical significance,” a technique that dominates many sciences, has been a huge mistake. The authors find that researchers in a broad spectrum of fields, from agronomy to zoology, employ “testing” that doesn’t test and “estimating” that doesn’t estimate. The facts will startle the outside reader: how could a group of brilliant scientists wander so far from scientific magnitudes? This study will encourage scientists who want to know how to get the statistical sciences back on track and fulfill their quantitative promise. The book shows for the first time how wide the disaster is, and how bad for science, and it traces the problem to its historical, sociological, and philosophical roots. Stephen T. Ziliak is the author or editor of many articles and two books. He currently lives in Chicago, where he is Professor of Economics at Roosevelt University. Deirdre N. McCloskey, Distinguished Professor of Economics, History, English, and Communication at the University of Illinois at Chicago, is the author of twenty books and three hundred scholarly articles. She has held Guggenheim and National Humanities Fellowships. She is best known for How to Be Human* Though an Economist (University of Michigan Press, 2000) and her most recent book, The Bourgeois Virtues: Ethics for an Age of Commerce (2006).

Understanding Thermodynamics


Hendrick C. Van Ness - 1983
    Language is informal, examples are vivid and lively, and the perspectivie is fresh. Based on lectures delivered to engineering students, this work will also be valued by scientists, engineers, technicians, businessmen, anyone facing energy challenges of the future.

Thomas' Calculus, Early Transcendentals, Media Upgrade


George B. Thomas Jr. - 2002
    This book offers a full range of exercises, a precise and conceptual presentation, and a new media package designed specifically to meet the needs of today's readers. The exercises gradually increase in difficulty, helping readers learn to generalize and apply the concepts. The refined table of contents introduces the exponential, logarithmic, and trigonometric functions in Chapter 7 of the text.KEY TOPICS Functions, Limits and Continuity, Differentiation, Applications of Derivatives, Integration, Applications of Definite Integrals, Integrals and Transcendental Functions, Techniques of Integration, Further Applications of Integration, Conic Sections and Polar Coordinates, Infinite Sequences and Series, Vectors and the Geometry of Space, Vector-Valued Functions and Motion in Space, Partial Derivatives, Multiple Integrals, Integration in Vector Fields.MARKET For all readers interested in Calculus.

Elements of Information Theory


Thomas M. Cover - 1991
    Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.