Numerical Linear Algebra


Lloyd N. Trefethen - 1997
    The clarity and eloquence of the presentation make it popular with teachers and students alike. The text aims to expand the reader's view of the field and to present standard material in a novel way. All of the most important topics in the field are covered with a fresh perspective, including iterative methods for systems of equations and eigenvalue problems and the underlying principles of conditioning and stability. Presentation is in the form of 40 lectures, which each focus on one or two central ideas. The unity between topics is emphasized throughout, with no risk of getting lost in details and technicalities. The book breaks with tradition by beginning with the QR factorization - an important and fresh idea for students, and the thread that connects most of the algorithms of numerical linear algebra.

Innumeracy: Mathematical Illiteracy and Its Consequences


John Allen Paulos - 1988
    Dozens of examples in innumeracy show us how it affects not only personal economics and travel plans, but explains mis-chosen mates, inappropriate drug-testing, and the allure of pseudo-science.

Linear Algebra and Its Applications


Gilbert Strang - 1976
    While the mathematics is there, the effort is not all concentrated on proofs. Strang's emphasis is on understanding. He explains concepts, rather than deduces. This book is written in an informal and personal style and teaches real mathematics. The gears change in Chapter 2 as students reach the introduction of vector spaces. Throughout the book, the theory is motivated and reinforced by genuine applications, allowing pure mathematicians to teach applied mathematics.

The Data Detective: Ten Easy Rules to Make Sense of Statistics


Tim Harford - 2020
    That’s a mistake, Tim Harford says in The Data Detective. We shouldn’t be suspicious of statistics—we need to understand what they mean and how they can improve our lives: they are, at heart, human behavior seen through the prism of numbers and are often “the only way of grasping much of what is going on around us.” If we can toss aside our fears and learn to approach them clearly—understanding how our own preconceptions lead us astray—statistics can point to ways we can live better and work smarter.As “perhaps the best popular economics writer in the world” (New Statesman), Tim Harford is an expert at taking complicated ideas and untangling them for millions of readers. In The Data Detective, he uses new research in science and psychology to set out ten strategies for using statistics to erase our biases and replace them with new ideas that use virtues like patience, curiosity, and good sense to better understand ourselves and the world. As a result, The Data Detective is a big-idea book about statistics and human behavior that is fresh, unexpected, and insightful.

The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World


Pedro Domingos - 2015
    In The Master Algorithm, Pedro Domingos lifts the veil to give us a peek inside the learning machines that power Google, Amazon, and your smartphone. He assembles a blueprint for the future universal learner--the Master Algorithm--and discusses what it will mean for business, science, and society. If data-ism is today's philosophy, this book is its bible.

Deep Learning for Coders with Fastai and Pytorch: AI Applications Without a PhD


Jeremy Howard - 2020
    But as this hands-on guide demonstrates, programmers comfortable with Python can achieve impressive results in deep learning with little math background, small amounts of data, and minimal code. How? With fastai, the first library to provide a consistent interface to the most frequently used deep learning applications.Authors Jeremy Howard and Sylvain Gugger show you how to train a model on a wide range of tasks using fastai and PyTorch. You'll also dive progressively further into deep learning theory to gain a complete understanding of the algorithms behind the scenes.Train models in computer vision, natural language processing, tabular data, and collaborative filteringLearn the latest deep learning techniques that matter most in practiceImprove accuracy, speed, and reliability by understanding how deep learning models workDiscover how to turn your models into web applicationsImplement deep learning algorithms from scratchConsider the ethical implications of your work

Standard Deviations: Flawed Assumptions, Tortured Data, and Other Ways to Lie with Statistics


Gary Smith - 2014
    In Standard Deviations, economics professor Gary Smith walks us through the various tricks and traps that people use to back up their own crackpot theories. Sometimes, the unscrupulous deliberately try to mislead us. Other times, the well-intentioned are blissfully unaware of the mischief they are committing. Today, data is so plentiful that researchers spend precious little time distinguishing between good, meaningful indicators and total rubbish. Not only do others use data to fool us, we fool ourselves.With the breakout success of Nate Silver’s The Signal and the Noise, the once humdrum subject of statistics has never been hotter. Drawing on breakthrough research in behavioral economics by luminaries like Daniel Kahneman and Dan Ariely and taking to task some of the conclusions of Freakonomics author Steven D. Levitt, Standard Deviations demystifies the science behind statistics and makes it easy to spot the fraud all around.

The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth Century


David Salsburg - 2001
    At a summer tea party in Cambridge, England, a guest states that tea poured into milk tastes different from milk poured into tea. Her notion is shouted down by the scientific minds of the group. But one man, Ronald Fisher, proposes to scientifically test the hypothesis. There is no better person to conduct such an experiment, for Fisher is a pioneer in the field of statistics.The Lady Tasting Tea spotlights not only Fisher's theories but also the revolutionary ideas of dozens of men and women which affect our modern everyday lives. Writing with verve and wit, David Salsburg traces breakthroughs ranging from the rise and fall of Karl Pearson's theories to the methods of quality control that rebuilt postwar Japan's economy, including a pivotal early study on the capacity of a small beer cask at the Guinness brewing factory. Brimming with intriguing tidbits and colorful characters, The Lady Tasting Tea salutes the spirit of those who dared to look at the world in a new way.

Statistical Techniques in Business & Economics [With CDROM]


Douglas A. Lind - 1974
    The text is non-threatening and presents concepts clearly and succinctly with a conversational writing style. All statistical concepts are illustrated with solved applied examples immediately upon introduction. Self reviews and exercises for each section, and review sections for groups of chapters also support the student learning steps. Modern computing applications (Excel, Minitab, and MegaStat) are introduced, but the text maintains a focus on presenting statistics concepts as applied in business as opposed to technology or programming methods. The thirteenth edition continues as a students' text with increased emphasis on interpretation of data and results.

Machine Learning With Random Forests And Decision Trees: A Mostly Intuitive Guide, But Also Some Python


Scott Hartshorn - 2016
    They are typically used to categorize something based on other data that you have. The purpose of this book is to help you understand how Random Forests work, as well as the different options that you have when using them to analyze a problem. Additionally, since Decision Trees are a fundamental part of Random Forests, this book explains how they work. This book is focused on understanding Random Forests at the conceptual level. Knowing how they work, why they work the way that they do, and what options are available to improve results. This book covers how Random Forests work in an intuitive way, and also explains the equations behind many of the functions, but it only has a small amount of actual code (in python). This book is focused on giving examples and providing analogies for the most fundamental aspects of how random forests and decision trees work. The reason is that those are easy to understand and they stick with you. There are also some really interesting aspects of random forests, such as information gain, feature importances, or out of bag error, that simply cannot be well covered without diving into the equations of how they work. For those the focus is providing the information in a straight forward and easy to understand way.

Applied Linear Regression Models- 4th Edition with Student CD (McGraw Hill/Irwin Series: Operations and Decision Sciences)


Michael H. Kutner - 2003
    Cases, datasets, and examples allow for a more real-world perspective and explore relevant uses of regression techniques in business today.

Calling Bullshit: The Art of Skepticism in a Data-Driven World


Carl T. Bergstrom - 2020
    Now, two science professors give us the tools to dismantle misinformation and think clearly in a world of fake news and bad data.It's increasingly difficult to know what's true. Misinformation, disinformation, and fake news abound. Our media environment has become hyperpartisan. Science is conducted by press release. Startup culture elevates bullshit to high art. We are fairly well equipped to spot the sort of old-school bullshit that is based in fancy rhetoric and weasel words, but most of us don't feel qualified to challenge the avalanche of new-school bullshit presented in the language of math, science, or statistics. In Calling Bullshit, Professors Carl Bergstrom and Jevin West give us a set of powerful tools to cut through the most intimidating data.You don't need a lot of technical expertise to call out problems with data. Are the numbers or results too good or too dramatic to be true? Is the claim comparing like with like? Is it confirming your personal bias? Drawing on a deep well of expertise in statistics and computational biology, Bergstrom and West exuberantly unpack examples of selection bias and muddled data visualization, distinguish between correlation and causation, and examine the susceptibility of science to modern bullshit.We have always needed people who call bullshit when necessary, whether within a circle of friends, a community of scholars, or the citizenry of a nation. Now that bullshit has evolved, we need to relearn the art of skepticism.

Econometrics


Fumio Hayashi - 2000
    It introduces first year Ph.D. students to standard graduate econometrics material from a modern perspective. It covers all the standard material necessary for understanding the principal techniques of econometrics from ordinary least squares through cointegration. The book is also distinctive in developing both time-series and cross-section analysis fully, giving the reader a unified framework for understanding and integrating results.Econometrics has many useful features and covers all the important topics in econometrics in a succinct manner. All the estimation techniques that could possibly be taught in a first-year graduate course, except maximum likelihood, are treated as special cases of GMM (generalized methods of moments). Maximum likelihood estimators for a variety of models (such as probit and tobit) are collected in a separate chapter. This arrangement enables students to learn various estimation techniques in an efficient manner. Eight of the ten chapters include a serious empirical application drawn from labor economics, industrial organization, domestic and international finance, and macroeconomics. These empirical exercises at the end of each chapter provide students a hands-on experience applying the techniques covered in the chapter. The exposition is rigorous yet accessible to students who have a working knowledge of very basic linear algebra and probability theory. All the results are stated as propositions, so that students can see the points of the discussion and also the conditions under which those results hold. Most propositions are proved in the text.For those who intend to write a thesis on applied topics, the empirical applications of the book are a good way to learn how to conduct empirical research. For the theoretically inclined, the no-compromise treatment of the basic techniques is a good preparation for more advanced theory courses.

How to Lie with Statistics


Darrell Huff - 1954
    Darrell Huff runs the gamut of every popularly used type of statistic, probes such things as the sample study, the tabulation method, the interview technique, or the way the results are derived from the figures, and points up the countless number of dodges which are used to fool rather than to inform.

OpenIntro Statistics


David M. Diez - 2012
    Our inaugural effort is OpenIntro Statistics. Probability is optional, inference is key, and we feature real data whenever possible. Files for the entire book are freely available at openintro.org, and anybody can purchase a paperback copy from amazon.com for under $10.The future for OpenIntro depends on the involvement and enthusiasm of our community. Visit our website, openintro.org. We provide free course management tools, including an online question bank, utilities for creating course quizzes, and many other helpful resources.CERTAIN CONTENT THAT APPEARS ON THIS SITE COMES FROM AMAZON SERVICES LLC. THIS CONTENT IS PROVIDED ‘AS IS’ AND IS SUBJECT TO CHANGE OR REMOVAL AT ANY TIME.Can’t find it here? Search Amazon.com Search: All Products Apparel & AccessoriesBabyBeautyBooksCamera & PhotoCell Phones & ServiceClassical MusicComputersComputer & Video GamesDVDElectronicsGourmet FoodHome & GardenMiscellaneousHealth & Personal CareJewelry & WatchesKitchen & HousewaresMagazine SubscriptionsMusicMusical InstrumentsSoftwareSports & OutdoorsTools & HardwareToys & GamesVHS Keywords: