Linear Algebra and Its Applications [with CD-ROM]


David C. Lay - 1993
    

Introduction to Real Analysis


Robert G. Bartle - 1982
    Therefore, this book provides the fundamental concepts and techniques of real analysis for readers in all of these areas. It helps one develop the ability to think deductively, analyze mathematical situations and extend ideas to a new context. Like the first two editions, this edition maintains the same spirit and user-friendly approach with some streamlined arguments, a few new examples, rearranged topics, and a new chapter on the Generalized Riemann Integral.

Human Compatible: Artificial Intelligence and the Problem of Control


Stuart Russell - 2019
    Conflict between humans and machines is seen as inevitable and its outcome all too predictable.In this groundbreaking book, distinguished AI researcher Stuart Russell argues that this scenario can be avoided, but only if we rethink AI from the ground up. Russell begins by exploring the idea of intelligence in humans and in machines. He describes the near-term benefits we can expect, from intelligent personal assistants to vastly accelerated scientific research, and outlines the AI breakthroughs that still have to happen before we reach superhuman AI. He also spells out the ways humans are already finding to misuse AI, from lethal autonomous weapons to viral sabotage.If the predicted breakthroughs occur and superhuman AI emerges, we will have created entities far more powerful than ourselves. How can we ensure they never, ever, have power over us? Russell suggests that we can rebuild AI on a new foundation, according to which machines are designed to be inherently uncertain about the human preferences they are required to satisfy. Such machines would be humble, altruistic, and committed to pursue our objectives, not theirs. This new foundation would allow us to create machines that are provably deferential and provably beneficial.In a 2014 editorial co-authored with Stephen Hawking, Russell wrote, "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last." Solving the problem of control over AI is not just possible; it is the key that unlocks a future of unlimited promise.

How to Bake Pi: An Edible Exploration of the Mathematics of Mathematics


Eugenia Cheng - 2015
    Of course, it’s not all cooking; we’ll also run the New York and Chicago marathons, pay visits to Cinderella and Lewis Carroll, and even get to the bottom of a tomato’s identity as a vegetable. This is not the math of our high school classes: mathematics, Cheng shows us, is less about numbers and formulas and more about how we know, believe, and understand anything, including whether our brother took too much cake.At the heart of How to Bake Pi is Cheng’s work on category theory—a cutting-edge “mathematics of mathematics.” Cheng combines her theory work with her enthusiasm for cooking both to shed new light on the fundamentals of mathematics and to give readers a tour of a vast territory no popular book on math has explored before. Lively, funny, and clear, How to Bake Pi will dazzle the initiated while amusing and enlightening even the most hardened math-phobe.

Mostly Harmless Econometrics: An Empiricist's Companion


Joshua D. Angrist - 2008
    In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? Mostly Harmless Econometrics shows how the basic tools of applied econometrics allow the data to speak.In addition to econometric essentials, Mostly Harmless Econometrics covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications

Principles of Statistics


M.G. Bulmer - 1979
    There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again for the classroom or for self-study.Principles of Statistics was created primarily for the student of natural sciences, the social scientist, the undergraduate mathematics student, or anyone familiar with the basics of mathematical language. It assumes no previous knowledge of statistics or probability; nor is extensive mathematical knowledge necessary beyond a familiarity with the fundamentals of differential and integral calculus. (The calculus is used primarily for ease of notation; skill in the techniques of integration is not necessary in order to understand the text.)Professor Bulmer devotes the first chapters to a concise, admirably clear description of basic terminology and fundamental statistical theory: abstract concepts of probability and their applications in dice games, Mendelian heredity, etc.; definitions and examples of discrete and continuous random variables; multivariate distributions and the descriptive tools used to delineate them; expected values; etc. The book then moves quickly to more advanced levels, as Professor Bulmer describes important distributions (binomial, Poisson, exponential, normal, etc.), tests of significance, statistical inference, point estimation, regression, and correlation. Dozens of exercises and problems appear at the end of various chapters, with answers provided at the back of the book. Also included are a number of statistical tables and selected references.

Discovering Statistics Using SPSS (Introducing Statistical Methods)


Andy Field - 2000
    What's new in the Second Edition? 1. Fully compliant with the latest version of SPSS version 12 2. More coverage of advanced statistics including completely new coverage of non-parametric statistics. The book is 50 per cent longer than the First Edition. 3. Each section of each chapter now has a notation - 1,2 or 3 - referring to the intended level of study. This helps students navigate their way through the book and makes it user-friendly for students of ALL levels. 4. Has a 'how to use this book' section at the start of the text. 5. Characters in each chapter have defined roles - summarizing key points, to pose questions etc 6. Each chapter now has several examples for students to work through. Answers provided on the enclosed CD-ROM

Solid State Physics: Structure and Properties of Materials


M.A. Wahab - 2005
    The First seven chapters deal with structure related aspects such as lattice and crystal structures, bonding, packing and diffusion of atoms followed by imperfections and lattice vibrations. Chapter eight deals mainly with experimental methods of determining structures of given materials. While the next nine chapters cover various physical properties of crystalline solids, the last chapter deals with the anisotropic properties of materials. This chapter has been added for benefit of readers to understand the crystal properties (anisotropic) in terms of some simple mathematical formulations such as tensor and matrix. New to the Second Edition: Chapter on: *Anisotropic Properties of Materials

Bayesian Methods for Hackers: Probabilistic Programming and Bayesian Inference


Cameron Davidson-Pilon - 2014
    However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice-freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples and intuitive explanations that have been refined after extensive user feedback. You'll learn how to use the Markov Chain Monte Carlo algorithm, choose appropriate sample sizes and priors, work with loss functions, and apply Bayesian inference in domains ranging from finance to marketing. Once you've mastered these techniques, you'll constantly turn to this guide for the working PyMC code you need to jumpstart future projects. Coverage includes - Learning the Bayesian "state of mind" and its practical implications - Understanding how computers perform Bayesian inference - Using the PyMC Python library to program Bayesian analyses - Building and debugging models with PyMC - Testing your model's "goodness of fit" - Opening the "black box" of the Markov Chain Monte Carlo algorithm to see how and why it works - Leveraging the power of the "Law of Large Numbers" - Mastering key concepts, such as clustering, convergence, autocorrelation, and thinning - Using loss functions to measure an estimate's weaknesses based on your goals and desired outcomes - Selecting appropriate priors and understanding how their influence changes with dataset size - Overcoming the "exploration versus exploitation" dilemma: deciding when "pretty good" is good enough - Using Bayesian inference to improve A/B testing - Solving data science problems when only small amounts of data are available Cameron Davidson-Pilon has worked in many areas of applied mathematics, from the evolutionary dynamics of genes and diseases to stochastic modeling of financial prices. His contributions to the open source community include lifelines, an implementation of survival analysis in Python. Educated at the University of Waterloo and at the Independent University of Moscow, he currently works with the online commerce leader Shopify.

The Giro Playboy


Michael Smith - 2006
    Along the way he falls in love, drinks a lot of beer, eats too many sweets, ponders the meaning of life on the dole, and gets admitted to hospital for a painful condition.

Nikon D3100 for Dummies


Julie Adair King - 2010
    Say you?re already an experienced photographer? The helpful tips and tricks in this friendly book will get you quickly up to speed on the D3100's new 14-megapixel sensor, continous video/live focus, full HD video, expanded autofocus, and more. As a seasoned instructor at the Palm Beach Photographic Center, Julie anticipates all questions, whether you?re a beginner or digital camera pro, and offers pages of easy-to-follow advice.Helps you get every bit of functionality out of the new Nikon D3100 camera Walks you through its exciting new features, including the 14-megapixel sensor, continous video/live focus, full HD video, expanded autofocus, and the updated in-camera menu Explores shooting in Auto mode, managing playback options, and basic troubleshooting Explains how to adjust the camera's manual settings for your own preferred exposure, lighting, focus, and color style Covers digital photo housekeeping tips?how to organize, edit, and share your files Tap all the tools in this hot new DSLR camera and start taking some great pix with Nikon D3100 For Dummies.

Data Smart: Using Data Science to Transform Information into Insight


John W. Foreman - 2013
    Major retailers are predicting everything from when their customers are pregnant to when they want a new pair of Chuck Taylors. It's a brave new world where seemingly meaningless data can be transformed into valuable insight to drive smart business decisions.But how does one exactly do data science? Do you have to hire one of these priests of the dark arts, the "data scientist," to extract this gold from your data? Nope.Data science is little more than using straight-forward steps to process raw data into actionable insight. And in Data Smart, author and data scientist John Foreman will show you how that's done within the familiar environment of a spreadsheet. Why a spreadsheet? It's comfortable! You get to look at the data every step of the way, building confidence as you learn the tricks of the trade. Plus, spreadsheets are a vendor-neutral place to learn data science without the hype. But don't let the Excel sheets fool you. This is a book for those serious about learning the analytic techniques, the math and the magic, behind big data.Each chapter will cover a different technique in a spreadsheet so you can follow along: - Mathematical optimization, including non-linear programming and genetic algorithms- Clustering via k-means, spherical k-means, and graph modularity- Data mining in graphs, such as outlier detection- Supervised AI through logistic regression, ensemble models, and bag-of-words models- Forecasting, seasonal adjustments, and prediction intervals through monte carlo simulation- Moving from spreadsheets into the R programming languageYou get your hands dirty as you work alongside John through each technique. But never fear, the topics are readily applicable and the author laces humor throughout. You'll even learn what a dead squirrel has to do with optimization modeling, which you no doubt are dying to know.

Invitation to Psychology


Carole Wade - 1998
    In clear, lively, warm prose, this edition continues the title's integration of gender, culture, and ethnicity. By the end, readers will learn how to interpret research and to address and resolve controversies. MyPsychLab is an integral part of the Wade/Tavris/Garry program. Engaging activities and assessments provide a teaching and learning system that helps students think like a psychologist. With MyPsychLab, students can watch videos on psychological research and applications, participate in virtual classic experiments, and develop critical thinking skills through writing. "Invitation to Psychology, "5/e is available in a new DSM-5 Updated edition. To learn more, click here. This title is available in a variety of formats - digital and print. Pearson offers its titles on the devices students love through Pearson's MyLab products, CourseSmart, Amazon, and more.

The Language of Mathematics: Making the Invisible Visible


Keith Devlin - 1998
    And this language is mathematics." In The Language of Mathematics, award-winning author Keith Devlin reveals the vital role mathematics plays in our eternal quest to understand who we are and the world we live in. More than just the study of numbers, mathematics provides us with the eyes to recognize and describe the hidden patterns of life—patterns that exist in the physical, biological, and social worlds without, and the realm of ideas and thoughts within.Taking the reader on a wondrous journey through the invisible universe that surrounds us—a universe made visible by mathematics—Devlin shows us what keeps a jumbo jet in the air, explains how we can see and hear a football game on TV, allows us to predict the weather, the behavior of the stock market, and the outcome of elections. Microwave ovens, telephone cables, children's toys, pacemakers, automobiles, and computers—all operate on mathematical principles. Far from a dry and esoteric subject, mathematics is a rich and living part of our culture. An exploration of an often woefully misunderstood subject, The Language of Mathematics celebrates the simplicity, the precision, the purity, and the elegance of mathematics.

Foundations of Statistical Natural Language Processing


Christopher D. Manning - 1999
    This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.