Prealgebra


Richard Rusczyk - 2011
    Topics covered in the book include the properties of arithmetic, exponents, primes and divisors, fractions, equations and inequalities, decimals, ratios and proportions, unit conversions and rates, percents, square roots, basic geometry (angles, perimeter, area, triangles, and quadrilaterals), statistics, counting and probability, and more! The text is structured to inspire the reader to explore and develop new ideas. Each section starts with problems, giving the student a chance to solve them without help before proceeding. The text then includes solutions to these problems, through which algebraic techniques are taught. Important facts and powerful problem solving approaches are highlighted throughout the text. In addition to the instructional material, the book contains well over 1000 problems. The solutions manual (sold separately) contains full solutions to all of the problems, not just answers. This book can serve as a complete Prealgebra course. This text is supplemented by free videos and a free learning system at the publisher's website.

Statistics for Business & Economics


James T. McClave - 1991
    Theoretical, yet applied. Statistics for Business and Economics, Eleventh Edition, gives you the best of both worlds. Using a rich array of applications from a variety of industries, McClave/Sincich/Benson clearly demonstrates how to use statistics effectively in a business environment.The book focuses on developing statistical thinking so the reader can better assess the credibility and value of inferences made from data. As consumers and future producers of statistical inferences, readers are introduced to a wide variety of data collection and analysis techniques to help them evaluate data and make informed business decisions. As with previous editions, this revision offers an abundance of applications with many new and updated exercises that draw on real business situations and recent economic events. The authors assume a background of basic algebra.

Introductory Linear Algebra: An Applied First Course


Bernard Kolman - 1988
    Calculus is not a prerequisite, although examples and exercises using very basic calculus are included (labeled Calculus Required.) The most technology-friendly text on the market, Introductory Linear Algebra is also the most flexible. By omitting certain sections, instructors can cover the essentials of linear algebra (including eigenvalues and eigenvectors), to show how the computer is used, and to introduce applications of linear algebra in a one-semester course.

Statistics for People Who (Think They) Hate Statistics


Neil J. Salkind - 2000
    The book begins with an introduction to the language of statistics and then covers descriptive statistics and inferential statistics. Throughout, the author offers readers:- Difficulty Rating Index for each chapter′s material- Tips for doing and thinking about a statistical technique- Top tens for everything from the best ways to create a graph to the most effective techniques for data collection- Steps that break techniques down into a clear sequence of procedures- SPSS tips for executing each major statistical technique- Practice exercises at the end of each chapter, followed by worked out solutions.The book concludes with a statistical software sampler and a description of the best Internet sites for statistical information and data resources. Readers also have access to a website for downloading data that they can use to practice additional exercises from the book. Students and researchers will appreciate the book′s unhurried pace and thorough, friendly presentation.

Introduction to Graph Theory


Douglas B. West - 1995
    Verification that algorithms work is emphasized more than their complexity. An effective use of examples, and huge number of interesting exercises, demonstrate the topics of trees and distance, matchings and factors, connectivity and paths, graph coloring, edges and cycles, and planar graphs. For those who need to learn to make coherent arguments in the fields of mathematics and computer science.

Clinical Psychology [with InfoTrac]


Timothy J. Trull - 2000
    A highly respected clinician and researcher, Dr. Trull examines the rigorous research training that clinicians receive, along with the empirically supported assessment methods and interventions that clinical psychologists must understand to be successful in the field. This new edition of Trull's best-selling text covers cutting-edge trends, and offers enhanced coverage of culture, gender and diversity, and contemporary issues of health care. Written to inspire students thinking of pursuing careers in the field of clinical psychology, this text is a complete introduction.

Using Multivariate Statistics


Barbara G. Tabachnick - 1983
    It givessyntax and output for accomplishing many analyses through the mostrecent releases of SAS, SPSS, and SYSTAT, some not available insoftware manuals. The book maintains its practical approach, stillfocusing on the benefits and limitations of applications of a techniqueto a data set -- when, why, and how to do it. Overall, it providesadvanced students with a timely and comprehensive introduction totoday's most commonly encountered statistical and multivariatetechniques, while assuming only a limited knowledge of higher-levelmathematics.

Computer Vision: Algorithms and Applications


Richard Szeliski - 2010
    However, despite all of the recent advances in computer vision research, the dream of having a computer interpret an image at the same level as a two-year old remains elusive. Why is computer vision such a challenging problem and what is the current state of the art?Computer Vision: Algorithms and Applications explores the variety of techniques commonly used to analyze and interpret images. It also describes challenging real-world applications where vision is being successfully used, both for specialized applications such as medical imaging, and for fun, consumer-level tasks such as image editing and stitching, which students can apply to their own personal photos and videos.More than just a source of "recipes," this exceptionally authoritative and comprehensive textbook/reference also takes a scientific approach to basic vision problems, formulating physical models of the imaging process before inverting them to produce descriptions of a scene. These problems are also analyzed using statistical models and solved using rigorous engineering techniquesTopics and features: Structured to support active curricula and project-oriented courses, with tips in the Introduction for using the book in a variety of customized courses Presents exercises at the end of each chapter with a heavy emphasis on testing algorithms and containing numerous suggestions for small mid-term projects Provides additional material and more detailed mathematical topics in the Appendices, which cover linear algebra, numerical techniques, and Bayesian estimation theory Suggests additional reading at the end of each chapter, including the latest research in each sub-field, in addition to a full Bibliography at the end of the book Supplies supplementary course material for students at the associated website, http: //szeliski.org/Book/ Suitable for an upper-level undergraduate or graduate-level course in computer science or engineering, this textbook focuses on basic techniques that work under real-world conditions and encourages students to push their creative boundaries. Its design and exposition also make it eminently suitable as a unique reference to the fundamental techniques and current research literature in computer vision.

Computer Age Statistical Inference: Algorithms, Evidence, and Data Science


Bradley Efron - 2016
    'Big data', 'data science', and 'machine learning' have become familiar terms in the news, as statistical methods are brought to bear upon the enormous data sets of modern science and commerce. How did we get here? And where are we going? This book takes us on an exhilarating journey through the revolution in data analysis following the introduction of electronic computation in the 1950s. Beginning with classical inferential theories - Bayesian, frequentist, Fisherian - individual chapters take up a series of influential topics: survival analysis, logistic regression, empirical Bayes, the jackknife and bootstrap, random forests, neural networks, Markov chain Monte Carlo, inference after model selection, and dozens more. The distinctly modern approach integrates methodology and algorithms with statistical inference. The book ends with speculation on the future direction of statistics and data science.

Concrete Mathematics: A Foundation for Computer Science


Ronald L. Graham - 1988
    "More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."

Deep Learning


Ian Goodfellow - 2016
    Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.

The Lady Tasting Tea: How Statistics Revolutionized Science in the Twentieth Century


David Salsburg - 2001
    At a summer tea party in Cambridge, England, a guest states that tea poured into milk tastes different from milk poured into tea. Her notion is shouted down by the scientific minds of the group. But one man, Ronald Fisher, proposes to scientifically test the hypothesis. There is no better person to conduct such an experiment, for Fisher is a pioneer in the field of statistics.The Lady Tasting Tea spotlights not only Fisher's theories but also the revolutionary ideas of dozens of men and women which affect our modern everyday lives. Writing with verve and wit, David Salsburg traces breakthroughs ranging from the rise and fall of Karl Pearson's theories to the methods of quality control that rebuilt postwar Japan's economy, including a pivotal early study on the capacity of a small beer cask at the Guinness brewing factory. Brimming with intriguing tidbits and colorful characters, The Lady Tasting Tea salutes the spirit of those who dared to look at the world in a new way.

Introduction to Graph Theory


Richard J. Trudeau - 1994
    This book leads the reader from simple graphs through planar graphs, Euler's formula, Platonic graphs, coloring, the genus of a graph, Euler walks, Hamilton walks, more. Includes exercises. 1976 edition.

Introduction to Algorithms


Thomas H. Cormen - 1989
    Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.

Foundations of Statistical Natural Language Processing


Christopher D. Manning - 1999
    This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.