What Is Data Science?


Mike Loukides - 2011
    Five years ago, in What is Web 2.0, Tim O'Reilly said that "data is the next Intel Inside." But what does that statement mean? Why do we suddenly care about statistics and about data? This report examines the many sides of data science -- the technologies, the companies and the unique skill sets.The web is full of "data-driven apps." Almost any e-commerce application is a data-driven application. There's a database behind a web front end, and middleware that talks to a number of other databases and data services (credit card processing companies, banks, and so on). But merely using data isn't really what we mean by "data science." A data application acquires its value from the data itself, and creates more data as a result. It's not just an application with data; it's a data product. Data science enables the creation of data products.

A Field Guide to Lies: Critical Thinking in the Information Age


Daniel J. Levitin - 2016
    We are bombarded with more information each day than our brains can process—especially in election season. It's raining bad data, half-truths, and even outright lies. New York Times bestselling author Daniel J. Levitin shows how to recognize misleading announcements, statistics, graphs, and written reports revealing the ways lying weasels can use them. It's becoming harder to separate the wheat from the digital chaff. How do we distinguish misinformation, pseudo-facts, distortions, and outright lies from reliable information? Levitin groups his field guide into two categories—statistical infomation and faulty arguments—ultimately showing how science is the bedrock of critical thinking. Infoliteracy means understanding that there are hierarchies of source quality and bias that variously distort our information feeds via every media channel, including social media. We may expect newspapers, bloggers, the government, and Wikipedia to be factually and logically correct, but they so often aren't. We need to think critically about the words and numbers we encounter if we want to be successful at work, at play, and in making the most of our lives. This means checking the plausibility and reasoning—not passively accepting information, repeating it, and making decisions based on it. Readers learn to avoid the extremes of passive gullibility and cynical rejection. Levitin's charming, entertaining, accessible guide can help anyone wake up to a whole lot of things that aren't so. And catch some lying weasels in their tracks!

Quantum Computing Since Democritus


Scott Aaronson - 2013
    Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.

Learn You a Haskell for Great Good!


Miran Lipovača - 2011
    Learn You a Haskell for Great Good! introduces programmers familiar with imperative languages (such as C++, Java, or Python) to the unique aspects of functional programming. Packed with jokes, pop culture references, and the author's own hilarious artwork, Learn You a Haskell for Great Good! eases the learning curve of this complex language, and is a perfect starting point for any programmer looking to expand his or her horizons. The well-known web tutorial on which this book is based is widely regarded as the best way for beginners to learn Haskell, and receives over 30,000 unique visitors monthly.

Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics and Speech Recognition


Dan Jurafsky - 2000
    This comprehensive work covers both statistical and symbolic approaches to language processing; it shows how they can be applied to important tasks such as speech recognition, spelling and grammar correction, information extraction, search engines, machine translation, and the creation of spoken-language dialog agents. The following distinguishing features make the text both an introduction to the field and an advanced reference guide.- UNIFIED AND COMPREHENSIVE COVERAGE OF THE FIELDCovers the fundamental algorithms of each field, whether proposed for spoken or written language, whether logical or statistical in origin.- EMPHASIS ON WEB AND OTHER PRACTICAL APPLICATIONSGives readers an understanding of how language-related algorithms can be applied to important real-world problems.- EMPHASIS ON SCIENTIFIC EVALUATIONOffers a description of how systems are evaluated with each problem domain.- EMPERICIST/STATISTICAL/MACHINE LEARNING APPROACHES TO LANGUAGE PROCESSINGCovers all the new statistical approaches, while still completely covering the earlier more structured and rule-based methods.

Time Series Analysis


James Douglas Hamilton - 1994
    This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. Time Series Analysis fills an important need for a textbook that integrates economic theory, econometrics, and new results.The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.-- "Journal of Economics"

Calculus


Dale E. Varberg - 1999
    Covering various the materials needed by students in engineering, science, and mathematics, this calculus text makes effective use of computing technology, graphics, and applications. It presents at least two technology projects in each chapter.

Introduction to Mathematical Statistics


Robert V. Hogg - 1962
    Designed for two-semester, beginning graduate courses in Mathematical Statistics, and for senior undergraduate Mathematics, Statistics, and Actuarial Science majors, this text retains its ongoing features and continues to provide students with background material.

Neural Networks and Deep Learning


Michael Nielsen - 2013
    The book will teach you about:* Neural networks, a beautiful biologically-inspired programming paradigm which enables a computer to learn from observational data* Deep learning, a powerful set of techniques for learning in neural networksNeural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing. This book will teach you the core concepts behind neural networks and deep learning.

A Book of Abstract Algebra


Charles C. Pinter - 1982
    Its easy-to-read treatment offers an intuitive approach, featuring informal discussions followed by thematically arranged exercises. Intended for undergraduate courses in abstract algebra, it is suitable for junior- and senior-level math majors and future math teachers. This second edition features additional exercises to improve student familiarity with applications. An introductory chapter traces concepts of abstract algebra from their historical roots. Succeeding chapters avoid the conventional format of definition-theorem-proof-corollary-example; instead, they take the form of a discussion with students, focusing on explanations and offering motivation. Each chapter rests upon a central theme, usually a specific application or use. The author provides elementary background as needed and discusses standard topics in their usual order. He introduces many advanced and peripheral subjects in the plentiful exercises, which are accompanied by ample instruction and commentary and offer a wide range of experiences to students at different levels of ability.

Survey Methodology


Robert M. Groves - 2004
    Survey Methodology describes the basic principles of survey design discovered in methodological research over recent years and offers guidance for making successful decisions in the design and execution of high quality surveys. Written by six nationally recognized experts in the field, this book covers the major considerations in designing and conducting a sample survey. Topical, accessible, and succinct, this book represents the state of the science in survey methodology. Employing the "total survey error" paradigm as an organizing framework, it merges the science of surveys with state-of-the-art practices. End-of-chapter terms, references, and exercises enhance its value as a reference for practitioners and as a text for advanced students.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

C Programming: Language: A Step by Step Beginner's Guide to Learn C Programming in 7 Days


Darrel L. Graham - 2016
    It is a great book, not just for beginning programmers, but also for computer users who would want to have an idea what is happening behind the scenes as they work with various computer programs. In this book, you are going to learn what the C programming language entails, how to write conditions, expressions, statements and even commands, for the language to perform its functions efficiently. You will learn too how to organize relevant expressions so that after compilation and execution, the computer returns useful results and not error messages. Additionally, this book details the data types that you need for the C language and how to present it as well. Simply put, this is a book for programmers, learners taking other computer courses, and other computer users who would like to be versed with the workings of the most popular computer language, C. In this book You'll learn: What Is The C Language? Setting Up Your Local Environment The C Structure and Data Type C Constants and Literals C Storage Classes Making Decisions In C The Role Of Loops In C Programming Functions in C Programming Structures and Union in C Bit Fields and Typedef Within C. C Header Files and Type Casting Benefits Of Using The C Language ...and much more!! Download your copy today! click the BUY button and download it right now!

Big Data: A Revolution That Will Transform How We Live, Work, and Think


Viktor Mayer-Schönberger - 2013
    “Big data” refers to our burgeoning ability to crunch vast collections of information, analyze it instantly, and draw sometimes profoundly surprising conclusions from it. This emerging science can translate myriad phenomena—from the price of airline tickets to the text of millions of books—into searchable form, and uses our increasing computing power to unearth epiphanies that we never could have seen before. A revolution on par with the Internet or perhaps even the printing press, big data will change the way we think about business, health, politics, education, and innovation in the years to come. It also poses fresh threats, from the inevitable end of privacy as we know it to the prospect of being penalized for things we haven’t even done yet, based on big data’s ability to predict our future behavior.In this brilliantly clear, often surprising work, two leading experts explain what big data is, how it will change our lives, and what we can do to protect ourselves from its hazards. Big Data is the first big book about the next big thing.www.big-data-book.com

Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life


Albert-László Barabási - 2002
    Albert-László Barabási, the nation’s foremost expert in the new science of networks and author of Bursts, takes us on an intellectual adventure to prove that social networks, corporations, and living organisms are more similar than previously thought. Grasping a full understanding of network science will someday allow us to design blue-chip businesses, stop the outbreak of deadly diseases, and influence the exchange of ideas and information. Just as James Gleick and the Erdos–Rényi model brought the discovery of chaos theory to the general public, Linked tells the story of the true science of the future and of experiments in statistical mechanics on the internet, all vital parts of what would eventually be called the Barabási–Albert model.