Fluent Python: Clear, Concise, and Effective Programming


Luciano Ramalho - 2015
    With this hands-on guide, you'll learn how to write effective, idiomatic Python code by leveraging its best and possibly most neglected features. Author Luciano Ramalho takes you through Python's core language features and libraries, and shows you how to make your code shorter, faster, and more readable at the same time.Many experienced programmers try to bend Python to fit patterns they learned from other languages, and never discover Python features outside of their experience. With this book, those Python programmers will thoroughly learn how to become proficient in Python 3.This book covers:Python data model: understand how special methods are the key to the consistent behavior of objectsData structures: take full advantage of built-in types, and understand the text vs bytes duality in the Unicode ageFunctions as objects: view Python functions as first-class objects, and understand how this affects popular design patternsObject-oriented idioms: build classes by learning about references, mutability, interfaces, operator overloading, and multiple inheritanceControl flow: leverage context managers, generators, coroutines, and concurrency with the concurrent.futures and asyncio packagesMetaprogramming: understand how properties, attribute descriptors, class decorators, and metaclasses work"

Qualitative Data Analysis: A Methods Sourcebook


Matthew B. Miles - 2013
    Several of the data display strategies from previous editions are now presented in re-envisioned and reorganized formats to enhance reader accessibility and comprehension. The Third Edition's presentation of the fundamentals of research design and data management is followed by five distinct methods of analysis: exploring, describing, ordering, explaining, and predicting. Miles and Huberman′s original research studies are profiled and accompanied with new examples from Salda�a′s recent qualitative work. The book′s most celebrated chapter, Drawing and Verifying Conclusions, is retained and revised, and the chapter on report writing has been greatly expanded, and is now called Writing About Qualitative Research. Comprehensive and authoritative, Qualitative Data Analysis has been elegantly revised for a new generation of qualitative researchers.

Natural Language Processing with Python


Steven Bird - 2009
    With it, you'll learn how to write Python programs that work with large collections of unstructured text. You'll access richly annotated datasets using a comprehensive range of linguistic data structures, and you'll understand the main algorithms for analyzing the content and structure of written communication.Packed with examples and exercises, Natural Language Processing with Python will help you: Extract information from unstructured text, either to guess the topic or identify "named entities" Analyze linguistic structure in text, including parsing and semantic analysis Access popular linguistic databases, including WordNet and treebanks Integrate techniques drawn from fields as diverse as linguistics and artificial intelligenceThis book will help you gain practical skills in natural language processing using the Python programming language and the Natural Language Toolkit (NLTK) open source library. If you're interested in developing web applications, analyzing multilingual news sources, or documenting endangered languages -- or if you're simply curious to have a programmer's perspective on how human language works -- you'll find Natural Language Processing with Python both fascinating and immensely useful.

Social Research Methods


Alan Bryman - 2001
    Fully updated and now in two colour, the text is accessible and well structured with numerous real life examples and student learning aids. The text is also accompanied by a fully comprehensive companion web site.

Reinforcement Learning: An Introduction


Richard S. Sutton - 1998
    Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.

Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die


Eric Siegel - 2013
    Rather than a "how to" for hands-on techies, the book entices lay-readers and experts alike by covering new case studies and the latest state-of-the-art techniques.You have been predicted — by companies, governments, law enforcement, hospitals, and universities. Their computers say, "I knew you were going to do that!" These institutions are seizing upon the power to predict whether you're going to click, buy, lie, or die.Why? For good reason: predicting human behavior combats financial risk, fortifies healthcare, conquers spam, toughens crime fighting, and boosts sales.How? Prediction is powered by the world's most potent, booming unnatural resource: data. Accumulated in large part as the by-product of routine tasks, data is the unsalted, flavorless residue deposited en masse as organizations churn away. Surprise! This heap of refuse is a gold mine. Big data embodies an extraordinary wealth of experience from which to learn.Predictive analytics unleashes the power of data. With this technology, the computer literally learns from data how to predict the future behavior of individuals. Perfect prediction is not possible, but putting odds on the future — lifting a bit of the fog off our hazy view of tomorrow — means pay dirt.In this rich, entertaining primer, former Columbia University professor and Predictive Analytics World founder Eric Siegel reveals the power and perils of prediction: -What type of mortgage risk Chase Bank predicted before the recession. -Predicting which people will drop out of school, cancel a subscription, or get divorced before they are even aware of it themselves. -Why early retirement decreases life expectancy and vegetarians miss fewer flights. -Five reasons why organizations predict death, including one health insurance company. -How U.S. Bank, European wireless carrier Telenor, and Obama's 2012 campaign calculated the way to most strongly influence each individual. -How IBM's Watson computer used predictive modeling to answer questions and beat the human champs on TV's Jeopardy! -How companies ascertain untold, private truths — how Target figures out you're pregnant and Hewlett-Packard deduces you're about to quit your job. -How judges and parole boards rely on crime-predicting computers to decide who stays in prison and who goes free. -What's predicted by the BBC, Citibank, ConEd, Facebook, Ford, Google, IBM, the IRS, Match.com, MTV, Netflix, Pandora, PayPal, Pfizer, and Wikipedia. A truly omnipresent science, predictive analytics affects everyone, every day. Although largely unseen, it drives millions of decisions, determining whom to call, mail, investigate, incarcerate, set up on a date, or medicate.Predictive analytics transcends human perception. This book's final chapter answers the riddle: What often happens to you that cannot be witnessed, and that you can't even be sure has happened afterward — but that can be predicted in advance?Whether you are a consumer of it — or consumed by it — get a handle on the power of Predictive Analytics.

Research and Evaluation in Education and Psychology: Integrating Diversity with Quantitative, Qualitative, and Mixed Methods


Donna M. Mertens - 1997
    Donna is so sensitive in exploring those issues, a first in a text for that class and a welcome addition.--Nick Eastmond, Utah State UniversityFocused on discussing what is considered to be good research, this text explains quantitative, qualitative, and mixed methods in detail, incorporating the viewpoints of various research paradigms into the descriptions of these methods. Approximately 60% of the content in this Third Edition is new, with lots of fresh examples.Key FeaturesPostpositivist, constructivist, transformative, and pragmatic paradigms discussedConducting research in culturally complex communities emphasized throughoutA step-by-step overview of the entire research process providedNew to this Edition New coverage on how to write a literature review and plan a dissertationNew pedagogy including Extending Your Thinking throughoutThis is a core or supplemental text for research courses in departments of education, psychology, sociology, social work and other human-services disciplines.

Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics and Speech Recognition


Dan Jurafsky - 2000
    This comprehensive work covers both statistical and symbolic approaches to language processing; it shows how they can be applied to important tasks such as speech recognition, spelling and grammar correction, information extraction, search engines, machine translation, and the creation of spoken-language dialog agents. The following distinguishing features make the text both an introduction to the field and an advanced reference guide.- UNIFIED AND COMPREHENSIVE COVERAGE OF THE FIELDCovers the fundamental algorithms of each field, whether proposed for spoken or written language, whether logical or statistical in origin.- EMPHASIS ON WEB AND OTHER PRACTICAL APPLICATIONSGives readers an understanding of how language-related algorithms can be applied to important real-world problems.- EMPHASIS ON SCIENTIFIC EVALUATIONOffers a description of how systems are evaluated with each problem domain.- EMPERICIST/STATISTICAL/MACHINE LEARNING APPROACHES TO LANGUAGE PROCESSINGCovers all the new statistical approaches, while still completely covering the earlier more structured and rule-based methods.

How to Write a Lot: A Practical Guide to Productive Academic Writing


Paul J. Silvia - 2007
    Writing is hard work and can be difficult to wedge into a frenetic academic schedule.This revised and updated edition of Paul Silvia's popular guide provides practical, light-hearted advice to help academics overcome common barriers and become productive writers. Silvia's expert tips have been updated to apply to a wide variety of disciplines, and this edition has a new chapter devoted to grant and fellowship writing.

Structure and Interpretation of Computer Programs


Harold Abelson - 1984
    This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.

Statistics in Plain English


Timothy C. Urdan - 2001
    Each self-contained chapter consists of three sections. The first describes the statistic, including how it is used and what information it provides. The second section reviews how it works, how to calculate the formula, the strengths and weaknesses of the technique, and the conditions needed for its use. The final section provides examples that use and interpret the statistic. A glossary of terms and symbols is also included.New features in the second edition include:an interactive CD with PowerPoint presentations and problems for each chapter including an overview of the problem's solution; new chapters on basic research concepts including sampling, definitions of different types of variables, and basic research designs and one on nonparametric statistics; more graphs and more precise descriptions of each statistic; and a discussion of confidence intervals.This brief paperback is an ideal supplement for statistics, research methods, courses that use statistics, or as a reference tool to refresh one's memory about key concepts. The actual research examples are from psychology, education, and other social and behavioral sciences.Materials formerly available with this book on CD-ROM are now available for download from our website www.psypress.com. Go to the book's page and look for the 'Download' link in the right-hand column.

Time Series Analysis


James Douglas Hamilton - 1994
    This book synthesizes these recent advances and makes them accessible to first-year graduate students. James Hamilton provides the first adequate text-book treatments of important innovations such as vector autoregressions, generalized method of moments, the economic and statistical consequences of unit roots, time-varying variances, and nonlinear time series models. In addition, he presents basic tools for analyzing dynamic systems (including linear representations, autocovariance generating functions, spectral analysis, and the Kalman filter) in a way that integrates economic theory with the practical difficulties of analyzing and interpreting real-world data. Time Series Analysis fills an important need for a textbook that integrates economic theory, econometrics, and new results.The book is intended to provide students and researchers with a self-contained survey of time series analysis. It starts from first principles and should be readily accessible to any beginning graduate student, while it is also intended to serve as a reference book for researchers.-- "Journal of Economics"

Statistics for People Who (Think They) Hate Statistics


Neil J. Salkind - 2000
    The book begins with an introduction to the language of statistics and then covers descriptive statistics and inferential statistics. Throughout, the author offers readers:- Difficulty Rating Index for each chapter′s material- Tips for doing and thinking about a statistical technique- Top tens for everything from the best ways to create a graph to the most effective techniques for data collection- Steps that break techniques down into a clear sequence of procedures- SPSS tips for executing each major statistical technique- Practice exercises at the end of each chapter, followed by worked out solutions.The book concludes with a statistical software sampler and a description of the best Internet sites for statistical information and data resources. Readers also have access to a website for downloading data that they can use to practice additional exercises from the book. Students and researchers will appreciate the book′s unhurried pace and thorough, friendly presentation.

Probability Theory: The Logic of Science


E.T. Jaynes - 1999
    It discusses new results, along with applications of probability theory to a variety of problems. The book contains many exercises and is suitable for use as a textbook on graduate-level courses involving data analysis. Aimed at readers already familiar with applied mathematics at an advanced undergraduate level or higher, it is of interest to scientists concerned with inference from incomplete information.

Machine Learning


Tom M. Mitchell - 1986
    Mitchell covers the field of machine learning, the study of algorithms that allow computer programs to automatically improve through experience and that automatically infer general laws from specific data.