Book picks similar to
Macroanalysis: Digital Methods and Literary History by Matthew L. Jockers
non-fiction
digital-humanities
dh
literary-criticism
Naked Statistics: Stripping the Dread from the Data
Charles Wheelan - 2012
How can we catch schools that cheat on standardized tests? How does Netflix know which movies you’ll like? What is causing the rising incidence of autism? As best-selling author Charles Wheelan shows us in Naked Statistics, the right data and a few well-chosen statistical tools can help us answer these questions and more.For those who slept through Stats 101, this book is a lifesaver. Wheelan strips away the arcane and technical details and focuses on the underlying intuition that drives statistical analysis. He clarifies key concepts such as inference, correlation, and regression analysis, reveals how biased or careless parties can manipulate or misrepresent data, and shows us how brilliant and creative researchers are exploiting the valuable data from natural experiments to tackle thorny questions.And in Wheelan’s trademark style, there’s not a dull page in sight. You’ll encounter clever Schlitz Beer marketers leveraging basic probability, an International Sausage Festival illuminating the tenets of the central limit theorem, and a head-scratching choice from the famous game show Let’s Make a Deal—and you’ll come away with insights each time. With the wit, accessibility, and sheer fun that turned Naked Economics into a bestseller, Wheelan defies the odds yet again by bringing another essential, formerly unglamorous discipline to life.
Bayesian Data Analysis
Andrew Gelman - 1995
Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.
How to Write a Lot: A Practical Guide to Productive Academic Writing
Paul J. Silvia - 2007
Writing is hard work and can be difficult to wedge into a frenetic academic schedule.This revised and updated edition of Paul Silvia's popular guide provides practical, light-hearted advice to help academics overcome common barriers and become productive writers. Silvia's expert tips have been updated to apply to a wide variety of disciplines, and this edition has a new chapter devoted to grant and fellowship writing.
The Art of Data Science: A Guide for Anyone Who Works with Data
Roger D. Peng - 2015
The authors have extensive experience both managing data analysts and conducting their own data analyses, and have carefully observed what produces coherent results and what fails to produce useful insights into data. This book is a distillation of their experience in a format that is applicable to both practitioners and managers in data science.
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
Cathy O'Neil - 2016
Increasingly, the decisions that affect our lives--where we go to school, whether we can get a job or a loan, how much we pay for health insurance--are being made not by humans, but by machines. In theory, this should lead to greater fairness: Everyone is judged according to the same rules.But as mathematician and data scientist Cathy O'Neil reveals, the mathematical models being used today are unregulated and uncontestable, even when they're wrong. Most troubling, they reinforce discrimination--propping up the lucky, punishing the downtrodden, and undermining our democracy in the process.
Cybertext: Perspectives on Ergodic Literature
Espen J. Aarseth - 1997
Instead of insisting on the uniqueness and newness of electronic writing and interactive fiction, however, Aarseth situates these literary forms within the tradition of "ergodic" literature—a term borrowed from physics to describe open, dynamic texts such as the I Ching or Apollinaire's calligrams, with which the reader must perform specific actions to generate a literary sequence.Constructing a theoretical model that describes how new electronic forms build on this tradition, Aarseth bridges the widely assumed divide between paper texts and electronic texts. He then uses the perspective of ergodic aesthetics to reexamine literary theories of narrative, semiotics, and rhetoric and to explore the implications of applying these theories to materials for which they were not intended.
The Rhetoric of Fiction
Wayne C. Booth - 1961
One of the most widely used texts in fiction courses, it is a standard reference point in advanced discussions of how fictional form works, how authors make novels accessible, and how readers recreate texts, and its concepts and terms—such as "the implied author," "the postulated reader," and "the unreliable narrator"—have become part of the standard critical lexicon.For this new edition, Wayne C. Booth has written an extensive Afterword in which he clarifies misunderstandings, corrects what he now views as errors, and sets forth his own recent thinking about the rhetoric of fiction. The other new feature is a Supplementary Bibliography, prepared by James Phelan in consultation with the author, which lists the important critical works of the past twenty years—two decades that Booth describes as "the richest in the history of the subject."
Fluent Python: Clear, Concise, and Effective Programming
Luciano Ramalho - 2015
With this hands-on guide, you'll learn how to write effective, idiomatic Python code by leveraging its best and possibly most neglected features. Author Luciano Ramalho takes you through Python's core language features and libraries, and shows you how to make your code shorter, faster, and more readable at the same time.Many experienced programmers try to bend Python to fit patterns they learned from other languages, and never discover Python features outside of their experience. With this book, those Python programmers will thoroughly learn how to become proficient in Python 3.This book covers:Python data model: understand how special methods are the key to the consistent behavior of objectsData structures: take full advantage of built-in types, and understand the text vs bytes duality in the Unicode ageFunctions as objects: view Python functions as first-class objects, and understand how this affects popular design patternsObject-oriented idioms: build classes by learning about references, mutability, interfaces, operator overloading, and multiple inheritanceControl flow: leverage context managers, generators, coroutines, and concurrency with the concurrent.futures and asyncio packagesMetaprogramming: understand how properties, attribute descriptors, class decorators, and metaclasses work"
Data Jujitsu: The Art of Turning Data into Product
D.J. Patil - 2012
Acclaimed data scientist DJ Patil details a new approach to solving problems in Data Jujitsu.Learn how to use a problem's "weight" against itself to:Break down seemingly complex data problems into simplified partsUse alternative data analysis techniques to examine themUse human input, such as Mechanical Turk, and design tricks that enlist the help of your users to take short cuts around tough problemsLearn more about the problems before starting on the solutions—and use the findings to solve them, or determine whether the problems are worth solving at all.
Sorting Things Out: Classification and Its Consequences
Geoffrey C. Bowker - 1999
Bowker and Susan Leigh Star explore the role of categories and standards in shaping the modern world. In a clear and lively style, they investigate a variety of classification systems, including the International Classification of Diseases, the Nursing Interventions Classification, race classification under apartheid in South Africa, and the classification of viruses and of tuberculosis.The authors emphasize the role of invisibility in the process by which classification orders human interaction. They examine how categories are made and kept invisible, and how people can change this invisibility when necessary. They also explore systems of classification as part of the built information environment. Much as an urban historian would review highway permits and zoning decisions to tell a city's story, the authors review archives of classification design to understand how decisions have been made. Sorting Things Out has a moral agenda, for each standard and category valorizes some point of view and silences another. Standards and classifications produce advantage or suffering. Jobs are made and lost; some regions benefit at the expense of others. How these choices are made and how we think about that process are at the moral and political core of this work. The book is an important empirical source for understanding the building of information infrastructures.
Doing Bayesian Data Analysis: A Tutorial Introduction with R and BUGS
John K. Kruschke - 2010
Included are step-by-step instructions on how to carry out Bayesian data analyses.Download Link : readbux.com/download?i=0124058884 0124058884 Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan PDF by John Kruschke
Visualizing Data: Exploring and Explaining Data with the Processing Environment
Ben Fry - 2007
Using a downloadable programming environment developed by the author, Visualizing Data demonstrates methods for representing data accurately on the Web and elsewhere, complete with user interaction, animation, and more. How do the 3.1 billion A, C, G and T letters of the human genome compare to those of a chimp or a mouse? What do the paths that millions of visitors take through a web site look like? With Visualizing Data, you learn how to answer complex questions like these with thoroughly interactive displays. We're not talking about cookie-cutter charts and graphs. This book teaches you how to design entire interfaces around large, complex data sets with the help of a powerful new design and prototyping tool called "Processing". Used by many researchers and companies to convey specific data in a clear and understandable manner, the Processing beta is available free. With this tool and Visualizing Data as a guide, you'll learn basic visualization principles, how to choose the right kind of display for your purposes, and how to provide interactive features that will bring users to your site over and over. This book teaches you:The seven stages of visualizing data -- acquire, parse, filter, mine, represent, refine, and interact How all data problems begin with a question and end with a narrative construct that provides a clear answer without extraneous details Several example projects with the code to make them work Positive and negative points of each representation discussed. The focus is on customization so that each one best suits what you want to convey about your data set The book does not provide ready-made "visualizations" that can be plugged into any data set. Instead, with chapters divided by types of data rather than types of display, you'll learn how each visualization conveys the unique properties of the data it represents -- why the data was collected, what's interesting about it, and what stories it can tell. Visualizing Data teaches you how to answer questions, not simply display information.
Information is Beautiful
David McCandless - 2001
We need a brand new way to take it all in. 'Information is Beautiful' transforms the ideas surrounding and swamping us into graphs and maps that anyone can follow at a single glance.
The Algorithm Design Manual
Steven S. Skiena - 1997
Drawing heavily on the author's own real-world experiences, the book stresses design and analysis. Coverage is divided into two parts, the first being a general guide to techniques for the design and analysis of computer algorithms. The second is a reference section, which includes a catalog of the 75 most important algorithmic problems. By browsing this catalog, readers can quickly identify what the problem they have encountered is called, what is known about it, and how they should proceed if they need to solve it. This book is ideal for the working professional who uses algorithms on a daily basis and has need for a handy reference. This work can also readily be used in an upper-division course or as a student reference guide. THE ALGORITHM DESIGN MANUAL comes with a CD-ROM that contains: * a complete hypertext version of the full printed book. * the source code and URLs for all cited implementations. * over 30 hours of audio lectures on the design and analysis of algorithms are provided, all keyed to on-line lecture notes.
Machine Learning with R
Brett Lantz - 2014
This practical guide that covers all of the need to know topics in a very systematic way. For each machine learning approach, each step in the process is detailed, from preparing the data for analysis to evaluating the results. These steps will build the knowledge you need to apply them to your own data science tasks.Intended for those who want to learn how to use R's machine learning capabilities and gain insight from your data. Perhaps you already know a bit about machine learning, but have never used R; or perhaps you know a little R but are new to machine learning. In either case, this book will get you up and running quickly. It would be helpful to have a bit of familiarity with basic programming concepts, but no prior experience is required.