Design for Information: An Introduction to the Histories, Theories, and Best Practices Behind Effective Information Visualizations


Isabel Meirelles - 2013
    Design for Information critically examines other design solutions —current and historic— helping you gain a larger understanding of how to solve specific problems. This book is designed to help you foster the development of a repertoire of existing methods and concepts to help you overcome design problems. Learn the ins and outs of data visualization with this informative book that provides you with a series of current visualization case studies. The visualizations discussed are analyzed for their design principles and methods, giving you valuable critical and analytical tools to further develop your design process. The case study format of this book is perfect for discussing  the histories, theories and best practices in the field through real-world, effective visualizations. The selection represents a fraction of effective visualizations that we encounter in this burgeoning field, allowing you the opportunity to extend your study to other solutions in your specific field(s) of practice. This book is also helpful to students in other disciplines who are involved with visualizing information, such as those in the digital humanities and most of the sciences.

All of Statistics: A Concise Course in Statistical Inference


Larry Wasserman - 2003
    But in spirit, the title is apt, as the book does cover a much broader range of topics than a typical introductory book on mathematical statistics. This book is for people who want to learn probability and statistics quickly. It is suitable for graduate or advanced undergraduate students in computer science, mathematics, statistics, and related disciplines. The book includes modern topics like nonparametric curve estimation, bootstrapping, and clas- sification, topics that are usually relegated to follow-up courses. The reader is presumed to know calculus and a little linear algebra. No previous knowledge of probability and statistics is required. Statistics, data mining, and machine learning are all concerned with collecting and analyzing data. For some time, statistics research was con- ducted in statistics departments while data mining and machine learning re- search was conducted in computer science departments. Statisticians thought that computer scientists were reinventing the wheel. Computer scientists thought that statistical theory didn't apply to their problems. Things are changing. Statisticians now recognize that computer scientists are making novel contributions while computer scientists now recognize the generality of statistical theory and methodology. Clever data mining algo- rithms are more scalable than statisticians ever thought possible. Formal sta- tistical theory is more pervasive than computer scientists had realized.

Applied Predictive Modeling


Max Kuhn - 2013
    Non- mathematical readers will appreciate the intuitive explanations of the techniques while an emphasis on problem-solving with real data across a wide variety of applications will aid practitioners who wish to extend their expertise. Readers should have knowledge of basic statistical ideas, such as correlation and linear regression analysis. While the text is biased against complex equations, a mathematical background is needed for advanced topics. Dr. Kuhn is a Director of Non-Clinical Statistics at Pfizer Global R&D in Groton Connecticut. He has been applying predictive models in the pharmaceutical and diagnostic industries for over 15 years and is the author of a number of R packages. Dr. Johnson has more than a decade of statistical consulting and predictive modeling experience in pharmaceutical research and development. He is a co-founder of Arbor Analytics, a firm specializing in predictive modeling and is a former Director of Statistics at Pfizer Global R&D. His scholarly work centers on the application and development of statistical methodology and learning algorithms. Applied Predictive Modeling covers the overall predictive modeling process, beginning with the crucial steps of data preprocessing, data splitting and foundations of model tuning. The text then provides intuitive explanations of numerous common and modern regression and classification techniques, always with an emphasis on illustrating and solving real data problems. Addressing practical concerns extends beyond model fitting to topics such as handling class imbalance, selecting predictors, and pinpointing causes of poor model performance-all of which are problems that occur frequently in practice. The text illustrates all parts of the modeling process through many hands-on, real-life examples. And every chapter contains extensive R code f

Probabilistic Graphical Models: Principles and Techniques


Daphne Koller - 2009
    The framework of probabilistic graphical models, presented in this book, provides a general approach for this task. The approach is model-based, allowing interpretable models to be constructed and then manipulated by reasoning algorithms. These models can also be learned automatically from data, allowing the approach to be used in cases where manually constructing a model is difficult or even impossible. Because uncertainty is an inescapable aspect of most real-world applications, the book focuses on probabilistic models, which make the uncertainty explicit and provide models that are more faithful to reality.Probabilistic Graphical Models discusses a variety of models, spanning Bayesian networks, undirected Markov networks, discrete and continuous models, and extensions to deal with dynamical systems and relational data. For each class of models, the text describes the three fundamental cornerstones: representation, inference, and learning, presenting both basic concepts and advanced techniques. Finally, the book considers the use of the proposed framework for causal reasoning and decision making under uncertainty. The main text in each chapter provides the detailed technical development of the key ideas. Most chapters also include boxes with additional material: skill boxes, which describe techniques; case study boxes, which discuss empirical cases related to the approach described in the text, including applications in computer vision, robotics, natural language understanding, and computational biology; and concept boxes, which present significant concepts drawn from the material in the chapter. Instructors (and readers) can group chapters in various combinations, from core topics to more technically advanced material, to suit their particular needs.

The Non-Designer's Design Book


Robin P. Williams - 2003
    Not to worry: This book is the one place you can turn to find quick, non-intimidating, excellent design help. In The Non-Designer's Design Book, 2nd Edition, best-selling author Robin Williams turns her attention to the basic principles of good design and typography. All you have to do is follow her clearly explained concepts, and you'll begin producing more sophisticated, professional, and interesting pages immediately. Humor-infused, jargon-free prose interspersed with design exercises, quizzes, illustrations, and dozens of examples make learning a snap—which is just what audiences have come to expect from this best-selling author.

The Elements of User Experience: User-Centered Design for the Web


Jesse James Garrett - 2002
    This book aims to minimize the complexity of user-centered design for the Web with explanations and illustrations that focus on ideas rather than tools or techniques.

Code Complete


Steve McConnell - 1993
    Now this classic book has been fully updated and revised with leading-edge practices--and hundreds of new code samples--illustrating the art and science of software construction. Capturing the body of knowledge available from research, academia, and everyday commercial practice, McConnell synthesizes the most effective techniques and must-know principles into clear, pragmatic guidance. No matter what your experience level, development environment, or project size, this book will inform and stimulate your thinking--and help you build the highest quality code. Discover the timeless techniques and strategies that help you: Design for minimum complexity and maximum creativity Reap the benefits of collaborative development Apply defensive programming techniques to reduce and flush out errors Exploit opportunities to refactor--or evolve--code, and do it safely Use construction practices that are right-weight for your project Debug problems quickly and effectively Resolve critical construction issues early and correctly Build quality into the beginning, middle, and end of your project

Processing: A Programming Handbook for Visual Designers and Artists


Casey Reas - 2007
    This book is an introduction to the concepts of computer programming within the context of the visual arts. It offers a comprehensive reference and text for Processing (www.processing.org), an open-source programming language that can be used by students, artists, designers, architects, researchers, and anyone who wants to program images, animation, and interactivity. The ideas in Processing have been tested in classrooms, workshops, and arts institutions, including UCLA, Carnegie Mellon, New York University, and Harvard University. Tutorial units make up the bulk of the book and introduce the syntax and concepts of software (including variables, functions, and object-oriented programming), cover such topics as photography and drawing in relation to software, and feature many short, prototypical example programs with related images and explanations. More advanced professional projects from such domains as animation, performance, and typography are discussed in interviews with their creators. "Extensions" present concise introductions to further areas of investigation, including computer vision, sound, and electronics. Appendixes, references to additional material, and a glossary contain additional technical details. Processing can be used by reading each unit in order, or by following each category from the beginning of the book to the end. The Processing software and all of the code presented can be downloaded and run for future exploration.Includes essays by Alexander R. Galloway, Golan Levin, R. Luke DuBois, Simon Greenwold, Francis Li, and Hernando Barragan and interviews with Jared Tarbell, Martin Wattenberg, James Paterson, Erik van Blockland, Ed Burton, Josh On, Jurg Lehni, Auriea Harvey and Michael Samyn, Mathew Cullen and Grady Hall, Bob Sabiston, Jennifer Steinkamp, Ruth Jarman and Joseph Gerhardt, Sue Costabile, Chris Csikszentmihalyi, Golan Levin and Zachary Lieberman, and Mark Hansen.Casey Reas is Associate Professor in the Design Media Arts Department at the University of California, Los Angeles. Ben Fry is Nierenburg Chair of Design in the School of Design at Carnegie Mellon University, 2006-2007."

Thinking with Data


Max Shron - 2014
    In this practical guide, data strategy consultant Max Shron shows you how to put the why before the how, through an often-overlooked set of analytical skills.Thinking with Data helps you learn techniques for turning data into knowledge you can use. You’ll learn a framework for defining your project, including the data you want to collect, and how you intend to approach, organize, and analyze the results. You’ll also learn patterns of reasoning that will help you unveil the real problem that needs to be solved.Learn a framework for scoping data projectsUnderstand how to pin down the details of an idea, receive feedback, and begin prototypingUse the tools of arguments to ask good questions, build projects in stages, and communicate resultsExplore data-specific patterns of reasoning and learn how to build more useful argumentsDelve into causal reasoning and learn how it permeates data workPut everything together, using extended examples to see the method of full problem thinking in action

Spark: The Definitive Guide: Big Data Processing Made Simple


Bill Chambers - 2018
    With an emphasis on improvements and new features in Spark 2.0, authors Bill Chambers and Matei Zaharia break down Spark topics into distinct sections, each with unique goals. You’ll explore the basic operations and common functions of Spark’s structured APIs, as well as Structured Streaming, a new high-level API for building end-to-end streaming applications. Developers and system administrators will learn the fundamentals of monitoring, tuning, and debugging Spark, and explore machine learning techniques and scenarios for employing MLlib, Spark’s scalable machine-learning library. Get a gentle overview of big data and Spark Learn about DataFrames, SQL, and Datasets—Spark’s core APIs—through worked examples Dive into Spark’s low-level APIs, RDDs, and execution of SQL and DataFrames Understand how Spark runs on a cluster Debug, monitor, and tune Spark clusters and applications Learn the power of Structured Streaming, Spark’s stream-processing engine Learn how you can apply MLlib to a variety of problems, including classification or recommendation

Natural Language Processing with Python


Steven Bird - 2009
    With it, you'll learn how to write Python programs that work with large collections of unstructured text. You'll access richly annotated datasets using a comprehensive range of linguistic data structures, and you'll understand the main algorithms for analyzing the content and structure of written communication.Packed with examples and exercises, Natural Language Processing with Python will help you: Extract information from unstructured text, either to guess the topic or identify "named entities" Analyze linguistic structure in text, including parsing and semantic analysis Access popular linguistic databases, including WordNet and treebanks Integrate techniques drawn from fields as diverse as linguistics and artificial intelligenceThis book will help you gain practical skills in natural language processing using the Python programming language and the Natural Language Toolkit (NLTK) open source library. If you're interested in developing web applications, analyzing multilingual news sources, or documenting endangered languages -- or if you're simply curious to have a programmer's perspective on how human language works -- you'll find Natural Language Processing with Python both fascinating and immensely useful.

Information Architecture for the World Wide Web: Designing Large-Scale Web Sites


Peter Morville - 1998
    How do you present large volumes of information to people who need to find what they're looking for quickly? This classic primer shows information architects, designers, and web site developers how to build large-scale and maintainable web sites that are appealing and easy to navigate. The new edition is thoroughly updated to address emerging technologies -- with recent examples, new scenarios, and information on best practices -- while maintaining its focus on fundamentals. With topics that range from aesthetics to mechanics, Information Architecture for the World Wide Web explains how to create interfaces that users can understand right away. Inside, you'll find:* An overview of information architecture for both newcomers and experienced practitioners* The fundamental components of an architecture, illustrating the interconnected nature of these systems. Updated, with updates for tagging, folksonomies, social classification, and guided navigation* Tools, techniques, and methods that take you from research to strategy and design to implementation. This edition discusses blueprints, wireframes and the role of diagrams in the design phase* A series of short essays that provide practical tips and philosophical advice for those who work on information architecture* The business context of practicing and promoting information architecture, including recent lessons on how to handle enterprise architecture* Case studies on the evolution of two large and very different information architectures, illustrating best practices along the way* How do you document the rich interfaces of web applications? How do you design for multiple platforms and mobile devices? With emphasis on goals and approaches over tactics or technologies, this enormously popular book gives you knowledge about information architecture with a framework that allows you to learn new approaches -- and unlearn outmoded ones.

Make Your Own Neural Network


Tariq Rashid - 2016
     Neural networks are a key element of deep learning and artificial intelligence, which today is capable of some truly impressive feats. Yet too few really understand how neural networks actually work. This guide will take you on a fun and unhurried journey, starting from very simple ideas, and gradually building up an understanding of how neural networks work. You won't need any mathematics beyond secondary school, and an accessible introduction to calculus is also included. The ambition of this guide is to make neural networks as accessible as possible to as many readers as possible - there are enough texts for advanced readers already! You'll learn to code in Python and make your own neural network, teaching it to recognise human handwritten numbers, and performing as well as professionally developed networks. Part 1 is about ideas. We introduce the mathematical ideas underlying the neural networks, gently with lots of illustrations and examples. Part 2 is practical. We introduce the popular and easy to learn Python programming language, and gradually builds up a neural network which can learn to recognise human handwritten numbers, easily getting it to perform as well as networks made by professionals. Part 3 extends these ideas further. We push the performance of our neural network to an industry leading 98% using only simple ideas and code, test the network on your own handwriting, take a privileged peek inside the mysterious mind of a neural network, and even get it all working on a Raspberry Pi. All the code in this has been tested to work on a Raspberry Pi Zero.

The Art of Game Design: A Book of Lenses


Jesse Schell - 2008
    The Art of Game Design: A Book of Lenses shows that the same basic principles of psychology that work for board games, card games and athletic games also are the keys to making top-quality video games. Good game design happens when you view your game from many different perspectives, or lenses. While touring through the unusual territory that is game design, this book gives the reader one hundred of these lenses—one hundred sets of insightful questions to ask yourself that will help make your game better. These lenses are gathered from fields as diverse as psychology, architecture, music, visual design, film, software engineering, theme park design, mathematics, writing, puzzle design, and anthropology. Anyone who reads this book will be inspired to become a better game designer—and will understand how to do it.

Reinforcement Learning: An Introduction


Richard S. Sutton - 1998
    Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.