Programming Perl


Tom Christiansen - 1991
    The first edition of this book, Programming Perl, hit the shelves in 1990, and was quickly adopted as the undisputed bible of the language. Since then, Perl has grown with the times, and so has this book.Programming Perl is not just a book about Perl. It is also a unique introduction to the language and its culture, as one might expect only from its authors. Larry Wall is the inventor of Perl, and provides a unique perspective on the evolution of Perl and its future direction. Tom Christiansen was one of the first champions of the language, and lives and breathes the complexities of Perl internals as few other mortals do. Jon Orwant is the editor of The Perl Journal, which has brought together the Perl community as a common forum for new developments in Perl.Any Perl book can show the syntax of Perl's functions, but only this one is a comprehensive guide to all the nooks and crannies of the language. Any Perl book can explain typeglobs, pseudohashes, and closures, but only this one shows how they really work. Any Perl book can say that my is faster than local, but only this one explains why. Any Perl book can have a title, but only this book is affectionately known by all Perl programmers as "The Camel."This third edition of Programming Perl has been expanded to cover version 5.6 of this maturing language. New topics include threading, the compiler, Unicode, and other new features that have been added since the previous edition.

OpenIntro Statistics


David M. Diez - 2012
    Our inaugural effort is OpenIntro Statistics. Probability is optional, inference is key, and we feature real data whenever possible. Files for the entire book are freely available at openintro.org, and anybody can purchase a paperback copy from amazon.com for under $10.The future for OpenIntro depends on the involvement and enthusiasm of our community. Visit our website, openintro.org. We provide free course management tools, including an online question bank, utilities for creating course quizzes, and many other helpful resources.CERTAIN CONTENT THAT APPEARS ON THIS SITE COMES FROM AMAZON SERVICES LLC. THIS CONTENT IS PROVIDED ‘AS IS’ AND IS SUBJECT TO CHANGE OR REMOVAL AT ANY TIME.Can’t find it here? Search Amazon.com Search: All Products Apparel & AccessoriesBabyBeautyBooksCamera & PhotoCell Phones & ServiceClassical MusicComputersComputer & Video GamesDVDElectronicsGourmet FoodHome & GardenMiscellaneousHealth & Personal CareJewelry & WatchesKitchen & HousewaresMagazine SubscriptionsMusicMusical InstrumentsSoftwareSports & OutdoorsTools & HardwareToys & GamesVHS Keywords:

Super Crunchers: Why Thinking-By-Numbers Is the New Way to Be Smart


Ian Ayres - 2007
    In this lively and groundbreaking new book, economist Ian Ayres shows how today's best and brightest organizations are analyzing massive databases at lightening speed to provide greater insights into human behavior. They are the Super Crunchers. From internet sites like Google and Amazon that know your tastes better than you do, to a physician's diagnosis and your child's education, to boardrooms and government agencies, this new breed of decision makers are calling the shots. And they are delivering staggeringly accurate results. How can a football coach evaluate a player without ever seeing him play? Want to know whether the price of an airline ticket will go up or down before you buy? How can a formula outpredict wine experts in determining the best vintages? Super crunchers have the answers. In this brave new world of equation versus expertise, Ayres shows us the benefits and risks, who loses and who wins, and how super crunching can be used to help, not manipulate us.Gone are the days of solely relying on intuition to make decisions. No businessperson, consumer, or student who wants to stay ahead of the curve should make another keystroke without reading Super Crunchers.

The Mythical Man-Month: Essays on Software Engineering


Frederick P. Brooks Jr. - 1975
    With a blend of software engineering facts and thought-provoking opinions, Fred Brooks offers insight for anyone managing complex projects. These essays draw from his experience as project manager for the IBM System/360 computer family and then for OS/360, its massive software system. Now, 45 years after the initial publication of his book, Brooks has revisited his original ideas and added new thoughts and advice, both for readers already familiar with his work and for readers discovering it for the first time.The added chapters contain (1) a crisp condensation of all the propositions asserted in the original book, including Brooks' central argument in The Mythical Man-Month: that large programming projects suffer management problems different from small ones due to the division of labor; that the conceptual integrity of the product is therefore critical; and that it is difficult but possible to achieve this unity; (2) Brooks' view of these propositions a generation later; (3) a reprint of his classic 1986 paper "No Silver Bullet"; and (4) today's thoughts on the 1986 assertion, "There will be no silver bullet within ten years."

How Linux Works: What Every Superuser Should Know


Brian Ward - 2004
    Some books try to give you copy-and-paste instructions for how to deal with every single system issue that may arise, but How Linux Works actually shows you how the Linux system functions so that you can come up with your own solutions. After a guided tour of filesystems, the boot sequence, system management basics, and networking, author Brian Ward delves into open-ended topics such as development tools, custom kernels, and buying hardware, all from an administrator's point of view. With a mixture of background theory and real-world examples, this book shows both "how" to administer Linux, and "why" each particular technique works, so that you will know how to make Linux work for you.

Site Reliability Engineering: How Google Runs Production Systems


Betsy Beyer - 2016
    So, why does conventional wisdom insist that software engineers focus primarily on the design and development of large-scale computing systems?In this collection of essays and articles, key members of Google's Site Reliability Team explain how and why their commitment to the entire lifecycle has enabled the company to successfully build, deploy, monitor, and maintain some of the largest software systems in the world. You'll learn the principles and practices that enable Google engineers to make systems more scalable, reliable, and efficient--lessons directly applicable to your organization.This book is divided into four sections: Introduction--Learn what site reliability engineering is and why it differs from conventional IT industry practicesPrinciples--Examine the patterns, behaviors, and areas of concern that influence the work of a site reliability engineer (SRE)Practices--Understand the theory and practice of an SRE's day-to-day work: building and operating large distributed computing systemsManagement--Explore Google's best practices for training, communication, and meetings that your organization can use

Refactoring: Improving the Design of Existing Code


Martin Fowler - 1999
    Significant numbers of poorly designed programs have been created by less-experienced developers, resulting in applications that are inefficient and hard to maintain and extend. Increasingly, software system professionals are discovering just how difficult it is to work with these inherited, non-optimal applications. For several years, expert-level object programmers have employed a growing collection of techniques to improve the structural integrity and performance of such existing software programs. Referred to as refactoring, these practices have remained in the domain of experts because no attempt has been made to transcribe the lore into a form that all developers could use... until now. In Refactoring: Improving the Design of Existing Software, renowned object technology mentor Martin Fowler breaks new ground, demystifying these master practices and demonstrating how software practitioners can realize the significant benefits of this new process.

Coders at Work: Reflections on the Craft of Programming


Peter Seibel - 2009
    As the words "at work" suggest, Peter Seibel focuses on how his interviewees tackle the day–to–day work of programming, while revealing much more, like how they became great programmers, how they recognize programming talent in others, and what kinds of problems they find most interesting. Hundreds of people have suggested names of programmers to interview on the Coders at Work web site: http://www.codersatwork.com. The complete list was 284 names. Having digested everyone’s feedback, we selected 16 folks who’ve been kind enough to agree to be interviewed:- Frances Allen: Pioneer in optimizing compilers, first woman to win the Turing Award (2006) and first female IBM fellow- Joe Armstrong: Inventor of Erlang- Joshua Bloch: Author of the Java collections framework, now at Google- Bernie Cosell: One of the main software guys behind the original ARPANET IMPs and a master debugger- Douglas Crockford: JSON founder, JavaScript architect at Yahoo!- L. Peter Deutsch: Author of Ghostscript, implementer of Smalltalk-80 at Xerox PARC and Lisp 1.5 on PDP-1- Brendan Eich: Inventor of JavaScript, CTO of the Mozilla Corporation - Brad Fitzpatrick: Writer of LiveJournal, OpenID, memcached, and Perlbal - Dan Ingalls: Smalltalk implementor and designer- Simon Peyton Jones: Coinventor of Haskell and lead designer of Glasgow Haskell Compiler- Donald Knuth: Author of The Art of Computer Programming and creator of TeX- Peter Norvig: Director of Research at Google and author of the standard text on AI- Guy Steele: Coinventor of Scheme and part of the Common Lisp Gang of Five, currently working on Fortress- Ken Thompson: Inventor of UNIX- Jamie Zawinski: Author of XEmacs and early Netscape/Mozilla hackerWhat you’ll learn:How the best programmers in the world do their jobWho is this book for?Programmers interested in the point of view of leaders in the field. Programmers looking for approaches that work for some of these outstanding programmers.

Statistics in a Nutshell: A Desktop Quick Reference


Sarah Boslaugh - 2008
    This book gives you a solid understanding of statistics without being too simple, yet without the numbing complexity of most college texts. You get a firm grasp of the fundamentals and a hands-on understanding of how to apply them before moving on to the more advanced material that follows. Each chapter presents you with easy-to-follow descriptions illustrated by graphics, formulas, and plenty of solved examples. Before you know it, you'll learn to apply statistical reasoning and statistical techniques, from basic concepts of probability and hypothesis testing to multivariate analysis. Organized into four distinct sections, Statistics in a Nutshell offers you:Introductory material: Different ways to think about statistics Basic concepts of measurement and probability theoryData management for statistical analysis Research design and experimental design How to critique statistics presented by others Basic inferential statistics: Basic concepts of inferential statistics The concept of correlation, when it is and is not an appropriate measure of association Dichotomous and categorical data The distinction between parametric and nonparametric statistics Advanced inferential techniques: The General Linear Model Analysis of Variance (ANOVA) and MANOVA Multiple linear regression Specialized techniques: Business and quality improvement statistics Medical and public health statistics Educational and psychological statistics Unlike many introductory books on the subject, Statistics in a Nutshell doesn't omit important material in an effort to dumb it down. And this book is far more practical than most college texts, which tend to over-emphasize calculation without teaching you when and how to apply different statistical tests. With Statistics in a Nutshell, you learn how to perform most common statistical analyses, and understand statistical techniques presented in research articles. If you need to know how to use a wide range of statistical techniques without getting in over your head, this is the book you want.

Econometric Analysis of Cross Section and Panel Data


Jeffrey M. Wooldridge - 2001
    The book makes clear that applied microeconometrics is about the estimation of marginal and treatment effects, and that parametric estimation is simply a means to this end. It also clarifies the distinction between causality and statistical association. The book focuses specifically on cross section and panel data methods. Population assumptions are stated separately from sampling assumptions, leading to simple statements as well as to important insights. The unified approach to linear and nonlinear models and to cross section and panel data enables straightforward coverage of more advanced methods. The numerous end-of-chapter problems are an important component of the book. Some problems contain important points not fully described in the text, and others cover new ideas that can be analyzed using tools presented in the current and previous chapters. Several problems require the use of the data sets located at the author's website.

Pearls of Functional Algorithm Design


Richard S. Bird - 2010
    These 30 short chapters each deal with a particular programming problem drawn from sources as diverse as games and puzzles, intriguing combinatorial tasks, and more familiar areas such as data compression and string matching. Each pearl starts with the statement of the problem expressed using the functional programming language Haskell, a powerful yet succinct language for capturing algorithmic ideas clearly and simply. The novel aspect of the book is that each solution is calculated from an initial formulation of the problem in Haskell by appealing to the laws of functional programming. Pearls of Functional Algorithm Design will appeal to the aspiring functional programmer, students and teachers interested in the principles of algorithm design, and anyone seeking to master the techniques of reasoning about programs in an equational style.

Statistical Inference


George Casella - 2001
    Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and are natural extensions and consequences of previous concepts. This book can be used for readers who have a solid mathematics background. It can also be used in a way that stresses the more practical uses of statistical theory, being more concerned with understanding basic statistical concepts and deriving reasonable statistical procedures for a variety of situations, and less concerned with formal optimality investigations.

A New Kind of Science


Stephen Wolfram - 1997
    Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton

Mindstorms: Children, Computers, And Powerful Ideas


Seymour Papert - 1980
    We have Mindstorms to thank for that. In this book, pioneering computer scientist Seymour Papert uses the invention of LOGO, the first child-friendly programming language, to make the case for the value of teaching children with computers. Papert argues that children are more than capable of mastering computers, and that teaching computational processes like de-bugging in the classroom can change the way we learn everything else. He also shows that schools saturated with technology can actually improve socialization and interaction among students and between students and teachers.

Grokking Deep Learning


Andrew W. Trask - 2017
    Loosely based on neuron behavior inside of human brains, these systems are rapidly catching up with the intelligence of their human creators, defeating the world champion Go player, achieving superhuman performance on video games, driving cars, translating languages, and sometimes even helping law enforcement fight crime. Deep Learning is a revolution that is changing every industry across the globe.Grokking Deep Learning is the perfect place to begin your deep learning journey. Rather than just learn the “black box” API of some library or framework, you will actually understand how to build these algorithms completely from scratch. You will understand how Deep Learning is able to learn at levels greater than humans. You will be able to understand the “brain” behind state-of-the-art Artificial Intelligence. Furthermore, unlike other courses that assume advanced knowledge of Calculus and leverage complex mathematical notation, if you’re a Python hacker who passed high-school algebra, you’re ready to go. And at the end, you’ll even build an A.I. that will learn to defeat you in a classic Atari game.