Book picks similar to
Networks, Crowds, and Markets by David Easley
economics
computer-science
networks
complexity
A New Kind of Science
Stephen Wolfram - 1997
Wolfram lets the world see his work in A New Kind of Science, a gorgeous, 1,280-page tome more than a decade in the making. With patience, insight, and self-confidence to spare, Wolfram outlines a fundamental new way of modeling complex systems. On the frontier of complexity science since he was a boy, Wolfram is a champion of cellular automata--256 "programs" governed by simple nonmathematical rules. He points out that even the most complex equations fail to accurately model biological systems, but the simplest cellular automata can produce results straight out of nature--tree branches, stream eddies, and leopard spots, for instance. The graphics in A New Kind of Science show striking resemblance to the patterns we see in nature every day. Wolfram wrote the book in a distinct style meant to make it easy to read, even for nontechies; a basic familiarity with logic is helpful but not essential. Readers will find themselves swept away by the elegant simplicity of Wolfram's ideas and the accidental artistry of the cellular automaton models. Whether or not Wolfram's revolution ultimately gives us the keys to the universe, his new science is absolutely awe-inspiring. --Therese Littleton
Blockchain Revolution: How the Technology Behind Bitcoin Is Changing Money, Business, and the World
Don Tapscott - 2016
But it is much more than that, too. It is a public ledger to which everyone has access, but which no single person controls. It allows for companies and individuals to collaborate with an unprecedented degree of trust and transparency. It is cryptographically secure, but fundamentally open. And soon it will be everywhere.In Blockchain Revolution, Don and Alex Tapscott reveal how this game-changing technology will shape the future of the world economy, dramatically improving everything from healthcare records to online voting, and from insurance claims to artist royalty payments. Brilliantly researched and highly accessible, this is the essential text on the next major paradigm shift. Read it, or be left behind.
Algorithms of Oppression: How Search Engines Reinforce Racism
Safiya Umoja Noble - 2018
But, if you type in "white girls," the results are radically different. The suggested porn sites and un-moderated discussions about "why black women are so sassy" or "why black women are so angry" presents a disturbing portrait of black womanhood in modern society.In Algorithms of Oppression, Safiya Umoja Noble challenges the idea that search engines like Google offer an equal playing field for all forms of ideas, identities, and activities. Data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.Through an analysis of textual and media searches as well as extensive research on paid online advertising, Noble exposes a culture of racism and sexism in the way discoverability is created online. As search engines and their related companies grow in importance - operating as a source for email, a major vehicle for primary and secondary school learning, and beyond - understanding and reversing these disquieting trends and discriminatory practices is of utmost importance.An original, surprising and, at times, disturbing account of bias on the internet, Algorithms of Oppression contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.
Data Mining: Concepts and Techniques (The Morgan Kaufmann Series in Data Management Systems)
Jiawei Han - 2000
Not only are all of our business, scientific, and government transactions now computerized, but the widespread use of digital cameras, publication tools, and bar codes also generate data. On the collection side, scanned text and image platforms, satellite remote sensing systems, and the World Wide Web have flooded us with a tremendous amount of data. This explosive growth has generated an even more urgent need for new techniques and automated tools that can help us transform this data into useful information and knowledge.Like the first edition, voted the most popular data mining book by KD Nuggets readers, this book explores concepts and techniques for the discovery of patterns hidden in large data sets, focusing on issues relating to their feasibility, usefulness, effectiveness, and scalability. However, since the publication of the first edition, great progress has been made in the development of new data mining methods, systems, and applications. This new edition substantially enhances the first edition, and new chapters have been added to address recent developments on mining complex types of data- including stream data, sequence data, graph structured data, social network data, and multi-relational data.A comprehensive, practical look at the concepts and techniques you need to know to get the most out of real business dataUpdates that incorporate input from readers, changes in the field, and more material on statistics and machine learningDozens of algorithms and implementation examples, all in easily understood pseudo-code and suitable for use in real-world, large-scale data mining projectsComplete classroom support for instructors at www.mkp.com/datamining2e companion site
The Filter Bubble: What the Internet is Hiding From You
Eli Pariser - 2011
Instead of giving you the most broadly popular result, Google now tries to predict what you are most likely to click on. According to MoveOn.org board president Eli Pariser, Google's change in policy is symptomatic of the most significant shift to take place on the Web in recent years - the rise of personalization. In this groundbreaking investigation of the new hidden Web, Pariser uncovers how this growing trend threatens to control how we consume and share information as a society-and reveals what we can do about it.Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook - the primary news source for an increasing number of Americans - prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like "The Washington Post" devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos.In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs - and because these filters are invisible, we won't know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.While we all worry that the Internet is eroding privacy or shrinking our attention spans, Pariser uncovers a more pernicious and far-reaching trend on the Internet and shows how we can - and must - change course. With vivid detail and remarkable scope, The Filter Bubble reveals how personalization undermines the Internet's original purpose as an open platform for the spread of ideas and could leave us all in an isolated, echoing world.
The Nature of Code
Daniel Shiffman - 2012
Readers will progress from building a basic physics engine to creating intelligent moving objects and complex systems, setting the foundation for further experiments in generative design. Subjects covered include forces, trigonometry, fractals, cellular automata, self-organization, and genetic algorithms. The book's examples are written in Processing, an open-source language and development environment built on top of the Java programming language. On the book's website (http://www.natureofcode.com), the examples run in the browser via Processing's JavaScript mode.
Algorithms Illuminated (Part 1): The Basics
Tim Roughgarden - 2017
Their applications range from network routing and computational genomics to public-key cryptography and database system implementation. Studying algorithms can make you a better programmer, a clearer thinker, and a master of technical interviews. Algorithms Illuminated is an accessible introduction to the subject---a transcript of what an expert algorithms tutor would say over a series of one-on-one lessons. The exposition is rigorous but emphasizes the big picture and conceptual understanding over low-level implementation and mathematical details. Part 1 of the book series covers asymptotic analysis and big-O notation, divide-and-conquer algorithms and the master method, randomized algorithms, and several famous algorithms for sorting and selection.
Python Crash Course: A Hands-On, Project-Based Introduction to Programming
Eric Matthes - 2015
You'll also learn how to make your programs interactive and how to test your code safely before adding it to a project. In the second half of the book, you'll put your new knowledge into practice with three substantial projects: a Space Invaders-inspired arcade game, data visualizations with Python's super-handy libraries, and a simple web app you can deploy online.As you work through Python Crash Course, you'll learn how to: Use powerful Python libraries and tools, including matplotlib, NumPy, and PygalMake 2D games that respond to keypresses and mouse clicks, and that grow more difficult as the game progressesWork with data to generate interactive visualizationsCreate and customize simple web apps and deploy them safely onlineDeal with mistakes and errors so you can solve your own programming problemsIf you've been thinking seriously about digging into programming, Python Crash Course will get you up to speed and have you writing real programs fast. Why wait any longer? Start your engines and code!
Think Like a Programmer: An Introduction to Creative Problem Solving
V. Anton Spraul - 2012
In this one-of-a-kind text, author V. Anton Spraul breaks down the ways that programmers solve problems and teaches you what other introductory books often ignore: how to Think Like a Programmer. Each chapter tackles a single programming concept, like classes, pointers, and recursion, and open-ended exercises throughout challenge you to apply your knowledge. You'll also learn how to:Split problems into discrete components to make them easier to solve Make the most of code reuse with functions, classes, and libraries Pick the perfect data structure for a particular job Master more advanced programming tools like recursion and dynamic memory Organize your thoughts and develop strategies to tackle particular types of problems Although the book's examples are written in C++, the creative problem-solving concepts they illustrate go beyond any particular language; in fact, they often reach outside the realm of computer science. As the most skillful programmers know, writing great code is a creative art—and the first step in creating your masterpiece is learning to Think Like a Programmer.
The Art of Doing Science and Engineering: Learning to Learn
Richard Hamming - 1996
By presenting actual experiences and analyzing them as they are described, the author conveys the developmental thought processes employed and shows a style of thinking that leads to successful results is something that can be learned. Along with spectacular successes, the author also conveys how failures contributed to shaping the thought processes. Provides the reader with a style of thinking that will enhance a person's ability to function as a problem-solver of complex technical issues. Consists of a collection of stories about the author's participation in significant discoveries, relating how those discoveries came about and, most importantly, provides analysis about the thought processes and reasoning that took place as the author and his associates progressed through engineering problems.
Mostly Harmless Econometrics: An Empiricist's Companion
Joshua D. Angrist - 2008
In the modern experimentalist paradigm, these techniques address clear causal questions such as: Do smaller classes increase learning? Should wife batterers be arrested? How much does education raise wages? Mostly Harmless Econometrics shows how the basic tools of applied econometrics allow the data to speak.In addition to econometric essentials, Mostly Harmless Econometrics covers important new extensions--regression-discontinuity designs and quantile regression--as well as how to get standard errors right. Joshua Angrist and Jorn-Steffen Pischke explain why fancier econometric techniques are typically unnecessary and even dangerous. The applied econometric methods emphasized in this book are easy to use and relevant for many areas of contemporary social science.An irreverent review of econometric essentials A focus on tools that applied researchers use most Chapters on regression-discontinuity designs, quantile regression, and standard errors Many empirical examples A clear and concise resource with wide applications
Foundations of Statistical Natural Language Processing
Christopher D. Manning - 1999
This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.
The Tyranny of Metrics
Jerry Z. Muller - 2017
But in our zeal to instill the evaluation process with scientific rigor, we've gone from measuring performance to fixating on measuring itself. The result is a tyranny of metrics that threatens the quality of our lives and most important institutions. In this timely and powerful book, Jerry Muller uncovers the damage our obsession with metrics is causing--and shows how we can begin to fix the problem.Filled with examples from education, medicine, business and finance, government, the police and military, and philanthropy and foreign aid, this brief and accessible book explains why the seemingly irresistible pressure to quantify performance distorts and distracts, whether by encouraging "gaming the stats" or "teaching to the test." That's because what can and does get measured is not always worth measuring, may not be what we really want to know, and may draw effort away from the things we care about. Along the way, we learn why paying for measured performance doesn't work, why surgical scorecards may increase deaths, and much more. But metrics can be good when used as a complement to--rather than a replacement for--judgment based on personal experience, and Muller also gives examples of when metrics have been beneficial.Complete with a checklist of when and how to use metrics, The Tyranny of Metrics is an essential corrective to a rarely questioned trend that increasingly affects us all.
Disruptive Possibilities: How Big Data Changes Everything
Jeffrey Needham - 2013
As author Jeffrey Needham points out in this eye-opening book, big data can provide unprecedented insight into user habits, giving enterprises a huge market advantage. It will also inspire organizations to change the way they function."Disruptive Possibilities: How Big Data Changes Everything" takes you on a journey of discovery into the emerging world of big data, from its relatively simple technology to the ways it differs from cloud computing. But the big story of big data is the disruption of enterprise status quo, especially vendor-driven technology silos and budget-driven departmental silos. In the highly collaborative environment needed to make big data work, silos simply don't fit.Internet-scale computing offers incredible opportunity and a tremendous challenge--and it will soon become standard operating procedure in the enterprise. This book shows you what to expect.
Bayesian Data Analysis
Andrew Gelman - 1995
Its world-class authors provide guidance on all aspects of Bayesian data analysis and include examples of real statistical analyses, based on their own research, that demonstrate how to solve complicated problems. Changes in the new edition include:Stronger focus on MCMC Revision of the computational advice in Part III New chapters on nonlinear models and decision analysis Several additional applied examples from the authors' recent research Additional chapters on current models for Bayesian data analysis such as nonlinear models, generalized linear mixed models, and more Reorganization of chapters 6 and 7 on model checking and data collectionBayesian computation is currently at a stage where there are many reasonable ways to compute any given posterior distribution. However, the best approach is not always clear ahead of time. Reflecting this, the new edition offers a more pluralistic presentation, giving advice on performing computations from many perspectives while making clear the importance of being aware that there are different ways to implement any given iterative simulation computation. The new approach, additional examples, and updated information make Bayesian Data Analysis an excellent introductory text and a reference that working scientists will use throughout their professional life.