Book picks similar to
Mechanizing Proof: Computing, Risk, and Trust by Donald Angus MacKenzie
computer-science
history
tech
cultural-theory
Programming Collective Intelligence: Building Smart Web 2.0 Applications
Toby Segaran - 2002
With the sophisticated algorithms in this book, you can write smart programs to access interesting datasets from other web sites, collect data from users of your own applications, and analyze and understand the data once you've found it.Programming Collective Intelligence takes you into the world of machine learning and statistics, and explains how to draw conclusions about user experience, marketing, personal tastes, and human behavior in general -- all from information that you and others collect every day. Each algorithm is described clearly and concisely with code that can immediately be used on your web site, blog, Wiki, or specialized application. This book explains:Collaborative filtering techniques that enable online retailers to recommend products or media Methods of clustering to detect groups of similar items in a large dataset Search engine features -- crawlers, indexers, query engines, and the PageRank algorithm Optimization algorithms that search millions of possible solutions to a problem and choose the best one Bayesian filtering, used in spam filters for classifying documents based on word types and other features Using decision trees not only to make predictions, but to model the way decisions are made Predicting numerical values rather than classifications to build price models Support vector machines to match people in online dating sites Non-negative matrix factorization to find the independent features in a dataset Evolving intelligence for problem solving -- how a computer develops its skill by improving its own code the more it plays a game Each chapter includes exercises for extending the algorithms to make them more powerful. Go beyond simple database-backed applications and put the wealth of Internet data to work for you. "Bravo! I cannot think of a better way for a developer to first learn these algorithms and methods, nor can I think of a better way for me (an old AI dog) to reinvigorate my knowledge of the details."-- Dan Russell, Google "Toby's book does a great job of breaking down the complex subject matter of machine-learning algorithms into practical, easy-to-understand examples that can be directly applied to analysis of social interaction across the Web today. If I had this book two years ago, it would have saved precious time going down some fruitless paths."-- Tim Wolters, CTO, Collective Intellect
Building Evolutionary Architectures: Support Constant Change
Neal Ford - 2017
Over the past few years, incremental developments in core engineering practices for software development have created the foundations for rethinking how architecture changes over time, along with ways to protect important architectural characteristics as it evolves. This practical guide ties those parts together with a new way to think about architecture and time.
The Man Who Knew Too Much: Alan Turing and the Invention of the Computer
David Leavitt - 2006
Then, attempting to break a Nazi code during World War II, he successfully designed and built one, thus ensuring the Allied victory. Turing became a champion of artificial intelligence, but his work was cut short. As an openly gay man at a time when homosexuality was illegal in England, he was convicted and forced to undergo a humiliating "treatment" that may have led to his suicide.With a novelist's sensitivity, David Leavitt portrays Turing in all his humanity—his eccentricities, his brilliance, his fatal candor—and elegantly explains his work and its implications.
Concrete Mathematics: A Foundation for Computer Science
Ronald L. Graham - 1988
"More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."
Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Can Teach Us About Digital Technology
Lizzie O'Shea - 2019
In Future Histories, public interest lawyer and digital specialist Lizzie O'Shea argues that we need to stop looking forward and start looking backwards. Weaving together histories of computing and progressive social movements with modern theories of the mind, society, and self, O'Shea constructs a "usable past" that can help us determine our digital future.What, she asks, can the Paris Commune tell us about earlier experiments in sharing resources--like the Internet--in common? How can Frantz Fanon's theories of anti colonial self-determination help us build digital world in which everyone can participate equally? Can debates over equal digital access be helped by American revolutionary Tom Paine's theories of democratic, economic redistribution? What can indigenous land struggles teach us about stewarding our digital climate? And, how is Elon Musk not a future visionary but a steampunk throwback to Victorian-era technological utopians?In engaging, sparkling prose, O'Shea shows us how very human our understanding of technology is, and how when we draw on the resources of the past, we can see the potential for struggle, for liberation, for art and poetry in our technological present. Future Histories is for all of us--makers, coders, hacktivists, Facebook-users, self-styled Luddites--who find ourselves in a brave new world.
The Lifebox, the Seashell, and the Soul: What Gnarly Computation Taught Me About Ultimate Reality, the Meaning of Life, and How to Be Happy
Rudy Rucker - 2005
This concept is at the root of the computational worldview, which basically says that very complex systems — the world we live in — have their beginnings in simple mathematical equations. We've lately come to understand that such an algorithm is only the start of a never-ending story — the real action occurs in the unfolding consequences of the rules. The chip-in-a-box computers so popular in our time have acted as a kind of microscope, letting us see into the secret machinery of the world. In Lifebox, Rucker uses whimsical drawings, fables, and humor to demonstrate that everything is a computation — that thoughts, computations, and physical processes are all the same. Rucker discusses the linguistic and computational advances that make this kind of "digital philosophy" possible, and explains how, like every great new principle, the computational world view contains the seeds of a next step.
Data Science from Scratch: First Principles with Python
Joel Grus - 2015
In this book, you’ll learn how many of the most fundamental data science tools and algorithms work by implementing them from scratch.
If you have an aptitude for mathematics and some programming skills, author Joel Grus will help you get comfortable with the math and statistics at the core of data science, and with hacking skills you need to get started as a data scientist. Today’s messy glut of data holds answers to questions no one’s even thought to ask. This book provides you with the know-how to dig those answers out.
Get a crash course in Python
Learn the basics of linear algebra, statistics, and probability—and understand how and when they're used in data science
Collect, explore, clean, munge, and manipulate data
Dive into the fundamentals of machine learning
Implement models such as k-nearest Neighbors, Naive Bayes, linear and logistic regression, decision trees, neural networks, and clustering
Explore recommender systems, natural language processing, network analysis, MapReduce, and databases
Targeted: My Inside Story of Cambridge Analytica and How Trump and Facebook Broke Democracy
Brittany Kaiser - 2019
A veteran of Barack Obama's 2008 campaign, Kaiser's goal was to utilize data for humanitarian purposes, most notably to prevent genocide and human rights abuses. But her experience inside Cambridge Analytica opened her eyes to the tremendous risks that this unregulated industry poses to privacy and democracy.Targeted is Kaiser's eyewitness chronicle of the dramatic and disturbing story of the rise and fall of Cambridge Analytica. She reveals to the public how Facebook's lax policies and lack of sufficient national laws allowed voters to be manipulated in both Britain and the United States, where personal data was weaponized to spread fake news and racist messaging during the Brexit vote and the 2016 election. But the damage isn't done Kaiser warns; the 2020 election can be compromised as well if we continue to do nothing.In the aftermath of the U.S. election, as she became aware of the horrifying reality of what Cambridge Analytica had done in support of Donald Trump, Kaiser made the difficult choice to expose the truth. Risking her career, relationships, and personal safety, she told authorities about the data industry's unethical business practices, eventually testifying before Parliament about the company's Brexit efforts and helping Special Counsel Robert Mueller's investigation into Russian interference in the 2016 election, alongside at least 10 other international investigations.Packed with never-before-publicly-told stories and insights, Targeted goes inside the secretive meetings with Trump campaign personnel and details the promises Cambridge Analytica made to win. Throughout, Kaiser makes the case for regulation, arguing that legal oversight of the data industry is not only justifiable but essential to ensuring the long-term safety of our democracy.
The Visual Display of Quantitative Information
Edward R. Tufte - 1983
Theory and practice in the design of data graphics, 250 illustrations of the best (and a few of the worst) statistical graphics, with detailed analysis of how to display data for precise, effective, quick analysis. Design of the high-resolution displays, small multiples. Editing and improving graphics. The data-ink ratio. Time-series, relational graphics, data maps, multivariate designs. Detection of graphical deception: design variation vs. data variation. Sources of deception. Aesthetics and data graphical displays. This is the second edition of The Visual Display of Quantitative Information. Recently published, this new edition provides excellent color reproductions of the many graphics of William Playfair, adds color to other images, and includes all the changes and corrections accumulated during 17 printings of the first edition.
Compilers: Principles, Techniques, and Tools
Alfred V. Aho - 1986
The authors present updated coverage of compilers based on research and techniques that have been developed in the field over the past few years. The book provides a thorough introduction to compiler design and covers topics such as context-free grammars, fine state machines, and syntax-directed translation.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie - 2001
With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.
Why Greatness Cannot Be Planned: The Myth of the Objective
Kenneth O. Stanley - 2015
In Why Greatness Cannot Be Planned, Stanley and Lehman begin with a surprising scientific discovery in artificial intelligence that leads ultimately to the conclusion that the objective obsession has gone too far. They make the case that great achievement can't be bottled up into mechanical metrics; that innovation is not driven by narrowly focused heroic effort; and that we would be wiser (and the outcomes better) if instead we whole-heartedly embraced serendipitous discovery and playful creativity.Controversial at its heart, yet refreshingly provocative, this book challenges readers to consider life without a destination and discovery without a compass.
Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web
Tim Berners-Lee - 1999
Named one of the greatest minds of the 20th century by Time, Tim Berners-Lee is responsible for one of that century's most important advancements: the world wide web. Now, this low-profile genius - who never personally profited from his invention - offers a compelling portrait of his invention. He reveals the Web's origins and the creation of the now ubiquitous http and www acronyms and shares his views on such critical issues as censorship, privacy, the increasing power of software companies, and the need to find the ideal balance between commercial and social forces. He offers insights into the true nature of the Web, showing readers how to use it to its fullest advantage. And he presents his own plan for the Web's future, calling for the active support and participation of programmers, computer manufacturers, and social organizations to manage and maintain this valuable resource so that it can remain a powerful force for social change and an outlet for individual creativity.
Deep Learning
Ian Goodfellow - 2016
Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
Mindf*ck: Cambridge Analytica and the Plot to Break America
Christopher Wylie - 2019
Bannon had long sensed that deep within America's soul lurked an explosive tension. Cambridge Analytica had the data to prove it, and in 2016 Bannon had a presidential campaign to use as his proving ground.Christopher Wylie might have seemed an unlikely figure to be at the center of such an operation. Canadian and liberal in his politics, he was only twenty-four when he got a job with a London firm that worked with the U.K. Ministry of Defense and was charged putatively with helping to build a team of data scientists to create new tools to identify and combat radical extremism online. In short order, those same military tools were turned to political purposes, and Cambridge Analytica was born. Wylie's decision to become a whistleblower prompted the largest data crime investigation in history. His story is both exposé and dire warning about a sudden problem born of very new and powerful capabilities. It has not only exposed the profound vulnerabilities and profound carelessness in the enormous companies that drive the attention economy, it has also exposed the profound vulnerabilities of democracy itself. What happened in 2016 was just a trial run. Ruthless actors are coming for your data, and they want to control what you think.