Book picks similar to
Networks: An Introduction by M.E.J. Newman
science
non-fiction
networks
math
Numerical Optimization
Jorge Nocedal - 2000
One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.
Complex Adaptive Systems: An Introduction to Computational Models of Social Life
John H. Miller - 2007
Such systems--whether political parties, stock markets, or ant colonies--present some of the most intriguing theoretical and practical challenges confronting the social sciences. Engagingly written, and balancing technical detail with intuitive explanations, Complex Adaptive Systems focuses on the key tools and ideas that have emerged in the field since the mid-1990s, as well as the techniques needed to investigate such systems. It provides a detailed introduction to concepts such as emergence, self-organized criticality, automata, networks, diversity, adaptation, and feedback. It also demonstrates how complex adaptive systems can be explored using methods ranging from mathematics to computational models of adaptive agents. John Miller and Scott Page show how to combine ideas from economics, political science, biology, physics, and computer science to illuminate topics in organization, adaptation, decentralization, and robustness. They also demonstrate how the usual extremes used in modeling can be fruitfully transcended.
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life Plus the Secrets of Enigma
Alan Turing - 2004
In 1935, aged 22, he developed the mathematical theory upon which all subsequent stored-program digital computers are modeled.At the outbreak of hostilities with Germany in September 1939, he joined the Government Codebreaking team at Bletchley Park, Buckinghamshire and played a crucial role in deciphering Engima, the code used by the German armed forces to protect their radio communications. Turing's work on the versionof Enigma used by the German navy was vital to the battle for supremacy in the North Atlantic. He also contributed to the attack on the cyphers known as Fish, which were used by the German High Command for the encryption of signals during the latter part of the war. His contribution helped toshorten the war in Europe by an estimated two years.After the war, his theoretical work led to the development of Britain's first computers at the National Physical Laboratory and the Royal Society Computing Machine Laboratory at Manchester University.Turing was also a founding father of modern cognitive science, theorizing that the cortex at birth is an unorganized machine which through training becomes organized into a universal machine or something like it. He went on to develop the use of computers to model biological growth, launchingthe discipline now referred to as Artificial Life.The papers in this book are the key works for understanding Turing's phenomenal contribution across all these fields. The collection includes Turing's declassified wartime Treatise on the Enigma; letters from Turing to Churchill and to codebreakers; lectures, papers, and broadcasts which opened upthe concept of AI and its implications; and the paper which formed the genesis of the investigation of Artifical Life.
Pattern Recognition and Machine Learning
Christopher M. Bishop - 2006
However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have grown from a specialist niche to become mainstream, while graphical models have emerged as a general framework for describing and applying probabilistic models. Also, the practical applicability of Bayesian methods has been greatly enhanced through the development of a range of approximate inference algorithms such as variational Bayes and expectation propagation. Similarly, new models based on kernels have had a significant impact on both algorithms and applications. This new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners, and assumes no previous knowledge of pattern recognition or machine learning concepts. Knowledge of multivariate calculus and basic linear algebra is required, and some familiarity with probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory.
How to Solve It: A New Aspect of Mathematical Method
George Pólya - 1944
Polya, How to Solve It will show anyone in any field how to think straight. In lucid and appealing prose, Polya reveals how the mathematical method of demonstrating a proof or finding an unknown can be of help in attacking any problem that can be reasoned out--from building a bridge to winning a game of anagrams. Generations of readers have relished Polya's deft--indeed, brilliant--instructions on stripping away irrelevancies and going straight to the heart of the problem.
Natural Language Processing with Python
Steven Bird - 2009
With it, you'll learn how to write Python programs that work with large collections of unstructured text. You'll access richly annotated datasets using a comprehensive range of linguistic data structures, and you'll understand the main algorithms for analyzing the content and structure of written communication.Packed with examples and exercises, Natural Language Processing with Python will help you: Extract information from unstructured text, either to guess the topic or identify "named entities" Analyze linguistic structure in text, including parsing and semantic analysis Access popular linguistic databases, including WordNet and treebanks Integrate techniques drawn from fields as diverse as linguistics and artificial intelligenceThis book will help you gain practical skills in natural language processing using the Python programming language and the Natural Language Toolkit (NLTK) open source library. If you're interested in developing web applications, analyzing multilingual news sources, or documenting endangered languages -- or if you're simply curious to have a programmer's perspective on how human language works -- you'll find Natural Language Processing with Python both fascinating and immensely useful.
Social Network Analysis: Methods and Applications
Stanley Wasserman - 1994
Social Network Analysis: Methods and Applications reviews and discusses methods for the analysis of social networks with a focus on applications of these methods to many substantive examples. As the first book to provide a comprehensive coverage of the methodology and applications of the field, this study is both a reference book and a textbook.
Hands-On Machine Learning with Scikit-Learn and TensorFlow
Aurélien Géron - 2017
Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details
Elements of Information Theory
Thomas M. Cover - 1991
Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated referencesNow current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
The Information: A History, a Theory, a Flood
James Gleick - 2011
The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself. And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.
Information Theory: A Tutorial Introduction
James V. Stone - 2015
In this richly illustrated book, accessible examples are used to show how information theory can be understood in terms of everyday games like '20 Questions', and the simple MatLab programs provided give hands-on experience of information theory in action. Written in a tutorial style, with a comprehensive glossary, this text represents an ideal primer for novices who wish to become familiar with the basic principles of information theory.Download chapter 1 from http://jim-stone.staff.shef.ac.uk/Boo...
Quantum Computation and Quantum Information
Michael A. Nielsen - 2000
A wealth of accompanying figures and exercises illustrate and develop the material in more depth. They describe what a quantum computer is, how it can be used to solve problems faster than familiar "classical" computers, and the real-world implementation of quantum computers. Their book concludes with an explanation of how quantum states can be used to perform remarkable feats of communication, and of how it is possible to protect quantum states against the effects of noise.
Linear Algebra Done Right
Sheldon Axler - 1995
The novel approach taken here banishes determinants to the end of the book and focuses on the central goal of linear algebra: understanding the structure of linear operators on vector spaces. The author has taken unusual care to motivate concepts and to simplify proofs. For example, the book presents - without having defined determinants - a clean proof that every linear operator on a finite-dimensional complex vector space (or an odd-dimensional real vector space) has an eigenvalue. A variety of interesting exercises in each chapter helps students understand and manipulate the objects of linear algebra. This second edition includes a new section on orthogonal projections and minimization problems. The sections on self-adjoint operators, normal operators, and the spectral theorem have been rewritten. New examples and new exercises have been added, several proofs have been simplified, and hundreds of minor improvements have been made throughout the text.
Introduction to Algorithms
Thomas H. Cormen - 1989
Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.