Book picks similar to
Stochastic Processes and Filtering Theory by Andrew H. Jazwinski
storage
want-read
work-textbooks
computer-science
The Man Who Knew Too Much: Alan Turing and the Invention of the Computer
David Leavitt - 2006
Then, attempting to break a Nazi code during World War II, he successfully designed and built one, thus ensuring the Allied victory. Turing became a champion of artificial intelligence, but his work was cut short. As an openly gay man at a time when homosexuality was illegal in England, he was convicted and forced to undergo a humiliating "treatment" that may have led to his suicide.With a novelist's sensitivity, David Leavitt portrays Turing in all his humanity—his eccentricities, his brilliance, his fatal candor—and elegantly explains his work and its implications.
D is for Digital: What a well-informed person should know about computers and communications
Brian W. Kernighan - 2011
Elements of the Theory of Computation
Harry R. Lewis - 1981
The authors are well-known for their clear presentation that makes the material accessible to a a broad audience and requires no special previous mathematical experience. KEY TOPICS: In this new edition, the authors incorporate a somewhat more informal, friendly writing style to present both classical and contemporary theories of computation. Algorithms, complexity analysis, and algorithmic ideas are introduced informally in Chapter 1, and are pursued throughout the book. Each section is followed by problems.
Conceptual Blockbusting: A Guide to Better Ideas
James L. Adams - 1969
Now, twenty-five years after its original publication, Conceptual Blockbusting has never been more relevant, powerful, or fresh. Integrating insights from the worlds of psychology, engineering, management, art, and philosophy, Adams identifies the key blocks (perceptual, emotional, cultural, environmental, intellectual, and expressive) that prevent us from realizing the full potential of our fertile minds. Employing unconventional exercises and other interactive elements, Adams shows individuals, teams, and organizations how to overcome these blocks, embrace alternative ways of thinking about complex problems, and celebrate the joy of creativity. With new examples and contemporary references, Conceptual Blockbusting is guaranteed to introduce a new generation of readers to a world of new possibilities.
Modern Technical Writing: An Introduction to Software Documentation
Andrew Etter - 2016
Written by the lead technical writer at one of Silicon Valley's most exciting companies, Modern Technical Writing is a set of guiding principles and thoughtful recommendations for new and experienced technical writers alike. Not a reference manual, and not comprehensive, it instead serves as an introduction to a sensible writing and publishing process, one that has eluded the profession for too long.
Worlds Hidden in Plain Sight: The Evolving Idea of Complexity at the Santa Fe Institute, 1984–2019
David C. KrakauerJennifer Dunne - 2019
Ignoring the boundaries of disciplines and schools and searching for novel fundamental ideas, theories, and practices, this international community integrates the full range of scientific inquiries that will help us to understand and survive on a complex planet. This volume collects essays from the past thirty years of research, in which contributors explain in clear and accessible language many of the deepest challenges and insights of complexity science. Explore the evolution of complex systems science with chapters from Nobel Laureates Murray Gell-Mann and Kenneth Arrow, as well as numerous pioneering complexity researchers, including John Holland, Brian Arthur, Robert May, Richard Lewontin, Jennifer Dunne, and Geoffrey West.
Types and Programming Languages
Benjamin C. Pierce - 2002
The study of type systems--and of programming languages from a type-theoretic perspective--has important applications in software engineering, language design, high-performance compilers, and security.This text provides a comprehensive introduction both to type systems in computer science and to the basic theory of programming languages. The approach is pragmatic and operational; each new concept is motivated by programming examples and the more theoretical sections are driven by the needs of implementations. Each chapter is accompanied by numerous exercises and solutions, as well as a running implementation, available via the Web. Dependencies between chapters are explicitly identified, allowing readers to choose a variety of paths through the material.The core topics include the untyped lambda-calculus, simple type systems, type reconstruction, universal and existential polymorphism, subtyping, bounded quantification, recursive types, kinds, and type operators. Extended case studies develop a variety of approaches to modeling the features of object-oriented languages.
Data Science For Dummies
Lillian Pierson - 2014
Data Science For Dummies is the perfect starting point for IT professionals and students interested in making sense of their organization’s massive data sets and applying their findings to real-world business scenarios. From uncovering rich data sources to managing large amounts of data within hardware and software limitations, ensuring consistency in reporting, merging various data sources, and beyond, you’ll develop the know-how you need to effectively interpret data and tell a story that can be understood by anyone in your organization. Provides a background in data science fundamentals before moving on to working with relational databases and unstructured data and preparing your data for analysis Details different data visualization techniques that can be used to showcase and summarize your data Explains both supervised and unsupervised machine learning, including regression, model validation, and clustering techniques Includes coverage of big data processing tools like MapReduce, Hadoop, Dremel, Storm, and Spark It’s a big, big data world out there – let Data Science For Dummies help you harness its power and gain a competitive edge for your organization.
A Treatise on Electricity and Magnetism, Vol. 1
James Clerk Maxwell - 1873
Topics include electrical work and energy in a system of conductors, mechanical action between two electrical systems, spherical harmonics, electric current, conduction and resistance, electrolysis, and other subjects. 1891 edition.
Operations Research: An Introduction
Hamdy A. Taha - 1976
The applications and computations in operations research are emphasized. Significantly revised, this text streamlines the coverage of the theory, applications, and computations of operations research. Numerical examples are effectively used to explain complex mathematical concepts. A separate chapter of fully analyzed applications aptly demonstrates the diverse use of OR. The popular commercial and tutorial software AMPL, Excel, Excel Solver, and Tora are used throughout the book to solve practical problems and to test theoretical concepts. New materials include Markov chains, TSP heuristics, new LP models, and a totally new simplex-based approach to LP sensitivity analysis.
Networks: A Very Short Introduction
Guido Caldarelli - 2012
It is impossible to understand the spread of an epidemic, a computer virus, large-scale blackouts, or massive extinctions without taking into account the network structure that underlies all these phenomena. In this Very Short Introduction, Guido Caldarelli and Michele Catanzaro discuss the nature and variety of networks, using everyday examples from society, technology, nature, and history to explain and understand the science of network theory. They show the ubiquitous role of networks; how networks self-organize; why the rich get richer; and how networks can spontaneously collapse. They conclude by highlighting how the findings of complex network theory have very wide and important applications in genetics, ecology, communications, economics, and sociology.
Quantum Computing for Everyone
Chris Bernhardt - 2019
In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means.Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement--which, he says, is easier to describe mathematically than verbally--and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as "spooky action at a distance"); and introduces quantum cryptography. He recaps standard topics in classical computing--bits, gates, and logic--and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.
Deep Learning
Ian Goodfellow - 2016
Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning.The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models.Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Trevor Hastie - 2001
With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.
The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge
William Poundstone - 1984
Topics include the limits of knowledge, paradox of complexity, Maxwell's demon, Big Bang theory, much more. 1985 edition.