Best of
Information-Science

2019

Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play


David Foster - 2019
    Recent advances in the field have shown how it's possible to teach a machine to excel at human endeavors--such as drawing, composing music, and completing tasks--by generating an understanding of how its actions affect its environment.With this practical book, machine learning engineers and data scientists will learn how to recreate some of the most famous examples of generative deep learning models, such as variational autoencoders and generative adversarial networks (GANs). You'll also learn how to apply the techniques to your own datasets.David Foster, cofounder of Applied Data Science, demonstrates the inner workings of each technique, starting with the basics of deep learning before advancing to the most cutting-edge algorithms in the field. Through tips and tricks, you'll learn how to make your models learn more efficiently and become more creative.Get a fundamental overview of deep learningLearn about libraries such as Keras and TensorFlowDiscover how variational autoencoders workGet practical examples of generative adversarial networks (GANs)Understand how autoregressive generative models functionApply generative models within a reinforcement learning setting to accomplish tasks

Deep Learning


John D. Kelleher - 2019
    When we use consumer products from Google, Microsoft, Facebook, Apple, or Baidu, we are often interacting with a deep learning system. In this volume in the MIT Press Essential Knowledge series, computer scientist John Kelleher offers an accessible and concise but comprehensive introduction to the fundamental technology at the heart of the artificial intelligence revolution.Kelleher explains that deep learning enables data-driven decisions by identifying and extracting patterns from large datasets; its ability to learn from complex data makes deep learning ideally suited to take advantage of the rapid growth in big data and computational power. Kelleher also explains some of the basic concepts in deep learning, presents a history of advances in the field, and discusses the current state of the art. He describes the most important deep learning architectures, including autoencoders, recurrent neural networks, and long short-term networks, as well as such recent developments as Generative Adversarial Networks and capsule networks. He also provides a comprehensive (and comprehensible) introduction to the two fundamental algorithms in deep learning: gradient descent and backpropagation. Finally, Kelleher considers the future of deep learning—major trends, possible developments, and significant challenges.

Algorithms for Optimization


Mykel J. Kochenderfer - 2019
    The book approaches optimization from an engineering perspective, where the objective is to design a system that optimizes a set of metrics subject to constraints. Readers will learn about computational approaches for a range of challenges, including searching high-dimensional spaces, handling problems where there are multiple competing objectives, and accommodating uncertainty in the metrics. Figures, examples, and exercises convey the intuition behind the mathematical approaches. The text provides concrete implementations in the Julia programming language.Topics covered include derivatives and their generalization to multiple dimensions; local descent and first- and second-order methods that inform local descent; stochastic methods, which introduce randomness into the optimization process; linear constrained optimization, when both the objective function and the constraints are linear; surrogate models, probabilistic surrogate models, and using probabilistic surrogate models to guide optimization; optimization under uncertainty; uncertainty propagation; expression optimization; and multidisciplinary design optimization. Appendixes offer an introduction to the Julia language, test functions for evaluating algorithm performance, and mathematical concepts used in the derivation and analysis of the optimization methods discussed in the text. The book can be used by advanced undergraduates and graduate students in mathematics, statistics, computer science, any engineering field, (including electrical engineering and aerospace engineering), and operations research, and as a reference for professionals.

The ABC of It: Why Children’s Books Matter


Leonard S. Marcus - 2019
    For fourteen months beginning in June 2013, more than half a million visitors to the New York Public Library viewed an exhibition about the role that children’s books play in world culture and in our lives. After the exhibition closed, attendees clamored for a catalog of The ABC of It as well as for children’s literature historian Leonard S. Marcus’s insightful, wry commentary about the objects on display. Now with this book, a collaboration between the University of Minnesota’s Kerlan Collection of Children’s Literature and Leonard Marcus, the nostalgia and vision of that exhibit can be experienced anywhere. The story of the origins of children’s literature is a tale with memorable characters and deeds, from Hans Christian Andersen and Lewis Carroll to E. B. White and Madeleine L’Engle, who safeguarded a place for wonder in a world increasingly dominated by mechanistic styles of thought, to artists like Beatrix Potter and Maurice Sendak who devoted their extraordinary talents to revealing to children not only the exhilarating beauty of life but also its bracing intensity. Philosophers like John Locke and Jean-Jacques Rousseau and educators such as Johann Comenius and John Dewey were path-finding interpreters of the phenomenon of childhood, inspiring major strands of bookmaking and storytelling for the young. Librarians devised rigorous standards for evaluating children’s books and effective ways of putting good books into children’s hands, and educators proposed radically different ideas about what those books should include. Eventually, publishers came to embrace juvenile publishing as a core activity, and pioneering collectors of children’s book art, manuscripts, correspondence, and ephemera appeared—the University of Minnesota’s Dr. Irvin Kerlan being a superb example. Without the foresight and persistence of these collectors, much of this story would have been lost forever.  Regarding children’s literature as both a rich repository of collective memory and a powerful engine of cultural change is more important today than ever.

Strengthening Deep Neural Networks: Making AI Less Susceptible to Adversarial Trickery


Katy Warr - 2019
    This practical book examines real-world scenarios where DNNs--the algorithms intrinsic to much of AI--are used daily to process image, audio, and video data.Author Katy Warr considers attack motivations, the risks posed by this adversarial input, and methods for increasing AI robustness to these attacks. If you're a data scientist developing DNN algorithms, a security architect interested in how to make AI systems more resilient to attack, or someone fascinated by the differences between artificial and biological perception, this book is for you.Delve into DNNs and discover how they could be tricked by adversarial inputInvestigate methods used to generate adversarial input capable of fooling DNNsExplore real-world scenarios and model the adversarial threatEvaluate neural network robustness; learn methods to increase resilience of AI systems to adversarial dataExamine some ways in which AI might become better at mimicking human perception in years to come

Introduction to Natural Language Processing


Jacob Eisenstein - 2019
    It emphasizes contemporary data-driven approaches, focusing on techniques from supervised and unsupervised machine learning. The first section establishes a foundation in machine learning by building a set of tools that will be used throughout the book and applying them to word-based textual analysis. The second section introduces structured representations of language, including sequences, trees, and graphs. The third section explores different approaches to the representation and analysis of linguistic meaning, ranging from formal logic to neural word embeddings. The final section offers chapter-length treatments of three transformative applications of natural language processing: information extraction, machine translation, and text generation. End-of-chapter exercises include both paper-and-pencil analysis and software implementation.The text synthesizes and distills a broad and diverse research literature, linking contemporary machine learning techniques with the field's linguistic and computational foundations. It is suitable for use in advanced undergraduate and graduate-level courses and as a reference for software engineers and data scientists. Readers should have a background in computer programming and college-level mathematics. After mastering the material presented, students will have the technical skill to build and analyze novel natural language processing systems and to understand the latest research in the field.

Python Deep Learning: Exploring deep learning techniques and neural network architectures with PyTorch, Keras, and TensorFlow, 2nd Edition


Ivan Vasilev - 2019
    With this book, you’ll explore deep learning, and learn how to put machine learning to use in your projects.This second edition of Python Deep Learning will get you up to speed with deep learning, deep neural networks, and how to train them with high-performance algorithms and popular Python frameworks. You’ll uncover different neural network architectures, such as convolutional networks, recurrent neural networks, long short-term memory (LSTM) networks, and capsule networks. You’ll also learn how to solve problems in the fields of computer vision, natural language processing (NLP), and speech recognition. You'll study generative model approaches such as variational autoencoders and Generative Adversarial Networks (GANs) to generate images. As you delve into newly evolved areas of reinforcement learning, you’ll gain an understanding of state-of-the-art algorithms that are the main components behind popular games Go, Atari, and Dota.By the end of the book, you will be well-versed with the theory of deep learning along with its real-world applications. What you will learn Grasp the mathematical theory behind neural networks and deep learning processes Investigate and resolve computer vision challenges using convolutional networks and capsule networks Solve generative tasks using variational autoencoders and Generative Adversarial Networks Implement complex NLP tasks using recurrent networks (LSTM and GRU) and attention models Explore reinforcement learning and understand how agents behave in a complex environment Get up to date with applications of deep learning in autonomous vehicles Who this book is for This book is for data science practitioners, machine learning engineers, and those interested in deep learning who have a basic foundation in machine learning and some Python programming experience. A background in mathematics and conceptual understanding of calculus and statistics will help you gain maximum benefit from this book. Table of Contents Machine Learning – An Introduction Neural Networks Deep Learning Fundamentals Computer Vision With Convolutional Networks Advanced Computer Vision Generating images with GANs and Variational Autoencoders Recurrent Neural Networks and Language Models Reinforcement Learning Theory Deep Reinforcement Learning for Games Deep Learning in Autonomous Vehicles