Book picks similar to
Statistical Machine Translation by Philipp Koehn
linguistics
computer-science
nlp
textbooks
How to Bake Pi: An Edible Exploration of the Mathematics of Mathematics
Eugenia Cheng - 2015
Of course, it’s not all cooking; we’ll also run the New York and Chicago marathons, pay visits to Cinderella and Lewis Carroll, and even get to the bottom of a tomato’s identity as a vegetable. This is not the math of our high school classes: mathematics, Cheng shows us, is less about numbers and formulas and more about how we know, believe, and understand anything, including whether our brother took too much cake.At the heart of How to Bake Pi is Cheng’s work on category theory—a cutting-edge “mathematics of mathematics.” Cheng combines her theory work with her enthusiasm for cooking both to shed new light on the fundamentals of mathematics and to give readers a tour of a vast territory no popular book on math has explored before. Lively, funny, and clear, How to Bake Pi will dazzle the initiated while amusing and enlightening even the most hardened math-phobe.
How to Create a Mind: The Secret of Human Thought Revealed
Ray Kurzweil - 2012
In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse engineering the brain to understand precisely how it works and using that knowledge to create even more intelligent machines.Kurzweil discusses how the brain functions, how the mind emerges from the brain, and the implications of vastly increasing the powers of our intelligence in addressing the world’s problems. He thoughtfully examines emotional and moral intelligence and the origins of consciousness and envisions the radical possibilities of our merging with the intelligent technology we are creating.Certain to be one of the most widely discussed and debated science books of the year, How to Create a Mind is sure to take its place alongside Kurzweil’s previous classics which include Fantastic Voyage: Live Long Enough to Live Forever and The Age of Spiritual Machines.
All of Statistics: A Concise Course in Statistical Inference
Larry Wasserman - 2003
But in spirit, the title is apt, as the book does cover a much broader range of topics than a typical introductory book on mathematical statistics. This book is for people who want to learn probability and statistics quickly. It is suitable for graduate or advanced undergraduate students in computer science, mathematics, statistics, and related disciplines. The book includes modern topics like nonparametric curve estimation, bootstrapping, and clas- sification, topics that are usually relegated to follow-up courses. The reader is presumed to know calculus and a little linear algebra. No previous knowledge of probability and statistics is required. Statistics, data mining, and machine learning are all concerned with collecting and analyzing data. For some time, statistics research was con- ducted in statistics departments while data mining and machine learning re- search was conducted in computer science departments. Statisticians thought that computer scientists were reinventing the wheel. Computer scientists thought that statistical theory didn't apply to their problems. Things are changing. Statisticians now recognize that computer scientists are making novel contributions while computer scientists now recognize the generality of statistical theory and methodology. Clever data mining algo- rithms are more scalable than statisticians ever thought possible. Formal sta- tistical theory is more pervasive than computer scientists had realized.
Computer Organization & Design: The Hardware/Software Interface
David A. Patterson - 1993
More importantly, this book provides a framework for thinking about computer organization and design that will enable the reader to continue the lifetime of learning necessary for staying at the forefront of this competitive discipline. --John Crawford Intel Fellow Director of Microprocessor Architecture, Intel The performance of software systems is dramatically affected by how well software designers understand the basic hardware technologies at work in a system. Similarly, hardware designers must understand the far reaching effects their design decisions have on software applications. For readers in either category, this classic introduction to the field provides a deep look into the computer. It demonstrates the relationship between the software and hardware and focuses on the foundational concepts that are the basis for current computer design. Using a distinctive learning by evolution approach the authors present each idea from its first principles, guiding readers through a series of worked examples that incrementally add more complex instructions until they ha
Programming Pearls
Jon L. Bentley - 1986
Jon has done a wonderful job of updating the material. I am very impressed at how fresh the new examples seem." - Steve McConnell, author, Code CompleteWhen programmers list their favorite books, Jon Bentley's collection of programming pearls is commonly included among the classics. Just as natural pearls grow from grains of sand that irritate oysters, programming pearls have grown from real problems that have irritated real programmers. With origins beyond solid engineering, in the realm of insight and creativity, Bentley's pearls offer unique and clever solutions to those nagging problems. Illustrated by programs designed as much for fun as for instruction, the book is filled with lucid and witty descriptions of practical programming techniques and fundamental design principles. It is not at all surprising that
Programming Pearls
has been so highly valued by programmers at every level of experience. In this revision, the first in 14 years, Bentley has substantially updated his essays to reflect current programming methods and environments. In addition, there are three new essays on (1) testing, debugging, and timing; (2) set representations; and (3) string problems. All the original programs have been rewritten, and an equal amount of new code has been generated. Implementations of all the programs, in C or C++, are now available on the Web.What remains the same in this new edition is Bentley's focus on the hard core of programming problems and his delivery of workable solutions to those problems. Whether you are new to Bentley's classic or are revisiting his work for some fresh insight, this book is sure to make your own list of favorites.
The Computer and the Brain
John von Neumann - 1958
This work represents the views of a mathematician on the analogies between computing machines and the living human brain.
Abstract Algebra
I.N. Herstein - 1986
Providing a concise introduction to abstract algebra, this work unfolds some of the fundamental systems with the aim of reaching applicable, significant results.
Conceptual Mathematics: A First Introduction to Categories
F. William Lawvere - 1997
Written by two of the best-known names in categorical logic, Conceptual Mathematics is the first book to apply categories to the most elementary mathematics. It thus serves two purposes: first, to provide a key to mathematics for the general reader or beginning student; and second, to furnish an easy introduction to categories for computer scientists, logicians, physicists, and linguists who want to gain some familiarity with the categorical method without initially committing themselves to extended study.
Concepts, Techniques, and Models of Computer Programming
Peter Van Roy - 2004
The book focuses on techniques of lasting value and explains them precisely in terms of a simple abstract machine. The book presents all major programming paradigms in a uniform framework that shows their deep relationships and how and where to use them together.After an introduction to programming concepts, the book presents both well-known and lesser-known computation models ("programming paradigms"). Each model has its own set of techniques and each is included on the basis of its usefulness in practice. The general models include declarative programming, declarative concurrency, message-passing concurrency, explicit state, object-oriented programming, shared-state concurrency, and relational programming. Specialized models include graphical user interface programming, distributed programming, and constraint programming. Each model is based on its kernel language—a simple core language that consists of a small number of programmer- significant elements. The kernel languages are introduced progressively, adding concepts one by one, thus showing the deep relationships between different models. The kernel languages are defined precisely in terms of a simple abstract machine. Because a wide variety of languages and programming paradigms can be modeled by a small set of closely related kernel languages, this approach allows programmer and student to grasp the underlying unity of programming. The book has many program fragments and exercises, all of which can be run on the Mozart Programming System, an Open Source software package that features an interactive incremental development environment.
Machine Learning for Absolute Beginners
Oliver Theobald - 2017
The manner in which computers are now able to mimic human thinking is rapidly exceeding human capabilities in everything from chess to picking the winner of a song contest. In the age of machine learning, computers do not strictly need to receive an ‘input command’ to perform a task, but rather ‘input data’. From the input of data they are able to form their own decisions and take actions virtually as a human would. But as a machine, can consider many more scenarios and execute calculations to solve complex problems. This is the element that excites companies and budding machine learning engineers the most. The ability to solve complex problems never before attempted. This is also perhaps one reason why you are looking at purchasing this book, to gain a beginner's introduction to machine learning. This book provides a plain English introduction to the following topics: - Artificial Intelligence - Big Data - Downloading Free Datasets - Regression - Support Vector Machine Algorithms - Deep Learning/Neural Networks - Data Reduction - Clustering - Association Analysis - Decision Trees - Recommenders - Machine Learning Careers This book has recently been updated following feedback from readers. Version II now includes: - New Chapter: Decision Trees - Cleanup of minor errors
Deep Learning for Coders with Fastai and Pytorch: AI Applications Without a PhD
Jeremy Howard - 2020
But as this hands-on guide demonstrates, programmers comfortable with Python can achieve impressive results in deep learning with little math background, small amounts of data, and minimal code. How? With fastai, the first library to provide a consistent interface to the most frequently used deep learning applications.Authors Jeremy Howard and Sylvain Gugger show you how to train a model on a wide range of tasks using fastai and PyTorch. You'll also dive progressively further into deep learning theory to gain a complete understanding of the algorithms behind the scenes.Train models in computer vision, natural language processing, tabular data, and collaborative filteringLearn the latest deep learning techniques that matter most in practiceImprove accuracy, speed, and reliability by understanding how deep learning models workDiscover how to turn your models into web applicationsImplement deep learning algorithms from scratchConsider the ethical implications of your work
Head First Design Patterns
Eric Freeman - 2004
At any given moment, somewhere in the world someone struggles with the same software design problems you have. You know you don't want to reinvent the wheel (or worse, a flat tire), so you look to Design Patterns--the lessons learned by those who've faced the same problems. With Design Patterns, you get to take advantage of the best practices and experience of others, so that you can spend your time on...something else. Something more challenging. Something more complex. Something more fun. You want to learn about the patterns that matter--why to use them, when to use them, how to use them (and when NOT to use them). But you don't just want to see how patterns look in a book, you want to know how they look "in the wild". In their native environment. In other words, in real world applications. You also want to learn how patterns are used in the Java API, and how to exploit Java's built-in pattern support in your own code. You want to learn the real OO design principles and why everything your boss told you about inheritance might be wrong (and what to do instead). You want to learn how those principles will help the next time you're up a creek without a design pattern. Most importantly, you want to learn the "secret language" of Design Patterns so that you can hold your own with your co-worker (and impress cocktail party guests) when he casually mentions his stunningly clever use of Command, Facade, Proxy, and Factory in between sips of a martini. You'll easily counter with your deep understanding of why Singleton isn't as simple as it sounds, how the Factory is so often misunderstood, or on the real relationship between Decorator, Facade and Adapter. With Head First Design Patterns, you'll avoid the embarrassment of thinking Decorator is something from the "Trading Spaces" show. Best of all, in a way that won't put you to sleep! We think your time is too important (and too short) to spend it struggling with academic texts. If you've read a Head First book, you know what to expect--a visually rich format designed for the way your brain works. Using the latest research in neurobiology, cognitive science, and learning theory, Head First Design Patterns will load patterns into your brain in a way that sticks. In a way that lets you put them to work immediately. In a way that makes you better at solving software design problems, and better at speaking the language of patterns with others on your team.
Artificial Intelligence: A Guide for Thinking Humans
Melanie Mitchell - 2019
The award-winning author Melanie Mitchell, a leading computer scientist, now reveals AI’s turbulent history and the recent spate of apparent successes, grand hopes, and emerging fears surrounding it.In Artificial Intelligence, Mitchell turns to the most urgent questions concerning AI today: How intelligent—really—are the best AI programs? How do they work? What can they actually do, and when do they fail? How humanlike do we expect them to become, and how soon do we need to worry about them surpassing us? Along the way, she introduces the dominant models of modern AI and machine learning, describing cutting-edge AI programs, their human inventors, and the historical lines of thought underpinning recent achievements. She meets with fellow experts such as Douglas Hofstadter, the cognitive scientist and Pulitzer Prize–winning author of the modern classic Gödel, Escher, Bach, who explains why he is “terrified” about the future of AI. She explores the profound disconnect between the hype and the actual achievements in AI, providing a clear sense of what the field has accomplished and how much further it has to go.Interweaving stories about the science of AI and the people behind it, Artificial Intelligence brims with clear-sighted, captivating, and accessible accounts of the most interesting and provocative modern work in the field, flavored with Mitchell’s humor and personal observations. This frank, lively book is an indispensable guide to understanding today’s AI, its quest for “human-level” intelligence, and its impact on the future for us all.
Reinforcement Learning: An Introduction
Richard S. Sutton - 1998
Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.