Concepts of Programming Languages


Robert W. Sebesta - 1988
    It presents the principles, paradigms, designs and implementations of modern programming languages, and contains increased coverage of the object-oriented programming paradigm. The book also covers semantics and Java.

Introduction to Probability


Dimitri P. Bertsekas - 2002
    This is the currently used textbook for "Probabilistic Systems Analysis," an introductory probability course at the Massachusetts Institute of Technology, attended by a large number of undergraduate and graduate students. The book covers the fundamentals of probability theory (probabilistic models, discrete and continuous random variables, multiple random variables, and limit theorems), which are typically part of a first course on the subject. It also contains, a number of more advanced topics, from which an instructor can choose to match the goals of a particular course. These topics include transforms, sums of random variables, least squares estimation, the bivariate normal distribution, and a fairly detailed introduction to Bernoulli, Poisson, and Markov processes. The book strikes a balance between simplicity in exposition and sophistication in analytical reasoning. Some of the more mathematically rigorous analysis has been just intuitively explained in the text, but is developed in detail (at the level of advanced calculus) in the numerous solved theoretical problems. The book has been widely adopted for classroom use in introductory probability courses within the USA and abroad.

Digital Signal Processing: Principles, Algorithms, and Applications


John G. Proakis - 1992
    This book presents the fundamentals of discrete-time signals, systems, and modern digital processing and applications for students in electrical engineering, computer engineering, and computer science.The book is suitable for either a one-semester or a two-semester undergraduate level course in discrete systems and digital signal processing. It is also intended for use in a one-semester first-year graduate-level course in digital signal processing.

Algebra


Michael Artin - 1991
    Linear algebra is tightly integrated into the text.

Automate This: How Algorithms Came to Rule Our World


Christopher Steiner - 2012
    It used to be that to diagnose an illness, interpret legal documents, analyze foreign policy, or write a newspaper article you needed a human being with specific skills—and maybe an advanced degree or two. These days, high-level tasks are increasingly being handled by algorithms that can do precise work not only with speed but also with nuance. These “bots” started with human programming and logic, but now their reach extends beyond what their creators ever expected. In this fascinating, frightening book, Christopher Steiner tells the story of how algorithms took over—and shows why the “bot revolution” is about to spill into every aspect of our lives, often silently, without our knowledge. The May 2010 “Flash Crash” exposed Wall Street’s reliance on trading bots to the tune of a 998-point market drop and $1 trillion in vanished market value. But that was just the beginning. In Automate This, we meet bots that are driving cars, penning haiku, and writing music mistaken for Bach’s. They listen in on our customer service calls and figure out what Iran would do in the event of a nuclear standoff. There are algorithms that can pick out the most cohesive crew of astronauts for a space mission or identify the next Jeremy Lin. Some can even ingest statistics from baseball games and spit out pitch-perfect sports journalism indistinguishable from that produced by humans. The interaction of man and machine can make our lives easier. But what will the world look like when algorithms control our hospitals, our roads, our culture, and our national security? What hap­pens to businesses when we automate judgment and eliminate human instinct? And what role will be left for doctors, lawyers, writers, truck drivers, and many others?  Who knows—maybe there’s a bot learning to do your job this minute.

Statistics: An Introduction Using R


Michael J. Crawley - 2005
    R is one of the most powerful and flexible statistical software packages available, and enables the user to apply a wide variety of statistical methods ranging from simple regression to generalized linear modelling. Statistics: An Introduction using R is a clear and concise introductory textbook to statistical analysis using this powerful and free software, and follows on from the success of the author's previous best-selling title Statistical Computing. * Features step-by-step instructions that assume no mathematics, statistics or programming background, helping the non-statistician to fully understand the methodology. * Uses a series of realistic examples, developing step-wise from the simplest cases, with the emphasis on checking the assumptions (e.g. constancy of variance and normality of errors) and the adequacy of the model chosen to fit the data. * The emphasis throughout is on estimation of effect sizes and confidence intervals, rather than on hypothesis testing. * Covers the full range of statistical techniques likely to be need to analyse the data from research projects, including elementary material like t-tests and chi-squared tests, intermediate methods like regression and analysis of variance, and more advanced techniques like generalized linear modelling. * Includes numerous worked examples and exercises within each chapter. * Accompanied by a website featuring worked examples, data sets, exercises and solutions: http: //www.imperial.ac.uk/bio/research/crawl... Statistics: An Introduction using R is the first text to offer such a concise introduction to a broad array of statistical methods, at a level that is elementary enough to appeal to a broad range of disciplines. It is primarily aimed at undergraduate students in medicine, engineering, economics and biology - but will also appeal to postgraduates who have not previously covered this area, or wish to switch to using R.

Computer Vision: Algorithms and Applications


Richard Szeliski - 2010
    However, despite all of the recent advances in computer vision research, the dream of having a computer interpret an image at the same level as a two-year old remains elusive. Why is computer vision such a challenging problem and what is the current state of the art?Computer Vision: Algorithms and Applications explores the variety of techniques commonly used to analyze and interpret images. It also describes challenging real-world applications where vision is being successfully used, both for specialized applications such as medical imaging, and for fun, consumer-level tasks such as image editing and stitching, which students can apply to their own personal photos and videos.More than just a source of "recipes," this exceptionally authoritative and comprehensive textbook/reference also takes a scientific approach to basic vision problems, formulating physical models of the imaging process before inverting them to produce descriptions of a scene. These problems are also analyzed using statistical models and solved using rigorous engineering techniquesTopics and features: Structured to support active curricula and project-oriented courses, with tips in the Introduction for using the book in a variety of customized courses Presents exercises at the end of each chapter with a heavy emphasis on testing algorithms and containing numerous suggestions for small mid-term projects Provides additional material and more detailed mathematical topics in the Appendices, which cover linear algebra, numerical techniques, and Bayesian estimation theory Suggests additional reading at the end of each chapter, including the latest research in each sub-field, in addition to a full Bibliography at the end of the book Supplies supplementary course material for students at the associated website, http: //szeliski.org/Book/ Suitable for an upper-level undergraduate or graduate-level course in computer science or engineering, this textbook focuses on basic techniques that work under real-world conditions and encourages students to push their creative boundaries. Its design and exposition also make it eminently suitable as a unique reference to the fundamental techniques and current research literature in computer vision.

Introduction to Operations Research [with Revised CD-ROM]


Frederick S. Hillier - 1967
    This edition also features the developments in Operations Research, such as metaheuristics, simulation, and spreadsheet modeling.

The Mathematical Theory of Communication


Claude Shannon - 1949
    Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.

Quantum Computing Since Democritus


Scott Aaronson - 2013
    Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible to readers with scientific backgrounds, as well as students and researchers working in physics, computer science, mathematics and philosophy.

Building Java Programs: A Back to Basics Approach


Stuart Reges - 2007
    By using objects early to solve interesting problems and defining objects later in the course, Building Java Programs develops programming knowledge for a broad audience. Introduction to Java Programming, Primitive Data and Definite Loops, Introduction to Parameters and Objects, Conditional Execution, Program Logic and Indefinite Loops, File Processing, Arrays, Defining Classes, Inheritance and Interfaces, ArrayLists, Java Collections Framework, Recursion, Searching and Sorting, Graphical User Interfaces. For all readers interested in introductory programming.

Purely Functional Data Structures


Chris Okasaki - 1996
    However, data structures for these languages do not always translate well to functional languages such as Standard ML, Haskell, or Scheme. This book describes data structures from the point of view of functional languages, with examples, and presents design techniques that allow programmers to develop their own functional data structures. The author includes both classical data structures, such as red-black trees and binomial queues, and a host of new data structures developed exclusively for functional languages. All source code is given in Standard ML and Haskell, and most of the programs are easily adaptable to other functional languages. This handy reference for professional programmers working with functional languages can also be used as a tutorial or for self-study.

Statistics Done Wrong: The Woefully Complete Guide


Alex Reinhart - 2013
    Politicians and marketers present shoddy evidence for dubious claims all the time. But smart people make mistakes too, and when it comes to statistics, plenty of otherwise great scientists--yes, even those published in peer-reviewed journals--are doing statistics wrong."Statistics Done Wrong" comes to the rescue with cautionary tales of all-too-common statistical fallacies. It'll help you see where and why researchers often go wrong and teach you the best practices for avoiding their mistakes.In this book, you'll learn: - Why "statistically significant" doesn't necessarily imply practical significance- Ideas behind hypothesis testing and regression analysis, and common misinterpretations of those ideas- How and how not to ask questions, design experiments, and work with data- Why many studies have too little data to detect what they're looking for-and, surprisingly, why this means published results are often overestimates- Why false positives are much more common than "significant at the 5% level" would suggestBy walking through colorful examples of statistics gone awry, the book offers approachable lessons on proper methodology, and each chapter ends with pro tips for practicing scientists and statisticians. No matter what your level of experience, "Statistics Done Wrong" will teach you how to be a better analyst, data scientist, or researcher.

Partial Differential Equations


Lawrence C. Evans - 1998
    

Beautiful Code: Leading Programmers Explain How They Think


Andy OramLincoln Stein - 2007
    You will be able to look over the shoulder of major coding and design experts to see problems through their eyes.This is not simply another design patterns book, or another software engineering treatise on the right and wrong way to do things. The authors think aloud as they work through their project's architecture, the tradeoffs made in its construction, and when it was important to break rules. Beautiful Code is an opportunity for master coders to tell their story. All author royalties will be donated to Amnesty International.