Best of
Computer-Science

1990

Unix Network Programming, Volume 1: Networking APIs - Sockets and XTI


W. Richard Stevens - 1990
    You need UNIX Network Programming, Volume 1, Second Edition. In this book, leading UNIX networking expert W. Richard Stevens offers unprecedented, start-to-finish guidance on making the most of sockets, the de facto standard for UNIX network programming-as well as extensive coverage of the X/Open Transport Interface (XTI). Stevens begins by introducing virtually every basic capability of TCP and UDP sockets, including socket functions and options, I/O multiplexing, and name and address conversions. He presents detailed coverage of the Posix.1g standard for sockets and the Posix threads. He also introduces advanced techniques for: Establishing IPv4/IPv6 interoperability. Implementing non-blocking I/O. Routing sockets. Broadcasting and multicasting. IP options. Multithreading. Advanced name and address conversions. UNIX domain protocols. Raw sockets. Learn how to choose among today's leading client/server design approaches, including TCP

Computer Graphics: Principles and Practice


James D. Foley - 1990
    It details programming with SRGP, a simple but powerful raster graphics package. Important algorithms in 2D and 3D graphics are detailed for easy implementation, and a thorough presentation of the mathematical principles of geometric transformations and viewing are included.

Fundamentals of Robotics: Analysis and Control


Robert J. Schilling - 1990
    Case study examples of educational, industrial and generic robots are discussed. Class demonstration software is provided with the laboratory manual. (vs. Craig, Fu, and Asada).

Parsing Techniques: A Practical Guide


Dick Grune - 1990
    Parsing, also referred to as syntax analysis, has been and continues to be an essential part of computer science and linguistics. Parsing techniques have grown considerably in importance, both in computer science, ie. advanced compilers often use general CF parsers, and computational linguistics where such parsers are the only option. They are used in a variety of software products including Web browsers, interpreters in computer devices, and data compression programs; and they are used extensively in linguistics.

The Craft of PROLOG


Richard O'Keefe - 1990
    Prolog is different, but not that different. Elegance is not optional. These are the themes that unify Richard O'Keefe's very personal statement on how Prolog programs should be written. The emphasis in The "Craft of Prolog" is on using Prolog effectively. It presents a loose collection of topics that build on and elaborate concepts learned in a first course. These may be read in any order following the first chapter, "Basic Topics in Prolog," which provides a basis for the rest of the material in the book.Richard A. O'Keefe is Lecturer in the Department of Computer Science at the Royal Melbourne Institute of Technology. He is also a consultant to Quintus Computer Systems, Inc.Contents: Basic Topics in Prolog. Searching. Where Does the Space Go? Methods of Programming. Data Structure Design. Sequences. Writing Interpreters. Some Notes on Grammar Rules. Prolog Macros. Writing Tokenisers in Prolog. All Solutions.

Understanding SQL


Martin Gruber - 1990
    Exercises at the end of each chapter build reader fluency and confidence at each level before proceeding to the next.

Zen of Assembly Language: Vol. 1, Knowledge


Michael Abrash - 1990
    Also probes hardware aspects that affect code performance and compares programming techniques.

The REXX Language: A Practical Approach to Programing


Michael Cowlishaw - 1990
    This book is recognized as the standard reference manual for the REXX Programming Language, much as Kernighan and Ritchie is recognized as the standard reference book for the C Programming Language.

Graphics Gems


Andrew S. Glassner - 1990
    The best programmers have a large toolbox of general techniques, nuggets of algorithms, and clever insights that they use on a daily basis to make their code faster, more reliable, more accurate, easier to debug, and a pleasure to use. Such toolboxes are compiled through years of experience and trading with other professionals.

Programming Languages: An Interpreter-Based Approach


Samuel N. Kamin - 1990
    

First-Order Logic and Automated Theorem Proving


Melvin Fitting - 1990
    Some have philosophers as their intended audience, some mathematicians, some computer scien tists. Although there is a common core to all such books, they will be very different in emphasis, methods, and even appearance. This book is intended for computer scientists. But even this is not precise. Within computer science formal logic turns up in a number of areas, from pro gram verification to logic programming to artificial intelligence. This book is intended for computer scientists interested in automated theo rem proving in classical logic. To be more precise yet, it is essentially a theoretical treatment, not a how-to book, although how-to issues are not neglected. This does not mean, of course, that the book will be of no interest to philosophers or mathematicians. It does contain a thorough presentation of formal logic and many proof techniques, and as such it contains all the material one would expect to find in a course in formal logic covering completeness but, not incompleteness issues. The first item to be addressed is, What are we talking about and why are we interested in it? We are primarily talking about truth as used in mathematical discourse, and our interest in it is, or should be, self evident. Truth is a semantic concept, so we begin with models and their properties. These are used to define our subject."

The Architecture of Symbolic Computers


Peter M. Kogge - 1990
    Focuses on the design and implementation of two classes of non-von Neumann computer architecture: those designed for functional and logical language computing.

Operating Systems


Harvey Deitel - 1990
    To complement the discussion of operating system concepts, the book features two in-depth case studies on Linux and Windows XP. The case studies follow the outline of the book, so readers working through the chapter material can refer to each case study to see how a particular topic is handled in either Linux or Windows XP. Using Java code to illustrate key points, Operating Systems introduces processes, concurrent programming, deadlock and indefinite postponement, mutual exclusion, physical and virtual memory, file systems, disk performance, distributed systems, security and more. New to this edition are a chapter on multithreading and extensive treatments of distributed computing, multiprocessing, performance, and computer security. An ideal up-to-date book for beginner operating systems readers.

Digital And Microprocessor Fundamentals: Theory And Applications


William Kleitz - 1990
    It uses a simple, easy-to-understand writing style, an abundance of clearly explained examples, and nearly 1,000 illustrations to explain practical applications and problems using industry-standard ICs and circuits and schematics that the reader will encounter on the job. Important features of this text are many: coordinates digital/microprocessor coverage throughout the text; first provides all theory required to understand a particular IC or circuit, then gives examples of its use; includes examples and system design applications that give a complete explanation of circuit operation - with all required hardware and software, so they can be duplicated in the lab; uses the 8085A microprocessor and 8051 microcontroller to explain the fundamentals of microprocessor architecture, programming, and hardware; clearly explains the microprocessor program solutions for the 8085 and 8051; includes a glossary for each chapter; and contains a Supplementary Index of ICs and an Instruction Set Reference Encyclopedia. Complete with end-of-chapter summaries, numerous appendixes, and schematic interpretation problems (all of which are new to this second edition), this text is an essential source for the fundamentals of theory and applications.

Programming In Martin Löf's Type Theory: An Introduction


Bengt Nordström - 1990
    One such formalism is the type theory developed by Per Martin-L f. Well suited as a theory for program construction, it makes possible the expression of both specifications and programs within the same formalism. Furthermore, the proof rules can be used to derive a correct program from a specification as well as to verify that a given program has a certain property. This book contains a thorough introduction to type theory, with information on polymorphic sets, subsets, monomorphic sets, and a full set of helpful examples.

Programming Linguistics


David Gelernter - 1990
    In studying the evolution of programming languages, the authors are also studying a series of answers to the central (and still unanswered) questions of what programs are and how they should be built.Programming Linguistics approaches language design as an attempt to define the nature of programming and the shape and structure of programs, rather than as the attempt to solve a series of narrow, disjoint technical problems. It emphasizes the structural-engineering rather than mathematical approach to programming, the importance of aesthetics and elegance in the success of language design, and provides an integrated treatment of concurrency and parallelism.Its readable and informal but rigorous coverage of the gamut of programming language designs is based on a simple and general programming model called the Ideal Software Machine. There are helpful exercises throughout.

Programming in the 1990s: An Introduction to the Calculation of Programs


Edward Cohen - 1990
    Unfortunately, it is rarely presented as such. Most often it is taught by "induction": features of some famous programming languages are given operational meaning (e.g. a loop "goes round and round"), a number of examples are shown, and by induction, we are asked to develop other programs, often radically different from the ones we've seen. Basically we are taught to guess our programs, and then to patch up our guesses. Our errors are given the cute name of "bugs". Fixing them becomes puzzle-solving, as does finding tricks that exploit or avoid poorly designed features of the programming language. The entire process is time-consuming and expensive. And even so, we are never quite sure if our programs really work in all cases. When approached in this way, programming is indeed a dull activity. There is, however, another approach to programming, an approach in which programs can be developed reliably, with attention to the real issues. It is a practical approach based on methodically developing programs from their specifications. Besides being practical, it is exciting. Many programs can be developed with relative ease. Problems which once were difficult can now be solved by beginners. Elegant solutions bring great satisfaction. This is our subject. We are interested in making programming an exciting topic!

Graphics Gems


Andrew S. Glassner - 1990
    The vision and purpose of the Series was - and still is - to provide tips, techniques, and algorithms for graphics programmers. All of the gems are written by programmers who work in the field and are motivated by a common desire to share interesting ideas and tools with their colleagues. Each volume provides a new set of innovative solutions to a variety of programming problems

Distributed Systems


Sape Mullender - 1990
    Examples and case studies of commericial and experimental systems are provided by a distinguished author team, whose work reflects the cutting edge of modern developments in the field.

Adaptive Algorithms and Stochastic Approximations


Albert Benveniste - 1990
    These diverse areas echo the classes of models which conveniently describe each corresponding system. Thus although there can hardly be a "general theory of adaptive systems" encompassing both the modelling task and the design of the adaptation procedure, nevertheless, these diverse issues have a major common component: namely the use of adaptive algorithms, also known as stochastic approximations in the mathematical statistics literature, that is to say the adaptation procedure (once all modelling problems have been resolved). The juxtaposition of these two expressions in the title reflects the ambition of the authors to produce a reference work, both for engineers who use these adaptive algorithms and for probabilists or statisticians who would like to study stochastic approximations in terms of problems arising from real applications. Hence the book is organised in two parts, the first one user-oriented, and the second providing the mathematical foundations to support the practice described in the first part. The book covers the topcis of convergence, convergence rate, permanent adaptation and tracking, change detection, and is illustrated by various realistic applications originating from these areas of applications.

Software Conflict: Essays on the Art and Science of Software Engineering


Robert L. Glass - 1990
    

Programming: The Derivation Of Algorithms


A. Kaldewaij - 1990
    There are two factors by which algorithms may be judged - their correctness and their performance. This text discusses the calculational style of programming where programs are derived from their specification by means of formula manipulation.

The Mathematical Foundations of Learning Machines


Nils J. Nilsson - 1990
    By providing a clear exposition of the mathematical ideas that unify this field, Mathematical Foundations of Learning Machines offers the basis of a rigorous and integrated theory of Neural Networks. This seminal book is a recognized classic among Neural Network researchers due to Nilsson's presentation of intuitive geometric and statistical theories. Recent developments in Neural Networks and Artificial Intelligence underscore the importance of the strong theoretical basis for research in these areas. Many of the issues raised in this book still stand as challenges to current efforts, giving new relevance to the importance of proper formulation, analysis, experimentation, and then reformulation. Included in this volume are discussions of special relevance to learning rates, nonparametric training methods, nonlinear network models, and related issues from computation, control theory and statistics. The emphasis on deterministic methods for solving classification problems is an excellent starting point for readers interested in the probabilistic studies associated with Neural Networks research. Anyone interested in the foundations of Neural Networks and learning in parallel distributed processing systems will find this a valuable book.

Aaron's Code: Meta-Art, Artificial Intelligence, and the Work of Harold Cohen


Pamela McCorduck - 1990
    Here is the work of Harold Cohen - the renowned abstract painter who, at the height of a celebrated career in the late 1960's, abandoned the international scene of museums and galleries and sequestered himself with the most powerful computers he could get his hands on. What emerged from his long years of solitary struggle is an elaborate computer program that makes drawings autonomously, without human intervention - an electronic apprentice and alter ego called Aaron.

Definition of Standard ML, Revised Edition


Robin Milner - 1990
    This book provides a formal definition of Standard ML for the benefit of all concerned with the language, including users and implementers. Because computer programs are increasingly required to withstand rigorous analysis, it is all the more important that the language in which they are written be defined with full rigor. The authors have defined their semantic objects in mathematical notation that is completely independent of Standard ML.