Conceptual Physics


Paul G. Hewitt - 1971
    Hewitt's text is famous for engaging readers with analogies and imagery from real-world situations that build a strong conceptual understanding of physical principles ranging from classical mechanics to modern physics. With this strong foundation, readers are better equipped to understand the equations and formulas of physics, and motivated to explore the thought-provoking exercises and fun projects in each chapter. Included in the package is the workbook. Mechanics, Properties of Matter, Heat, Sound, Electricity and Magnetism, Light, Atomic and Nuclear Physics, Relativity. For all readers interested in conceptual physics.

Elementary Number Theory and Its Applications


Kenneth H. Rosen - 1984
    The Fourth Edition builds on this strength with new examples, additional applications and increased cryptology coverage. Up-to-date information on the latest discoveries is included.Elementary Number Theory and Its Applications provides a diverse group of exercises, including basic exercises designed to help students develop skills, challenging exercises and computer projects. In addition to years of use and professor feedback, the fourth edition of this text has been thoroughly accuracy checked to ensure the quality of the mathematical content and the exercises.

A First Course in Probability


Sheldon M. Ross - 1976
    A software diskette provides an easy-to-use tool for students to derive probabilities for binomial.

Solar Energy: The physics and engineering of photovoltaic conversion, technologies and systems


Arno Smets - 2016
    The book is also ideal for university and third-level physics or engineering courses on solar photovoltaics, with exercises to check students' understanding and reinforce learning. It is the perfect companion to the Massive Open Online Course (MOOC) on Solar Energy (DelftX, ET.3034TU) presented by co-author Arno Smets. The course is available in English on the nonprofit open source edX.org platform, and in Arabic on edraak.org. Over 100,000 students have already registered for these MOOCs.

The Cartoon Guide to Statistics


Larry Gonick - 1993
    Never again will you order the Poisson Distribution in a French restaurant!This updated version features all new material.

Applied Cryptography: Protocols, Algorithms, and Source Code in C


Bruce Schneier - 1993
    … The book the National Security Agency wanted never to be published." –Wired Magazine "…monumental… fascinating… comprehensive… the definitive work on cryptography for computer programmers…" –Dr. Dobb's Journal"…easily ranks as one of the most authoritative in its field." —PC Magazine"…the bible of code hackers." –The Millennium Whole Earth CatalogThis new edition of the cryptography classic provides you with a comprehensive survey of modern cryptography. The book details how programmers and electronic communications professionals can use cryptography—the technique of enciphering and deciphering messages-to maintain the privacy of computer data. It describes dozens of cryptography algorithms, gives practical advice on how to implement them into cryptographic software, and shows how they can be used to solve security problems. Covering the latest developments in practical cryptographic techniques, this new edition shows programmers who design computer applications, networks, and storage systems how they can build security into their software and systems. What's new in the Second Edition? * New information on the Clipper Chip, including ways to defeat the key escrow mechanism * New encryption algorithms, including algorithms from the former Soviet Union and South Africa, and the RC4 stream cipher * The latest protocols for digital signatures, authentication, secure elections, digital cash, and more * More detailed information on key management and cryptographic implementations

The Visual Display of Quantitative Information


Edward R. Tufte - 1983
    Theory and practice in the design of data graphics, 250 illustrations of the best (and a few of the worst) statistical graphics, with detailed analysis of how to display data for precise, effective, quick analysis. Design of the high-resolution displays, small multiples. Editing and improving graphics. The data-ink ratio. Time-series, relational graphics, data maps, multivariate designs. Detection of graphical deception: design variation vs. data variation. Sources of deception. Aesthetics and data graphical displays. This is the second edition of The Visual Display of Quantitative Information. Recently published, this new edition provides excellent color reproductions of the many graphics of William Playfair, adds color to other images, and includes all the changes and corrections accumulated during 17 printings of the first edition.

Modern Operating Systems


Andrew S. Tanenbaum - 1992
    What makes an operating system modern? According to author Andrew Tanenbaum, it is the awareness of high-demand computer applications--primarily in the areas of multimedia, parallel and distributed computing, and security. The development of faster and more advanced hardware has driven progress in software, including enhancements to the operating system. It is one thing to run an old operating system on current hardware, and another to effectively leverage current hardware to best serve modern software applications. If you don't believe it, install Windows 3.0 on a modern PC and try surfing the Internet or burning a CD. Readers familiar with Tanenbaum's previous text, Operating Systems, know the author is a great proponent of simple design and hands-on experimentation. His earlier book came bundled with the source code for an operating system called Minux, a simple variant of Unix and the platform used by Linus Torvalds to develop Linux. Although this book does not come with any source code, he illustrates many of his points with code fragments (C, usually with Unix system calls). The first half of Modern Operating Systems focuses on traditional operating systems concepts: processes, deadlocks, memory management, I/O, and file systems. There is nothing groundbreaking in these early chapters, but all topics are well covered, each including sections on current research and a set of student problems. It is enlightening to read Tanenbaum's explanations of the design decisions made by past operating systems gurus, including his view that additional research on the problem of deadlocks is impractical except for "keeping otherwise unemployed graph theorists off the streets." It is the second half of the book that differentiates itself from older operating systems texts. Here, each chapter describes an element of what constitutes a modern operating system--awareness of multimedia applications, multiple processors, computer networks, and a high level of security. The chapter on multimedia functionality focuses on such features as handling massive files and providing video-on-demand. Included in the discussion on multiprocessor platforms are clustered computers and distributed computing. Finally, the importance of security is discussed--a lively enumeration of the scores of ways operating systems can be vulnerable to attack, from password security to computer viruses and Internet worms. Included at the end of the book are case studies of two popular operating systems: Unix/Linux and Windows 2000. There is a bias toward the Unix/Linux approach, not surprising given the author's experience and academic bent, but this bias does not detract from Tanenbaum's analysis. Both operating systems are dissected, describing how each implements processes, file systems, memory management, and other operating system fundamentals. Tanenbaum's mantra is simple, accessible operating system design. Given that modern operating systems have extensive features, he is forced to reconcile physical size with simplicity. Toward this end, he makes frequent references to the Frederick Brooks classic The Mythical Man-Month for wisdom on managing large, complex software development projects. He finds both Windows 2000 and Unix/Linux guilty of being too complicated--with a particular skewering of Windows 2000 and its "mammoth Win32 API." A primary culprit is the attempt to make operating systems more "user-friendly," which Tanenbaum views as an excuse for bloated code. The solution is to have smart people, the smallest possible team, and well-defined interactions between various operating systems components. Future operating system design will benefit if the advice in this book is taken to heart. --Pete Ostenson

Introduction to Graph Theory


Douglas B. West - 1995
    Verification that algorithms work is emphasized more than their complexity. An effective use of examples, and huge number of interesting exercises, demonstrate the topics of trees and distance, matchings and factors, connectivity and paths, graph coloring, edges and cycles, and planar graphs. For those who need to learn to make coherent arguments in the fields of mathematics and computer science.

Humble Pi: A Comedy of Maths Errors


Matt Parker - 2019
    Most of the time this math works quietly behind the scenes . . . until it doesn't. All sorts of seemingly innocuous mathematical mistakes can have significant consequences.Math is easy to ignore until a misplaced decimal point upends the stock market, a unit conversion error causes a plane to crash, or someone divides by zero and stalls a battleship in the middle of the ocean.Exploring and explaining a litany of glitches, near misses, and mathematical mishaps involving the internet, big data, elections, street signs, lotteries, the Roman Empire, and an Olympic team, Matt Parker uncovers the bizarre ways math trips us up, and what this reveals about its essential place in our world. Getting it wrong has never been more fun.

Statistics for People Who (Think They) Hate Statistics


Neil J. Salkind - 2000
    The book begins with an introduction to the language of statistics and then covers descriptive statistics and inferential statistics. Throughout, the author offers readers:- Difficulty Rating Index for each chapter′s material- Tips for doing and thinking about a statistical technique- Top tens for everything from the best ways to create a graph to the most effective techniques for data collection- Steps that break techniques down into a clear sequence of procedures- SPSS tips for executing each major statistical technique- Practice exercises at the end of each chapter, followed by worked out solutions.The book concludes with a statistical software sampler and a description of the best Internet sites for statistical information and data resources. Readers also have access to a website for downloading data that they can use to practice additional exercises from the book. Students and researchers will appreciate the book′s unhurried pace and thorough, friendly presentation.

Data Science for Business: What you need to know about data mining and data-analytic thinking


Foster Provost - 2013
    This guide also helps you understand the many data-mining techniques in use today.Based on an MBA course Provost has taught at New York University over the past ten years, Data Science for Business provides examples of real-world business problems to illustrate these principles. You’ll not only learn how to improve communication between business stakeholders and data scientists, but also how participate intelligently in your company’s data science projects. You’ll also discover how to think data-analytically, and fully appreciate how data science methods can support business decision-making.Understand how data science fits in your organization—and how you can use it for competitive advantageTreat data as a business asset that requires careful investment if you’re to gain real valueApproach business problems data-analytically, using the data-mining process to gather good data in the most appropriate wayLearn general concepts for actually extracting knowledge from dataApply data science principles when interviewing data science job candidates

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Fundamentals of Astrodynamics


Roger R. Bate - 1971
    Air Force Academy and designed as a first course emphasizes the universal variable formulation. Develops the basic two-body and n-body equations of motion; orbit determination; classical orbital elements, coordinate transformations; differential correction; more. Includes specialized applications to lunar and interplanetary flight, example problems, exercises. 1971 edition.

Chemical Reaction Engineering


Octave Levenspiel - 1962
    It's goal is the successful design and operation of chemical reactors. This text emphasizes qualitative arguments, simple design methods, graphical procedures, and frequent comparison of capabilities of the major reactor types. Simple ideas are treated first, and are then extended to the more complex.