Physical Examination and Health Assessment [With CDROM]


Carolyn Jarvis - 1941
    The physical examination unit is organized by body system, pedagogically and clinically the most logical and efficient way to learn and perform health assessment. Each chapter has five major sections: (1) Structure and Function (A&P); (2) Subjective Data (history); (3) Objective Data (skills, expected findings, and common variations for healthy people and selected abnormal findings); (4) Abnormal Findings (illustrations of related disorders and conditions in atlas format); and (5) Application and Documentation (sample charting, clinical case studies, nursing diagnoses, and critical thinking questions tied to the Saunders video series).

Probability Theory: The Logic of Science


E.T. Jaynes - 1999
    It discusses new results, along with applications of probability theory to a variety of problems. The book contains many exercises and is suitable for use as a textbook on graduate-level courses involving data analysis. Aimed at readers already familiar with applied mathematics at an advanced undergraduate level or higher, it is of interest to scientists concerned with inference from incomplete information.

Sensation and Perception


E. Bruce Goldstein - 1980
    Bruce Goldstein's SENSATION AND PERCEPTION has helped more than 100,000 students make the connection between perception and physiology. Goldstein has crafted an easier-to-understand, and more student-friendly book, without sacrificing the text's comprehensive examination of sensation and perception. Goldstein takes readers on an intriguing journey through their senses, and chronicles scientists' efforts to understand the fascinating behind the scenes activity that allows us to perceive. With balanced coverage of all senses, this book offers an integrated examination of how the senses work together. Goldstein shows readers how seemingly simple experiences are actually extremely complex mechanisms and examines both the psychophysical and physiological underpinnings of perception. All material is presented in a way students find interesting and easy to follow. The book's visually dynamic presentation includes numerous color plates that are presented as visual topic essays. In addition, more than 50 hands-on demonstrations illustrate perceptual experiences. All are simple enough for students to do and are seamlessly integrated into the flow of the text.

Spreadsheet Modeling & Decision Analysis: A Practical Introduction to Management Science


Cliff T. Ragsdale - 1997
    . . everything you need to master the most widely used management science techniques using Microsoft Excel is right here Learning to make decisions in today's business world takes training and experience. Cliff Ragsdale--the respected innovator in the field of management science--is an outstanding guide to help you learn the skills you need, use Microsoft Excel for Windows to implement those skills, and gain the confidence to apply what you learn to real business situations. SPREADSHEET MODELING AND DECISION ANALYSIS gives you step-by-step instructions and annotated screen shots to make examples easy to follow. Plus, interesting sections called The World of Management Science show you how each topic has been applied in a real company.

Ansel's Pharmaceutical Dosage Forms and Drug Delivery Systems


Loyd V. Allen Jr. - 2004
    Each chapter in this revised Eighth Edition includes two case studies—one clinical and one pharmaceutical. Content coincides with the CAPE, APhA, and NAPLEX competencies.This edition includes updated drug information and expanded sections on parenterals, excipients, liposomes, and biopharmaceutics. Coverage incorporates all new dosage forms in the current USP Pharmacopoeia-National Formulary. Capsules and tablets are now covered in separate chapters. The thoroughly revamped illustration program includes new product and manufacturing equipment photographs.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Design Patterns: Elements of Reusable Object-Oriented Software


Erich Gamma - 1994
    Previously undocumented, these 23 patterns allow designers to create more flexible, elegant, and ultimately reusable designs without having to rediscover the design solutions themselves.The authors begin by describing what patterns are and how they can help you design object-oriented software. They then go on to systematically name, explain, evaluate, and catalog recurring designs in object-oriented systems. With Design Patterns as your guide, you will learn how these important patterns fit into the software development process, and how you can leverage them to solve your own design problems most efficiently. Each pattern describes the circumstances in which it is applicable, when it can be applied in view of other design constraints, and the consequences and trade-offs of using the pattern within a larger design. All patterns are compiled from real systems and are based on real-world examples. Each pattern also includes code that demonstrates how it may be implemented in object-oriented programming languages like C++ or Smalltalk.

Intern


Sandeep Jauhar - 2007
    Residency--and especially the first year, called internship--is legendary for its brutality. Working eighty hours or more per week, most new doctors spend their first year asking themselves why they wanted to be doctors in the first place.Jauhar's internship was even more harrowing than most: he switched from physics to medicine in order to follow a more humane calling--only to find that medicine put patients' concerns last. He struggled to find a place among squadrons of cocky residents and doctors. He challenged the practices of the internship in The New York Times, attracting the suspicions of the medical bureaucracy. Then, suddenly stricken, he became a patient himself--and came to see that today's high-tech, high-pressure medicine can be a humane science after all.Now a thriving cardiologist, Jauhar has all the qualities you'd want in your own doctor: expertise, insight, a feel for the human factor, a sense of humor, and a keen awareness of the worries that we all have in common. His beautifully written memoir explains the inner workings of modern medicine with rare candor and insight. "In Jauhar's wise memoir of his two-year ordeal of doubt and sleep deprivation at a New York hospital, he takes readers to the heart of every young physician's hardest test: to become a doctor yet remain a human being." ― Time

Modern Operating Systems


Andrew S. Tanenbaum - 1992
    What makes an operating system modern? According to author Andrew Tanenbaum, it is the awareness of high-demand computer applications--primarily in the areas of multimedia, parallel and distributed computing, and security. The development of faster and more advanced hardware has driven progress in software, including enhancements to the operating system. It is one thing to run an old operating system on current hardware, and another to effectively leverage current hardware to best serve modern software applications. If you don't believe it, install Windows 3.0 on a modern PC and try surfing the Internet or burning a CD. Readers familiar with Tanenbaum's previous text, Operating Systems, know the author is a great proponent of simple design and hands-on experimentation. His earlier book came bundled with the source code for an operating system called Minux, a simple variant of Unix and the platform used by Linus Torvalds to develop Linux. Although this book does not come with any source code, he illustrates many of his points with code fragments (C, usually with Unix system calls). The first half of Modern Operating Systems focuses on traditional operating systems concepts: processes, deadlocks, memory management, I/O, and file systems. There is nothing groundbreaking in these early chapters, but all topics are well covered, each including sections on current research and a set of student problems. It is enlightening to read Tanenbaum's explanations of the design decisions made by past operating systems gurus, including his view that additional research on the problem of deadlocks is impractical except for "keeping otherwise unemployed graph theorists off the streets." It is the second half of the book that differentiates itself from older operating systems texts. Here, each chapter describes an element of what constitutes a modern operating system--awareness of multimedia applications, multiple processors, computer networks, and a high level of security. The chapter on multimedia functionality focuses on such features as handling massive files and providing video-on-demand. Included in the discussion on multiprocessor platforms are clustered computers and distributed computing. Finally, the importance of security is discussed--a lively enumeration of the scores of ways operating systems can be vulnerable to attack, from password security to computer viruses and Internet worms. Included at the end of the book are case studies of two popular operating systems: Unix/Linux and Windows 2000. There is a bias toward the Unix/Linux approach, not surprising given the author's experience and academic bent, but this bias does not detract from Tanenbaum's analysis. Both operating systems are dissected, describing how each implements processes, file systems, memory management, and other operating system fundamentals. Tanenbaum's mantra is simple, accessible operating system design. Given that modern operating systems have extensive features, he is forced to reconcile physical size with simplicity. Toward this end, he makes frequent references to the Frederick Brooks classic The Mythical Man-Month for wisdom on managing large, complex software development projects. He finds both Windows 2000 and Unix/Linux guilty of being too complicated--with a particular skewering of Windows 2000 and its "mammoth Win32 API." A primary culprit is the attempt to make operating systems more "user-friendly," which Tanenbaum views as an excuse for bloated code. The solution is to have smart people, the smallest possible team, and well-defined interactions between various operating systems components. Future operating system design will benefit if the advice in this book is taken to heart. --Pete Ostenson

Introduction to Algorithms


Thomas H. Cormen - 1989
    Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor.

Abstract Algebra


David S. Dummit - 1900
    This book is designed to give the reader insight into the power and beauty that accrues from a rich interplay between different areas of mathematics. The book carefully develops the theory of different algebraic structures, beginning from basic definitions to some in-depth results, using numerous examples and exercises to aid the reader's understanding. In this way, readers gain an appreciation for how mathematical structures and their interplay lead to powerful results and insights in a number of different settings. * The emphasis throughout has been to motivate the introduction and development of important algebraic concepts using as many examples as possible.

Mechanics of Materials, SI Edition


James M. Gere - 2002
    They are converted to metric units using realistic data to help students grasp what is feasible in engineering practice.

The Norton Anthology of English Literature, Volume 1: The Middle Ages through the Restoration & the Eighteenth Century


M.H. Abrams - 1962
    Under the direction of Stephen Greenblatt, General Editor, the editors have reconsidered all aspects of the anthology to make it an even better teaching tool.

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

The Second Brain: A Groundbreaking New Understanding of Nervous Disorders of the Stomach and Intestine


Michael D. Gershon - 1998
    hopeful news [for those] suffering from functional bowel disease.”  — New York Times Book ReviewDr. Gershon’s groundbreaking book fills the gap between what you need to know—and what your doctor has time to tell you.Dr. Michael Gershon has devoted his career to understanding the human bowel (the stomach, esophagus, small intestine, and colon). His thirty years of research have led to an extraordinary rediscovery: nerve cells in the gut that act as a brain. This "second brain" can control our gut all by itself. Our two brains—the one in our head and the one in our bowel—must cooperate. If they do not, then there is chaos in the gut and misery in the head—everything from "butterflies" to cramps, from diarrhea to constipation. Dr. Gershon's work has led to radical new understandings about a wide range of gastrointestinal problems including gastroenteritis, nervous stomach, and irritable bowel syndrome.The Second Brain represents a quantum leap in medical knowledge and is already benefiting patients whose symptoms were previously dismissed as neurotic or "it's all in your head."