Book picks similar to
The Electrical Engineering Handbook by Richard C. Dorf
engineering
reference
engineering-texts
energy
Concrete Mathematics: A Foundation for Computer Science
Ronald L. Graham - 1988
"More concretely," the authors explain, "it is the controlled manipulation of mathematical formulas, using a collection of techniques for solving problems."
Building Wireless Sensor Networks
Robert Faludi - 2010
By the time you're halfway through this fast-paced, hands-on guide, you'll have built a series of useful projects, including a complete ZigBee wireless network that delivers remotely sensed data.Radio networking is creating revolutions in volcano monitoring, performance art, clean energy, and consumer electronics. As you follow the examples in each chapter, you'll learn how to tackle inspiring projects of your own. This practical guide is ideal for inventors, hackers, crafters, students, hobbyists, and scientists.Investigate an assortment of practical and intriguing project ideasPrep your ZigBee toolbox with an extensive shopping list of parts and programsCreate a simple, working ZigBee network with XBee radios in less than two hours -- for under $100Use the Arduino open source electronics prototyping platform to build a series of increasingly complex projectsGet familiar with XBee's API mode for creating sensor networksBuild fully scalable sensing and actuation systems with inexpensive componentsLearn about power management, source routing, and other XBee technical nuancesMake gateways that connect with neighboring networks, including the Internet
Artificial Intelligence: A Modern Approach
Stuart Russell - 1994
The long-anticipated revision of this best-selling text offers the most comprehensive, up-to-date introduction to the theory and practice of artificial intelligence. *NEW-Nontechnical learning material-Accompanies each part of the book. *NEW-The Internet as a sample application for intelligent systems-Added in several places including logical agents, planning, and natural language. *NEW-Increased coverage of material - Includes expanded coverage of: default reasoning and truth maintenance systems, including multi-agent/distributed AI and game theory; probabilistic approaches to learning including EM; more detailed descriptions of probabilistic inference algorithms. *NEW-Updated and expanded exercises-75% of the exercises are revised, with 100 new exercises. *NEW-On-line Java software. *Makes it easy for students to do projects on the web using intelligent agents. *A unified, agent-based approach to AI-Organizes the material around the task of building intelligent agents. *Comprehensive, up-to-date coverage-Includes a unified view of the field organized around the rational decision making pa
Digital Design and Computer Architecture
David Money Harris - 2007
Digital Design and Computer Architecture begins with a modern approach by rigorously covering the fundamentals of digital logic design and then introducing Hardware Description Languages (HDLs). Featuring examples of the two most widely-used HDLs, VHDL and Verilog, the first half of the text prepares the reader for what follows in the second: the design of a MIPS Processor. By the end of Digital Design and Computer Architecture, readers will be able to build their own microprocessor and will have a top-to-bottom understanding of how it works--even if they have no formal background in design or architecture beyond an introductory class. David Harris and Sarah Harris combine an engaging and humorous writing style with an updated and hands-on approach to digital design.Unique presentation of digital logic design from the perspective of computer architecture using a real instruction set, MIPS.Side-by-side examples of the two most prominent Hardware Design Languages--VHDL and Verilog--illustrate and compare the ways the each can be used in the design of digital systems.Worked examples conclude each section to enhance the reader's understanding and retention of the material.
Tune to Win
Carroll Smith - 1978
An exceptional book written by a true professional.
Feynman Lectures On Computation
Richard P. Feynman - 1996
Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman, the course also featured, as occasional guest speakers, some of the most brilliant men in science at that time, including Marvin Minsky, Charles Bennett, and John Hopfield. Although the lectures are now thirteen years old, most of the material is timeless and presents a “Feynmanesque” overview of many standard and some not-so-standard topics in computer science such as reversible logic gates and quantum computers.
Two Scoops of Django: Best Practices for Django 1.6
Daniel Roy Greenfeld - 2014
Computer Organization & Design: The Hardware/Software Interface
David A. Patterson - 1993
More importantly, this book provides a framework for thinking about computer organization and design that will enable the reader to continue the lifetime of learning necessary for staying at the forefront of this competitive discipline. --John Crawford Intel Fellow Director of Microprocessor Architecture, Intel The performance of software systems is dramatically affected by how well software designers understand the basic hardware technologies at work in a system. Similarly, hardware designers must understand the far reaching effects their design decisions have on software applications. For readers in either category, this classic introduction to the field provides a deep look into the computer. It demonstrates the relationship between the software and hardware and focuses on the foundational concepts that are the basis for current computer design. Using a distinctive learning by evolution approach the authors present each idea from its first principles, guiding readers through a series of worked examples that incrementally add more complex instructions until they ha
Think Complexity: Complexity Science and Computational Modeling
Allen B. Downey - 2009
Whether you’re an intermediate-level Python programmer or a student of computational modeling, you’ll delve into examples of complex systems through a series of exercises, case studies, and easy-to-understand explanations.You’ll work with graphs, algorithm analysis, scale-free networks, and cellular automata, using advanced features that make Python such a powerful language. Ideal as a text for courses on Python programming and algorithms, Think Complexity will also help self-learners gain valuable experience with topics and ideas they might not encounter otherwise.Work with NumPy arrays and SciPy methods, basic signal processing and Fast Fourier Transform, and hash tablesStudy abstract models of complex physical systems, including power laws, fractals and pink noise, and Turing machinesGet starter code and solutions to help you re-implement and extend original experiments in complexityExplore the philosophy of science, including the nature of scientific laws, theory choice, realism and instrumentalism, and other topicsExamine case studies of complex systems submitted by students and readers
Physics, Volume 1
Robert Resnick - 1966
The Fourth Edition of volumes 1 and 2 is concerned with mechanics and E&M/Optics. New features include: expanded coverage of classic physics topics, substantial increases in the number of in-text examples which reinforce text exposition, the latest pedagogical and technical advances in the field, numerical analysis, computer-generated graphics, computer projects and much more.
Turing's Cathedral: The Origins of the Digital Universe
George Dyson - 2012
In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same. Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars. Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.
Digital Integrated Circuits
Jan M. Rabaey - 1995
Digital Integrated Circuits maintains a consistent, logical flow of subject matter throughout. KEY TOPICS: Addresses today's most significant and compelling industry topics, including: the impact of interconnect, design for low power, issues in timing and clocking, design methodologies, and the tremendous effect of design automation on the digital design perspective. MARKET: For readers interested in digital circuit design.
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
Ray Tracing in One Weekend (Ray Tracing Minibooks Book 1)
Peter Shirley - 2016
Each mini-chapter adds one feature to the ray tracer, and by the end the reader can produce the image on the book cover. Details of basic ray tracing code architecture and C++ classes are given.