Spacetime and Geometry: An Introduction to General Relativity


Sean Carroll - 2003
    With an accessible and lively writing style, it introduces modern techniques to what can often be a formal and intimidating subject. Readers are led from the physics of flat spacetime (special relativity), through the intricacies of differential geometry and Einstein's equations, and on to exciting applications such as black holes, gravitational radiation, and cosmology.

Big Java


Cay S. Horstmann - 2002
    Thoroughly updated to include Java 6, the Third Edition of Horstmann's bestselling text helps you absorb computing concepts and programming principles, develop strong problem-solving skills, and become a better programmer, all while exploring the elements of Java that are needed to write real-life programs. A top-notch introductory text for beginners, Big Java, Third Edition is also a thorough reference for students and professionals alike to Java technologies, Internet programming, database access, and many other areas of computer science.Features of the Third Edition: The 'Objects Gradual' approach leads you into object-oriented thinking step-by-step, from using classes, implementing simple methods, all the way to designing your own object-oriented programs. A strong emphasis on test-driven development encourages you to consider outcomes as you write programming code so you design better, more usable programs Helpful "Testing Track" introduces techniques and tools step by step, ensuring that you master one before moving on to the next New teaching and learning tools in WileyPLUS--including a unique assignment checker that enables you to test your programming problems online before you submit them for a grade Graphics topics are developed gradually throughout the text, conveniently highlighted in separate color-coded sections Updated coverage is fully compatible with Java 5 and includes a discussion of the latest Java 6 features

Learning From Data: A Short Course


Yaser S. Abu-Mostafa - 2012
    Its techniques are widely applied in engineering, science, finance, and commerce. This book is designed for a short course on machine learning. It is a short course, not a hurried course. From over a decade of teaching this material, we have distilled what we believe to be the core topics that every student of the subject should know. We chose the title `learning from data' that faithfully describes what the subject is about, and made it a point to cover the topics in a story-like fashion. Our hope is that the reader can learn all the fundamentals of the subject by reading the book cover to cover. ---- Learning from data has distinct theoretical and practical tracks. In this book, we balance the theoretical and the practical, the mathematical and the heuristic. Our criterion for inclusion is relevance. Theory that establishes the conceptual framework for learning is included, and so are heuristics that impact the performance of real learning systems. ---- Learning from data is a very dynamic field. Some of the hot techniques and theories at times become just fads, and others gain traction and become part of the field. What we have emphasized in this book are the necessary fundamentals that give any student of learning from data a solid foundation, and enable him or her to venture out and explore further techniques and theories, or perhaps to contribute their own. ---- The authors are professors at California Institute of Technology (Caltech), Rensselaer Polytechnic Institute (RPI), and National Taiwan University (NTU), where this book is the main text for their popular courses on machine learning. The authors also consult extensively with financial and commercial companies on machine learning applications, and have led winning teams in machine learning competitions.

Introduction to Quantum Mechanics


David J. Griffiths - 1994
    The book s two-part coverage organizes topics under basic theory, and assembles an arsenal of approximation schemes with illustrative applications. For physicists and engineers. "

Problem-Solving Strategies


Arthur Engel - 1997
    The discussion of problem solving strategies is extensive. It is written for trainers and participants of contests of all levels up to the highest level: IMO, Tournament of the Towns, and the noncalculus parts of the Putnam Competition. It will appeal to high school teachers conducting a mathematics club who need a range of simple to complex problems and to those instructors wishing to pose a "problem of the week", "problem of the month", and "research problem of the year" to their students, thus bringing a creative atmosphere into their classrooms with continuous discussions of mathematical problems. This volume is a must-have for instructors wishing to enrich their teaching with some interesting non-routine problems and for individuals who are just interested in solving difficult and challenging problems. Each chapter starts with typical examples illustrating the central concepts and is followed by a number of carefully selected problems and their solutions. Most of the solutions are complete, but some merely point to the road leading to the final solution. Very few problems have no solutions. Readers interested in increasing the effectiveness of the book can do so by working on the examples in addition to the problems thereby increasing the number of problems to over 1300. In addition to being a valuable resource of mathematical problems and solution strategies, this volume is the most complete training book on the market.

Statistical Inference


George Casella - 2001
    Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and are natural extensions and consequences of previous concepts. This book can be used for readers who have a solid mathematics background. It can also be used in a way that stresses the more practical uses of statistical theory, being more concerned with understanding basic statistical concepts and deriving reasonable statistical procedures for a variety of situations, and less concerned with formal optimality investigations.

Applied Predictive Modeling


Max Kuhn - 2013
    Non- mathematical readers will appreciate the intuitive explanations of the techniques while an emphasis on problem-solving with real data across a wide variety of applications will aid practitioners who wish to extend their expertise. Readers should have knowledge of basic statistical ideas, such as correlation and linear regression analysis. While the text is biased against complex equations, a mathematical background is needed for advanced topics. Dr. Kuhn is a Director of Non-Clinical Statistics at Pfizer Global R&D in Groton Connecticut. He has been applying predictive models in the pharmaceutical and diagnostic industries for over 15 years and is the author of a number of R packages. Dr. Johnson has more than a decade of statistical consulting and predictive modeling experience in pharmaceutical research and development. He is a co-founder of Arbor Analytics, a firm specializing in predictive modeling and is a former Director of Statistics at Pfizer Global R&D. His scholarly work centers on the application and development of statistical methodology and learning algorithms. Applied Predictive Modeling covers the overall predictive modeling process, beginning with the crucial steps of data preprocessing, data splitting and foundations of model tuning. The text then provides intuitive explanations of numerous common and modern regression and classification techniques, always with an emphasis on illustrating and solving real data problems. Addressing practical concerns extends beyond model fitting to topics such as handling class imbalance, selecting predictors, and pinpointing causes of poor model performance-all of which are problems that occur frequently in practice. The text illustrates all parts of the modeling process through many hands-on, real-life examples. And every chapter contains extensive R code f

Introductory Circuit Analysis


Robert L. Boylestad - 1968
    Features exceptionally clear explanations and descriptions, step-by-step examples, more than 50 practical applications, over 2000 easy-to-challenging practice problems, and comprehensive coverage of essentials. PSpice, OrCAd version 9.2 Lite Edition, Multisims 2001 version of Electronics Workbench, and MathCad software references and examples are used throughout. Computer programs (C++, BASIC and PSpice) are printed in color, as they run, at the point in the book where they are discussed. Current and Voltage. Resistance. Ohm's Law, Power, and Energy. Series Circuits. Parallel Circuits. Series-Parallel Networks. Methods of Analysis & Selected Topics. Network Theorems. Capacitors. Magnetic Circuits. Inductors. Sinusodial Alternating Waveforms. The Basic Elements and Phasors. Series and Parallel ac Circuits. Series-Parallel ac Networks. Methods of Analysis and Related Topics. Network Theorems (ac). Power (ac). Resonance. Transformers. Polyphase Systems. Decibels, Filters, and Bode Points. Pulse Waveforms and the R-C Response. Nonsinusodial Circuits. System Analysis: An Introduction. For those working in electronic technology.

Engineering a Compiler


Keith D. Cooper - 2003
    No longer is execution speed the sole criterion for judging compiled code. Today, code might be judged on how small it is, how much power it consumes, how well it compresses, or how many page faults it generates. In this evolving environment, the task of building a successful compiler relies upon the compiler writer's ability to balance and blend algorithms, engineering insights, and careful planning. Today's compiler writer must choose a path through a design space that is filled with diverse alternatives, each with distinct costs, advantages, and complexities.Engineering a Compiler explores this design space by presenting some of the ways these problems have been solved, and the constraints that made each of those solutions attractive. By understanding the parameters of the problem and their impact on compiler design, the authors hope to convey both the depth of the problems and the breadth of possible solutions. Their goal is to cover a broad enough selection of material to show readers that real tradeoffs exist, and that the impact of those choices can be both subtle and far-reaching.Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major passes of a compiler. Their text re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in current practice.

HINTS SOLUTIONS PROBLEMS IN CALCULUS FOR JEE MAIN ADV


Sameer Bansal
    HINTS SOLUTIONS of GRB PROBLEMS IN CALCULUS FOR JEE MAIN ADVANCED by Sameer Bansal

R for Data Science: Import, Tidy, Transform, Visualize, and Model Data


Hadley Wickham - 2016
    This book introduces you to R, RStudio, and the tidyverse, a collection of R packages designed to work together to make data science fast, fluent, and fun. Suitable for readers with no previous programming experience, R for Data Science is designed to get you doing data science as quickly as possible. Authors Hadley Wickham and Garrett Grolemund guide you through the steps of importing, wrangling, exploring, and modeling your data and communicating the results. You’ll get a complete, big-picture understanding of the data science cycle, along with basic tools you need to manage the details. Each section of the book is paired with exercises to help you practice what you’ve learned along the way. You’ll learn how to: Wrangle—transform your datasets into a form convenient for analysis Program—learn powerful R tools for solving data problems with greater clarity and ease Explore—examine your data, generate hypotheses, and quickly test them Model—provide a low-dimensional summary that captures true "signals" in your dataset Communicate—learn R Markdown for integrating prose, code, and results

Reinforcement Learning: An Introduction


Richard S. Sutton - 1998
    Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications.Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives when interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the key ideas and algorithms of reinforcement learning. Their discussion ranges from the history of the field's intellectual foundations to the most recent developments and applications. The only necessary mathematical background is familiarity with elementary concepts of probability.The book is divided into three parts. Part I defines the reinforcement learning problem in terms of Markov decision processes. Part II provides basic solution methods: dynamic programming, Monte Carlo methods, and temporal-difference learning. Part III presents a unified view of the solution methods and incorporates artificial neural networks, eligibility traces, and planning; the two final chapters present case studies and consider the future of reinforcement learning.