Statistical Mechanics


R.K. Pathria - 1972
    Highly recommended for graduate-level libraries.' ChoiceThis highly successful text, which first appeared in the year 1972 and has continued to be popular ever since, has now been brought up-to-date by incorporating the remarkable developments in the field of 'phase transitions and critical phenomena' that took place over the intervening years. This has been done by adding three new chapters (comprising over 150 pages and containing over 60 homework problems) which should enhance the usefulness of the book for both students and instructors. We trust that this classic text, which has been widely acclaimed for its clean derivations and clear explanations, will continue to provide further generations of students a sound training in the methods of statistical physics.

Elements of Electromagnetics


Matthew N.O. Sadiku - 1993
    The book also provides a balanced presentation of time-varying and static fields, preparingstudents for employment in today's industrial and manufacturing sectors. Streamlined to facilitate student understanding, this edition features worked examples in every chapter that explain how to use the theory presented in the text to solve different kinds of problems. Numerical methods, including MATLAB and vector analysis, are also included to help students analyzesituations that they are likely to encounter in industry practice. Elements of Electromagnetics, Fifth Edition, is designed for introductory undergraduate courses in electromagnetics.

Principles of Mathematical Analysis


Walter Rudin - 1964
    The text begins with a discussion of the real number system as a complete ordered field. (Dedekind's construction is now treated in an appendix to Chapter I.) The topological background needed for the development of convergence, continuity, differentiation and integration is provided in Chapter 2. There is a new section on the gamma function, and many new and interesting exercises are included. This text is part of the Walter Rudin Student Series in Advanced Mathematics.

The Art of Computer Programming, Volume 1: Fundamental Algorithms


Donald Ervin Knuth - 1973
     -Byte, September 1995 I can't begin to tell you how many pleasurable hours of study and recreation they have afforded me! I have pored over them in cars, restaurants, at work, at home... and even at a Little League game when my son wasn't in the line-up. -Charles Long If you think you're a really good programmer... read [Knuth's] Art of Computer Programming... You should definitely send me a resume if you can read the whole thing. -Bill Gates It's always a pleasure when a problem is hard enough that you have to get the Knuths off the shelf. I find that merely opening one has a very useful terrorizing effect on computers. -Jonathan Laventhol This first volume in the series begins with basic programming concepts and techniques, then focuses more particularly on information structures-the representation of information inside a computer, the structural relationships between data elements and how to deal with them efficiently. Elementary applications are given to simulation, numerical methods, symbolic computing, software and system design. Dozens of simple and important algorithms and techniques have been added to those of the previous edition. The section on mathematical preliminaries has been extensively revised to match present trends in research. Ebook (PDF version) produced by Mathematical Sciences Publishers (MSP), http: //msp.org

Problem-Solving Strategies


Arthur Engel - 1997
    The discussion of problem solving strategies is extensive. It is written for trainers and participants of contests of all levels up to the highest level: IMO, Tournament of the Towns, and the noncalculus parts of the Putnam Competition. It will appeal to high school teachers conducting a mathematics club who need a range of simple to complex problems and to those instructors wishing to pose a "problem of the week", "problem of the month", and "research problem of the year" to their students, thus bringing a creative atmosphere into their classrooms with continuous discussions of mathematical problems. This volume is a must-have for instructors wishing to enrich their teaching with some interesting non-routine problems and for individuals who are just interested in solving difficult and challenging problems. Each chapter starts with typical examples illustrating the central concepts and is followed by a number of carefully selected problems and their solutions. Most of the solutions are complete, but some merely point to the road leading to the final solution. Very few problems have no solutions. Readers interested in increasing the effectiveness of the book can do so by working on the examples in addition to the problems thereby increasing the number of problems to over 1300. In addition to being a valuable resource of mathematical problems and solution strategies, this volume is the most complete training book on the market.

Probability And Statistics For Engineering And The Sciences


Jay L. Devore - 1982
    In this book, a wealth of exercises are provided throughout each section, designed to reinforce learning and the logical comprehension of topics. The use of real data is incorporated much more extensively than in any other book on the market. Consist of strong coverage of computer-based methods, especially in the coverage of analysis of variance and regression. This text stresses mastery of methods most often used in medical research, with specific reference to actual medical literature and actual medical research. The approach minimizes mathematical formulation, yet gives complete explanations of all important concepts. Every new concept is systematically developed through completely worked-out examples from current medical research problems. Computer output is used to illustrate concepts when appropriate.

Computational Complexity


Sanjeev Arora - 2007
    Requiring essentially no background apart from mathematical maturity, the book can be used as a reference for self-study for anyone interested in complexity, including physicists, mathematicians, and other scientists, as well as a textbook for a variety of courses and seminars. More than 300 exercises are included with a selected hint set.

Field and Wave Electromagnetics


David K. Cheng - 1982
    These include applications drawn from important new areas of technology such as optical fibers, radome design, satellite communication, and microstrip lines. There is also added coverage of several new topics, including Hall effect, radar equation and scattering cross section, transients in transmission lines, waveguides and circular cavity resonators, wave propagation in the ionosphere, and helical antennas. New exercises, new problems, and many worked-out examples make this complex material more accessible to students.

All of Statistics: A Concise Course in Statistical Inference


Larry Wasserman - 2003
    But in spirit, the title is apt, as the book does cover a much broader range of topics than a typical introductory book on mathematical statistics. This book is for people who want to learn probability and statistics quickly. It is suitable for graduate or advanced undergraduate students in computer science, mathematics, statistics, and related disciplines. The book includes modern topics like nonparametric curve estimation, bootstrapping, and clas- sification, topics that are usually relegated to follow-up courses. The reader is presumed to know calculus and a little linear algebra. No previous knowledge of probability and statistics is required. Statistics, data mining, and machine learning are all concerned with collecting and analyzing data. For some time, statistics research was con- ducted in statistics departments while data mining and machine learning re- search was conducted in computer science departments. Statisticians thought that computer scientists were reinventing the wheel. Computer scientists thought that statistical theory didn't apply to their problems. Things are changing. Statisticians now recognize that computer scientists are making novel contributions while computer scientists now recognize the generality of statistical theory and methodology. Clever data mining algo- rithms are more scalable than statisticians ever thought possible. Formal sta- tistical theory is more pervasive than computer scientists had realized.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

Complex Variables and Applications


James Ward Brown - 1960
    It uses examples and exercise sets, with clear explanations of problem-solving techniqes and material on the further theory of functions.

Hands-On Machine Learning with Scikit-Learn and TensorFlow


Aurélien Géron - 2017
    Now that machine learning is thriving, even programmers who know close to nothing about this technology can use simple, efficient tools to implement programs capable of learning from data. This practical book shows you how.By using concrete examples, minimal theory, and two production-ready Python frameworks—Scikit-Learn and TensorFlow—author Aurélien Géron helps you gain an intuitive understanding of the concepts and tools for building intelligent systems. You’ll learn how to use a range of techniques, starting with simple Linear Regression and progressing to Deep Neural Networks. If you have some programming experience and you’re ready to code a machine learning project, this guide is for you.This hands-on book shows you how to use:Scikit-Learn, an accessible framework that implements many algorithms efficiently and serves as a great machine learning entry pointTensorFlow, a more complex library for distributed numerical computation, ideal for training and running very large neural networksPractical code examples that you can apply without learning excessive machine learning theory or algorithm details

Using Econometrics: A Practical Guide


A.H. Studenmund - 1987
    "Using Econometrics: A Practical Guide "provides readers with a practical introduction that combines single-equation linear regression analysis with real-world examples and exercises. This text also avoids complex matrix algebra and calculus, making it an ideal text for beginners. New problem sets and added support make "Using Econometrics" modern and easier to use.

Introduction to Graph Theory


Douglas B. West - 1995
    Verification that algorithms work is emphasized more than their complexity. An effective use of examples, and huge number of interesting exercises, demonstrate the topics of trees and distance, matchings and factors, connectivity and paths, graph coloring, edges and cycles, and planar graphs. For those who need to learn to make coherent arguments in the fields of mathematics and computer science.

Information Theory, Inference and Learning Algorithms


David J.C. MacKay - 2002
    These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.