Book picks similar to
Mathematical Foundations of Neuroscience by G. Bard Ermentrout
neuroscience
mathematics
textbook
con-patry
Information Theory, Inference and Learning Algorithms
David J.C. MacKay - 2002
These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
A Book of Abstract Algebra
Charles C. Pinter - 1982
Its easy-to-read treatment offers an intuitive approach, featuring informal discussions followed by thematically arranged exercises. Intended for undergraduate courses in abstract algebra, it is suitable for junior- and senior-level math majors and future math teachers. This second edition features additional exercises to improve student familiarity with applications. An introductory chapter traces concepts of abstract algebra from their historical roots. Succeeding chapters avoid the conventional format of definition-theorem-proof-corollary-example; instead, they take the form of a discussion with students, focusing on explanations and offering motivation. Each chapter rests upon a central theme, usually a specific application or use. The author provides elementary background as needed and discusses standard topics in their usual order. He introduces many advanced and peripheral subjects in the plentiful exercises, which are accompanied by ample instruction and commentary and offer a wide range of experiences to students at different levels of ability.
Networks of the Brain
Olaf Sporns - 2010
Increasingly, science is concerned with the structure, behavior, and evolution of complex systems ranging from cells to ecosystems. In Networks of the Brain, Olaf Sporns describes how the integrative nature of brain function can be illuminated from a complex network perspective.Highlighting the many emerging points of contact between neuroscience and network science, the book serves to introduce network theory to neuroscientists and neuroscience to those working on theoretical network models. Sporns emphasizes how networks connect levels of organization in the brain and how they link structure to function, offering an informal and nonmathematical treatment of the subject. Networks of the Brain provides a synthesis of the sciences of complex networks and the brain that will be an essential foundation for future research.
Therapeutic Exercise: Foundations and Techniques
Carolyn Kisner - 1990
It covers isokinetics, soft tissue injury repair, surgical procedures, exercise rehabilitation, post-operative management and posture.
Human Compatible: Artificial Intelligence and the Problem of Control
Stuart Russell - 2019
Conflict between humans and machines is seen as inevitable and its outcome all too predictable.In this groundbreaking book, distinguished AI researcher Stuart Russell argues that this scenario can be avoided, but only if we rethink AI from the ground up. Russell begins by exploring the idea of intelligence in humans and in machines. He describes the near-term benefits we can expect, from intelligent personal assistants to vastly accelerated scientific research, and outlines the AI breakthroughs that still have to happen before we reach superhuman AI. He also spells out the ways humans are already finding to misuse AI, from lethal autonomous weapons to viral sabotage.If the predicted breakthroughs occur and superhuman AI emerges, we will have created entities far more powerful than ourselves. How can we ensure they never, ever, have power over us? Russell suggests that we can rebuild AI on a new foundation, according to which machines are designed to be inherently uncertain about the human preferences they are required to satisfy. Such machines would be humble, altruistic, and committed to pursue our objectives, not theirs. This new foundation would allow us to create machines that are provably deferential and provably beneficial.In a 2014 editorial co-authored with Stephen Hawking, Russell wrote, "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last." Solving the problem of control over AI is not just possible; it is the key that unlocks a future of unlimited promise.
Linear Algebra Done Right
Sheldon Axler - 1995
The novel approach taken here banishes determinants to the end of the book and focuses on the central goal of linear algebra: understanding the structure of linear operators on vector spaces. The author has taken unusual care to motivate concepts and to simplify proofs. For example, the book presents - without having defined determinants - a clean proof that every linear operator on a finite-dimensional complex vector space (or an odd-dimensional real vector space) has an eigenvalue. A variety of interesting exercises in each chapter helps students understand and manipulate the objects of linear algebra. This second edition includes a new section on orthogonal projections and minimization problems. The sections on self-adjoint operators, normal operators, and the spectral theorem have been rewritten. New examples and new exercises have been added, several proofs have been simplified, and hundreds of minor improvements have been made throughout the text.
Evolutionary Psychology: The New Science of the Mind
David M. Buss - 1998
Since the publication of the award-winning first edition of Evolutionary Psychology, there has been an explosion of research within the field. In this book, David M. Buss examines human behavior from an evolutionary perspective, providing students with the conceptual tools needed to study evolutionary psychology and apply them to empirical research on the human mind. This edition contains expanded coverage of cultural evolution, with a new section on culture–gene co-evolution, additional studies discussing interbreeding between modern humans and Neanderthals, expanded discussions of evolutionary hypotheses that have been empirically disconfirmed, and much more!
Course of Theoretical Physics: Vol. 1, Mechanics
L.D. Landau - 1969
The exposition is simple and leads to the most complete direct means of solving problems in mechanics. The final sections on adiabatic invariants have been revised and augmented. In addition a short biography of L D Landau has been inserted.
Computers and Intractability: A Guide to the Theory of NP-Completeness
Michael R. Garey - 1979
Johnson. It was the first book exclusively on the theory of NP-completeness and computational intractability. The book features an appendix providing a thorough compendium of NP-complete problems (which was updated in later printings of the book). The book is now outdated in some respects as it does not cover more recent development such as the PCP theorem. It is nevertheless still in print and is regarded as a classic: in a 2006 study, the CiteSeer search engine listed the book as the most cited reference in computer science literature.
Calculus
Michael Spivak - 1967
His aim is to present calculus as the first real encounter with mathematics: it is the place to learn how logical reasoning combined with fundamental concepts can be developed into a rigorous mathematical theory rather than a bunch of tools and techniques learned by rote. Since analysis is a subject students traditionally find difficult to grasp, Spivak provides leisurely explanations, a profusion of examples, a wide range of exercises and plenty of illustrations in an easy-going approach that enlightens difficult concepts and rewards effort. Calculus will continue to be regarded as a modern classic, ideal for honours students and mathematics majors, who seek an alternative to doorstop textbooks on calculus, and the more formidable introductions to real analysis.
Principles of Statistics
M.G. Bulmer - 1979
There are equally many advanced textbooks which delve into the far reaches of statistical theory, while bypassing practical applications. But between these two approaches is an unfilled gap, in which theory and practice merge at an intermediate level. Professor M. G. Bulmer's Principles of Statistics, originally published in 1965, was created to fill that need. The new, corrected Dover edition of Principles of Statistics makes this invaluable mid-level text available once again for the classroom or for self-study.Principles of Statistics was created primarily for the student of natural sciences, the social scientist, the undergraduate mathematics student, or anyone familiar with the basics of mathematical language. It assumes no previous knowledge of statistics or probability; nor is extensive mathematical knowledge necessary beyond a familiarity with the fundamentals of differential and integral calculus. (The calculus is used primarily for ease of notation; skill in the techniques of integration is not necessary in order to understand the text.)Professor Bulmer devotes the first chapters to a concise, admirably clear description of basic terminology and fundamental statistical theory: abstract concepts of probability and their applications in dice games, Mendelian heredity, etc.; definitions and examples of discrete and continuous random variables; multivariate distributions and the descriptive tools used to delineate them; expected values; etc. The book then moves quickly to more advanced levels, as Professor Bulmer describes important distributions (binomial, Poisson, exponential, normal, etc.), tests of significance, statistical inference, point estimation, regression, and correlation. Dozens of exercises and problems appear at the end of various chapters, with answers provided at the back of the book. Also included are a number of statistical tables and selected references.
An Introduction to Statistical Learning: With Applications in R
Gareth James - 2013
This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree- based methods, support vector machines, clustering, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform. Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.
Topology
James R. Munkres - 1975
Includes many examples and figures. GENERAL TOPOLOGY. Set Theory and Logic. Topological Spaces and Continuous Functions. Connectedness and Compactness. Countability and Separation Axioms. The Tychonoff Theorem. Metrization Theorems and paracompactness. Complete Metric Spaces and Function Spaces. Baire Spaces and Dimension Theory. ALGEBRAIC TOPOLOGY. The Fundamental Group. Separation Theorems. The Seifert-van Kampen Theorem. Classification of Surfaces. Classification of Covering Spaces. Applications to Group Theory. For anyone needing a basic, thorough, introduction to general and algebraic topology and its applications.