Communication Systems


Simon Haykin - 1978
    In addition to being the most up-to-date communications text available, Simon Haykin has added MATLAB computer experiments.

Data Science For Dummies


Lillian Pierson - 2014
    Data Science For Dummies is the perfect starting point for IT professionals and students interested in making sense of their organization’s massive data sets and applying their findings to real-world business scenarios. From uncovering rich data sources to managing large amounts of data within hardware and software limitations, ensuring consistency in reporting, merging various data sources, and beyond, you’ll develop the know-how you need to effectively interpret data and tell a story that can be understood by anyone in your organization. Provides a background in data science fundamentals before moving on to working with relational databases and unstructured data and preparing your data for analysis Details different data visualization techniques that can be used to showcase and summarize your data Explains both supervised and unsupervised machine learning, including regression, model validation, and clustering techniques Includes coverage of big data processing tools like MapReduce, Hadoop, Dremel, Storm, and Spark It’s a big, big data world out there – let Data Science For Dummies help you harness its power and gain a competitive edge for your organization.

Data Smart: Using Data Science to Transform Information into Insight


John W. Foreman - 2013
    Major retailers are predicting everything from when their customers are pregnant to when they want a new pair of Chuck Taylors. It's a brave new world where seemingly meaningless data can be transformed into valuable insight to drive smart business decisions.But how does one exactly do data science? Do you have to hire one of these priests of the dark arts, the "data scientist," to extract this gold from your data? Nope.Data science is little more than using straight-forward steps to process raw data into actionable insight. And in Data Smart, author and data scientist John Foreman will show you how that's done within the familiar environment of a spreadsheet. Why a spreadsheet? It's comfortable! You get to look at the data every step of the way, building confidence as you learn the tricks of the trade. Plus, spreadsheets are a vendor-neutral place to learn data science without the hype. But don't let the Excel sheets fool you. This is a book for those serious about learning the analytic techniques, the math and the magic, behind big data.Each chapter will cover a different technique in a spreadsheet so you can follow along: - Mathematical optimization, including non-linear programming and genetic algorithms- Clustering via k-means, spherical k-means, and graph modularity- Data mining in graphs, such as outlier detection- Supervised AI through logistic regression, ensemble models, and bag-of-words models- Forecasting, seasonal adjustments, and prediction intervals through monte carlo simulation- Moving from spreadsheets into the R programming languageYou get your hands dirty as you work alongside John through each technique. But never fear, the topics are readily applicable and the author laces humor throughout. You'll even learn what a dead squirrel has to do with optimization modeling, which you no doubt are dying to know.

Making Numbers Count: The Art and Science of Communicating Numbers


Chip Heath - 2022
    In Making Numbers Count, Chip Heath argues that it's crucial for us all to be able to interpret and communicate numbers and stats more effectively so that data comes alive. By combining years of research into making ideas stick with a deep understanding of how the brain really works, Heath has discerned six critical principles that will give anyone the tools to communicate numbers with more transparency and meaning. These ideas - including simplicity, concreteness and familiarity - reveal what's compelling about a number and show how to transform it into its most understandable form. And if we can do this when we're using numbers, Heath tells us, then the idea of data won't drive people to panic. We're not hungry for numbers - there's an unfathomable amount of information being generated each year - but we are starved for meaning. The ability to communicate and understand numbers has never mattered more.

Models.Behaving.Badly.: Why Confusing Illusion with Reality Can Lead to Disaster, on Wall Street and in Life


Emanuel Derman - 2011
    The reliance traders put on such quantitative analysis was catastrophic for the economy, setting off the series of financial crises that began to erupt in 2007 with the mortgage crisis and from which we're still recovering. Here Derman looks at why people--bankers in particular--still put so much faith in these models, and why it's a terrible mistake to do so.Though financial models imitate the style of physics by using the language of mathematics, ultimately they deal with human beings. Their similarity confuses the fundamental difference between the aims and possible achievements of the phsyics world and that of the financial world. When we make a model involving human beings, we are trying to force the ugly stepsister's foot into Cinderella's pretty glass slipper.  It doesn't fit without cutting off some of the essential parts. Physicists and economists have been too enthusiastic to recognize the limits of their equations in the sphere of human behavior--which of course is what economics is all about.  Models.Behaving.Badly. includes a personal account Derman's childhood encounter with failed models--the utopia of the kibbutz, his experience as a physicist on Wall Street, and a look at the models quants generated: the benefits they brought and the problems they caused. Derman takes a close look at what a model is, and then he highlights the differences between the success of modeling in physics and its relative failure in economics.  Describing the collapse of the subprime mortgage CDO market in 2007, Derman urges us to stop relying on these models where possible, and offers suggestions for mending these models where they might still do some good.  This is a fascinating, lyrical, and very human look behind the curtain at the intersection between mathematics and human nature.

Abstract Algebra


I.N. Herstein - 1986
    Providing a concise introduction to abstract algebra, this work unfolds some of the fundamental systems with the aim of reaching applicable, significant results.

Digital Communications


John G. Proakis - 1983
    Includes expert coverage of new topics: Turbocodes, Turboequalization, Antenna Arrays, Digital Cellular Systems, and Iterative Detection. Convenient, sequential organization begins with a look at the historyo and classification of channel models and builds from there.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction


Trevor Hastie - 2001
    With it has come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting—the first comprehensive treatment of this topic in any book. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie wrote much of the statistical modeling software in S-PLUS and invented principal curves and surfaces. Tibshirani proposed the Lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, and projection pursuit.

The Model Thinker: What You Need to Know to Make Data Work for You


Scott E. Page - 2018
    But as anyone who has ever opened up a spreadsheet packed with seemingly infinite lines of data knows, numbers aren't enough: we need to know how to make those numbers talk. In The Model Thinker, social scientist Scott E. Page shows us the mathematical, statistical, and computational models—from linear regression to random walks and far beyond—that can turn anyone into a genius. At the core of the book is Page's "many-model paradigm," which shows the reader how to apply multiple models to organize the data, leading to wiser choices, more accurate predictions, and more robust designs. The Model Thinker provides a toolkit for business people, students, scientists, pollsters, and bloggers to make them better, clearer thinkers, able to leverage data and information to their advantage.

Applied Linear Regression Models- 4th Edition with Student CD (McGraw Hill/Irwin Series: Operations and Decision Sciences)


Michael H. Kutner - 2003
    Cases, datasets, and examples allow for a more real-world perspective and explore relevant uses of regression techniques in business today.

Social and Economic Networks


Matthew O. Jackson - 2008
    The many aspects of our lives that are governed by social networks make it critical to understand how they impact behavior, which network structures are likely to emerge in a society, and why we organize ourselves as we do. In Social and Economic Networks, Matthew Jackson offers a comprehensive introduction to social and economic networks, drawing on the latest findings in economics, sociology, computer science, physics, and mathematics. He provides empirical background on networks and the regularities that they exhibit, and discusses random graph-based models and strategic models of network formation. He helps readers to understand behavior in networked societies, with a detailed analysis of learning and diffusion in networks, decision making by individuals who are influenced by their social neighbors, game theory and markets on networks, and a host of related subjects. Jackson also describes the varied statistical and modeling techniques used to analyze social networks. Each chapter includes exercises to aid students in their analysis of how networks function.This book is an indispensable resource for students and researchers in economics, mathematics, physics, sociology, and business.

The Visual Display of Quantitative Information


Edward R. Tufte - 1983
    Theory and practice in the design of data graphics, 250 illustrations of the best (and a few of the worst) statistical graphics, with detailed analysis of how to display data for precise, effective, quick analysis. Design of the high-resolution displays, small multiples. Editing and improving graphics. The data-ink ratio. Time-series, relational graphics, data maps, multivariate designs. Detection of graphical deception: design variation vs. data variation. Sources of deception. Aesthetics and data graphical displays. This is the second edition of The Visual Display of Quantitative Information. Recently published, this new edition provides excellent color reproductions of the many graphics of William Playfair, adds color to other images, and includes all the changes and corrections accumulated during 17 printings of the first edition.

Volatility Trading (Wiley Trading)


Euan Sinclair - 2008
    With an accessible, straightforward approach. He guides traders through the basics of option pricing, volatility measurement, hedging, money management, and trade evaluation. In addition, Sinclair explains the often-overlooked psychological aspects of trading, revealing both how behavioral psychology can create market conditions traders can take advantage of-and how it can lead them astray. Psychological biases, he asserts, are probably the drivers behind most sources of edge available to a volatility trader. Your goal, Sinclair explains, must be clearly defined and easily expressed-if you cannot explain it in one sentence, you probably aren't completely clear about what it is. The same applies to your statistical edge. If you do not know exactly what your edge is, you shouldn't trade. He shows how, in addition to the numerical evaluation of a potential trade, you should be able to identify and evaluate the reason why implied volatility is priced where it is, that is, why an edge exists. This means it is also necessary to be on top of recent news stories, sector trends, and behavioral psychology. Finally, Sinclair underscores why trades need to be sized correctly, which means that each trade is evaluated according to its projected return and risk in the overall context of your goals. As the author concludes, while we also need to pay attention to seemingly mundane things like having good execution software, a comfortable office, and getting enough sleep, it is knowledge that is the ultimate source of edge. So, all else being equal, the trader with the greater knowledge will be the more successful. This book, and its companion CD-ROM, will provide that knowledge. The CD-ROM includes spreadsheets designed to help you forecast volatility and evaluate trades together with simulation engines.

Introduction to Statistical Quality Control


Douglas C. Montgomery - 1985
    It provides comprehensive coverage of the subject from basic principles to state-of-art concepts and applications. The objective is to give the reader a sound understanding of the principles and the basis for applying them in a variety of both product and nonproduct situations. While statistical techniques are emphasized throughout, the book has a strong engineering and management orientation. Guidelines are given throughout the book for selecting the proper type of statistical technique to use in a wide variety of product and nonproduct situations. By presenting theory, and supporting the theory with clear and relevant examples, Montgomery helps the reader to understand the big picture of important concepts. Updated to reflect contemporary practice and provide more information on management aspects of quality improvement.

The Art of Doing Science and Engineering: Learning to Learn


Richard Hamming - 1996
    By presenting actual experiences and analyzing them as they are described, the author conveys the developmental thought processes employed and shows a style of thinking that leads to successful results is something that can be learned. Along with spectacular successes, the author also conveys how failures contributed to shaping the thought processes. Provides the reader with a style of thinking that will enhance a person's ability to function as a problem-solver of complex technical issues. Consists of a collection of stories about the author's participation in significant discoveries, relating how those discoveries came about and, most importantly, provides analysis about the thought processes and reasoning that took place as the author and his associates progressed through engineering problems.