Average Reviews:
(More customer reviews)Regression fitting is much in demand in diverse applications. It involves fairly elementary ideas from probability and statistics. Two groups of users come to mind: Medical professionals who must fit regression lines to make sense of and process medical data. Other users: business, sociology, biology, finance etc.
I just found this newer and up to date book. It is more to my liking than the others I came across up to now, and it includes ready-to-use software.
Idea: You want to fit lines or system of lines to data in the form (X, y) where X is a matrix; it is really a sequence of measurements of some finite selection of things you want to measure. Think of X is independent vector measurements, and y as a dependent vector variable. (Special case: If X and y are both vectors, you are looking at a set of plots in your planar coordinate system.)
You then want to test if some observed other sequence of measured numbers y depend on X in a good way; i.e., whether these dependent variables fit hypothetical linear dependencies which are prescribed by parameters b, i.e, a set of unknown numbers to be adjusted in order to get a best fit, using minimal least-square approximation. But to complicate life, there is a stochastic noise element, and it is represented by a vector e of random variables (prescribed in turn by some probability distributions, and usually assumed independent.)
So if you organize your vectors y, b, and e in column form, then you are looking at a relatively easy problem in linear algebra:
y = X b + e,
where y, b, and e are vectors of the same size, and where X is a matrix.
Oversimplification: If e = 0, there is an easy formula for a unique solution b, and gotten from demanding minimal least-square fit. (Lagrange's method will do this.) And it is in all the books. It involves a certain known and simple matrix function f applied to X. So it is b = f(X) y. And you get it ready to use from all the software you can find on the web.
If on the other hand, there is noise, i.e., if e is not 0, then you must use a little probability, but there is a lot of software where you can just input your numbers X, y, and your hypothesis about e. You still look for the best fit, i.e., the best numbers (parameters) for b, but then it is a bit more complicated than just b = f(X) y.
Naturally other hypothetical dependencies are relevant, e.g., non-linear etc. They are still prescribed by parameters but in more complicated ways. And still the approach is via demanding a minimal least-square fit to the data.
The book goes beyond the classical parametric approach. Hence the word "nonparametric" in the title!
And it is well presented and organized in this book. Review by Palle Jorgensen, October 2006.
Click Here to see more reviews about: Introduction to Nonparametric Regression (Wiley Series in Probability and Statistics)
An easy-to-grasp introduction to nonparametric regressionThis book's straightforward, step-by-step approach provides an excellent introduction to the field for novices of nonparametric regression. Introduction to Nonparametric Regression clearly explains the basic concepts underlying nonparametric regression and features:* Thorough explanations of various techniques, which avoid complex mathematics and excessive abstract theory to help readers intuitively grasp the value of nonparametric regression methods* Statistical techniques accompanied by clear numerical examples that further assist readers in developing and implementing their own solutions* Mathematical equations that are accompanied by a clear explanation of how the equation was derivedThe first chapter leads with a compelling argument for studying nonparametric regression and sets the stage for more advanced discussions. In addition to covering standard topics, such as kernel and spline methods, the book provides in-depth coverage of the smoothing of histograms, a topic generally not covered in comparable texts.With a learning-by-doing approach, each topical chapter includes thorough S-Plus? examples that allow readers to duplicate the same results described in the chapter. A separate appendix is devoted to the conversion of S-Plus objects to R objects. In addition, each chapter ends with a set of problems that test readers' grasp of key concepts and techniques and also prepares them for more advanced topics.This book is recommended as a textbook for undergraduate and graduate courses in nonparametric regression. Only a basic knowledge of linear algebra and statistics is required. In addition, this is an excellent resource for researchers and engineers in such fields as pattern recognition, speech understanding, and data mining. Practitioners who rely on nonparametric regression for analyzing data in the physical, biological, and social sciences, as well as in finance and economics, will find this an unparalleled resource.
Buy cheap Introduction to Nonparametric Regression (Wiley Series in Probability and Statistics) now.
No comments:
Post a Comment