Vous êtes sur la page 1sur 2

Foreword

This book introduces a toolbox of statistical tools using partial moments that are both old and
new. Partial moment analysis is over a century old but most applications of partial moments have
not progressed beyond a substitution for simple variance analysis. Lower partial moments have
been in use in finance in portfolio investment theory for over 60 years. However, just as the
normal distribution and the variance leads the statistician into linear correlation and regression
analysis, partial moments leads us towards nonlinear correlation and regression analysis. Using
partial moments as a variance measure is only the tip of the iceberg as the purpose of this book is
to explore the entire iceberg.
This partial moment toolbox is the new presented in this book. However, new always should
have some advantage over old. The advantage of using partial moments is that it is
nonparametric and does not require the knowledge of the underlying probability function nor
does it require a goodness of fit analysis. Partial moments provide us with cumulative density
functions, probability density functions, linear correlation and regression analysis, nonlinear
correlation and regression analysis, ANOVA, and ARMA/ARCH models. This new toolbox is
completely nonparametric and provides a full set of probability hypothesis testing tools without
knowing the underlying probability distribution.
In this new advanced approach to nonparametric statistics, we merge the ideas of discrete and
continuous processes and present them in a unified framework predicated on partial moments.
Through the asymptotic property of partial moments, we show the two schools of mathematical
thought do not converge as commonly envisioned. The increased observations approximate the
continuous area of a function; versus stabilizing on a discrete counting metric. However, it
remains a strictly binary analysis: discrete or continuous. The known properties generated from
this continuous vs. discrete analysis affords an assumption free analysis of variance (ANOVA)
on multiple distributions.
In our correlation and regression analysis, linear segments are aggregated to describe a nonlinear
system. The computational issue is the weighting of the segments. However, since partial
moments weigh all observations this consideration is alleviated, ultimately yielding a more
robust result with no butterfly effect due to our lack of parameters. By building off basic
relationships between variables, we are able to perform multivariate analysis with ease and
transform complexity into tedious. One major advantage with our work is that the partial
moment methodology fully replicates linear conditions or known functions. This trust of
methodology is important for transition to chaotic unknowns and forecasting with autoregressive
models.
Normalization of data has the unintended consequence of transforming continuous variables to
discrete variables while eliminating prior relationships. We present a normalization method that
enables a truly apples to apples comparison that retains the finite moment properties of the
underlying distribution. In the ensuing analysis of the question variables, we illustrate the
distinction between correlation and causation. Using this distinction we offer a definition of
causation that integrates historical correlation with conditional probabilities.

Finally, linearity should be a pleasant surprise to encounter in data, not a prerequisite. By


eliminating all preconceptions and assumptions, we offer a powerful framework for statistical
analysis. The simple nonparametric architecture based on partial moments yields important
information to easily conduct multivariate analysis; generating descriptive and inferential
statistics for a nonlinear world.

Vous aimerez peut-être aussi