Section: New Results
Iterative isotone regression
Participant : Arnaud Guyader.
This is a collaboration with Nicolas Hengartner (Los Alamos), Nicolas Jégou (université de Rennes 2) and Eric Matzner–Løber (université de Rennes 2), and with Alexander B. Németh (Babeş Bolyai University) and Sándor Z. Németh (University of Birmingham).
We explore some theoretical aspects of a recent nonparametric
method for estimating a univariate regression function of bounded variation.
The method exploits the Jordan decomposition which states that a function of
bounded variation can be decomposed as the sum of a non-decreasing function
and a non-increasing function. This suggests combining the backfitting
algorithm for estimating additive functions with isotonic regression for
estimating monotone functions. The resulting iterative algorithm is
called IIR (iterative isotonic regression).
The main result in this work [22]
states that the estimator is consistent if the number of
iterations
With the geometrical interpretation linking this iterative method with the von Neumann algorithm, and making a connection with the general property of isotonicity of projection onto convex cones, we derive in [14] another equivalent algorithm and go further in the analysis.