Optimal Algorithms for Unimodal Regression

I don’t usually post abstracts of papers other than my own here, but this one is particularly significant to me though I don’t know the author.

This paper gives optimal algorithms for determining real-valued univariate unimodal regressions, that is, for determining the optimal regression which is increasing and then decreasing. Such regressions arise in a wide variety of applications. They are a form of shape-constrained nonparametric regression, closely related to isotonic regression. For the L2 metric our algorithm requires only O(n) time for regression on n points, while for the L1 metric it requires O(n logn) time. Previous algorithms only considered the L2 metric and required (n^2) time. All previous algorithms used multiple calls to isotonic regression, and our major contribution is to organize these into a prefix isotonic regression, whereby one computes the regression on all initial segments. The prefix approach utilizes the solution for one initial segment to aid in the solution of the next, which considerably reduces the total time required. Our prefix isotonic regression algorithm for
the L1 metric also supplies the first O(n log n) algorithm for L1 isotonic regression.

Source.

Published by

Daniel Lemire

A computer science professor at the Université du Québec (TELUQ).

Leave a Reply

Your email address will not be published. Required fields are marked *

To create code blocks or other preformatted text, indent by four spaces:

    This will be displayed in a monospaced font. The first four 
    spaces will be stripped off, but all other whitespace
    will be preserved.
    
    Markdown is turned off in code blocks:
     [This is not a link](http://example.com)

To create not a block, but an inline code span, use backticks:

Here is some inline `code`.

For more help see http://daringfireball.net/projects/markdown/syntax