Core Concepts
This paper presents the first in-depth study of H-consistency bounds for regression, establishing non-asymptotic guarantees for the squared loss with respect to various surrogate regression losses such as Huber loss, ℓp losses, and squared ε-insensitive loss. The analysis leverages new generalized theorems for establishing H-consistency bounds.
Abstract
The paper makes the following key contributions:
-
It presents new generalized theorems (Theorems 1 and 2) that extend previous tools for establishing H-consistency bounds to allow for non-constant functions α. This generalization is crucial for analyzing H-consistency bounds for regression losses like Huber loss and squared ε-insensitive loss.
-
It proves a series of novel H-consistency bounds for surrogate loss functions of the squared loss under the assumption of a symmetric distribution and a bounded hypothesis set:
- For the Huber loss, it shows that the bound holds under a specific condition on the Huber loss parameter δ and the distribution mass around the mean. It also proves this condition is necessary when the hypothesis set H is realizable.
- For ℓp losses with p ≥ 1, it provides guarantees, including for the ℓ1 loss and ℓp losses with p ∈ (1,2).
- For the ε-insensitive loss used in SVR, it proves a negative result - this loss function does not admit H-consistency bounds with respect to the squared loss.
- For the squared ε-insensitive loss, it provides a positive H-consistency bound, but also shows a negative result if a certain condition is not satisfied.
-
Leveraging the H-consistency analysis, it derives principled surrogate losses for adversarial regression and reports favorable experimental results for the resulting novel algorithms.
Stats
The conditional distribution and the hypothesis set H are bounded by B > 0.
The distribution is symmetric.
pmin(δ) = infx∈X P(0 ≤ μ(x) - y ≤ δ | x) is positive for the Huber loss.
pmin(ε) = infx∈X P(μ(x) - y ≥ ε | x) is positive for the squared ε-insensitive loss.
Quotes
"We present a detailed study of H-consistency bounds for regression."
"This generalization proves essential for analyzing H-consistency bounds specific to regression."
"We further leverage our analysis of H-consistency for regression and derive principled surrogate losses for adversarial regression (Section 5)."