Concetti Chiave
Under ergodic-type conditions on the data distributions, optimal error bounds can be achieved for classification and regression tasks using universal rules, with applications in wireless networks.
Sintesi
The paper studies classification and regression error bounds for inhomogeneous data that are independent but not necessarily identically distributed.
For regression:
- It establishes ergodic-type sufficient conditions that guarantee the achievability of the Bayes error bound using universal rules.
- It performs a similar analysis for k-nearest neighbor regression and obtains optimal error bounds.
For classification:
- It derives bounds for the minimum classification error probability of inhomogeneous data when the noise statistics satisfy an ergodic-type condition.
- It shows that under these conditions, universal classifiers can achieve the Bayes error bound.
The results are then illustrated in the context of wireless network applications, such as estimating transmission power levels and primary user detection in cognitive radio networks.
Statistiche
The paper does not contain any explicit numerical data or statistics. The key results are theoretical bounds and conditions derived for the regression and classification error probabilities.
Citazioni
"If h is uniformly continuous and sup_x |1/n Σ_i^n f_i(x) - f(x)| → 0 as n → ∞, where f is a uniformly continuous density, then there is a universally consistent regressor that achieves the optimal average variance."
"If sup_x |1/n Σ_i^n |h_i(x) - h(x)| → 0, then the minimum possible average classification error probability is equal to the Bayes error bound."