Core Concepts
The authors propose an Approximate Message Passing (AMP) algorithm to efficiently estimate the signals and change point locations in high-dimensional linear regression. They provide an exact asymptotic characterization of the algorithm's performance and show how to quantify uncertainty in the estimates.
Abstract
The content discusses the problem of localizing change points in high-dimensional linear regression. The authors propose an Approximate Message Passing (AMP) algorithm to estimate both the signals and the change point locations.
Key highlights:
The AMP algorithm can exploit any prior information on the signal, noise, and change points.
The authors provide an exact asymptotic characterization of the algorithm's estimation performance in the high-dimensional limit.
The AMP algorithm enables uncertainty quantification in the form of an efficiently computable approximate posterior distribution, whose asymptotic form is also characterized.
The authors validate the theory through numerical experiments on synthetic data and images, demonstrating the favorable performance of their estimators.
The content is structured as follows:
Introduction to the problem of change point detection in high-dimensional linear regression.
Preliminaries on the model assumptions and notation.
Description of the AMP algorithm and the main theoretical results:
State evolution characterization of the AMP iterates
Choosing the denoising functions to optimize performance
Change point estimation and uncertainty quantification
Experimental results on synthetic data and images, comparing AMP to other state-of-the-art algorithms.
Stats
The content does not provide any specific numerical values or statistics. It focuses on the theoretical analysis and algorithmic development.