This paper introduces GradVI, a gradient-based optimization method for variational empirical Bayes (VEB) in multiple linear regression, demonstrating its advantages over coordinate ascent methods (CAVI) in scenarios with correlated predictors or design matrices enabling fast matrix-vector multiplication, particularly in trend filtering applications.
This paper introduces functional normalizing flow, a novel discretization-invariant variational inference method for efficiently solving inverse problems of partial differential equations in infinite-dimensional space.
본 논문에서는 샌드위치 추정기를 사용하여 포아송 로그 정규 모델에서 변분 추정기의 일관성과 점근적 정규성을 조사하고, 변분 추정기의 분산을 추정하기 위해 샌드위치 추정기가 변분적 피셔 정보 방법보다 효과적임을 보여줍니다.
本稿では、カウントデータ分析で広く用いられるポアソン対数正規モデル(PLN)のパラメータ推定において、従来の変分推定法では困難であった信頼区間の構築を実現する、サンドイッチ推定に基づく分散推定手法を提案し、その有効性をシミュレーションと実データ分析を通じて検証しています。
This research paper proposes and evaluates the Sandwich estimator, derived from M-estimation theory, as a more accurate alternative to the Variational Fisher Information method for estimating the variance of parameters in the Poisson Lognormal (PLN) model, particularly for high-dimensional data.
This research paper introduces a novel approach to variational inference for pairwise Markov Random Fields on the Boolean hypercube, leveraging quantum relaxations of the Kullback-Leibler divergence to derive tractable upper bounds on the log-partition function.
EigenVI is a new algorithm for black-box variational inference that leverages orthogonal function expansions to construct flexible variational approximations and utilizes score matching to derive a computationally efficient optimization procedure based on solving a minimum eigenvalue problem.
The paper introduces a new algorithm, patched batch-and-match (pBaM), for scaling score-based variational inference to high-dimensional problems by efficiently approximating full covariance matrices with a combination of low-rank and diagonal structures.
This paper proposes a novel approach to variational inference (VI) that leverages Wasserstein gradient flows (WGFs) over the space of variational parameters, offering a unified perspective on existing methods and enabling efficient approximation of complex posterior distributions, particularly with mixture models.
This paper introduces a novel variational inference algorithm based on Bregman proximal gradient descent that minimizes the regularized R'enyi divergence between a target distribution and an approximating distribution from an exponential family, offering theoretical convergence guarantees and practical advantages over existing methods.