Control-DAG introduces constrained decoding algorithms for Directed Acyclic T5 (DA-T5), a non-autoregressive text generation model, to address key limitations such as frequent out-of-vocabulary errors and inability to faithfully generate entity names. The proposed methods incorporate lexical, vocabulary, and length constraints to significantly enhance DA-T5's performance on Task-Oriented Dialogue and Data-to-Text generation tasks.
This survey offers a consolidated view of the neural data-to-text (D2T) paradigm, examining the latest approaches, benchmark datasets, and evaluation protocols. It highlights promising avenues for D2T research that focus on linguistic capabilities as well as fairness and accountability.
최신 언어 생성 모델에서 발생하는 환각을 감지하기 위해 합성 데이터를 활용하는 방법 소개
VOLTAは、TransformerとVAEを組み合わせて生成的多様性を向上させる革新的なフレームワークです。
Reference-free metrics show higher correlation with human judgment and sensitivity to language quality deficiencies compared to reference-based metrics.
VOLTA introduces a novel framework that enhances generative diversity in natural language generation by combining Transformer models with VAE and InfoGAN, improving quality while maintaining diversity.