Grunnleggende konsepter
Human attention enhances Transformer models for neural code summarization.
Sammendrag
Neural code summarization benefits from integrating human attention with machine attention. EyeTrans introduces a method to combine eye-tracking data with Transformer architecture, improving performance by up to 29.91% in Functional Summarization and 6.39% in General Code Summarization. The study analyzes programmers' gaze patterns, common attention switches, and the impact of incorporating human attention on Transformer models.
EyeTrans leverages eye-tracking data to enhance neural code summarization by integrating human and machine attention. The study demonstrates significant improvements in model performance and highlights structured code reading patterns observed in programmers.
Statistikk
Integrating human attention leads to an improvement of up to 29.91% in Functional Summarization and up to 6.39% in General Code Summarization performance.