FormHe is a novel tool that combines logic-based techniques and Large Language Models (LLMs) to automatically debug and repair Answer Set Programming (ASP) code, particularly targeting novice programmers in educational settings.
Large Language Models (LLMs) show promise for Automated Program Repair (APR) but struggle with memory inefficiency when using beam search for patch generation. FLAMES, a novel approach combining LLM-based and search-based APR, leverages semantic feedback and a best-first search algorithm to improve both the efficiency and effectiveness of LLM-based program repair.
This paper proposes mPRED, a novel machine-learning approach for multi-task program error repair and explanatory diagnosis, aiming to improve the accuracy and efficiency of identifying and fixing program errors while providing clear explanations to programmers.
ContrastRepair significantly improves program repair efficiency by providing informative feedback to Large Language Models through contrastive test cases.
T5APR is a novel neural program repair approach that offers a unified solution for bug fixing across multiple programming languages, showcasing competitive performance against state-of-the-art techniques.
ContrastRepair significantly improves program repair efficiency by providing informative prompts to Large Language Models through contrastive test cases.
T5APR introduces a novel multilingual neural repair approach, leveraging transformer models, to efficiently fix bugs across various programming languages.
RepairLLaMA combines tailored code representations and parameter-efficient fine-tuning to enhance program repair effectiveness, outperforming baselines in fixing bugs with language models.
ContrastRepair improves program repair by providing informative feedback to Large Language Models, outperforming existing methods.