LLaMA-Excitor: A Lightweight Method for Enhancing Instruction-Following Capabilities of Large Language Models
LLaMA-Excitor is a lightweight method that stimulates the potential of large language models like LLaMA to better follow instructions by gradually paying more attention to worthwhile information, without directly changing the intermediate hidden state during self-attention calculation.