Kernekoncepter
LLMs can benefit from using non-NL formats for reasoning and communication, leading to improved efficiency and effectiveness.
Resumé
LLMs are exploring alternative formats beyond natural language for reasoning and communication. The study shows that allowing LLMs to autonomously select suitable formats leads to significant improvements in efficiency and effectiveness. Different tasks may require different formats, and the chosen format can be generalized across tasks and transferred between different LLMs. The communication formats decided by LLMs resemble traditional Agent Communication Languages, emphasizing clarity, structure, brevity, and efficiency.
Natural language has long been the primary format for human cognition.
Large Language Models (LLMs) have seen various non-NL formats during pre-training.
Allowing LLMs to select suitable formats autonomously improves reasoning efficiency.
Different tasks may require different formats for optimal performance.
The chosen format can be generalized across tasks and transferred between different LLMs.
Communication formats decided by LLMs emphasize clarity, structure, brevity, and efficiency.
Statistik
Allowing LLMs to autonomously select the most suitable format before reasoning or communicating leads to a 3.3% to 5.7% improvement in reasoning efficiency.
Up to a 72.7% reduction in token usage in multi-agent communication is observed while maintaining communicative effectiveness.
Citater
"We challenge the default use of NL by exploring the utility of non-NL formats in these contexts."
"LLMs can leverage many non-NL formats such as ordered lists, logical expressions, and markdown tables to reason better."