核心概念
Large Language Models can effectively leverage the diversity and quality of solutions in a Quality-Diversity archive to efficiently generate novel and high-performing solutions.
要約
This work introduces "In-context QD", a framework that utilizes the pattern-matching and generative capabilities of pre-trained Large Language Models (LLMs) to generate new solutions for Quality-Diversity (QD) optimization.
The key insights are:
QD archives provide a diverse set of high-quality examples that can be effectively leveraged by LLMs through in-context learning to generate novel and improved solutions.
The prompt template, context structure, and query strategy are critical design choices that enable LLMs to extract relevant patterns from the QD archive and generate solutions that improve both quality and diversity.
Experiments across a range of QD benchmarks, including BBO functions, redundant robotic arm control, and hexapod locomotion, demonstrate that In-context QD outperforms conventional QD baselines like MAP-Elites, especially in finding regions of high fitness.
Ablation studies highlight the importance of including both fitness and feature information in the prompt template, as well as the benefits of structuring the context to provide helpful heuristics for the LLM.
Overall, this work showcases the potential of using LLMs as efficient in-context generators for QD optimization, opening up new avenues for leveraging large-scale generative models in open-ended search and discovery.
統計
The parameter space dimensions D and the number of niches C in the archive are varied to study the performance of In-context QD across different problem settings.
引用
"Effectively combining and using a large number of past inventions is not trivial to achieve. Our work looks at replicating this open-ended process of invention and innovation observed in cultural and technical evolution by (i) using foundation models to effectively ingest a large diversity of solutions to generate solutions that are both better and more novel, and (ii) using Quality-Diversity to maintain and provide these models with many diverse and high-quality examples as context for generation."
"We show that by careful construction of template, context and queries of the prompt, In-context QD can effectively generate novel and high-quality solutions for QD search over a range of parameter search space dimensions, and archive sizes."