Extending the Context Limit of Large Language Models through Hierarchical Context Merging
A novel training-free technique called Hierarchical cOntext MERging (HOMER) that can effectively extend the context limit of pre-trained large language models by employing a divide-and-conquer approach with hierarchical merging of context embeddings and token reduction.