In-Context Pretraining: Enhancing Language Models with Related Documents
IN-CONTEXT PRETRAINING introduces a new method for pretraining language models by incorporating related documents, improving their ability to understand and reason over diverse and longer contexts.