The paper presents a method called Patch-based Contrastive learning and Memory Consolidation (PCMC) for Online Unsupervised Continual Learning (O-UCL). O-UCL is a learning paradigm where an agent receives a non-stationary, unlabeled data stream and progressively learns to identify an increasing number of classes.
PCMC operates in a cycle of "wake" and "sleep" periods. During the wake period, it identifies and clusters incoming stream data using a novel patch-based contrastive learning encoder, along with online clustering and novelty detection techniques. It maintains a short-term memory (STM) and a long-term memory (LTM) to store the learned cluster centroids.
During the sleep period, PCMC retrains the encoder and consolidates the data representations. It updates the centroids' positions and prunes redundant examples stored in the LTM to avoid concept drift and improve efficiency.
The paper evaluates PCMC's performance on streams created from the ImageNet and Places365 datasets, and compares it against several existing methods and simple baselines. PCMC outperforms the baselines in both classification and clustering tasks, while maintaining consistent performance throughout the stream.
The paper also presents ablation studies exploring the impact of various design choices, such as sleep cycle timing, patch size, and memory consolidation. The results demonstrate the benefits of PCMC's patch-based approach and its ability to efficiently learn and adapt to the changing data distribution.
إلى لغة أخرى
من محتوى المصدر
arxiv.org
الرؤى الأساسية المستخلصة من
by Cameron Tayl... في arxiv.org 09-26-2024
https://arxiv.org/pdf/2409.16391.pdfاستفسارات أعمق