Основні поняття
Self-supervised learning with anatomy-oriented imaging planes improves transfer learning performance in medical image analysis.
Анотація
The content discusses the importance of self-supervised learning for pretraining deep networks on medical image data. It introduces two pretext tasks based on spatial relationships among imaging planes, demonstrating their effectiveness through experiments on cardiac and knee MRI datasets. The proposed tasks significantly enhance transfer learning performance for downstream tasks like segmentation and classification.
Introduction to Medical Image Analysis and Transfer Learning.
Importance of Self-Supervised Learning in Medical Imaging.
Proposal of Two Complementary Pretext Tasks for Anatomy-Oriented Imaging Planes.
Detailed Experiments and Results on Cardiac and Knee MRI Datasets.
Evaluation Metrics and Comparison with Existing Methods.
Статистика
Various pretext tasks have been proposed to utilize properties of medical image data (e.g., three dimensionality).
Previous works rarely paid attention to data with anatomy-oriented imaging planes, e.g., standard cardiac magnetic resonance imaging views.
Two complementary pretext tasks are proposed based on the spatial relationship of the imaging planes.
Experiments demonstrate that the proposed pretext tasks are effective in pretraining deep networks for boosted performance on target tasks.
The relative orientation regression task predicts intersecting lines between imaging planes effectively.
The relative location regression task accurately predicts the relative locations within a stack of parallel slices in medical images.
Multi-task self-supervised learning combining both pretext tasks shows improved representation learning results.