Modern Learning Paradigms Beyond Traditional Supervision: A Comprehensive Survey
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The exponential growth of unlabeled data coupled with prohibitive annotation costs has driven the development of learning paradigms beyond traditional supervision. This survey provides a comprehensive review of four interconnected approaches: self-supervised learning, semi-supervised learning, few-shot learning, and meta-learning. Self-supervised learning extracts representations from unlabeled data through pretext tasks and has become the foundation of modern large-scale models. Semi-supervised learning combines limited labeled data with abundant unlabeled data via consistency regularization and pseudo-labeling. Few-shot and zero-shot learning enable generalization to new classes with minimal or no examples through metric learning and semantic embeddings. Meta-learning provides the framework for rapid task adaptation through learned optimization strategies. We systematically examine the theoretical foundations, key algorithms, and practical applications of each paradigm. Our comparative analysis evaluates data efficiency, computational requirements, and performance across standard benchmarks. We discuss synergies between paradigms, particularly in foundation models, and identify critical research gaps including theoretical understanding, evaluation protocols, and domain adaptation challenges. Finally, we propose future directions toward unified learning frameworks and sample-efficient models. This survey serves as a comprehensive resource for understanding and applying modern learning paradigms.