NPMCL: A Theoretical Framework for Non-Parametric Continual Learning through Meta-Ability Cultivation
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Parametric update methods for Large Language Models (LLMs) in continual learning often face challenges such as catastrophic forgetting and the stability-plasticity dilemma. In this work, we characterize Non-Parametric Meta Continual Learning (NPMCL) as a structured approach that enables knowledge updates without additional training. This framework models adaptation as a Knowledge Compression-Decompression process, formalized through four core meta-abilities: (1) Query Generation for identifying information gaps; (2) Structural Matching for precise referential and temporal alignment; (3) Distillative Compression for extracting logical invariants from high-entropy data; and (4) Constrained Inference for memory-guided reasoning and prior suppression.We propose that these meta-abilities constitute a domain-agnostic cognitive pipeline, potentially allowing LLMs to adapt to counterfactual environments by leveraging dynamic external memory. This work aims to formalize the theoretical underpinnings of such meta-cognitive protocols. The proposed framework is informed by preliminary empirical observations from logic-aligned memory architectures (e.g., CoG-MeM). In this paper, we systematize the NPMCL paradigm and discuss its implications for the future development of training-free, autonomous cognitive agents.