A Representation-Consistent Gated Recurrent Framework for Robust Medical Time-Series Classification

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Medical time-series data are characterized by irregular sampling, high noise levels, missing values, and strong inter-feature dependencies. Recurrent neural networks (RNNs), particularly gated architectures such as Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), are widely used for modeling such data due to their ability to capture temporal dependencies. However, standard gated recurrent models do not explicitly constrain the evolution of latent representations over time, leading to representation drift and instability under noisy or incomplete inputs. In this work, we propose a representation-consistent gated recurrent framework (RC-GRF) that introduces a principled regularization strategy to enforce temporal consistency in hidden-state representations. The proposed framework is model-agnostic and can be integrated into existing gated recurrent architectures without modifying their internal gating mechanisms. We provide a theoretical analysis demonstrating how the consistency constraint bounds hidden-state divergence and improves stability. Extensive experiments on medical time-series classification benchmarks show that the proposed approach improves robustness, reduces variance, and enhances generalization performance, particularly in noisy and low-sample settings.

Article activity feed