Tri-Module Deep DFT Architecture with Physical Regularization, Task Coupling, and Compression-Based Transferability

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We introduce a deep learning architecture designed to enhance Density Functional Theory (DFT) property prediction through a tri-module system combining task decoupling, physics-based regularization, and compression-based transferability. The proposed framework consists of a shared backbone followed by three dedicated heads for energy, force, and polarizability predictions. A physically inspired Jacobian regularizer enforces structural consistency across outputs, while a bottleneck-based latent space is used to compress task-relevant features and enable inter-task knowledge transfer. Our design allows for minimal interference between learning objectives while preserving physical alignment, especially under low-data or high-noise conditions. This architecture aims to improve generalization, interpretability, and adaptability in multi-task DFT-based modeling without compromising computational efficiency. Early results suggest robust performance across benchmarks and a scalable foundation for future cross-property extensions.

Article activity feed