Transformer Graph Variational Autoencoder for Generative Molecular Design

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In the field of drug discovery, the generation of new molecules with desirable properties remains a critical challenge. Traditional methods often rely on SMILES (Simplified Molecular Input Line Entry System) representations for molecular input data, which can limit the diversity and novelty of generated molecules. To address this, we present the Transformer Graph Variational Autoencoder (TGVAE), an innovative AI model that employs molecular graphs as input data, thus captures the complex structural relationships within molecules more effectively than string models. To enhance molecular generation capabilities, TGVAE combines a Transformer, Graph Neural Network (GNN), and Variational Autoencoder (VAE). Additionally, we address common issues like over-smoothing in training GNNs and posterior collapse in VAE to ensure robust training and improve the generation of chemically valid and diverse molecular structures. Our results demonstrate that TGVAE outperforms existing approaches, generating a larger collection of diverse molecules and discovering structures that were previously unexplored. This advancement not only brings more possibilities for drug discovery but also sets a new level for the use of AI in molecular generation.

Article activity feed