Harnessing Generative Pre-trained Transformer for Antimicrobial Peptide Generation and MIC Prediction with Contrastive Learning

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Antimicrobial peptides (AMPs) have garnered considerable attention due to their reduced likelihood of inducing resistance in pathogens compared to traditional antibiotics, which has spurred the interest in de novo design of AMPs. Despite the availability of various methods, accurately generating AMPs and predicting their inhibitory effects remains a challenging task. In this work, we introduce AMPCLGPT, a novel approach that leverages contrastive learning and generative pre-training for AMP design and minimum inhibitory concentration (MIC) prediction. First, AMPCLGPT is pre-trained on a large-scale unlabeled peptide dataset to learn peptide sequence patterns and enhance its ability to extract powerful representations. Second, the pre-trained AMPCLGPT is fine-tuned on AMP data with contrastive learning to increase the distance between antimicrobial and non-antimicrobial peptides in the latent space, improving its ability to accurately generate AMPs. Additionally, the pre-trained AMPCLGPT is fine-tuned to predict MIC values based on the learned peptide features. Empirical results demonstrate that our model can effectively generate AMPs and accurately predict their MIC values. By integrating these two capabilities, AMPCLGPT enables fully automated design of AMPs with low MIC values. AMPCLGPT represents a significant advancement in the field of AMP research, potentially accelerating the development of potent AMP-based therapeutics.

Article activity feed