Generative Deep Neural Networks for Estimating Hypervariability in Hepatitis B and C Virus Genomes

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Hepatitis B virus (HBV) and Hepatitis C virus (HCV) have always remained a greater global concern. Approximately 1.3 million deaths occur each year due to HBV and HCV. Due to the diverse genotypes and drug resistance, diagnostic challenges are being faced to treat these viruses. Therefore, the success ratio of the antiviral therapies has been decreasing with time in the last few decades. By deep learning predictive model, the pattern of evolution in hypervariable regions of HBV and HCV genes can be foreseen. In HCV, the hypervariable region is the Envelope glycoprotein (E2) gene, while in HBV, it includes the S1 and S2 genes. Generative models in deep learning have been used for evolutionary studies, but the application of these models is limited in viral research for predicting the evolving genotypes of viruses. The Long Short-Term Memory (LSTM) model represented a satisfactory outcome in predicting the sequences of the hypervariable genes of the evolving genotypes of the HCV and HBV genes that might be of a great help in diagnosis and vaccine design. We collected data from databases like NCBI and BVBRC. Our proposed LSTM generative model was trained on 1500 sequences of hypervariable genes of the present 7 genotypes of Hepatitis C and 10 genotypes of HBV. Apart from the traditional generative models like Recurrent Neural Network (RNN), our model not only generates the sequence but also learns and develops the relationship between various parts of the virus’s genetic code. In this study, three generative models were compared, Simple RNN, 1-Dimensional Convolutional Neural Network (ConV1d) and Long Short-Term Memory (LSTM). Among these three, LSTM demonstrated the least error rate with the highest efficiency and accuracy. While simple RNN and ConV1d illustrated relatively higher error rate and lower accuracy. LSTM gained success in reading long dependencies, hence, the proposed LSTM models are efficient at handling the sequential data along with preventing the conventional issue of losing the important information from the data, which happens frequently in generative models like Simple RNN and ConV1d.

Article activity feed