LSTM-Driven CLIL: Cybersecurity Vocabulary Learning with AI

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This study presents the development of a custom dataset of L2 gap-fill exercises designed to enhance Long Short-Term Memory (LSTM) neural networks in CLIL (Content and Language Integrated Learning) settings for subject-specific courses. Targeting English for Special Purposes (ESP )vocabulary in cybersecurity, privacy, and data protection, the model addresses the dual challenge of domain-specific context mastery and language practice through structured neural network training. The custom dataset of gap-fill exercises for this LSTM model enables simultaneous prediction of missing words and semantic classification, offering learners contextualized language training that is a core requirement of CLIL methodology. Experimental results validate the model’s efficacy, demonstrating its potential as an adaptive support tool for CLIL-based education. This framework establishes a novel synergy between AI-enhanced language learning and subject-specific instruction, providing a scalable template for integrating neural networks into CLIL pedagogy.

Article activity feed