Scopus Indexed Paper

Paper Details


Title
Sequence-to-sequence Bangla Sentence Generation with LSTM Recurrent Neural Networks
Abstract
Sequence to sequence text generation is the most efficient approach for automatically converting the script of a word from a source sequence to a target sequence. Text generation is the application of natural language generation which is useful in sequence modeling like the machine translation, speech recognition, image captioning, language identification, video captioning and much more. In this paper we have discussed about Bangla text generation, using deep learning approach, Long Short-term Memory (LSTM), a special kind of RNN (Recurrent Neural Network). LSTM networks are suitable for analyzing sequences of text data and predicting the next word. LSTM could be a respectable solution if you want to predict the very next point of a given time sequence. In this article we proposed a artificial Bangla Text Generator with LSTM, which is very early for this language and also this model is validated with satisfactory accuracy rate.
Keywords
Language Modeling, Text Generation, NLP, Bangla Text, Sequence-to-sequence, RNN, LSTM, Deep Learning, Machine Learning
Authors
Md. Sanzidul Islam, Sadia Sultana, Sharmin Mousumi, Sheikh Abujar, Syed Akhter Hossain
Phone
Journal or Conference Name
Procedia Computer Science
Publish Year
2019
Indexing
Scopus