Scopus Indexed Publications
Paper Details
- Title
-
Transformer Based Bengali Chatbot Using General Knowledge Dataset
- Author
-
Abu Kaisar Mohammad Masum,
Nushrat Jahan Ria,
Sharmin Akter,
- Email
-
- Abstract
-
An AI chatbot provides an impressive response after learning from the trained
dataset. In this decade, most of the research work demonstrates that deep
neural models superior to any other model. RNN model regularly used for
determining the sequence-related problem like a question and it answers. This
approach acquainted with everyone as seq2seq learning. In a seq2seq model
mechanism, it has encoder and decoder. The encoder embedded any input sequence,
and the decoder embedded output sequence. For reinforcing the seq2seq model
performance, attention mechanism added into the encoder and decoder. After
that, the transformer model has introduced itself as a high-performance model
with multiple attention mechanism for solving the sequence-related dilemma.
This model reduces training time compared with RNN based model and also
achieved state-of-the-art performance for sequence transduction. In this
research, we applied the transformer model for Bengali general knowledge
chatbot based on the Bengali general knowledge Question Answer (QA) dataset. It
scores 85.0 BLEU on the applied QA data. To check the comparison of the
transformer model performance, we trained the seq2seq model with attention on
our dataset that scores 23.5 BLEU.
- Keywords
-
- Journal or Conference Name
- 20th IEEE International Conference on Machine Learning and Applications (ICMLA 2021)
- Publication Year
-
2021
- Indexing
-
scopus