Neural Machine Translation (NMT), increased the accuracy of machine translation and recently become more popular in machine translation research community. An attention mechanism has been added with NMT and the result of MT is improving, respectively. Using different attention mechanism in encoder-decoder, along with NMT models are achieving good results considering mentionable improvements in source sentences during the translation. Translating from English to another language is a widely used machine translation problem. For this regard, English-Bengali translation is a researchable area in Bengali machine translation. Bengali is one of the usable languages in the world. Therefore, conversion of Bengali to other language is very essential for people. The translator is a way to understand the meaning of different language. But sometimes the translator does not provide the accurate Bengali translation for the corresponding English text. So, generating the correct English to Bengali translation requires more research. In this research, we build an attention-based NMT model for English to Bengali machine translation instead of the normal NMT model and that achieves 22.3 BELU for model evaluation. Also, we build a seq2seq model without attention for checking the machine performance on translation. That achieves 15.3 BELU for the applied dataset. The analysis shows that the attention-based seq2seq model performs better than seq2seq learning for English to Bengali translation.