The amount of deaf and mute individuals on the earth is rising at an alarmingrate. Bangladesh has about 2.6 million people who are unable to interact with the community using language. Hearing-impaired citizens in Bangladesh use Bangladeshi sign language (BSL) as a means of communication. In this article,we propose a new method for Bengali sign language recognition based on deep convolutional neural networks. Our framework employs convolutional neural networks (CNN) to learn from the images in our dataset and interpret hand signs from input images. Checking their collections of ten indications (we usedten sets of images with 31 distinct signs) for a total of 310 images. The proposed system takes snap shots from a video by using a webcam with applying a computer vision-based approach. After that, it compares those photos to a previously trained dataset generated with CNN and displays the Bengali numbers (০-৯). After estimating the model on our dataset, weobtained an overall accuracy of 99.8%. We want to streng then things as far aswe can to make silent contact with the majority of society as simple asprobable.