Abstract— The decision tree is the most effective classification method. However, the results of the decision tree can show errors due to overfitting or if the data is too noisy. This may cause the tree to be too big with unnecessary nodes and branches. To handle the error rate pruning is done in the decision tree. Using pruning, the scope of the tree is cut short only keeping the necessary nodes and branches. Pruning is possible in two manners, that is, pre-pruning where the pruning of the tree is done while the tree is in the process of construction and post pruning is done after the tree is fully constructed, the size is reduced to the expected tree. In this paper, several techniques of both pre-pruning and post-pruning are described to get an overall better understanding of with method to use based on the type of data. The performance of the pruning methods based on various datasets are measured.