On complementing the implementation using XGBoost , Naive Byaes and Random Forest Classisier we observed the following results. XGBoost demonstrates the highest accuracy, followed by Random Forest, while Naive Bayes exhibits notably lower accuracy. This discrepancy in accuracy can be attributed to the inherentdifferences in the modeling approaches of these algorithms. XGBoost and Random Forest are ensemble learning methods that incorporate multiple models to make predictions, allowing for complex pattern recognition and higher accuracy. In contrast, Naive Bayes relies on strong assumptions about the independence of features, which might not hold true in many real-world datasets, leading to comparatively lower accuracy in predictive performance. The variation in underlying methodologies and their adaptability to the complexity of the dataset contribute to the observed differences in accuracy among these algorithms.
Accuracy:
Random Forest 98.54 %
XGBoost 98.62 %
Naive Bayes 68.44 %
This implementation was done for PICT(SPPU) Seminar Course (Sem V).