XGBoost and Random Forest Algorithms: An in Depth Analysis
DOI:
https://doi.org/10.57041/pjosr.v3i1.946Keywords:
Classification, Regreesion, Machinelearning, Artificial Intelligence, Randomforest, XgboostAbstract
Machine learning is playing an increasingly important role in many facets of our lives as technology develops, including forecasting weather, figuring out social media trends, and predicting prices on the world market. This significance invoked the demand of some efficient predicting models that can easily handle complex data and provide maximum accurate results. XGBoost and Random Forest are upgradable ensemble techniques used to solve regression and classification problems that have evolved and proved to be dependable and reliable machine learning challenge solvers. In this research paper, we undertake a comprehensive analysis and comparison of these two prominent machine learning algorithms. The first half of the research includes relevant overview on both of the techniques, significance and evolution of both algorithms. The latter part of this study involves a meticulous comparative analysis between Random Forest and XGBoost, scrutinizing facets such as time complexity, precision, and reliability. We examine their distinctive approaches to handling regression and classification problems while closely examining their subtle handling of training and testing datasets. A thorough quantitative evaluation using a variety of performance metrics, such as the F1-score, Recall, Precision, Mean Squared Erro
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 https://pjosr.com/index.php/pjosr/cr

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.