CatBoost is a gradient boosting library based on binary decision trees. Target leakage and prediction shifts are avoided by grouping categories with target statistics (TS). The log loss and zero-one loss were better than the traditional greedy algorithm.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.