Some Approaches to Improving Prediction Accuracy using Ensemble Methods
Main Article Content
Abstract
This study presents the results of an experimental analysis evaluating the effectiveness of Extra Trees within gradient boosting models, as well as in a newly proposed ensemble framework where the forest is generated under conditions of enhanced internal divergence. Additionally, the paper explores the performance of Extra Trees when applied to novel feature representations computed as distances to a selected set of reference examples. It has been shown that the use of Extra Randomized Trees in gradient boosting and divergent forest models improves generalization ability. The use of expanded feature sets leads to even greater generalization ability.
Article Details
References
2. Dmitriev A.I., Zhuravlev Yu.I., Krendelev F.P. O matematicheskikh printsipakh klassifikatsii predmetov ili yavlenii [On the Mathematical Principles of the Classification of Objects and Phenomena] // Diskretnyi analiz [Discrete Analysis]. 1967. No. 7. P. 3–17 (In Russ.).
3. Vaintsvaig M.N. Algoritm obucheniya raspoznavaniyu obrazov “Kora” [Algorithm for pattern recognition learning “Kora”] // Algoritmy obucheniya raspoznavaniyu obrazov [Algorithms for pattern recognition learning]. Moscow: Sovetskoe radio, 1973. P. 8–12 (In Russ.).
4. Heath D., Kasif S., Salzberg S. k-DT: A multi-tree learning method // Proceedings of the Second International Workshop on Multistrategy Learning. 1993. P. 138–149. https://doi.org/10.1007/0-387-34296-6_10
5. Breiman L. Random Forests // Machine Learning. 2001. Vol. 45, No. 1. P. 5–32. https://doi.org/10.1023/A:1010933404324
6. Breiman L. Bagging predictors // Machine Learning. 1996. Vol. 24, No. 2. P. 123–140. https://doi.org/10.1007/BF00058655
7. Ho T.K. The Random Subspace Method for Constructing Decision Forests // IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998. Vol. 20, No. 8. P. 832–844. https://doi.org/10.1109/34.709601
8. Freund Y., Schapire R.E. A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting // Journal of Computer and System Sciences. 1997. Vol. 55. P. 119–139. https://doi.org/10.1006/jcss.1997.1504
9. Friedman J.H. Stochastic Gradient Boosting // Computational Statistics & Data Analysis. 2002. Vol. 38, No. 4. P. 367–378. https://doi.org/10.1016/S0167-9473(01)00065-2
10. Zhou Z.H. Ensemble Methods: Foundations and Algorithms. New York: Chapman and Hall/CRC, 2012. 446 p. ISBN 978-1-4398-3003-1.
11. Hastie T., Tibshirani R., Friedman J. The Elements of Statistical Learning: Data Mining, Inference and Prediction. New York: Springer, 2009. 745 p. https://doi.org/10.1007/978-0-387-84858-7
12. Beja-Battais P. Overview of AdaBoost: Reconciling its Views to Better Understand its Dynamics // arXiv preprint arXiv:2310.18323 [cs.LG]. 2023. https://doi.org/10.48550/arXiv.2310.18323
13. Chen T., Guestrin C. XGBoost: A Scalable Tree Boosting System // Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2016. P. 785–794. https://doi.org/10.48550/arXiv.1603.02754
14. Ke G., Meng Q., Finley T., Wang T., Chen W., Ma W., Ye Q., Liu T.-Y. LightGBM: A Highly Efficient Gradient Boosting Decision Tree // Advances in Neural Information Processing Systems (NeurIPS). 2017. Vol. 30.
15. Hancock J.T., Khoshgoftaar T.M. CatBoost for big data: an interdisciplinary review // Journal of Big Data. 2020. Vol. 7, No. 94. 45 p. https://doi.org/10.1186/s40537-020-00369-8
16. Zhuravlev Yu.I., Senko O.V., Dokukin A.A., Kiselyova N.N., Saenko I.A. Two-Level Regression Method Using Ensembles of Trees with Optimal Divergence // Doklady Mathematics. 2021. Vol. 103, No. 1. P. 1–4.
https://doi.org/10.1134/S1064562421040177
17. Dokukin A.A., Sen’ko O.V. A New Two-Level Machine Learning Method for Evaluating the Real Characteristics of Objects // Journal of Computer and Systems Sciences International. 2023. Vol. 62, No. 4. P. 607–614. https://doi.org/10.1134/S1064230723040020
18. Senko O.V., Dokukin A.A., Kiselyova N.N., Dudarev V.A., Kuznetsova Yu.O. New Two-Level Ensemble Method and Its Application to Chemical Compounds Properties Prediction // Lobachevskii Journal of Mathematics. 2023. Vol. 44, No. 1. P. 188–197. https://doi.org/10.1134/S1995080223010341
19. Geurts P., Ernst D., Wehenkel L. Extremely Randomized Trees // Machine Learning. 2006. Vol. 63, No. 1. P. 3–42. https://doi.org/10.1007/s10994-006-6226-1
20. López-Iñesta E., Grimaldo F., Arevalillo-Herráez M. Combining feature extraction and expansion to improve classification-based similarity learning // Pattern Recognition Letters. 2016. Vol. 85. P. 84–90. https://doi.org/10.1016/j.patrec.2016.11.005
21. Breiman L., Friedman J., Olshen R.A., Stone C.J. Classification and Regression Trees. Monterey, CA: Wadsworth & Brooks/Cole, 1984. 358 p. https://doi.org/10.1201/9781315139470
22. Mahalanobis P.C. On the Generalised Distance in Statistics (reprint of 1936) // Sankhya A. 2018. Vol. 80, Suppl. 1. P. 1–7. https://doi.org/10.1007/s13171-019-00164-5

This work is licensed under a Creative Commons Attribution 4.0 International License.
Presenting an article for publication in the Russian Digital Libraries Journal (RDLJ), the authors automatically give consent to grant a limited license to use the materials of the Kazan (Volga) Federal University (KFU) (of course, only if the article is accepted for publication). This means that KFU has the right to publish an article in the next issue of the journal (on the website or in printed form), as well as to reprint this article in the archives of RDLJ CDs or to include in a particular information system or database, produced by KFU.
All copyrighted materials are placed in RDLJ with the consent of the authors. In the event that any of the authors have objected to its publication of materials on this site, the material can be removed, subject to notification to the Editor in writing.
Documents published in RDLJ are protected by copyright and all rights are reserved by the authors. Authors independently monitor compliance with their rights to reproduce or translate their papers published in the journal. If the material is published in RDLJ, reprinted with permission by another publisher or translated into another language, a reference to the original publication.
By submitting an article for publication in RDLJ, authors should take into account that the publication on the Internet, on the one hand, provide unique opportunities for access to their content, but on the other hand, are a new form of information exchange in the global information society where authors and publishers is not always provided with protection against unauthorized copying or other use of materials protected by copyright.
RDLJ is copyrighted. When using materials from the log must indicate the URL: index.phtml page = elbib / rus / journal?. Any change, addition or editing of the author's text are not allowed. Copying individual fragments of articles from the journal is allowed for distribute, remix, adapt, and build upon article, even commercially, as long as they credit that article for the original creation.
Request for the right to reproduce or use any of the materials published in RDLJ should be addressed to the Editor-in-Chief A.M. Elizarov at the following address: amelizarov@gmail.com.
The publishers of RDLJ is not responsible for the view, set out in the published opinion articles.
We suggest the authors of articles downloaded from this page, sign it and send it to the journal publisher's address by e-mail scan copyright agreements on the transfer of non-exclusive rights to use the work.