BENCHMARKING OF MACHINE LEARNING ALGORITHMS FOR FERTILIZER RECOMMENDATION IN PRECISION AGRICULTURE
Abstract
Keywords
Full Text:
PDFReferences
ABBAS M., MEMON K. A., JAMALI A. A., MEMON S., AHMED A. , 2019, Multinomial Naive Bayes classification model for sentiment analysis. IJCSNS Int. J. Comput. Sci. Netw. Secur, 19(3), 62. DOI:10.13140/RG.2.2.30021.40169
ALSHBOUL O., ALMASABHA G., SHEHADEH A., AL-SHBOUL K. , 2024, A comparative study of LightGBM, XGBoost, and GEP models in shear strength management of SFRC-SBWS. Structures, 61, 106009. https://doi.org/10.1016/j.istruc.2024.106009
ALTMAN N. S., 1992, An Introduction to Kernel and Nearest-Neighbor Nonparametric Regression. The American Statistician, 46(3), 175–185. https://doi.org/10.1080/00031305.1992.10475879
BENTÉJAC C., CSÖRGŐ A., MARTÍNEZ-MUÑOZ G., 2020, A comparative analysis of gradient boosting algorithms. Artificial Intelligence Review, 54(3), 1937–1967. https://doi.org/10.1007/s10462-020-09896-5
BISHOP C. M., 1995, Neural networks for pattern recognition. Oxford university press.
BREIMAN L., 2001, Random forests. Machine Learning, 45(1), 5–32. https://doi.org/10.1023/A:1010933404324
BREIMAN L., FRIEDMAN J. H., OLSHEN R. A., STONE C. J., 2017, Classification And Regression Trees. Routledge. https://doi.org/10.1201/9781315139470
CHEN T., GUESTRIN C., 2016, XGBoost : A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794, KDD ’16: The 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM. https://doi.org/10.1145/2939672.2939785
CHERNOV A., 2025, (GG) MoE vs. MLP on Tabular Data, Version 1, arXiv. https://doi.org/10.48550/ARXIV.2502.03608
CORTES C., VAPNIK V., 1995, Support-vector networks. Machine Learning, 20(3), 273–297. https://doi.org/10.1007/bf00994018
COVER T., HART P., 1967, Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21–27. https://doi.org/10.1109/tit.1967.1053964
COX D.R., 1958, The Regression Analysis of Binary Sequences. Journal of the Royal Statistical Society: Series B, 20, 215-242.
DENG Y., LIU Y., ZHANG D., CAO Z., 2025, A Hybrid Gradient Boosting Model for Predicting Longitudinal Dispersion Coefficient in Natural Rivers. Water Resources Management, 39(5), 2111–2131. https://doi.org/10.1007/s11269-024-04058-6
ENNAJI O., BELGAID A., EL ALLALI A., 2023, Machine learning in nutrient management: A review. ScienceDirect. DOI:10.1016/j.aiia.2023.06.001
FERNÁNDEZ-DELGADO M., CERNADAS E., BARRO S., AMORIM D., 2014, Do we need hundreds of classifiers to solve real world classification problems? Journal of Machine Learning Research, 15, 3133–3181. https://jmlr.org/papers/volume15/delgado14a/delgado14a.pdf
FISHER R. A., 1936, THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS. Annals of Eugenics, 7(2), 179–188. https://doi.org/10.1111/j.1469-1809.1936.tb02137.x
FLOREK P., ZAGDAŃSKI, A., 2023, Benchmarking state-of-the-art gradient boosting algorithms for classification, Version 1, arXiv. https://doi.org/10.48550/ARXIV.2305.17094
FRIEDMAN J. H., 2001, Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29(5, https://doi.org/10.1214/aos/1013203451
GEURTS P., ERNST D., WEHENKEL, L., 2006, Extremely randomized trees. Machine Learning, 63(1), 3–42. https://doi.org/10.1007/s10994-006-6226-1
GUIDO R., FERRISI S., LOFARO D., CONFORTI D., 2024, An Overview on the Advancements of Support Vector Machine Models in Healthcare Applications: A Review. Information, 15(4), 235. https://doi.org/10.3390/info15040235
HALDER R. K., UDDIN M. N., UDDIN MD. A., ARYAL S., KHRAISAT, A., 2024, Enhancing K-nearest neighbor algorithm: a comprehensive review and performance analysis of modifications. Journal of Big Data, 11(1, https://doi.org/10.1186/s40537-024-00973-y
HASTIE T., TIBSHIRANI R., FRIEDMAN J., 2009, The Elements of Statistical Learning. In Springer Series in Statistics. Springer New York. https://doi.org/10.1007/978-0-387-84858-7
HOLZMÜLLER D., GRINSZTAJN L., STEINWART, I., 2024, Better by default: Strong pre-tuned mlps and boosted trees on tabular data. Advances in Neural Information Processing Systems, 37, 26577-26658.
HOSMER D. W., LEMESHOW, S., 2000, Applied Logistic Regression. Wiley. https://doi.org/10.1002/0471722146
ILERI K., 2025, Comparative analysis of CatBoost, LightGBM, XGBoost, RF, and DT methods optimised with PSO to estimate the number of k-barriers for intrusion detection in wireless sensor networks. International Journal of Machine Learning and Cybernetics, 16(9), 6937–6956. https://doi.org/10.1007/s13042-025-02654-5
KAMILARIS A., PRENAFETA-BOLDÚ F. X., 2018, Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90. https://doi.org/10.1016/j.compag.2018.02.016
KE G., MENG Q., FINLEY T., WANG T., CHEN W., MA W., Y, Q., LIU T.-Y., 2017, LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, December 2017, 3149-3157.
LIAKOS,K. G., BUSATO P., MOSHOU D., PEARSON S., BOCHTIS, D., 2018, Machine learning in agriculture: A review. Sensors, Basel, Switzerland), 18(8), 2674. https://doi.org/10.3390/s18082674
LIU Z., LUONG P., BOLEY M., SCHMIDT D. F., 2025, Improving Random Forests by Smoothing, Version 1, arXiv. https://doi.org/10.48550/ARXIV.2505.06852
LUO W., LI H., BAI Z., LIU Z., 2025, Spectrally-Corrected and Regularized QDA Classifier for Spiked Covariance Model, Version 1, arXiv. https://doi.org/10.48550/ARXIV.2503.13582
MCLACHLAN G.J., 2004, Discriminant Analysis and Statistical Pattern Recognition. Wiley, New York.
MURPHY K. P., 2022, Probabilistic machine learning: an introduction. MIT press.
MUSANASE C., VODACEK A., HANYURWIMFURA D., UWITONZE A., KABANDANA I., 2023, Data-driven analysis and machine learning-based crop and fertilizer recommendation system for revolutionizing farming practices. Agriculture, 13(11), 2141. https://doi.org/10.3390/agriculture13112141
NOVIKOFF A., 1962, On convergence proofs on perceptrons. Proceedings of the Symposium on the Mathematical Theory of Automata, 12, 615–622.
PARMAR A., KATARIYA R., PATEL V., 2018, A Review on Random Forest: An Ensemble Classifier. In Lecture Notes on Data Engineering and Communications Technologies, pp. 758–763, Springer International Publishing. https://doi.org/10.1007/978-3-030-03146-6_86
PERETZ O., 2024, Naive Bayes classifier – An ensemble procedure for recall. Expert Systems with Applications, 239, 122559. https://doi.org/10.1016/j.engappai.2024.108972
PROKHORENKOVA L., GUSEV G., VOROBEV A., DOROGUSH A.V. GULIN A., 2018, Catboost: Unbiased Boosting with Categorical Features. Proceedings of the 32nd International Conference on Neural Information Processing Systems, Montréal, 3-8 December 2018, 6639-6649.
PROVOST F., FAWCETT T., 2013, Data science for business: What you need to know about data mining and data-analytic thinking. O'Reilly Media.
QU L., PEI Y., 2024, A Comprehensive Review on Discriminant Analysis for Addressing Challenges of Class-Level Limitations, Small Sample Size, and Robustness. Processes, 12(7), 1382. https://doi.org/10.3390/pr12071382
QUINLAN J. R., 1986, Induction of decision trees. Machine Learning, 1(1), 81–106. https://doi.org/10.1007/bf00116251
RISH I., 2001, An empirical study of the naive Bayes classifier. IJCAI 2001 Workshop on Empirical Methods in AI, 3(22), 41–46.
ROSENBLATT F., 1958, The perceptron: A probabilistic model for information storage and organization in the brain. Psychological Review, 65(6), 386–408. https://doi.org/10.1037/h0042519
RUMELHART D. E., HINTON G. E., WILLIAMS R. J., 1986, Learning representations by back-propagating errors. Nature, 323(6088), 533–536. https://doi.org/10.1038/323533a0
SAFAVIAN S. R., LANDGREBE D., 1991, A survey of decision tree classifier methodology. IEEE Transactions on Systems, Man, and Cybernetics, 21(3), 660–674. https://doi.org/10.1109/21.97458
SANKARI C., VITHYAVIGASINI S. P., VISHVAJIT S., ANTO JEBA INFANT M., RETHANISHA J., 2025, June, AI-Driven IoT-Enabled Soil Nutrient and Moisture Monitoring with XGBoost and LightGBM-Based Predictive Irrigation and Fertilization Optimization for Sustainable Precision Agriculture. In 2025 11th International Conference on Communication and Signal Processing, ICCSP) (pp. 795-800, IEEE. 10.1109/ICCSP64183.2025.11089358
SCHOLKOPF B., SMOLA A. J., 2002, Learning with kernels: Support vector machines, regularization, optimization, and beyond. MIT Press.
SHAMSHIRI R. R., JONES J. W., THORP K. R., AHMAD D., MAN H. C., TAHERI, S., 2018, Review of optimum temperature, humidity, and vapour pressure deficit for microclimate evaluation and control in greenhouse cultivation of tomato: a review. International agrophysics, 32(2), 287-302. doi: 10.1515/intag-2017-0005
SHOKATI H., MASHAL M., NOROOZI A., ABKAR A. A., MIRZAEI S., MOHAMMADI-DOQOZLOO Z., TAGHIZADEH-MEHRJARDI R., KHOSRAVANI P., NABIOLLAHI K., SCHOLTEN, T., 2024, Random Forest-Based Soil Moisture Estimation Using Sentinel-2, Landsat-8/9, and UAV-Based Hyperspectral Data. Remote Sensing, 16(11), 1962. https://doi.org/10.3390/rs16111962
TANAKA T. S., HEUVELINK G. B. M., MIENO T., 2024, Can machine learning models provide accurate fertilizer recommendations? Precision Agriculture, 25, 1839–1856. https://doi.org/10.1007/s11119-024-10136-x
WEIGARD A., SPENCER, R. J., 2022, Benefits and challenges of using logistic regression to assess neuropsychological performance validity: Evidence from a simulation study. The Clinical Neuropsychologist, 37(1), 34–59. https://doi.org/10.1080/13854046.2021.2023650
Refbacks
- There are currently no refbacks.
Copyright (c) 2026 Florin Daniel Militaru, Ramona Ciolac, Sebastian Moisa, Adrian Firu-Negoescu, Gabriela Popescu

This work is licensed under a Creative Commons Attribution 4.0 International License.
LUCRĂRI ȘTIINȚIFICE MANAGEMENT AGRICOL
ISSN print 1453-1410
ISSN online 2069-2307
(former ISSN 1453-1410, E-ISSN 2069-2307)
PUBLISHER: AGROPRINT Timisoara, Romania
PAPER ACCESS: Full text articles available for free
FREQUENCY: Annual
PUBLICATION LANGUAGE: English
______________________________________________________________________________________________
Banat`s University of Agricultural Sciences and Veterinary Medicine “King Michael I of Romania” from Timisoara
Faculty of Management and Rural Tourism
300645, Timisoara, Calea Aradului 119, Romania
E-mail: tabitaadamov2003 [at] yahoo.com
Phone: +40-256-277439, Fax.: +40-256-277031