Publikace UTB
Repozitář publikační činnosti UTB

Slicing aided large scale tomato fruit detection and counting in 360-degree video data from a greenhouse

Repozitář DSpace/Manakin

Zobrazit minimální záznam


dc.title Slicing aided large scale tomato fruit detection and counting in 360-degree video data from a greenhouse en
dc.contributor.author Turečková, Alžběta
dc.contributor.author Tureček, Tomáš
dc.contributor.author Janků, Peter
dc.contributor.author Vařacha, Pavel
dc.contributor.author Šenkeřík, Roman
dc.contributor.author Jašek, Roman
dc.contributor.author Psota, Václav
dc.contributor.author Štěpánek, Vít
dc.contributor.author Komínková Oplatková, Zuzana
dc.relation.ispartof Measurement: Journal of the International Measurement Confederation
dc.identifier.issn 0263-2241 Scopus Sources, Sherpa/RoMEO, JCR
dc.identifier.issn 1873-412X Scopus Sources, Sherpa/RoMEO, JCR
dc.date.issued 2022
utb.relation.volume 204
dc.type article
dc.language.iso en
dc.publisher Elsevier B.V.
dc.identifier.doi 10.1016/j.measurement.2022.111977
dc.relation.uri https://www.sciencedirect.com/science/article/pii/S0263224122011733
dc.relation.uri https://www.sciencedirect.com/science/article/pii/S0263224122011733/pdfft?md5=2eec9583c2a600fd1c55189d83e69ebc&pid=1-s2.0-S0263224122011733-main.pdf
dc.subject tomato fruit detection en
dc.subject tomato fruit counting en
dc.subject 360-degree video en
dc.subject image processing en
dc.subject computer vision en
dc.subject deep CNN en
dc.subject slicing aided inference en
dc.subject robotic farming en
dc.description.abstract This paper proposes an automated tomato fruit detection and counting process without a need for any human intervention. First of all, wide images of whole tomato plant rows were extracted from a 360-degree video taken in a greenhouse. These images were utilized to create a new object detection dataset. The original tomato detection methodology uses a deep CNN model with slicing-aided inference. The process encompasses two stages: first, the images are cut into patches for object detection, and consequently, the predictions are stitched back together. The paper also presents an extensive study of post-processing parameters needed to stitch object detections correctly, especially on the patch's borders. Final results reach 83.09% F1 score value on a test set, proving the suitability of the proposed methodology for robotic farming. en
utb.faculty Faculty of Applied Informatics
dc.identifier.uri http://hdl.handle.net/10563/1011191
utb.identifier.obdid 43884088
utb.identifier.scopus 2-s2.0-85140136384
utb.identifier.wok 000876254100002
utb.identifier.coden MSRMD
utb.source j-scopus
dc.date.accessioned 2022-11-29T07:49:17Z
dc.date.available 2022-11-29T07:49:17Z
dc.description.sponsorship IGA/CebiaTech/2022/ 001; Technology Agency of the Czech Republic, TACR: FW01010381
dc.description.sponsorship Technology Agency of the Czech Republic [FW01010381]; Internal Grant Agency of Tomas Bata University [IGA/CebiaTech/2022/001]; Faculty of Applied Informatics, Tomas Bata University in Zlin
utb.contributor.internalauthor Turečková, Alžběta
utb.contributor.internalauthor Tureček, Tomáš
utb.contributor.internalauthor Janků, Peter
utb.contributor.internalauthor Vařacha, Pavel
utb.contributor.internalauthor Šenkeřík, Roman
utb.contributor.internalauthor Jašek, Roman
utb.contributor.internalauthor Komínková Oplatková, Zuzana
utb.fulltext.affiliation Alžběta Turečková a, Tomáš Tureček a, Peter Janků a, Pavel Vařacha a, Roman Šenkeřík a, Roman Jašek a, Václav Psota c, Vit Štěpánek b, Zuzana Komínková Oplatková a,∗ a Tomas Bata University in Zlin, Faculty of Applied Informatics, Nam. T. G. Masaryka 5555, Zlin, 760 01, Czech Republic b NWT a.s., Trida Tomase Bati 269, Zlin, 760 01, Czech Republic c Farma Bezdinek, s.r.o., K Bezdinku 1515, Dolni Lutyne, 735 53, Czech Republic ∗ Corresponding author. E-mail address: oplatkova@utb.cz (Z. Komínková Oplatková).
utb.fulltext.dates Received 29 January 2022 Received in revised form 22 August 2022 Accepted 17 September 2022 Available online 23 September 2022
utb.fulltext.references [1] K. Fuglie, The growing role of the private sector in agricultural research and development world-wide, Glob. Food Secur. 10 (2016) 29–38, http://dx.doi.org/10.1016/j.gfs.2016.07.005, URL https://www.sciencedirect.com/science/article/pii/S2211912416300190. [2] T. Short, C. Draper, M. Donnell, Web-based decision support system for hydroponic vegetable production, in: International Conference on Sustainable Greenhouse Systems-Greensys2004 691, 2004, pp. 867–870. [3] R. Shamshiri, Measuring optimality degrees of microclimate parameters in protected cultivation of tomato under tropical climate condition, Measurement 106 (2017) 236–244, http://dx.doi.org/10.1016/j.measurement.2017.02.028, URL https://www.sciencedirect.com/science/article/pii/S0263224117301276. [4] Y. Zhao, L. Gong, Y. Huang, C. Liu, A review of key techniques of visionbased control for harvesting robot, Comput. Electron. Agric. 127 (2016) 311–323, http://dx.doi.org/10.1016/j.compag.2016.06.022, URL https://www.sciencedirect.com/science/article/pii/S0168169916304227. [5] A. Gongal, S. Amatya, M. Karkee, Q. Zhang, K. Lewis, Sensors and systems for fruit detection and localization: A review, Comput. Electron. Agric. 116 (2015) 8–19. [6] X. Wei, K. Jia, J. Lan, Y. Li, Y. Zeng, C. Wang, Automatic method of fruit object extraction under complex agricultural background for vision system of fruit picking robot, Optik 125 (19) (2014) 5684–5689. [7] S. Wan, S. Goudos, Faster R-CNN for multi-class fruit detection using a robotic vision system, Comput. Netw. 168 (2020) 107036, http://dx.doi.org/10.1016/j.comnet.2019.107036, URL https://www.sciencedirect.com/science/article/pii/S1389128619306978. [8] H. Mureşan, M. Oltean, Fruit recognition from images using deep learning, 2017, arXiv preprint arXiv:1712.00580. [9] Z.-F. Xu, R.-S. Jia, Y.-B. Liu, C.-Y. Zhao, H.-M. Sun, Fast method of detecting tomatoes in a complex scene for picking robots, IEEE Access 8 (2020) 55289–55299, http://dx.doi.org/10.1109/ACCESS.2020.2981823. [10] G. Liu, J.C. Nouaze, P.L. Touko Mbouembe, J.H. Kim, YOLO-tomato: A robust algorithm for tomato detection based on YOLOv3, Sensors 20 (7) (2020) http://dx.doi.org/10.3390/s20072145, URL https://www.mdpi.com/1424-8220/20/7/2145. [11] Y. Mu, T.-S. Chen, S. Ninomiya, W. Guo, Intact detection of highly occluded immature tomatoes on plants using deep learning techniques, Sensors 20(10) (2020) http://dx.doi.org/10.3390/s20102984, URL https://www.mdpi.com/1424-8220/20/10/2984. [12] A.I.B. Parico, T. Ahamed, Real time pear fruit detection and counting using YOLOv4 models and deep SORT, Sensors 21 (14) (2021) http://dx.doi.org/10.3390/s21144803, URL https://www.mdpi.com/1424-8220/21/14/4803.[13] I.-T. Chen, H.-Y. Lin, Detection, counting and maturity assessment of cherry tomatoes using multi-spectral images and machine learning techniques, in: VISIGRAPP, 5: VISAPP, 2020, pp. 759–766. [14] A. Rosenfeld, M. Thurston, Edge and curve detection for visual scene analysis, IEEE Trans. Comput. 100 (5) (1971) 562–569. [15] N. Dalal, B. Triggs, Histograms of oriented gradients for human detection, in: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, CVPR’05, IEEE, 2005, pp. 886–893. [16] P. Viola, M. Jones, Rapid object detection using a boosted cascade of simple features, in: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, CVPR 2001, IEEE, 2001, p. I. [17] S. Ren, K. He, R. Girshick, J. Sun, Faster R-CNN: towards real-time object detection with region proposal networks, IEEE Trans. Pattern Anal. Mach. Intell. 39 (6) (2016) 1137–1149. [18] N. Bodla, B. Singh, R. Chellappa, L.S. Davis, Soft-NMS — Improving object detection with one line of code, in: 2017 IEEE International Conference on Computer Vision, ICCV, 2017, pp. 5562–5570, http://dx.doi.org/10.1109/ICCV.2017.593. [19] J. Chu, Y. Zhang, S. Li, L. Leng, J. Miao, Syncretic-NMS: A merging nonmaximum suppression algorithm for instance segmentation, IEEE Access 8 (2020) 114705–114714, http://dx.doi.org/10.1109/ACCESS.2020.3003917. [20] B.C. Russell, A. Torralba, K.P. Murphy, W.T. Freeman, LabelMe: a database and web-based tool for image annotation, Int. J. Comput. Vis. 77 (1–3) (2008) 157–173. [21] S. Ren, K. He, R. Girshick, J. Sun, Faster R-CNN: Towards real-time object detection with region proposal networks, IEEE Trans. Pattern Anal. Mach. Intell. 39 (6) (2017) 1137–1149. [22] K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in: 2016 IEEE Conference on Computer Vision and Pattern Recognition, CVPR, IEEE, Las Vegas, NV, USA, 2016, pp. 770–778, http://dx.doi.org/10.1109/CVPR.2016.90, URL http://ieeexplore.ieee.org/document/7780459/. [23] F.C. Akyon, C. Cengiz, S.O. Altinuc, D. Cavusoglu, K. Sahin, O. Eryuksel, SAHI: A Lightweight Vision Library for Performing Large Scale Object Detection and Instance Segmentation, Zenodo, 2021, http://dx.doi.org/10.5281/zenodo.5718950. [24] A. Koirala, K.B. Walsh, Z. Wang, C. McCarthy, Deep learning – method overview and review of use for fruit detection and yield estimation, Comput. Electron. Agric. 162 (2019) 219–234, http://dx.doi.org/10.1016/j.compag.2019.04.017, URL https://www.sciencedirect.com/science/article/pii/S0168169919301164. [25] T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, C.L. Zitnick, Microsoft coco: Common objects in context, in: European Conference on Computer Vision, Springer, 2014, pp. 740–755. [26] D. Hoiem, Y. Chodpathumwan, Q. Dai, Diagnosing error in object detectors, in: European Conference on Computer Vision, Springer, 2012, pp. 340–353.
utb.fulltext.sponsorship This work was supported by the Technology Agency of the Czech Republic, under the project No. FW01010381, by Internal Grant Agency of Tomas Bata University under the project no. IGA/CebiaTech/2022/001, and further by the resources of A.I.Lab at the Faculty of Applied Informatics, Tomas Bata University in Zlin.
utb.wos.affiliation [Tureckova, Alzbeta; Turecek, Tomas; Janku, Peter; Varacha, Pavel; Senkerik, Roman; Jasek, Roman; Oplatkova, Zuzana Kominkova] Tomas Bata Univ Zlin, Fac Appl Informat, Nam TG Masaryka 5555, Zlin 76001, Czech Republic; [Stepanek, Vit] NWT AS, Trida Tomase Bati 269, Zlin 76001, Czech Republic; [Psota, Vaclav] Farma Bezdinek Sro, K Bezdinku 1515, Dolni Lutyne 73553, Czech Republic
utb.scopus.affiliation Tomas Bata University in Zlin, Faculty of Applied Informatics, Nam. T. G. Masaryka 5555, Zlin, 760 01, Czech Republic; NWT a.s., Trida Tomase Bati 269, Zlin, 760 01, Czech Republic; Farma Bezdinek, s.r.o., K Bezdinku 1515, Dolni Lutyne, 735 53, Czech Republic
utb.fulltext.projects TAČR FW01010381
utb.fulltext.projects IGA/CebiaTech/2022/001
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.ou -
utb.fulltext.ou -
utb.fulltext.ou -
utb.fulltext.ou -
utb.fulltext.ou -
utb.fulltext.ou -
utb.fulltext.ou -
Find Full text

Soubory tohoto záznamu

Zobrazit minimální záznam