TBU Publications
Repository of TBU Publications

Dog face detection using yolo network

DSpace Repository

Show simple item record


dc.title Dog face detection using yolo network en
dc.contributor.author Turečková, Alžběta
dc.contributor.author Holík, Tomáš
dc.contributor.author Komínková Oplatková, Zuzana
dc.relation.ispartof Mendel
dc.identifier.issn 1803-3814 Scopus Sources, Sherpa/RoMEO, JCR
dc.date.issued 2020
utb.relation.volume 26
utb.relation.issue 2
dc.citation.spage 17
dc.citation.epage 22
dc.type article
dc.language.iso en
dc.publisher Brno University of Technology
dc.identifier.doi 10.13164/mendel.2020.2.017
dc.relation.uri https://mendel-journal.org/index.php/mendel/article/view/121
dc.subject deep convolution networks en
dc.subject deep learning en
dc.subject dog face detection en
dc.subject IOS mobile application en
dc.subject object detection en
dc.subject YOLO en
dc.description.abstract This work presents the real-world application of the object detection which belongs to one of the current research lines in computer vision. Researchers are commonly focused on human face detection. Compared to that, the current paper presents a challenging task of detecting a dog face instead that is an object with extensive variability in appearance. The system utilises YOLO network, a deep convolution neural network, to predict bounding boxes and class confidences simultaneously. This paper documents the extensive dataset of dog faces gathered from two different sources and the training procedure of the detector. The proposed system was designed for realization on mobile hardware. This Doggie Smile application helps to snapshot dogs at the moment when they face the camera. The proposed mobile application can simultaneously evaluate the gaze directions of three dogs in scene more than 13 times per second, measured on iPhone XR. The average precision of the dogface detection system is 0.92. © 2020, Brno University of Technology. All rights reserved. en
utb.faculty Faculty of Applied Informatics
dc.identifier.uri http://hdl.handle.net/10563/1010152
utb.identifier.obdid 43881749
utb.identifier.scopus 2-s2.0-85098254118
utb.source j-scopus
dc.date.accessioned 2021-01-08T14:02:35Z
dc.date.available 2021-01-08T14:02:35Z
dc.rights Attribution-NonCommercial-ShareAlike 4.0 International
dc.rights.uri https://creativecommons.org/licenses/by-nc-sa/4.0/
dc.rights.access openAccess
utb.contributor.internalauthor Turečková, Alžběta
utb.contributor.internalauthor Holík, Tomáš
utb.contributor.internalauthor Komínková Oplatková, Zuzana
utb.fulltext.affiliation Alzbeta Tureckova, Tomas Holik, Zuzana Kominkova Oplatkova Tomas Bata University in Zlin, Faculty of Applied Informatics, Czech Republic tureckova@utb.cz, oplatkova@utb.cz
utb.fulltext.dates Received: 09 October 2020 Accepted: 11 November 2020 Published: 21 December 2020
utb.fulltext.sponsorship This work was supported by Internal Grant Agency of Tomas Bata University under the Project no. IGA/CebiaTech/2020/001 and by resources of A.I. Lab (ailab.fai.utb.cz).
utb.scopus.affiliation Tomas Bata University in Zlin, Faculty of Applied Informatics, Czech Republic
utb.fulltext.projects IGA/CebiaTech/2020/001
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
utb.fulltext.faculty Faculty of Applied Informatics
Find Full text

Files in this item

Show simple item record

Attribution-NonCommercial-ShareAlike 4.0 International Except where otherwise noted, this item's license is described as Attribution-NonCommercial-ShareAlike 4.0 International