DLAFS CASCADE R-CNN: AN OBJECT DETECTOR BASED ON DYNAMIC LABEL ASSIGNMENT
Keywords:Object detection, Marine vehicle, Cascade R-CNN, Document detection.
Object detection methods based on Deep Learning are the revolution of the Computer Vision field in general and object detection problems in particular. In detail, they are methods that belonged to the R - CNN family: Faster R - CNN and Cascade R - CNN. The characteristic of them is the Region Proposal Network, which is utilized for generating proposal regions that may include objects or not, then the proposals will be classified by the IoU threshold. In this study, we apply dynamic training, which adjusts this IoU threshold depending on the statistic of proposal regions on the Faster R - CNN and Cascade R - CNN, training on the SeaShips and DODV dataset. Cascade R - CNN with dynamic training achieve higher results compared to normal on both two datasets (higher 0.2% and 5.7% on the SeaShips and DODV dataset, respectively). In the DODV dataset, Faster R - CNN with dynamic training also perform higher results compared to its normal version, 4.4% higher.
How to Cite
License1. We hereby assign copyright of our article (the Work) in all forms of media, whether now known or hereafter developed, to the Journal of Computer Science and Cybernetics. We understand that the Journal of Computer Science and Cybernetics will act on my/our behalf to publish, reproduce, distribute and transmit the Work.
2. This assignment of copyright to the Journal of Computer Science and Cybernetics is done so on the understanding that permission from the Journal of Computer Science and Cybernetics is not required for me/us to reproduce, republish or distribute copies of the Work in whole or in part. We will ensure that all such copies carry a notice of copyright ownership and reference to the original journal publication.
3. We warrant that the Work is our results and has not been published before in its current or a substantially similar form and is not under consideration for another publication, does not contain any unlawful statements and does not infringe any existing copyright.
4. We also warrant that We have obtained the necessary permission from the copyright holder/s to reproduce in the article any materials including tables, diagrams or photographs not owned by me/us.