Improved YOLOv8-based Marine Life Detection Algorithm
Download PDF
$currentUrl="http://$_SERVER[HTTP_HOST]$_SERVER[REQUEST_URI]"

Keywords

Yolov8
Marine organisms
Target detection
Deep learning
Attention mechanisms

DOI

10.26689/jera.v9i1.9420

Submitted : 2025-01-20
Accepted : 2025-02-04
Published : 2025-02-19

Abstract

Aiming at the problems of insufficient feature extraction ability for small targets, complex image background, and low detection accuracy in marine life detection, this paper proposes a marine life detection algorithm SGW-YOLOv8 based on the improvement of YOLOv8. First, the Adaptive Fine-Grained Channel Attention (FCA) module is fused with the backbone layer of the YOLOv8 network to improve the feature extraction ability of the model. This paper uses the YOLOv8 network backbone layer to improve the feature extraction capability of the model. Second, the Efficient Multi-Scale Attention (C2f_EMA) module is replaced with the C2f module in the Neck layer of the network to improve the detection performance of the model for small underwater targets. Finally, the loss function is optimized to Weighted Intersection over Union (WIoU) to replace the original loss function, so that the model is better adapted to the target detection task in the complex ocean background. The improved algorithm has been experimented with on the Underwater Robot Picking Contest (URPC) dataset, and the results show that the improved algorithm achieves a detection accuracy of 84.5, which is 2.3% higher than that before the improvement, and at the same time, it can accurately detect the small-target marine organisms and adapts to the task of detecting marine organisms in various complex environments.

References

Fu X, Wang C, Wang Y, et al., 2006, Research on Marine Biological Resources and Sustainable Utilization Countermeasures. China Biotechnology Engineering Journal, 2006(07): 105–111. https://doi.org/10.13523/j.cb.20060720

Zhai F, Gu Y, Li P, et al., 2020, Construction and Development of the Marine Ranch Observation Network in Shandong Province. Marine Science, 44(12): 93–106.

Qu S, Cui C, Duan J, et al., 2024, Underwater Small Target Detection Under YOLOv8-LA Model. Scientific Reports, 14(1): 16108.

Han X, Chang J, Wang K, 2021, You Only Look Once: Unified, Real-Time Object Detection. Procedia Computer Science, 183(1): 61–72.

Wang CY, Bochkovskiy A, Liao HYM, 2023, YOLOv7: Trainable Bag-Of-Freebies Sets New State-Of-The-Art for Real-Time Object Detectors. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023: 7464–7475.

Wang CY, Liao HYM, Yeh IH, 2022, Designing Network Design Strategies through Gradient Path Analysis. Computer Vision and Pattern Recognition. https://doi.org/10.48550/arXiv.2211.04800

Li Dahai, Li Bingtao, Wang Zhendong. An Improved YOLOv8-Based Underwater Target Detection Algorithm [J]. Computer Applications, 2024, 44(11): 3610-3616.

Sun H, Wen Y, Feng H, et al., 2024, Unsupervised Bidirectional Contrastive Reconstruction and Adaptive Fine-Grained Channel Attention Networks for Image Dehazing. Neural Networks, 176: 106314.

Ouyang D, He S, Zhang G, et al., 2023, Efficient Multi-Scale Attention Module with Cross-Spatial Learning. ICASSP 2023–2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2023: 1–5.

Girshick R, 2015, Fast R-CNN. Computer Vision and Pattern Recognition. https://doi.org/10.48550/arXiv.1504.08083

Fu J, Tian Y, 2024, Underwater target detection using multi-information residual fusion and Multi-Scale Feature Expression. Computer Engineering and Applications, 2024: 1–13. http://kns.cnki.net/kcms/detail/11.2127.TP.20240815.1830.022.html

Zhu X, Lyu S, Wang X, et al., 2021, TPH-YOLOv5: Improved YOLOv5 Based on Transformer Prediction Head for Object Detection on Drone-Captured Scenarios. Proceedings of the IEEE/CVF International Conference on Computer Vision. 2021: 2778–2788.

Zhou X, Li Y, Wu M, et al., 2024, Improved YOLOv8-based Underwater Target Detection. Computer Systems Applications, 33(11): 177–185. https://doi.org/10.15888/j.cnki.csa.009680