A method for detecting surface defects of strip steel under the background interference of spangles based on STI-YOLO
- 分享到:
1:华北理工大学人工智能学院
2:华北理工大学河北省工业智能感知重点实验室
摘要(Abstract):
针对有花镀层钢板表面缺陷检测过程中由于锌花底纹干扰导致缺陷识别率低的问题,提出一种以目标检测算法YOLOv5s为基础并通过引入通道注意力机制和金字塔卷积网络的Spangles Texture Interference-YOLO(STI-YOLO)算法模型.根据缺陷数据集重新聚类,优化先验框;在特征融合网络PANet之前引入通道注意力机制SENet,抑制锌花背景的干扰;在预测网络之前添加金字塔卷积网络,取得更加丰富的上下文特征.实验结果表明,STI-YOLO模型提升了带钢表面缺陷的检测精度,平均精度均值mAP达到了95.79%,较YOLOv3、YOLOv4和YOLOv5s算法分别提高了13.13个百分点、14.59个百分点和2.07个百分点.检测速度为54.14 frame/s,满足实时性要求,可见STI-YOLO模型具有较好的检测性能.
Aiming at the problem of low defect recognition rate due to the interference of zinc flower shading in the surface defect detection of patterned coated steel plate, the Spangles Texture Interference-YOLO(STI-YOLO)algorithm model based on the target detection algorithm YOLOv5s and improved by introducing channel attention mechanism and pyramid convolution network is proposed. According to the defect data set, this paper reclusters & optimizes the anchor box. The SENet is introduced before the PANet to suppress the interference of the zinc flower background. The pyramid convolution network is added before the prediction network to obtain richer context features. The experimental results show that the STI-YOLO model improves the detection accuracy of strip surface defects, and the average accuracy mAP reaches 95.79%, which is 13.13 percentage points, 14.59 percentage points and 2.07 percentage points higher than that of YOLOv3, YOLOv4 and YOLOv5s algorithms respectively. The detection speed is 54.14 frame/s, meeting the real-time requirements. The STI-YOLO model has better detection performance.
关键词(KeyWords):缺陷检测;特征融合;注意力机制;多尺度特征;锌花背景干扰
defect detection;feature fusion;attention mechanism;multiscale feature;zinc flower background interference
基金项目(Foundation):科技部重点研发项目(2017YFE0135700)
作者(Authors):魏明军;陈钊;纪占林;周太宇;闫旭文;刘铭;
Wei Mingjun;Chen Zhao;Ji Zhanlin;Zhou Taiyu;Yan Xuwen;Liu Ming;College of Artificial Intelligence,North China University of Science and Technology;Hebei Provincial Key Laboratory of Industrial Intelligent Perception,North China University of Science and Technology;
DOI:10.16366/j.cnki.1000-2367.2023.04.011