Experimental environment.
<div><p>To improve the accuracy and efficiency of crack segmentation in ancient wooden structures, we propose a lightweight deep neural network architecture, termed SMG-Net. The core innovation of this model lies in its multi-cooperative perception mechanism. First, the proposed Structur...
محفوظ في:
| المؤلف الرئيسي: | |
|---|---|
| مؤلفون آخرون: | , , , , , |
| منشور في: |
2025
|
| الموضوعات: | |
| الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
| الملخص: | <div><p>To improve the accuracy and efficiency of crack segmentation in ancient wooden structures, we propose a lightweight deep neural network architecture, termed SMG-Net. The core innovation of this model lies in its multi-cooperative perception mechanism. First, the proposed Structure-Aware Cross-directional Pooling (SACP) establishes long-range feature dependencies in multiple orientations, addressing the challenge of coherent recognition for cracks with complex directions. Second, the Multi-path Robust Feature Extraction (MRFE) module enhances the tolerance of the model to noise and blurred edges, thereby improving the discriminative capability of shallow features. Third, the Guided Semantic–Spatial Fusion (GSSFusion) mechanism enables efficient alignment and integration of multi-scale features, ensuring both fine crack details and global structural consistency in segmentation. Extensive experiments were conducted on a self-constructed dataset of cracks in ancient wooden components and the public Masonry crack dataset. SMG-Net achieved mean Intersection-over-Union (mIoU) scores of 81.12% and 87.91%, and Pixel Accuracy (PA) of 98.91% and 98.99%, respectively, significantly outperforming mainstream approaches such as U-Net, SegFormer, and Swin-UNet, with results confirmed by statistical significance testing. Moreover, SMG-Net demonstrates superior parameter efficiency and inference speed, making it particularly suitable for heritage monitoring scenarios with limited computational resources. To promote reproducibility and future research, the source code and datasets have been made publicly available at: <a href="https://github.com/HuiZhenxing/HuiZhenxing.git" target="_blank">https://github.com/HuiZhenxing/HuiZhenxing.git</a>.</p></div> |
|---|