Search alternatives:
larger decrease » marked decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), teer decrease (Expand Search)
nn decrease » _ decrease (Expand Search), a decrease (Expand Search), gy decreased (Expand Search)
larger decrease » marked decrease (Expand Search)
we decrease » _ decrease (Expand Search), a decrease (Expand Search), teer decrease (Expand Search)
nn decrease » _ decrease (Expand Search), a decrease (Expand Search), gy decreased (Expand Search)
-
1961
KPD-YOLOv7-GD network structure diagram.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1962
Comparison experiment of accuracy improvement.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1963
Comparison of different pruning rates.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1964
Comparison of experimental results at ablation.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1965
Result of comparison of different lightweight.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1966
DyHead Structure.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1967
The parameters of the training phase.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1968
Structure of GSConv network.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1969
Comparison experiment of accuracy improvement.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1970
Improved model distillation structure.
Published 2025“…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
-
1971
-
1972
-
1973
-
1974
-
1975
-
1976
-
1977
-
1978
-
1979
-
1980