Showing 3,261 - 3,280 results of 11,397 for search 'significantly ((((((lower decrease) OR (we decrease))) OR (linear decrease))) OR (mean decrease))', query time: 0.54s Refine Results
  1. 3261
  2. 3262
  3. 3263
  4. 3264
  5. 3265
  6. 3266

    Simulation datasets. by Xiao Mo (2430355)

    Published 2025
    “…These results offer significant theoretical guidance for the design and improvement of needle-free injection.…”
  7. 3267

    Parameters of screws. by Pilan Jaipanya (12861176)

    Published 2025
    “…<div><p>Background</p><p>Lateral mass screw (LMS) is a more widely adopted method for posterior cervical spine fixation than the cervical pedicle screw (CPS). Despite its lower pullout strength, the insertions of LMS are more reproducible and have a lower risk. …”
  8. 3268

    The overall framework of CARAFE. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  9. 3269

    KPD-YOLOv7-GD network structure diagram. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  10. 3270

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  11. 3271

    Comparison of different pruning rates. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  12. 3272

    Comparison of experimental results at ablation. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  13. 3273

    Result of comparison of different lightweight. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  14. 3274

    DyHead Structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  15. 3275

    The parameters of the training phase. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  16. 3276

    Structure of GSConv network. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  17. 3277

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  18. 3278

    Improved model distillation structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  19. 3279
  20. 3280