Showing 1,801 - 1,820 results of 4,625 for search 'significantly ((((less decrease) OR (larger decrease))) OR (((nn decrease) OR (mean decrease))))', query time: 0.50s Refine Results
  1. 1801

    The data of meta-analysis. by Da Huang (1306407)

    Published 2025
    “…Normally presenting with symptoms such as dyspnea, decreased exercise tolerance, decreased maximal heart rate, and decreased arterial oxygen saturation. …”
  2. 1802

    Risk of bias. by Da Huang (1306407)

    Published 2025
    “…Normally presenting with symptoms such as dyspnea, decreased exercise tolerance, decreased maximal heart rate, and decreased arterial oxygen saturation. …”
  3. 1803

    Overall risk of bias assessment. by Da Huang (1306407)

    Published 2025
    “…Normally presenting with symptoms such as dyspnea, decreased exercise tolerance, decreased maximal heart rate, and decreased arterial oxygen saturation. …”
  4. 1804

    Funnel plot of VO<sub>2Peak</sub> inclusion studies. by Da Huang (1306407)

    Published 2025
    “…Normally presenting with symptoms such as dyspnea, decreased exercise tolerance, decreased maximal heart rate, and decreased arterial oxygen saturation. …”
  5. 1805

    Analysis of subgroups. by Da Huang (1306407)

    Published 2025
    “…Normally presenting with symptoms such as dyspnea, decreased exercise tolerance, decreased maximal heart rate, and decreased arterial oxygen saturation. …”
  6. 1806
  7. 1807

    The overall framework of CARAFE. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  8. 1808

    KPD-YOLOv7-GD network structure diagram. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  9. 1809

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  10. 1810

    Comparison of different pruning rates. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  11. 1811

    Comparison of experimental results at ablation. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  12. 1812

    Result of comparison of different lightweight. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  13. 1813

    DyHead Structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  14. 1814

    The parameters of the training phase. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  15. 1815

    Structure of GSConv network. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  16. 1816

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  17. 1817

    Improved model distillation structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  18. 1818
  19. 1819
  20. 1820