Showing 2,001 - 2,020 results of 5,376 for search 'significantly ((altered decrease) OR (((greatest decrease) OR (mean decrease))))', query time: 0.40s Refine Results
  1. 2001

    The framework diagram of this study. by Mei Zhou (269746)

    Published 2025
    “…</p><p>Results</p><p>After DRG implementation, the logarithmic mean of total hospitalization expenditures decreased significantly (3.914 ± 0.837 vs. 3.872 ± 1.004), while rates of unplanned readmissions, unplanned reoperations, postoperative complications, and patient complaints within 30 days increased significantly (3.784% vs 4.214%, 0.083% vs 0.166%, 0.207% vs 0.258%, 3.741% vs 5.133%). …”
  2. 2002
  3. 2003
  4. 2004
  5. 2005
  6. 2006
  7. 2007
  8. 2008
  9. 2009

    The overall framework of CARAFE. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  10. 2010

    KPD-YOLOv7-GD network structure diagram. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  11. 2011

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  12. 2012

    Comparison of different pruning rates. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  13. 2013

    Comparison of experimental results at ablation. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  14. 2014

    Result of comparison of different lightweight. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  15. 2015

    DyHead Structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  16. 2016

    The parameters of the training phase. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  17. 2017

    Structure of GSConv network. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  18. 2018

    Comparison experiment of accuracy improvement. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  19. 2019

    Improved model distillation structure. by Zhongjian Xie (4633099)

    Published 2025
    “…Secondly, a lightweight convolutional module is introduced to replace the standard convolutions in the Efficient Long-range Aggregation Network (ELAN-A) module, and the channel pruning techniques are applied to further decrease the model’s complexity. Finally, the experiment significantly enhanced the efficiency of feature extraction and the detection accuracy of the model algorithm through the integration of the Dynamic Head (DyHead) module, the Content-Aware Re-Assembly of Features (CARAFE) module, and the incorporation of knowledge distillation techniques. …”
  20. 2020