Showing 1,561 - 1,580 results of 21,342 for search '(( significant decrease decrease ) OR ( significant ((inter decrease) OR (greater decrease)) ))', query time: 0.62s Refine Results
  1. 1561

    PCA-CGAN model convergence curve. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  2. 1562

    PCA-CGAN Structure Diagram. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  3. 1563

    Comparison of Model Five-classification Results. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  4. 1564

    PCAECG-GAN K-fold experiment table. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  5. 1565

    PCA-CGAN Pseudocode Table. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  6. 1566

    PCA-CGAN Ablation Experiment Results. by Chao Tang (10925)

    Published 2025
    “…Experiments demonstrate that PCA-CGAN not only achieves stable convergence on a large-scale heterogeneous dataset comprising 43 patients for the first time but also resolves the “dilution effect” problem in data augmentation, avoiding the asymmetric phenomenon where Precision increases while Recall decreases. After data augmentation, the ResNet model’s average F1 score improved significantly, with particularly outstanding performance on rare categories such as atrial premature beats, far surpassing traditional methods like SigCWGAN and TD-GAN. …”
  7. 1567
  8. 1568
  9. 1569
  10. 1570
  11. 1571
  12. 1572
  13. 1573
  14. 1574
  15. 1575
  16. 1576
  17. 1577
  18. 1578
  19. 1579
  20. 1580