Showing 1,981 - 2,000 results of 3,675 for search 'significantly ((((larger decrease) OR (((mean decrease) OR (nn decrease))))) OR (teer decrease))', query time: 0.52s Refine Results
  1. 1981
  2. 1982
  3. 1983

    Algorithm training accuracy experiments. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  4. 1984

    Repeat the detection experiment. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  5. 1985

    Detection network structure with IRAU [34]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  6. 1986

    Ablation experiments of various block. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  7. 1987

    Kappa coefficients for different algorithms. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  8. 1988

    The structure of ASPP+ block. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  9. 1989

    The structure of attention gate block [31]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  10. 1990

    DSC block and its application network structure. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  11. 1991

    The structure of multi-scale residual block [30]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  12. 1992

    The structure of IRAU and Res2Net+ block [22]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  13. 1993
  14. 1994
  15. 1995
  16. 1996

    Prediction of transition readiness. by Sharon Barak (4803966)

    Published 2025
    “…In most transition domains, help needed did not decrease with age and was not affected by function. …”
  17. 1997

    Dataset visualization diagram. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  18. 1998

    Dataset sample images. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  19. 1999

    Performance comparison of different models. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  20. 2000

    C2f and BC2f module structure diagrams. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”