Showing 2,221 - 2,240 results of 4,361 for search 'significantly ((((teer decrease) OR (greater decrease))) OR (mean decrease))', query time: 0.21s Refine Results
  1. 2221

    Repeat the detection experiment. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  2. 2222

    Detection network structure with IRAU [34]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  3. 2223

    Ablation experiments of various block. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  4. 2224

    Kappa coefficients for different algorithms. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  5. 2225

    The structure of ASPP+ block. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  6. 2226

    The structure of attention gate block [31]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  7. 2227

    DSC block and its application network structure. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  8. 2228

    The structure of multi-scale residual block [30]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  9. 2229

    The structure of IRAU and Res2Net+ block [22]. by Yingying Liu (360782)

    Published 2025
    “…The actual accuracy and mean latency time of the model were 92.43% and 260ms, respectively. …”
  10. 2230
  11. 2231
  12. 2232
  13. 2233

    Prediction of transition readiness. by Sharon Barak (4803966)

    Published 2025
    “…In most transition domains, help needed did not decrease with age and was not affected by function. …”
  14. 2234

    Dataset visualization diagram. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  15. 2235

    Dataset sample images. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  16. 2236

    Performance comparison of different models. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  17. 2237

    C2f and BC2f module structure diagrams. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  18. 2238

    YOLOv8n detection results diagram. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  19. 2239

    YOLOv8n-BWG model structure diagram. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”
  20. 2240

    BiFormer structure diagram. by Yaojun Zhang (389482)

    Published 2025
    “…Results on a specialized dataset reveal that YOLOv8n-BWG outperforms YOLOv8n by increasing the mean Average Precision (mAP) by 4.2%, boosting recognition speed by 21.3% per second, and decreasing both the number of floating-point operations (FLOPs) by 28.9% and model size by 26.3%. …”