يعرض 121 - 140 نتائج من 15,048 نتيجة بحث عن '(( significant ((force decrease) OR (fold decrease)) ) OR ( significant attention data ))', وقت الاستعلام: 2.72s تنقيح النتائج
  1. 121

    Encoder decoder attention prediction. حسب Megha Roshan (18388293)

    منشور في 2024
    "…We examined the text-based models and compared their results which showed that the modified Encoder decoder model with attention mechanism trained on textual data achieved an accuracy of 93.5%. …"
  2. 122
  3. 123
  4. 124
  5. 125
  6. 126

    Time attention features weighted. حسب Gengchen Xu (20477357)

    منشور في 2024
    "…However, traditional LSTM-based SOH estimation methods do not account for the fact that the degradation sequence of battery SOH exhibits trend-like nonlinearity and significant dynamic variations between samples. Therefore, this paper proposes an LSTM-based lithium-ion SOH estimation method incorporating data characteristics and spatio-temporal attention. …"
  7. 127

    Spatial attention feature weighting. حسب Gengchen Xu (20477357)

    منشور في 2024
    "…However, traditional LSTM-based SOH estimation methods do not account for the fact that the degradation sequence of battery SOH exhibits trend-like nonlinearity and significant dynamic variations between samples. Therefore, this paper proposes an LSTM-based lithium-ion SOH estimation method incorporating data characteristics and spatio-temporal attention. …"
  8. 128
  9. 129
  10. 130
  11. 131
  12. 132
  13. 133

    Structural diagram of attention mechanism. حسب Libo Liu (552605)

    منشور في 2024
    "…The value of COD reflects the effectiveness and trend of sewage treatment to a certain extent, but obtaining accurate data requires high cost and labor intensity. To1 solve this problem, this paper proposes an online soft measurement method for COD based on Convolutional Neural Network-Bidirectional Long Short-Term Memory Network-Attention Mechanism (CNN-BiLSTM-Attention) algorithm. …"
  14. 134
  15. 135
  16. 136
  17. 137
  18. 138
  19. 139

    Evaluation indicators of network attention. حسب Feng Yuxin (18275639)

    منشور في 2024
    "…Research shows that: (1) Overall, the network attention to case-based destinations is relatively low, and there are significant differences in network attention among different attractions. …"
  20. 140

    Heterogeneous analysis: External attention. حسب Fangjun Wang (1654270)

    منشور في 2024
    "…Heterogeneity analysis further underscores that the effect of green funds is particularly potent in companies with high external attention. Furthermore, green funds also play significant roles in production capabilities and economic value. …"