يعرض 1 - 20 نتائج من 238 نتيجة بحث عن '(( significant attention model ) OR ( significantly ((lower decrease) OR (linear decrease)) ))', وقت الاستعلام: 0.13s تنقيح النتائج
  1. 1

    Efficient self-attention with smart pruning for sustainable large language models حسب Samir Brahim Belhaouari (9427347)

    منشور في 2025
    "…Second, the Weight Matrix Folding method is introduced to efficiently prune the self-attention layer matrices in a simple and efficient mathematical model. …"
  2. 2

    An End-to-End Concatenated CNN Attention Model for the Classification of Lung Cancer With XAI Techniques حسب Fariha Haque (21485518)

    منشور في 2025
    "…To address these challenges, an end-to-end concatenated Convolutional Neural Network (CNN) attention model has been proposed for automatic lung cancer classification. …"
  3. 3

    Agent Productivity Modeling in a Call Center Domain Using Attentive Convolutional Neural Networks حسب Ahmed , Abdelrahman

    منشور في 2020
    "…In this paper, we propose an objective framework for modeling agent productivity for real estate call centers based on speech signal processing. …"
    احصل على النص الكامل
    احصل على النص الكامل
  4. 4

    PredictPTB: an interpretable preterm birth prediction model using attention-based recurrent neural networks حسب Rawan AlSaad (14159019)

    منشور في 2022
    "…<h3>Background</h3><p dir="ltr">Early identification of pregnant women at risk for preterm birth (PTB), a major cause of infant mortality and morbidity, has a significant potential to improve prenatal care. However, we lack effective predictive models which can accurately forecast PTB and complement these predictions with appropriate interpretations for clinicians. …"
  5. 5

    Micro-Expression Recognition using Convolutional Variational Attention Transformer (ConVAT) with Multihead Attention Mechanism حسب Hafiz Khizer bin Talib (20571467)

    منشور في 2025
    "…To address these limitations, we propose the Convolutional Variational Attention Transformer (ConVAT), a novel model that leverages a multi-head attention mechanism integrated with convolutional networks, optimized specifically for detailed micro-expression analysis. …"
  6. 6

    A hybrid 3D CNN-LSTM model with soft spatial attention mechanism for accurate hyperspectral image classification حسب Mohamed Sultan Mohamed Ali (17317003)

    منشور في 2025
    "…This study introduces a hybrid deep learning model that combines 3D Convolutional Neural Networks (CNNs) with Long Short-Term Memory (LSTM) networks, incorporating residual connections and a soft spatial attention mechanism to overcome these limitations. …"
  7. 7

    Temporal self-attention for risk prediction from electronic health records using non-stationary kernel approximation حسب Rawan AlSaad (14159019)

    منشور في 2024
    "…Yet, modeling the non-stationarity in EHR data has received less attention. …"
  8. 8

    Dual-attention Network for View-invariant Action Recognition حسب Gedamu Alemu Kumie (19273711)

    منشور في 2023
    "…The DANet is composed of relation-aware spatiotemporal self-attention and spatiotemporal cross-attention modules. …"
  9. 9
  10. 10

    Robust and novel attention guided MultiResUnet model for 3D ground reaction force and moment prediction from foot kinematics حسب Md. Ahasan Atick Faisal (15302410)

    منشور في 2023
    "…The proposed deep learning model is tested on two publicly available datasets containing data from 66 healthy subjects to validate the approach. …"
  11. 11

    Hierarchical multi-head attention LSTM for polyphonic symbolic melody generation حسب Ahmet Kasif (17787560)

    منشور في 2024
    "…As such, in this study, a hierarchical multi-head attention LSTM model is proposed for creating polyphonic symbolic melodies. …"
  12. 12

    Social attention as a cross‐cultural transdiagnostic neurodevelopmental risk marker حسب Thomas W. Frazier (4229593)

    منشور في 2021
    "…The best-fitting model included a general social attention factor and six specific factors. …"
  13. 13

    A slow but steady nanoLuc: R162A mutation results in a decreased, but stable, nanoLuc activity حسب Wesam S. Ahmed (10170053)

    منشور في 2024
    "…Here, we combined molecular dynamics (MD) simulation and mutational analysis to show that the R162A mutation results in a decreased but stable <u>bioluminescence </u>activity of NLuc in living cells and in vitro. …"
  14. 14

    Does standalone phacoemulsification lower intraocular pressure in glaucomatous eyes? A systematic review and meta-analysis حسب Osama Hussein (21301202)

    منشور في 2025
    "…In conclusion, standalone phacoemulsification significantly lowers IOP and reduces the need for glaucoma medications in patients with glaucoma.…"
  15. 15

    MACGAN: An All-in-One Image Restoration Under Adverse Conditions Using Multidomain Attention-Based Conditional GAN حسب Maria Siddiqua (17949149)

    منشور في 2023
    "…Furthermore, an ablation study is conducted to analyze the contributions of the discriminator and attention blocks within the MACGAN architecture. The results confirm that both components play significant roles in the effectiveness of MACGAN, with the discriminator ensuring adversarial training and the attention blocks effectively capturing and enhancing important image features.…"
  16. 16
  17. 17

    Mixed precision iterative refinement with adaptive precision sparse approximate inverse preconditioning حسب Noaman Khan (19810050)

    منشور في 2025
    "…<p dir="ltr">Hardware trends have motivated the development of mixed precision algorithms in numerical linear algebra, which aim to decrease runtime while maintaining acceptable accuracy. …"
  18. 18
  19. 19
  20. 20

    Serum 25-hydroxyvitamin D concentrations are inversely associated with body adiposity measurements but the association with bone mass is non-linear in postmenopausal women حسب Vijay Ganji (8710491)

    منشور في 2021
    "…Overall, body adiposity markers were the lowest in the 4th quartile serum 25(OH)D and significantly lower compared to the 1st quartile serum 25(OH)D. …"