Stacked Ensemble Deep Random Vector Functional Link Network With Residual Learning for Medium-Scale Time-Series Forecasting

<p dir="ltr">The deep random vector functional link (dRVFL) and ensemble dRVFL (edRVFL) succeed in various tasks and achieve state-of-the-art performance compared with other randomized neural networks (NNs). However, existing edRVFL structures need more diversity and error correction...

Full description

Saved in:
Bibliographic Details
Main Author: Ruobin Gao (16003195) (author)
Other Authors: Minghui Hu (2457952) (author), Ruilin Li (5627456) (author), Xuewen Luo (10012781) (author), Ponnuthurai Nagaratnam Suganthan (11274636) (author), M. Tanveer (1758181) (author)
Published: 2025
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:<p dir="ltr">The deep random vector functional link (dRVFL) and ensemble dRVFL (edRVFL) succeed in various tasks and achieve state-of-the-art performance compared with other randomized neural networks (NNs). However, existing edRVFL structures need more diversity and error correction ability in an independent network. Our work fills the gap by combining stacked deep blocks and residual learning with the edRVFL. Subsequently, we propose a novel dRVFL combined with residual learning, ResdRVFL, whose deep layers calibrate the wrong estimations from shallow layers. Additionally, we propose incorporating a scaling parameter to control the scaling of residuals from shallow layers, thus mitigating the risk of overfitting. Finally, we present an ensemble deep stacking network, SResdRVFL, based on ResdRVFL. SResdRVFL aggregates multiple blocks into a cohesive network, leveraging the benefits of deep learning and ensemble learning. We evaluate the proposed model on 28 datasets and compare it with the state-of-the-art methods. The comparative study demonstrates that the SResdRVFL is the best-performing approach in terms of average ranking and errors based on 28 datasets.</p><h2>Other Information</h2><p dir="ltr">Published in: IEEE Transactions on Neural Networks and Learning Systems<br>License: <a href="https://creativecommons.org/licenses/by/4.0/deed.en" target="_blank">https://creativecommons.org/licenses/by/4.0/</a><br>See article on publisher's website: <a href="https://dx.doi.org/10.1109/tnnls.2025.3529219" target="_blank">https://dx.doi.org/10.1109/tnnls.2025.3529219</a></p>