Optimal Sparse Sliced Inverse Regression via Random Projection
<p>Given continuously emerging features, sufficient dimension reduction has been widely used as a supervised dimension reduction approach. Most existing high-dimensional sufficient dimension reduction methods involve penalized schemes, resulting in cumbersome tuning. To settle this problem, we...
Saved in:
| Main Author: | |
|---|---|
| Other Authors: | , |
| Published: |
2025
|
| Subjects: | |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | <p>Given continuously emerging features, sufficient dimension reduction has been widely used as a supervised dimension reduction approach. Most existing high-dimensional sufficient dimension reduction methods involve penalized schemes, resulting in cumbersome tuning. To settle this problem, we propose a novel sparse sliced inverse regression method for sufficient dimension reduction based on random projections in a large <i>p</i> small <i>n</i> setting. Embedded in a generalized eigenvalue framework, the proposed approach finally reduces to parallel execution of low-dimensional (generalized) eigenvalue decompositions, which facilitates high computational efficiency. Theoretically, we prove that this method achieves the minimax optimal rate of convergence under suitable assumptions. Furthermore, our algorithm involves a delicate reweighting scheme, which can significantly enhance the identifiability of the active set of covariates. Extensive numerical experiments demonstrate high superiority of the proposed algorithm in comparison to competing methods. Supplementary materials for this article are available online.</p> |
|---|