Optimal Sparse Sliced Inverse Regression via Random Projection

<p>Given continuously emerging features, sufficient dimension reduction has been widely used as a supervised dimension reduction approach. Most existing high-dimensional sufficient dimension reduction methods involve penalized schemes, resulting in cumbersome tuning. To settle this problem, we...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلف الرئيسي: Jia Zhang (187802) (author)
مؤلفون آخرون: Runxiong Wu (22405116) (author), Xin Chen (14149) (author)
منشور في: 2025
الموضوعات:
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
_version_ 1852015874671640576
author Jia Zhang (187802)
author2 Runxiong Wu (22405116)
Xin Chen (14149)
author2_role author
author
author_facet Jia Zhang (187802)
Runxiong Wu (22405116)
Xin Chen (14149)
author_role author
dc.creator.none.fl_str_mv Jia Zhang (187802)
Runxiong Wu (22405116)
Xin Chen (14149)
dc.date.none.fl_str_mv 2025-10-09T22:00:33Z
dc.identifier.none.fl_str_mv 10.6084/m9.figshare.30324843.v1
dc.relation.none.fl_str_mv https://figshare.com/articles/dataset/Optimal_Sparse_Sliced_Inverse_Regression_via_Random_Projection/30324843
dc.rights.none.fl_str_mv CC BY 4.0
info:eu-repo/semantics/openAccess
dc.subject.none.fl_str_mv Medicine
Cancer
Space Science
Mathematical Sciences not elsewhere classified
Information Systems not elsewhere classified
Minimax optimality
Random projection
Sparse sliced inverse regression
Sufficient dimension reduction
dc.title.none.fl_str_mv Optimal Sparse Sliced Inverse Regression via Random Projection
dc.type.none.fl_str_mv Dataset
info:eu-repo/semantics/publishedVersion
dataset
description <p>Given continuously emerging features, sufficient dimension reduction has been widely used as a supervised dimension reduction approach. Most existing high-dimensional sufficient dimension reduction methods involve penalized schemes, resulting in cumbersome tuning. To settle this problem, we propose a novel sparse sliced inverse regression method for sufficient dimension reduction based on random projections in a large <i>p</i> small <i>n</i> setting. Embedded in a generalized eigenvalue framework, the proposed approach finally reduces to parallel execution of low-dimensional (generalized) eigenvalue decompositions, which facilitates high computational efficiency. Theoretically, we prove that this method achieves the minimax optimal rate of convergence under suitable assumptions. Furthermore, our algorithm involves a delicate reweighting scheme, which can significantly enhance the identifiability of the active set of covariates. Extensive numerical experiments demonstrate high superiority of the proposed algorithm in comparison to competing methods. Supplementary materials for this article are available online.</p>
eu_rights_str_mv openAccess
id Manara_5a03fd14e26343103c183f102e00a8a2
identifier_str_mv 10.6084/m9.figshare.30324843.v1
network_acronym_str Manara
network_name_str ManaraRepo
oai_identifier_str oai:figshare.com:article/30324843
publishDate 2025
repository.mail.fl_str_mv
repository.name.fl_str_mv
repository_id_str
rights_invalid_str_mv CC BY 4.0
spelling Optimal Sparse Sliced Inverse Regression via Random ProjectionJia Zhang (187802)Runxiong Wu (22405116)Xin Chen (14149)MedicineCancerSpace ScienceMathematical Sciences not elsewhere classifiedInformation Systems not elsewhere classifiedMinimax optimalityRandom projectionSparse sliced inverse regressionSufficient dimension reduction<p>Given continuously emerging features, sufficient dimension reduction has been widely used as a supervised dimension reduction approach. Most existing high-dimensional sufficient dimension reduction methods involve penalized schemes, resulting in cumbersome tuning. To settle this problem, we propose a novel sparse sliced inverse regression method for sufficient dimension reduction based on random projections in a large <i>p</i> small <i>n</i> setting. Embedded in a generalized eigenvalue framework, the proposed approach finally reduces to parallel execution of low-dimensional (generalized) eigenvalue decompositions, which facilitates high computational efficiency. Theoretically, we prove that this method achieves the minimax optimal rate of convergence under suitable assumptions. Furthermore, our algorithm involves a delicate reweighting scheme, which can significantly enhance the identifiability of the active set of covariates. Extensive numerical experiments demonstrate high superiority of the proposed algorithm in comparison to competing methods. Supplementary materials for this article are available online.</p>2025-10-09T22:00:33ZDatasetinfo:eu-repo/semantics/publishedVersiondataset10.6084/m9.figshare.30324843.v1https://figshare.com/articles/dataset/Optimal_Sparse_Sliced_Inverse_Regression_via_Random_Projection/30324843CC BY 4.0info:eu-repo/semantics/openAccessoai:figshare.com:article/303248432025-10-09T22:00:33Z
spellingShingle Optimal Sparse Sliced Inverse Regression via Random Projection
Jia Zhang (187802)
Medicine
Cancer
Space Science
Mathematical Sciences not elsewhere classified
Information Systems not elsewhere classified
Minimax optimality
Random projection
Sparse sliced inverse regression
Sufficient dimension reduction
status_str publishedVersion
title Optimal Sparse Sliced Inverse Regression via Random Projection
title_full Optimal Sparse Sliced Inverse Regression via Random Projection
title_fullStr Optimal Sparse Sliced Inverse Regression via Random Projection
title_full_unstemmed Optimal Sparse Sliced Inverse Regression via Random Projection
title_short Optimal Sparse Sliced Inverse Regression via Random Projection
title_sort Optimal Sparse Sliced Inverse Regression via Random Projection
topic Medicine
Cancer
Space Science
Mathematical Sciences not elsewhere classified
Information Systems not elsewhere classified
Minimax optimality
Random projection
Sparse sliced inverse regression
Sufficient dimension reduction