博士生讨论班2024[10]
作者:
时间:2024-11-21
阅读量:200次
  • 演讲人: 叶舟夫
  • 时间:2024年11月26日15:30
  • 地点:浙江大学紫金港校区行政楼1417报告厅

报告文章:On non-redundant and linear operator-based nonlinear dimension reduction


摘要:Kernel principal component analysis (KPCA), a popular nonlinear dimension reduction technique, aims at finding a basis of a presumed low-dimensional function space. This causes the redundancy issue that each kernel principal component can be a measurable function of the preceding components, which harms the effectiveness of dimension reduction and leaves the dimension of the reduced data a heuristic choice. In this paper, we formulate the parameter of interest for nonlinear dimension reduction as a small function set that generates the σ-field of the original data. Using a novel characterization of near conditional mean independence, we propose two sequential unsupervised dimension reduction methods. Our methods tackle the redundancy issue, maintain the same level of computational complexity as KPCA, and rely on more plausible assumptions regarding the singularity of the original data. By constructing a measure of the exhaustiveness of the reduced data, we also provide consistent order determination for these methods. Some supportive numerical studies are presented at the end.