Preconditioned Riemannian Gradient Descent for Low-Rank Matrix Recovery Problems
作者:
时间:2024-03-01
阅读量:276次
  • 演讲人: 蔡剑锋(香港科技大学)
  • 时间:2024年 3 月 11 日 上午10:00
  • 地点:浙江大学紫金港校区行政楼1417报告厅
  • 主办单位:浙江大学数据科学研究中心

摘要:


The challenge of recovering low-rank matrices from linear samples is a common issue in various fields, including machine learning, imaging, signal processing, and computer vision. Non-convex algorithms have proven to be highly effective and efficient for low-rank matrix recovery, providing theoretical guarantees despite the potential for local minima. This talk presents a unifying framework for non-convex low-rank matrix recovery algorithms using Riemannian gradient descent. We demonstrate that numerous well-known non-convex low-rank matrix recovery algorithms can be considered special instances of Riemannian gradient descent, employing distinct Riemannian metrics and retraction operators. Consequently, we can pinpoint the optimal metrics and develop the most efficient non-convex algorithms. To illustrate this, we introduce a new preconditioned Riemannian gradient descent algorithm, which accelerates matrix completion tasks by more than ten times compared to traditional methods.


讲座海报