LEAPS-MPS: Hadamard Deep Autoencoders and Alternating Directional Methods of Multipliers for Manifold Learning Enabled Distance Preserving Matrix Completion

Information

  • NSF Award
  • 2418826
Owner
  • Award Id
    2418826
  • Award Effective Date
    9/1/2024 - 10 months ago
  • Award Expiration Date
    8/31/2026 - a year from now
  • Award Amount
    $ 249,926.00
  • Award Instrument
    Standard Grant

LEAPS-MPS: Hadamard Deep Autoencoders and Alternating Directional Methods of Multipliers for Manifold Learning Enabled Distance Preserving Matrix Completion

Data sampled from real-world applications such as image inpainting (e.g. reconstruction of the area covered by the mask of a criminal) and recommender systems (e.g. suggesting products based on a user’s Amazon purchase history) often consists of partially observed or unobserved entries; thus, estimating those entries before any analysis is vital. The technique of recovering missing entries of a data set, specifically, a data matrix, is known as Matrix Completion (MC) in which the data matrix is decomposed into a low-rank component representing features of the data, and a sparse component representing anomalies and noise. However, conventional MC frameworks have limited transferability and robustness when applied in diverse domains since such methods do not consider the natural correlation of the data. Thus, the principal investigator (PI) develops a highly transferable and robust MC framework by harnessing the natural correlation of the data. The optimization scheme of the MC framework is numerically implemented using both an efficient algebraic approach and a high-precision Deep Neural Network (DNN) approach. The performance of this MC framework is validated using both a theoretical analysis, as well as synthetic and real-world benchmark datasets.<br/><br/>Real-world data with natural correlation underlies low-rank nonlinear manifold representations; thus, robust MC methods should guarantee the manifold's primary characteristic of distance-preserving ability within the low-rank component of the data matrix, which results in meticulous sparse information retention ability within the sparse component. The PI will develop the MC model with a new mathematical foundation to assure the aforementioned characteristics within decomposed low-rank and sparse components of the data matrix. Especially, the method intakes training data as bounds in any distance of interest (e.g., geodesic, hamming, hop), which helps incorporate observed, unobserved, and fully observed data instances so that the method produces the recovered matrix in the same type of distance. The PI adopts the truncated nuclear norm convex relaxation on the low-rank component of the data matrix as a surrogate to the non-convex and discontinuous truncated rank minimization. The distance-preserving ability of the nonlinear manifold is attained by the adaptation of a special constraint into the optimization scheme that emphasizes the Gramian matrix of low-rank component is positive semi-definite. Anomalies in the real-world data are structured; thus, the PI extracts the sparse component by utilizing both square integrable and intergable norm minimization in contrast to the use of only integrable norm minimization. This MC model is numerically implemented by the algebraic approach Alternating Directional Methods of Multipliers and the DNN approach Hadamard Deep Autoencoders. This project is jointly funded by the Launching Early-Career Academic Pathways in the Mathematical and Physical Sciences Program and the Established Program to Stimulate Competitive Research (EPSCoR).<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Dmitry Golovatydgolovat@nsf.gov7032922117
  • Min Amd Letter Date
    8/23/2024 - 11 months ago
  • Max Amd Letter Date
    8/23/2024 - 11 months ago
  • ARRA Amount

Institutions

  • Name
    University of Rhode Island
  • City
    KINGSTON
  • State
    RI
  • Country
    United States
  • Address
    75 LOWER COLLEGE RD RM 103
  • Postal Code
    028811974
  • Phone Number
    4018742635

Investigators

  • First Name
    Kelum
  • Last Name
    Gajamannage
  • Email Address
    kelumdi@gmail.com
  • Start Date
    8/23/2024 12:00:00 AM

Program Element

  • Text
    LEAPS-MPS
  • Text
    EPSCoR Co-Funding
  • Code
    915000

Program Reference

  • Text
    EXP PROG TO STIM COMP RES
  • Code
    9150