Collaborative Research: III: Small: Foundations for Trustworthy Decentralized Federated Learning

Information

  • NSF Award
  • 2416607
Owner
  • Award Id
    2416607
  • Award Effective Date
    8/15/2024 - 5 months ago
  • Award Expiration Date
    7/31/2027 - 2 years from now
  • Award Amount
    $ 219,332.00
  • Award Instrument
    Standard Grant

Collaborative Research: III: Small: Foundations for Trustworthy Decentralized Federated Learning

Decentralized Federated Learning (DFL) has emerged as a new learning paradigm in artificial intelligence, enabling the training of data-hungry learning models on local devices without sharing raw data. This paradigm is immune to the single-point failure of a central server and the privacy perils caused by a dishonest server. However, the understanding of DFL is still in its infancy. It is unclear how the decentralized and periodic communication strategy affects the convergence performance of DFL algorithms, especially when tackling emerging machine learning models where the corresponding optimization problem has complicated structures, such as bilevel optimization. Furthermore, peer-to-peer communication in DFL introduces unique security risks, stemming from a combination of malicious users and device-to-device communication patterns. This project aims to design and develop a secure and efficient DFL system, addressing these communication, computation, and security issues. This project will benefit a variety of high-impact applications where machine learning models are trained in a DFL setting without sharing raw data.<br/><br/>This project aims to develop computational theories, models, and prototype systems, forming the foundations for trustworthy DFL, considering both high-performance accuracy and security with privacy preserving. The first focus of research is to develop the structural communication topology and pattern, underpinned by mathematical graph theories and empirical computer network techniques, to favor efficient and robust communication. The second focus is to investigate the bilevel optimization problem for emerging machine learning models in DFL, where efficient stochastic bilevel optimization algorithms will be developed, and their theoretical convergence foundations in DFL will be established. To providing security guarantees, unique security threats to DFL will be thoroughly investigated and principled defense strategies will be developed accordingly. Beyond these foundational aspects, this project will apply the developed techniques to practical data mining applications in Internet-of-Things networks and Smart Transportation, addressing the unique challenges therein and providing practical solutions to benefit real-world applications. Moreover, the team will integrate the proposed research work into several courses and provide abundant research activities for both undergraduate and graduate students with diverse backgrounds.<br/><br/>This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

  • Program Officer
    Sylvia Spenglersspengle@nsf.gov7032927347
  • Min Amd Letter Date
    8/12/2024 - 5 months ago
  • Max Amd Letter Date
    8/12/2024 - 5 months ago
  • ARRA Amount

Institutions

  • Name
    Temple University
  • City
    PHILADELPHIA
  • State
    PA
  • Country
    United States
  • Address
    1805 N BROAD ST
  • Postal Code
    191226104
  • Phone Number
    2157077547

Investigators

  • First Name
    Hongchang
  • Last Name
    Gao
  • Email Address
    hongchang.gao@temple.edu
  • Start Date
    8/12/2024 12:00:00 AM

Program Element

  • Text
    Info Integration & Informatics
  • Code
    736400

Program Reference

  • Text
    INFO INTEGRATION & INFORMATICS
  • Code
    7364
  • Text
    SMALL PROJECT
  • Code
    7923