DeltaMask: Minimizing Communication in Federated Fine-Tuning of Vision Foundation Models

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

2 Citations (Scopus)
85 Downloads (Pure)

Abstract

Foundation Models (FMs) have revolutionized machine learning with their adaptability and high performance across tasks; yet, their integration into Federated Learning (FL) is challenging due to substantial communication overhead from their extensive parameterization. Current communication-efficient FL strategies, such as gradient compression, reduce bitrates to around 1 bit-per-parameter (bpp). However, these approaches fail to harness the characteristics of FMs, with their large number of parameters still posing a challenge to communication efficiency, even at these bitrate regimes. In this work, we present DeltaMask, a novel method that efficiently fine-tunes FMs in FL at an ultra-low bitrate, well below 1 bpp. DeltaMask employs stochastic masking to detect highly effective subnetworks within FMs and leverage stochasticity and sparsity in client masks to compress updates into a compact grayscale image using probabilistic filters, deviating from traditional weight training approaches. Our comprehensive evaluations across various datasets and architectures demonstrate DeltaMask efficiently achieves bitrates as low as 0.09 bpp, enhancing communication efficiency while maintaining FMs performance, as measured on 8 datasets and 5 pre-trained models of various network architectures.
Original languageEnglish
Title of host publication2024 IEEE International Conference on Big Data, IEEE BigData 2024
EditorsWei Ding, Chang-Tien Lu, Fusheng Wang, Liping Di, Kesheng Wu, Jun Huan, Raghu Nambiar, Jundong Li, Filip Ilievski, Ricardo Baeza-Yates, Xiaohua Hu
PublisherInstitute of Electrical and Electronics Engineers
Pages8042-8051
Number of pages10
ISBN (Electronic)979-8-3503-6248-0
DOIs
Publication statusPublished - 16 Jan 2025
Event2024 IEEE International Conference on Big Data 2024, IEEE BigData 2024 - Washington DC, United States
Duration: 15 Dec 202418 Dec 2024

Conference

Conference2024 IEEE International Conference on Big Data 2024, IEEE BigData 2024
Abbreviated titleIEEE BigData 2024
Country/TerritoryUnited States
CityWashington DC
Period15/12/2418/12/24

Funding

V.T's research is funded by the DAIS project, which has received funding from KDTJU under grant agreement No 101007273.

Keywords

  • Federated Learning
  • foundation models
  • probabilistic masking
  • probabilistic filters
  • fine-tuning
  • federated learning

Promotion : time and place

  • December 15-18, 2024, Washington DC, USA

Fingerprint

Dive into the research topics of 'DeltaMask: Minimizing Communication in Federated Fine-Tuning of Vision Foundation Models'. Together they form a unique fingerprint.

Cite this