Abstract
Foundation Models (FMs) have revolutionized machine learning with their adaptability and high performance across tasks; yet, their integration into Federated Learning (FL) is challenging due to substantial communication overhead from their extensive parameterization. Current communication-efficient FL strategies, such as gradient compression, reduce bitrates to around 1 bit-per-parameter (bpp). However, these approaches fail to harness the characteristics of FMs, with their large number of parameters still posing a challenge to communication efficiency, even at these bitrate regimes. In this work, we present DeltaMask, a novel method that efficiently fine-tunes FMs in FL at an ultra-low bitrate, well below 1 bpp. DeltaMask employs stochastic masking to detect highly effective subnetworks within FMs and leverage stochasticity and sparsity in client masks to compress updates into a compact grayscale image using probabilistic filters, deviating from traditional weight training approaches. Our comprehensive evaluations across various datasets and architectures demonstrate DeltaMask efficiently achieves bitrates as low as 0.09 bpp, enhancing communication efficiency while maintaining FMs performance, as measured on 8 datasets and 5 pre-trained models of various network architectures.
| Original language | English |
|---|---|
| Title of host publication | 2024 IEEE International Conference on Big Data, IEEE BigData 2024 |
| Editors | Wei Ding, Chang-Tien Lu, Fusheng Wang, Liping Di, Kesheng Wu, Jun Huan, Raghu Nambiar, Jundong Li, Filip Ilievski, Ricardo Baeza-Yates, Xiaohua Hu |
| Publisher | Institute of Electrical and Electronics Engineers |
| Pages | 8042-8051 |
| Number of pages | 10 |
| ISBN (Electronic) | 979-8-3503-6248-0 |
| DOIs | |
| Publication status | Published - 16 Jan 2025 |
| Event | 2024 IEEE International Conference on Big Data 2024, IEEE BigData 2024 - Washington DC, United States Duration: 15 Dec 2024 → 18 Dec 2024 |
Conference
| Conference | 2024 IEEE International Conference on Big Data 2024, IEEE BigData 2024 |
|---|---|
| Abbreviated title | IEEE BigData 2024 |
| Country/Territory | United States |
| City | Washington DC |
| Period | 15/12/24 → 18/12/24 |
Funding
V.T's research is funded by the DAIS project, which has received funding from KDTJU under grant agreement No 101007273.
Keywords
- Federated Learning
- foundation models
- probabilistic masking
- probabilistic filters
- fine-tuning
- federated learning
Promotion : time and place
- December 15-18, 2024, Washington DC, USA