Abstract
Federated Learning (FL) is a distributed machine learning paradigm that enables learning models from decentralized private datasets, where the labeling effort is entrusted to the clients. While most existing FL approaches assume high-quality labels are readily available on users' devices; in reality, label noise can naturally occur in FL and follows a non-i.i.d. distribution among clients. Due to the ``non-iid-ness'' challenges, existing state-of-the-art centralized approaches exhibit unsatisfactory performance, while previous FL studies rely on data exchange or repeated server-side aid to improve model's performance. Here, we propose FedLN, a framework to deal with label noise across different FL training stages; namely, FL initialization, and server-side model aggregation. Extensive experiments on various publicly available vision and audio datasets demonstrate an improvement of 24% on average compared to state-of-the-art methods for a label noise level of 70%.
| Original language | English |
|---|---|
| Publication status | Published - 1 Dec 2022 |
| Event | 36th Conference on Neural Information Processing Systems, NeurIPS 2022 - Hybrid, New Orleans, United States Duration: 28 Nov 2022 → 9 Dec 2022 Conference number: 36 |
Conference
| Conference | 36th Conference on Neural Information Processing Systems, NeurIPS 2022 |
|---|---|
| Abbreviated title | NeurIPS 2022 |
| Country/Territory | United States |
| City | New Orleans |
| Period | 28/11/22 → 9/12/22 |
Keywords
- federated learning
- noisy labels
- label correction
- deep learning