Authors
|
C. Chatzikonstantinou |
A. Psaltis | |
C. Z. Patrikakis | |
P. Daras | |
Year
|
2024 |
Venue
|
Washington, USA |
Download
|
|
In this work, a novel federated distillation weight aggregation method is proposed. Specifically, an algorithm designed for effective learning in distributed environments is introduced. This algorithm includes an innovative federated distillation scheme, proposing a sophisticated aggregation of model outputs, employing a global server model to manage process. On the server side, feature mixing is employed to aggregate client representations before they are transmitted back to the client side for knowledge distillation. During feature mixing a weight factor is assigned to each client's logits, penalizing bad quality clients and preserving system's credibility. Thorough experimentation has been conducted to comprehensively study the issue at hand. Key findings reveal the significant potential of the proposed solution, which achieves robust performance in a federated setting while reducing communication costs.