Opportunistic Federated Learning: An Exploration of Egocentric Collaboration for Pervasive Computing Applications.

Speaker : June Pyo Jung
Télécom - Paris
Date: 25/10/2023
Time: 10:30 am - 11:30 am
Location: Salle 4B01

Abstract

Federated learning (FL) is an emerging distributed learning technique, through  which models can be trained using the data collected by user devices in resource-constrained situations while protecting user privacy. However, FL has three main limitations: First, the parameter server (PS), which aggregates the local models that are trained using local user data, is typically far from users. The large distance may burden the path links between the PS and local nodes, thereby increasing the consumption of the network and computing resources. Second, user device resources are limited , but this aspect is not considered in the training of the local model and transmission of the model parameters. Third, the PS-side links tend to become highly loaded as the number of participating clients increases. The links become congested owing to the large size of model parameters.
One of the kind in Federated Learning, opportunistic federated learning, in which individual devices belonging to different users seek to learn robust models that are personalized to their user’s own experiences. However, instead of learning in isolation, these models opportunistically incorporate the learned experiences of other devices they encounter opportunistically.