Gossip learning with linear models on fully distributed data

When

22/11/2023    
10:30 am-11:30 am
Mohamed Legheraba
UPMC Sorbonne Université

Where

Room 4B01
19 place Marguerite Perey, Palaiseau

Event Type

In the realm of distributed machine learning, a novel approach called Gossip Learning, was presented in 2013 by the paper “Gossip Learning with Linear Models on Fully Distributed Data”[1]. Gossip Learning introduces a decentralized communication framework among nodes, allowing them to collaboratively train a machine learning model without the need for a central aggregator. This decentralized paradigm offers advantages in scalability, fault tolerance, and privacy preservation.

The presentation will delve into the key components of Gossip Learning, highlighting the communication protocol that enable nodes to exchange information in a gossip-like fashion, the convergence properties of Gossip Learning and its resilience to node failures.
Additionally, the presentation will discuss comparisons with traditional centralized learning approaches (and federated learning approaches).

References:

[1] Ormándi, R., Heged?s, I., & Jelasity, M. (2013). Gossip learning with linear models on fully distributed data. Concurrency and Computation: Practice and Experience, 25(4), 556-571.
https://arxiv.org/abs/1109.1396