Gossip learning with linear models on fully distributed data

Speaker : Mohamed Legheraba
UPMC Sorbonne Université
Date: 22/11/2023
Time: 10:30 am - 11:30 am
Location: Room 4B01


In the realm of distributed machine learning, a novel approach called Gossip Learning, was presented in 2013 by the paper “Gossip Learning with Linear Models on Fully Distributed Data”[1]. Gossip Learning introduces a decentralized communication framework among nodes, allowing them to collaboratively train a machine learning model without the need for a central aggregator. This decentralized paradigm offers advantages in scalability, fault tolerance, and privacy preservation.

The presentation will delve into the key components of Gossip Learning, highlighting the communication protocol that enable nodes to exchange information in a gossip-like fashion, the convergence properties of Gossip Learning and its resilience to node failures.
Additionally, the presentation will discuss comparisons with traditional centralized learning approaches (and federated learning approaches).


[1] Ormándi, R., Heged?s, I., & Jelasity, M. (2013). Gossip learning with linear models on fully distributed data. Concurrency and Computation: Practice and Experience, 25(4), 556-571.