Decentralized learning enhances privacy, scalability, and fault tolerance by distributing data and computation across nodes. A popular approach is Federated learning, which relies on a central aggregator, yet faces challenges such as server vulnerabilities, scalability issues, privacy risks and most importantly, the single point of failure. Alternatively Gossip Learning offers fully decentralization through peer-to-peer exchanges of model updates, ensuring robustness and privacy, at the price of slower model convergence. In our work, we introduce a novel decentralized learning framework called HEAL. By exploiting an innovative topology based on dynamic hubs, HEAL delivers performance similar to that of Federated learning, but with the decentralization and resilience of Gossip learning.