Distributed Function Computation over Networks

Speaker : Derya Malak
Eurecom
Date: 12/04/2023
Time: 3:00 pm - 4:00 pm
Location: Room 4B01

Abstract

Large-scale distributed computing systems, such as MapReduce, Spark, or distributed deep networks, are critical for parallelizing the execution of computational tasks. Nevertheless, a struggle between computation and communication complexity lies at the heart of distributed computing. There has been recently a substantial effort to address this problem for a class of functions, such as distributed matrix multiplication, distributed gradient coding, linearly separable functions, etc. The optimal cost has been achieved under some constraints, based mainly on ideas of linear separability of the tasks and linear space intersections. Motivated by the same challenge, we propose a novel distributed computing framework where a master seeks to compute an arbitrary function of distributed datasets in an asymptotically lossless manner. Our approach exploits the notion of characteristic graphs, which have been widely utilized by Shannon, K¨orner, and Witsenhausen to derive the rate lower bounds for computation, and later by Alon- Orlitsky, Orlitsky-Roche, Doshi-Shah-M´edard, and Feizi-M´edard, to resolve some well known distributed coding and communication problems, allowing for lowered communication complexity and even for a) correlated data, b) a broad class of functions, and c) well-known topologies. The novelty of our approach lies in accurately capturing the communication-computation cost tradeoff by melding the notions of characteristic graphs and distributed placement, to provide a natural generalization of distributed linear function computation, thus elevating distributed gradient coding and distributed linear transform to the realm of distributed computing of any function. In toy scenarios, we demonstrate gains up to %70 over fully distributed solutions and an approximation ratio of 2 within the optimal centralized rate.