Redundancy and Resilience in Distributed Optimization

Speaker

Shuo Liu February 24, 2023.

Abstract

Distributed optimization problems have gained significant attention in recent years, where each agent in the system has a local cost function, and the goal is to design an algorithm that allows the agents to collaboratively minimize the aggregate cost over all the agents. Distributed machine learning is an important application of this problem.

We consider the case where the system may contain faulty agents and/or slow agents. We will discuss the impact of faulty or slow agents on distributed optimization, and how to utilize redundancy in cost functions to achieve resilient distributed optimization (RDO). We define a way of measuring the resilience of optimization algorithms, namely (f,r; epsilon)-resilience, and a notion of redundancy, namely (f,r; epsilon)-redundancy. We demonstrate the necessity of (f,r; epsilon)-redundancy, and present a practical algorithm for resilient distributed optimization.




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • Collaborative Top Distribution Identifications with Limited Interaction
  • Online Aggregation Problems with Delay
  • Deterministic identity testing paradigms for bounded top-fanin depth-4 circuits
  • Foundations of Lattice-based Cryptography
  • On Probabilistic Approximations of Boolean Functions via Polynomials, and Polynomial Closure Statements over the Boolean Cube