Skip to content

Shishqa/distributed-optimization

Repository files navigation

Communication-efficient distributed optimization

With the growing number of distributed computations, the need for optimal distributed algorithms has increased. The benefit from distributed computations is clear, as we multiply the computing powers, thus reducing the computation time, and allow processing of data containing an extreme number of features. But there is a significant overhead caused by the network communication. New optimization methods and algorithms are to solve the problem introduced by the communication expenses.

The goal

This work aims to implement and compare some of the most popular distributed convex optimization algorithms:

  • ADMM (centralized),
  • DANE (centralized),
  • Network-DANE (decentralized),
  • Network-SARAH (decentralized),
  • etc

in solving the problem of multi-label classification on fashion MNIST.

The results

You can read the detailed report in docs/report.pdf

Running benchmark

To run the benchmark:

  1. Create a virtual environment:
virtualenv .venv
source .venv/bin/activate
  1. Install requirements using pip
pip3 install -r requirements.txt
  1. Run the main script
python3 main.py

Contributors

Sources

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages