Adaptive methods

DAdam: A Consensus-Based Distributed Adaptive Gradient Method for Online Optimization

Adaptive optimization methods, such as AdaGrad , RMSProp , and Adam , are widely used in solving large-scale machine learning problems. A number of schemes have been proposed in the literature aiming at parallelizing them, based on communications …