Publishing House SB RAS:

Publishing House SB RAS:

Address of the Publishing House SB RAS:
Morskoy pr. 2, 630090 Novosibirsk, Russia



Advanced Search

Numerical Analysis and Applications

2018 year, number 1

Parallel algorithms and probability of large deviation for stochastic convex optimization problems

P.E. Dvurechensky1,2, A.V. Gasnikov2,3, A.A. Lagunovskaya3
1Weierstrass Institute for Applied Analysis and Stochastics, Mohrenstr. 39, 10117 Berlin, Germany
2Institute for Information Transmission Problems RAS, Bolshoy Karetny per. 19, build.1, Moscow, Russia 127051
3Moscow Institute of Physics and Technology, 9 Institutskiy per., Dolgoprudny, Moscow Region, Russia 141700
Keywords: стохастическая выпуклая оптимизация, оценки вероятностей больших уклонений, метод зеркального спуска, параллельные алгоритмы, stochastic convex optimization, probability of large deviation, mirror descent, parallel algorithm

Abstract

In this paper, convex stochastic optimization problems under different assumptions on the properties of the available stochastic subgradients are considered. It is known that if a value of the objective function is available, one can obtain, in parallel, several independent approximate solutions in terms of the objective residual expectation. Then, choosing a solution with the minimum function value, one can control the probability of large deviations of the objective residual. On the contrary, in this short paper we address the situation when the objective function value is unavailable or is too expensive to calculate. Under the «light-tail» assumption for stochastic subgradients and in the general case with a moderate probability of large deviations, it is shown that parallelization combined with averaging gives bounds for the probability of large deviations similar to those of a serial method. Thus, in these cases one can benefit from parallel computations and reduce the computational time without any loss in the solution quality.