<Reference List> | |
Type: | Preprint |
National /International: | International |
Title: | A weak tail-bound probabilistic condition for function estimation in stochastic derivative-free optimization |
Publication Date: | 2022-12-06 |
Authors: |
- Francesco Rinaldi
- Luís Nunes Vicente - Damiano Zeffiro |
Abstract: | Using tail bounds, we introduce a new probabilistic condition for function estimation in stochastic derivative-free optimization which leads to a reduction in the number of samples and eases algorithmic analyses. Moreover, we develop simple stochastic direct-search and trust-region methods for the optimization of a potentially non-smooth function whose values can only be estimated via stochastic observations. For trial points to be accepted, these algorithms require the estimated function values to yield a sufficient decrease measured in terms of a power larger than 1 of the algoritmic stepsize. Our new tail-bound condition is precisely imposed on the reduction estimate used to achieve such a sufficient decrease. This condition allows us to select the stepsize power used for sufficient decrease in such a way to reduce the number of samples needed per iteration. In previous works, the number of samples necessary for global convergence at every iteration k of this type of algorithms was O(Δk−4), where Δk is the stepsize or trust region radius. However, using the new tail-bound condition, and under mild assumptions on the noise, one can prove that such a number of samples is only O(Δk−2−ε), where ε > 0 can be made arbitrarily small by selecting the power of the stepsize in the sufficient decrease test arbitrarily close to 1. The global convergence properties of the stochastic direct-search and trust-region algorithms are established under the new tail bound condition. |
Institution: | DMUC 22-43 |
Online version: | http://www.mat.uc.pt...prints/eng_2022.html |
Download: | Not available |