Comparison-Based Algorithms for One-Dimensional Stochastic Convex Optimization

  • Chen X
  • Lin Q
  • Wang Z
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Stochastic optimization finds a wide range of applications in operations research and management science. However, existing stochastic optimization techniques usually require the information of random samples (e.g., demands in the newsvendor problem) or the objective values at the sampled points (e.g., the lost sales cost), which might not be available in practice. In this paper, we consider a new setup for stochastic optimization, in which the decision maker has access to only comparative information between a random sample and two chosen decision points in each iteration. We propose a comparison-based algorithm (CBA) to solve such problems in one dimension with convex objective functions. Particularly, the CBA properly chooses the two points in each iteration and constructs an unbiased gradient estimate for the original problem. We show that the CBA achieves the same convergence rate as the optimal stochastic gradient methods (with the samples observed). We also consider extensions of our approach to multidimensional quadratic problems as well as problems with nonconvex objective functions. Numerical experiments show that the CBA performs well in test problems.

Cite

CITATION STYLE

APA

Chen, X., Lin, Q., & Wang, Z. (2020). Comparison-Based Algorithms for One-Dimensional Stochastic Convex Optimization. INFORMS Journal on Optimization, 2(1), 34–56. https://doi.org/10.1287/ijoo.2019.0022

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free