Research Article

An Improvement of Stochastic Gradient Descent Approach for Mean-Variance Portfolio Optimization Problem

Algorithm 6

AdamSE algorithm.
Data: given the initial value , the number of samples , the step size , and the tolerance . Set .
Step 1: evaluate the augmented objective function from (16).
Step 2: compute the stochastic gradient from (20).
Step 3: set the random index .
Step 4: compute the decaying averages of past and past squared gradients from (22) and (23).
Step 5: calculate the bias-corrected first- and second-moment estimates from (24) and (25).
Step 6: calculate the standard error of the bias-corrected first-moment estimate from (33).
Step 7: update the vector from (34). If , then stop the iteration. Otherwise, set and repeat from Step 1.
Remark:
The default values for the decay rates are  = 0.9 and  = 0.999, and the smoothing term is , while the tolerance is , and the learning rate is  = 0.001 as the same as in the Adam algorithm.