Research Article

An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent

Algorithm 1

: Adam.
Input:
Output:
(1)  Initialize parameters (, ,  = 0,  = 0, t = 0)
(2)   For t = 1 to T do
(3)     Get a stochastic gradient objective at time step t:
(4)     Update biased first-order moment estimation:
(5)     Update biased second-order moment estimation:
(6)     Get bias-corrected first-order moment estimation:
(7)     Get bias-corrected second-order moment estimation:
(8)     Update the parameter:
(9)    End For
(10)  Return