Research Article

An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent

Algorithm 3

: ACGB-Adam.
Input:
Output:
(1) Initialize parameters (adaptive coefficient , predicted gradient , and the remaining parameters were initialized in the same way as in the Algorithm 1)
(2)  For t = 1 to T do
(3)    Generate a random diagonal matrix / Gradient Calculation based on Algorithm 2-RBC/
(4)    Get a stochastic gradient at time step t:
(5)   Update the parameters according to the gradient descent method: /Composite Gradient Optimization /
(6)    Get a predicted stochastic gradient at time step t: / Optimization of the first moment estimation /
(7)    Update biased first-order moment estimation:
(8)    Update biased second-order moment estimation:
(9)    Compute bias-corrected first-order moment estimation:
(10)   Compute bias-corrected second-order moment estimation:
(11)   Update the parameters:
(12)  End For
(13) Return