Research Article
An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent
Table 1
Description of parameters of Adam algorithm and its improvement.
| Parameters | Description |
| | Learning rate | | Exponential decay rate of the first-order and second-order moment estimation, respectively | T, t | The maximum iterations and the current t time step, respectively | | Product of exponential decay rate of the first and second-order moment estimation at t time step, respectively, and | | The first-order moment vector at t time step | | The second-order moment vector at t time step | | Current gradient at t time step | | Adaptive coefficient | | Prediction gradient | | Random diagonal matrix at t time step | | The ith diagonal element of with independent identical Bernoulli distribution | | The parameter that needs to be optimized | | The sequence of the smooth convex loss function | | Global optimal position |
|
|