Skip to content

Commit efeea8f

Browse files
authored
Add adabelief to the readme. (#210)
1 parent 9c72aa0 commit efeea8f

File tree

4 files changed

+36
-0
lines changed

4 files changed

+36
-0
lines changed

README.rst

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -60,6 +60,9 @@ Supported Optimizers
6060
| `AccSGD`_ | https://arxiv.org/abs/1803.05591 |
6161
+-------------+-------------------------------------------------------------------------------+
6262
| | |
63+
| `AdaBelief`_| https://arxiv.org/abs/2010.07468 |
64+
+-------------+-------------------------------------------------------------------------------+
65+
| | |
6366
| `AdaBound`_ | https://arxiv.org/abs/1902.09843 |
6467
+-------------+-------------------------------------------------------------------------------+
6568
| | |
@@ -261,6 +264,38 @@ AccSGD
261264

262265
**Reference Code**: https://github.com/rahulkidambi/AccSGD
263266

267+
268+
AdaBelief
269+
---------
270+
271+
+-------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
272+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_AdaBelief.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_AdaBelief.png |
273+
+-------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
274+
275+
.. code:: python
276+
277+
import torch_optimizer as optim
278+
279+
# model = ...
280+
optimizer = optim.AdaBelief(
281+
m.parameters(),
282+
lr= 1e-3,
283+
betas=(0.9, 0.999),
284+
eps=1e-3,
285+
weight_decay=0,
286+
amsgrad=False,
287+
weight_decouple=False,
288+
fixed_decay=False,
289+
rectify=False,
290+
)
291+
optimizer.step()
292+
293+
294+
**Paper**: *AdaBelief Optimizer, adapting stepsizes by the belief in observed gradients* (2020) [https://arxiv.org/abs/2010.07468]
295+
296+
**Reference Code**: https://github.com/juntang-zhuang/Adabelief-Optimizer
297+
298+
264299
AdaBound
265300
--------
266301

docs/rastrigin_AdaBelief.png

726 KB
Loading

docs/rosenbrock_AdaBelief.png

453 KB
Loading

examples/viz_optimizers.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -191,6 +191,7 @@ def LookaheadYogi(*a, **kw):
191191
(optim.A2GradUni, -8, 0.1),
192192
(optim.A2GradInc, -8, 0.1),
193193
(optim.A2GradExp, -8, 0.1),
194+
(optim.AdaBelief, -8, 0.1),
194195
]
195196
execute_experiments(
196197
optimizers,

0 commit comments

Comments
 (0)