class mentpy.optimizers.AdamOpt(mentpy.optimizers.base.BaseOpt)

Class for the Adam optimizer.

Parameters:
step_size : float, optional

The step size of the optimizer, by default 0.1

b1 : float, optional

The first moment decay rate, by default 0.9

b2 : float, optional

The second moment decay rate, by default 0.999

eps : float, optional

A small number to avoid division by zero, by default 10**-8

Examples

Create an Adam optimizer

In [1]: opt = mp.optimizers.AdamOpt()

In [2]: print(opt)
<mentpy.optimizers.adam.AdamOpt object at 0x7f61b4675de0>

See also

mp.optimizers.SGDOpt

Constructors

AdamOpt(step_size=0.1, b1=0.9, b2=0.999, eps=1e-08)

Initialize the Adam optimizer.

Methods

optimize(f, x0, num_iters=100, callback=None, verbose=False, **)

Optimize a function f using the Adam optimizer.

optimize_and_gradient_norm(f, x0, num_iters=100, callback=None, ...)

Optimize a function f using the Adam optimizer.

reset()

Reset the optimizer.

step(f, x, i, **kwargs)

Take a step of the optimizer.

update_step_size(x, i, factor=0.99)

Update the step size of the optimizer.