tidy3d.plugins.invdes.AdamOptimizer#
- class AdamOptimizer[source]#
Bases:
AbstractOptimizer
Specification for an optimization.
- Parameters:
attrs (dict = {}) – Dictionary storing arbitrary metadata for a Tidy3D object. This dictionary can be freely used by the user for storing data without affecting the operation of Tidy3D as it is not used internally. Note that, unlike regular Tidy3D fields,
attrs
are mutable. For example, the following is allowed for setting anattr
obj.attrs['foo'] = bar
. Also note that Tidy3D` will raise aTypeError
ifattrs
contain objects that can not be serialized. One can check ifattrs
are serializable by callingobj.json()
.design (Union[InverseDesign, InverseDesignMulti]) – Specification describing the inverse design problem we wish to optimize.
learning_rate (NonNegativeFloat) – Step size for the gradient descent optimizer.
maximize (bool = True) – If
True
, the optimizer will maximize the objective function. IfFalse
, the optimizer will minimize the objective function.num_steps (PositiveInt) – Number of steps in the gradient descent optimizer.
results_cache_fname (Optional[str] = None) – If specified, will save the optimization state to a local
.pkl
file usingdill.dump()
. This file stores anInverseDesignResult
corresponding to the latest state of the optimization. To continue this run from the file using the same optimizer instance, calloptimizer.complete_run_from_history()
. Alternatively, the latest results can then be loaded withtd.InverseDesignResult.from_file(fname)
and then continued usingoptimizer.continue_run(result)
.store_full_results (bool = True) – If
True
, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field toFalse
, which will only store the last computed state of these variables.beta1 (float = 0.9) – Beta 1 parameter in the Adam optimization method.
beta2 (float = 0.999) – Beta 2 parameter in the Adam optimization method.
eps (float = 1e-08) – Epsilon parameter in the Adam optimization method.
Attributes
design
Methods
initial_state
(parameters)initial state of the optimizer
update
(parameters, gradient[, state])Inherited Common Usage
- beta1#
- beta2#
- eps#
- __hash__()#
Hash method.