tidy3d.plugins.invdes.AdamOptimizer#

class AdamOptimizer[source]#

Bases: AbstractOptimizer

Specification for an optimization.

Parameters:
  • attrs (dict = {}) โ€“ Dictionary storing arbitrary metadata for a Tidy3D object. This dictionary can be freely used by the user for storing data without affecting the operation of Tidy3D as it is not used internally. Note that, unlike regular Tidy3D fields, attrs are mutable. For example, the following is allowed for setting an attr obj.attrs['foo'] = bar. Also note that Tidy3D` will raise a TypeError if attrs contain objects that can not be serialized. One can check if attrs are serializable by calling obj.json().

  • design (Union[InverseDesign, InverseDesignMulti]) โ€“ Specification describing the inverse design problem we wish to optimize.

  • learning_rate (PositiveFloat) โ€“ Step size for the gradient descent optimizer.

  • maximize (bool = True) โ€“ If True, the optimizer will maximize the objective function. If False, the optimizer will minimize the objective function.

  • num_steps (PositiveInt) โ€“ Number of steps in the gradient descent optimizer.

  • results_cache_fname (Optional[str] = None) โ€“ If specified, will save the optimization state to a local .pkl file using dill.dump(). This file stores an InverseDesignResult corresponding to the latest state of the optimization. To continue this run from the file using the same optimizer instance, call optimizer.complete_run_from_history(). Alternatively, the latest results can then be loaded with td.InverseDesignResult.from_file(fname) and then continued using optimizer.continue_run(result).

  • store_full_results (bool = True) โ€“ If True, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field to False, which will only store the last computed state of these variables.

  • beta1 (ConstrainedFloatValue = 0.9) โ€“ Beta 1 parameter in the Adam optimization method.

  • beta2 (ConstrainedFloatValue = 0.999) โ€“ Beta 2 parameter in the Adam optimization method.

  • eps (PositiveFloat = 1e-08) โ€“ Epsilon parameter in the Adam optimization method.

Attributes

design

attrs

Methods

initial_state(parameters)

initial state of the optimizer

update(parameters,ย gradient[,ย state])

Inherited Common Usage

beta1#
beta2#
eps#
initial_state(parameters)[source]#

initial state of the optimizer

update(parameters, gradient, state=None)[source]#
__hash__()#

Hash method.