tidy3d.plugins.invdes.AdamOptimizer#

class AdamOptimizer[source]#

Bases: AbstractOptimizer

Specification for an optimization.

Parameters:
  • attrs (dict = {}) – Dictionary storing arbitrary metadata for a Tidy3D object. This dictionary can be freely used by the user for storing data without affecting the operation of Tidy3D as it is not used internally. Note that, unlike regular Tidy3D fields, attrs are mutable. For example, the following is allowed for setting an attr obj.attrs['foo'] = bar. Also note that Tidy3D` will raise a TypeError if attrs contain objects that can not be serialized. One can check if attrs are serializable by calling obj.json().

  • design (Union[InverseDesign, InverseDesignMulti]) – Specification describing the inverse design problem we wish to optimize.

  • learning_rate (NonNegativeFloat) – Step size for the gradient descent optimizer.

  • maximize (bool = True) – If True, the optimizer will maximize the objective function. If False, the optimizer will minimize the objective function.

  • num_steps (PositiveInt) – Number of steps in the gradient descent optimizer.

  • results_cache_fname (Optional[str] = None) – If specified, will save the optimization state to a local .pkl file using dill.dump(). This file stores an InverseDesignResult corresponding to the latest state of the optimization. To continue this run from the file using the same optimizer instance, call optimizer.complete_run_from_history(). Alternatively, the latest results can then be loaded with td.InverseDesignResult.from_file(fname) and then continued using optimizer.continue_run(result).

  • store_full_results (bool = True) – If True, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field to False, which will only store the last computed state of these variables.

  • beta1 (float = 0.9) – Beta 1 parameter in the Adam optimization method.

  • beta2 (float = 0.999) – Beta 2 parameter in the Adam optimization method.

  • eps (float = 1e-08) – Epsilon parameter in the Adam optimization method.

Attributes

design

attrs

Methods

initial_state(parameters)

initial state of the optimizer

update(parameters, gradient[, state])

Inherited Common Usage

beta1#
beta2#
eps#
initial_state(parameters)[source]#

initial state of the optimizer

update(parameters, gradient, state=None)[source]#
__hash__()#

Hash method.