tidy3d.plugins.invdes.AdamOptimizer#
- class AdamOptimizer[source]#
Bases:
AbstractOptimizerSpecification for an optimization.
- Parameters:
design (Union[
InverseDesign,InverseDesignMulti]) – Specification describing the inverse design problem we wish to optimize.learning_rate (PositiveFloat) – Step size for the gradient descent optimizer.
maximize (bool = True) – If
True, the optimizer will maximize the objective function. IfFalse, the optimizer will minimize the objective function.num_steps (PositiveInt) – Number of steps in the gradient descent optimizer.
results_cache_fname (Optional[str] = None) – If specified, will save the optimization state to a local
.pklfile usingdill.dump(). This file stores anInverseDesignResultcorresponding to the latest state of the optimization. To continue this run from the file using the same optimizer instance, calloptimizer.complete_run_from_history(). Alternatively, the latest results can then be loaded withtd.InverseDesignResult.from_file(fname)and then continued usingoptimizer.continue_run(result).store_full_results (bool = True) – If
True, stores the full history for the vector fields, specifically the gradient, params, and optimizer state. For large design regions and many iterations, storing the full history of these fields can lead to large file size and memory usage. In some cases, we recommend setting this field toFalse, which will only store the last computed state of these variables.beta1 (float = 0.9) – Beta 1 parameter in the Adam optimization method.
beta2 (float = 0.999) – Beta 2 parameter in the Adam optimization method.
eps (PositiveFloat = 1e-08) – Epsilon parameter in the Adam optimization method.
Attributes
Methods
initial_state(parameters)initial state of the optimizer
update(parameters, gradient[, state])- beta1#
- beta2#
- eps#