Machine learning-based fabrication constraints for inverse design using PreFab

Machine learning-based fabrication constraints for inverse design using PreFab#

This notebook demonstrates how to apply advanced fabrication constraint correction to photonic devices generated through inverse design. We’ll show the integration of Tidy3D with PreFab, a python-based tool that uses machine learning to correct problematic device features, which leads to a more robust improvement of the device when fabricated.

To install the jax module required for this feature, we recommend running pip install "tidy3d[jax]".

We build on the approach detailed in the previous notebook on the inverse design of a compact grating coupler. In that notebook, we include a feature size filter and penalty to achieve a higher-performing device in simulation. In this notebook, we use PreFab’s machine learning (ML) capabilities on a device designed without feature size penalties and correct any resulting fabrication variations of the fine features post-optimization. The outcome is a design that is optimized for high on-chip performance by using the advanced design capabilities provided by Tidy3D’s adjoint plugin and PreFab’s nanofabrication prediction Python package.

PreFab uses hundreds of design patterns, encompassing a wide array of feature types and distributions (similar to those found in inverse-designed devices) to develop a comprehensive model of the nanofabrication process specific to a given foundry. This model predicts the fabrication process, enabling it to identify and correct any deviations (e.g., corner rounding, erosion, dilation, feature loss) that might occur. Consequently, creates designs that are not only optimized for superior performance but are also resilient to the inconsistencies inherent in the fabrication process. The image below illustrates a sample randomized pattern, its predicted fabrication outcome, the actual fabrication result, and the subsequent corrections made. In this notebook, this methodology will be applied to a pre-optimized, fine-featured grating coupler inverse design, showcasing the advantages of integrating PreFab into the design workflow.

Note that PreFab models are continuously evolving, with enhancements and updates anticipated regularly. To delve deeper into the details of ML-driven nanofabrication prediction and to remain informed on the latest developments, visit PreFab’s website and GitHub repository.

If you are new to the finite-difference time-domain (FDTD) method, we highly recommend going through our FDTD101 tutorials. FDTD simulations can diverge due to various reasons. If you run into any simulation divergence issues, please follow the steps outlined in our troubleshooting guide to resolve it.

We start by importing our typical python packages.

[1]:
# Standard python imports.
import numpy as np
import matplotlib.pylab as plt

# Import regular tidy3d.
import tidy3d as td
import tidy3d.web as web

Set up PreFab#

PreFab is a Python package that employs deep learning to predict and correct for fabrication-induced structural variations in integrated photonic devices. This virtual nanofabrication environment provides crucial insights into nanofabrication processes, thereby helping improve the precision of device designs.

This becomes particularly important for inverse-designed devices such as this grating coupler, which often feature many small, intricate features. These complex features can be significantly affected by the slightest variations in the fabrication process.

In this demonstration, we’ll use PreFab to predict and correct the fabrication-induced variations in the final grating coupler design. We’ll also use the stochastic uncertainty inherent in the prediction to evaluate the design’s robustness, both pre and post-correction. This step ensures the design withstands the natural variability of the nanofabrication process, thereby boosting the reliability and expected performance.

The following terms are used throughout the rest of the notebook:

  • Prediction: The process of predicting the structural variations in the design due to the fabrication process.

  • Correction: The process of correcting the design to compensate for the predicted structural variations.

  • Outcome: The prediction of the corrected design.

  • (Un)Constrained: We analyze the prefab corrections on previously optimized grating couplers. Whether a design is “constrained” or “unconstrained” refers to whether or not we applied feature size penalties (constraints) to the optimization model.

Below is an example of a simple target design, its predicted structure after fabrication, the corrected design, and the predicted structure after fabrication of the correction (outcome). With PreFab, the Intersect over Union (IoU) between the predicted and the nominal design starts at IoU = 0.65. After applying corrections, the IoU between the outcome and the nominal design rises to IoU = 0.97.

PreFab Target Example

Here is another example with a more complex geometry, including the fabricated results, showing good agreement with the corrected model.

PreFab Intro

We will apply these same benefits to our grating coupler design.

First, install the PreFab Python package.

[2]:
%pip install prefab
Requirement already satisfied: prefab in /opt/homebrew/lib/python3.11/site-packages (0.5.0)
Requirement already satisfied: gdspy in /opt/homebrew/lib/python3.11/site-packages (from prefab) (1.6.13)
Requirement already satisfied: matplotlib in /opt/homebrew/lib/python3.11/site-packages (from prefab) (3.7.0)
Requirement already satisfied: numpy in /opt/homebrew/lib/python3.11/site-packages (from prefab) (1.23.5)
Requirement already satisfied: opencv-python-headless in /opt/homebrew/lib/python3.11/site-packages (from prefab) (4.9.0.80)
Requirement already satisfied: requests in /opt/homebrew/lib/python3.11/site-packages (from prefab) (2.31.0)
Requirement already satisfied: contourpy>=1.0.1 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (1.0.7)
Requirement already satisfied: cycler>=0.10 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (4.38.0)
Requirement already satisfied: kiwisolver>=1.0.1 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (1.4.4)
Requirement already satisfied: packaging>=20.0 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (23.0)
Requirement already satisfied: pillow>=6.2.0 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (9.5.0)
Requirement already satisfied: pyparsing>=2.3.1 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7 in /opt/homebrew/lib/python3.11/site-packages (from matplotlib->prefab) (2.8.2)
Requirement already satisfied: charset-normalizer<4,>=2 in /opt/homebrew/lib/python3.11/site-packages (from requests->prefab) (3.0.1)
Requirement already satisfied: idna<4,>=2.5 in /opt/homebrew/lib/python3.11/site-packages (from requests->prefab) (3.4)
Requirement already satisfied: urllib3<3,>=1.21.1 in /opt/homebrew/lib/python3.11/site-packages (from requests->prefab) (1.26.14)
Requirement already satisfied: certifi>=2017.4.17 in /opt/homebrew/lib/python3.11/site-packages (from requests->prefab) (2022.12.7)
Requirement already satisfied: six>=1.5 in /opt/homebrew/lib/python3.11/site-packages (from python-dateutil>=2.7->matplotlib->prefab) (1.16.0)
Note: you may need to restart the kernel to use updated packages.

PreFab models operate on a serverless cloud platform. To initiate prediction requests, you must first create an account.

[3]:
import webbrowser

_ = webbrowser.open("https://www.prefabphotonics.com/signup")

To associate your account, a token is required. This action will prompt a browser window to open, allowing you to log in and validate your token.

[4]:
!python3 -m prefab setup
Started token authentication flow on the web browser...
Token verified!
Token successfully stored in /Users/twhughes/.prefab.toml

Load starting designs#

The pre-optimized device is loaded from a GDS file included in misc/, showcasing numerous intricate features that stand in contrast to those in the previous notebook. Ideally, we should include the waveguide at this stage due to potential interface variations. However, for the sake of this demonstration, we’ll simplify the process.

First, let’s set some global variables defining where the files will be stored.

[5]:
# gds file storing original design, and where we'll write the final design in a new cell
GDS_FILE = "misc/prefab_gc.gds"
GDS_CELL_START = "gc"
GDS_CELL_FINAL = "gc_tidy_prefab"

# base tidy3d.Simulation (without grating coupler)
SIM_BASE_FILE = "misc/prefab_base_sim.hdf5"

The hdf5 file stores a base td.Simulation with no grating coupler added. We’ll use this as the starting point for our analysis.

The grating coupler structure converts a vertically incident Gaussian-like mode from an optical fiber into a guided mode and then funnels it into the \(Si\) waveguide. We are considering a full-etched grating structure, so a \(SiO_{2}\) BOX layer is included. To reduce backreflection, we adjusted the fiber tilt angle to \(10^{\circ}\) [1, 2].

Let’s visualize it below.

[6]:
# load the base simulation (no grating coupler)
sim_base = td.Simulation.from_file(SIM_BASE_FILE)

sim_base.plot_3d()

The gds file stores our starting device, which was obtained from the grating coupler inverse design notebook with no extra fabrication penalty included.

[7]:
import prefab as pf

device = pf.load_device_gds(path=GDS_FILE, cell_name=GDS_CELL_START)

plt.imshow(device)
plt.title("Nominal Device")
plt.ylabel("y (nm)")
plt.xlabel("x (nm)")
plt.show()
../_images/notebooks_AdjointPlugin14PreFab_16_0.png

We can combine the base simulation and the device design with the following function, which takes a device array, constructs a td.Structure and adds it to a copy of the base Simulation.

[8]:
def make_simulation(device: np.ndarray) -> td.Simulation:
    """Add a grating coupler from a given device array."""

    # grab some material and geometric parameters from the base simulation and waveguide
    waveguide = sim_base.structures[0]
    eps_min = sim_base.medium.permittivity
    eps_max = waveguide.medium.permittivity
    w_thick = waveguide.geometry.size[2]

    # construct the grating coupler out of the parameters
    eps_values = eps_min + (eps_max - eps_min) * device
    dev_width = device.shape[1] / 1000
    dev_height = device.shape[0] / 1000
    Nx, Ny = eps_values.shape
    X = np.linspace(-dev_width / 2, dev_width / 2, Nx)
    Y = np.linspace(-dev_height / 2, dev_height / 2, Ny)
    Z = np.array([0])
    eps_array = td.SpatialDataArray(
        np.expand_dims(eps_values, axis=-1), coords=dict(x=X, y=Y, z=Z)
    )
    gc = td.Structure(
        geometry=td.Box(center=(0, 0, 0), size=(td.inf, td.inf, w_thick)),
        medium=td.CustomMedium.from_eps_raw(eps_array),
    )

    # return a copy of the base simulation with the grating coupler added (make sure it's added FIRST as it overwrites others)
    all_structures = [gc] + list(sim_base.structures)

    return sim_base.updated_copy(structures=all_structures)

Let’s test this function out and view our starting, un-corrected device.

[9]:
sim = make_simulation(device)
[10]:
ax = sim.plot_eps(z=0, monitor_alpha=0.0)
../_images/notebooks_AdjointPlugin14PreFab_21_0.png

note: the orange box indicates a symmetry region.

Apply PreFab models#

We’re now ready to predict, correct, and anticipate the final outcome of the device using a model based on Applied Nanotools’ silicon photonics process. The prediction will take a few seconds to complete. The correction process will take a minute or two.

[11]:
MODEL_NAME = "ANT_NanoSOI"
MODEL_TAGS = "v5-d4"

prediction = pf.predict(
    device=device, model_name=MODEL_NAME, model_tags=MODEL_TAGS, binarize=False
)
[12]:
correction = pf.correct(
    device=device,
    model_name=MODEL_NAME,
    model_tags=MODEL_TAGS,
    binarize=True,
    multi_correct=True,
)
outcome = pf.predict(
    device=correction, model_name=MODEL_NAME, model_tags=MODEL_TAGS, binarize=False
)

Now we plot the predictions and corrections. Upon a closer look at the device’s variations, we see several fuzzy areas around the edges of the prediction. These fuzzy spots represent areas of uncertainty in the design and the expected variance on the chip, especially in smaller, complex features. The prediction also shows many rounded corners, bridged gaps, and filled holes, indicating further changes during fabrication.

[13]:
xs, ys, zoom_size = 2000, 2000, 1000
fig, axs = plt.subplots(2, 4, figsize=(20, 10))
titles = ["Nominal", "Prediction", "Correction", "Outcome"]
data = [device, prediction, correction, outcome]

for i, (title, datum) in enumerate(zip(titles, data)):
    axs[0, i].imshow(datum)
    axs[0, i].set_title(title)
    axs[1, i].set_xlabel("x (nm)")
    axs[0, 0].set_ylabel("y (nm)")
    axs[1, 0].set_ylabel("y (nm)")

    axs[1, i].imshow(datum[ys : ys + zoom_size, xs : xs + zoom_size])
    axs[1, i].set_title(f"{title} Zoomed")

plt.show()
../_images/notebooks_AdjointPlugin14PreFab_28_0.png

Below, the images provide a visualization of prediction binarizations at different levels of uncertainty. Notably, binarization at a 50% threshold has the highest probability of occurrence, with the probability decreasing as the threshold moves towards 0% or 100%. By thresholding the raw prediction output, we can see the various potential variations in the design. The magenta contour overlaid on these images serves as a reference to the original design.

While we can mitigate this uncertainty somewhat by applying corrections to create larger features, some uncertainty will inevitably remain. In this case, the prediction of the correction (outcome) shows a near-complete restoration, which is quite promising.

[14]:
def plot_binarizations(data: np.ndarray, title: str) -> None:
    """Plot the device along with a binarization threshold contour."""
    _, axs = plt.subplots(1, 4, figsize=(20, 5))
    for i, eta in enumerate([None, 0.5, 0.3, 0.7]):
        if eta is None:
            img = data[ys : ys + zoom_size, xs : xs + zoom_size]
            axs[i].set_title("Raw " + title)
        else:
            img = pf.binarize_hard(
                data[ys : ys + zoom_size, xs : xs + zoom_size], eta=eta
            )
            axs[i].set_title(f"Binarized {title} ({int(eta*100)}% Threshold)")
        axs[i].imshow(img)
        axs[i].imshow(
            pf.generate_device_contour(
                device[ys : ys + zoom_size, xs : xs + zoom_size]
            ),
            cmap="spring",
        )
        axs[i].set_xlabel("x (nm)")
        if i == 0:
            axs[i].set_ylabel("y (nm)")
    plt.show()


plot_binarizations(prediction, "Prediction")
plot_binarizations(outcome, "Outcome")
../_images/notebooks_AdjointPlugin14PreFab_30_0.png
../_images/notebooks_AdjointPlugin14PreFab_30_1.png

Test PreFab predictions in simulation#

Next, we will prepare the device variations for re-simulation. To understand the stochastic, or random, variations from one device to another, we will simulate the predictions at different binarization thresholds. This is somewhat akin to uniform erosion and dilation tests, but it is data-driven and varies depending on the feature. Consequently, we will observe less variance for larger features and more variance for smaller ones.

Next, we write a function to simulate a set of devices in parallel using tidy3d.web.Batch, which we’ll use to analyze the performance over various threshold values.

[15]:
def run_simulations(devices: list[np.ndarray], task_names: list[str]) -> td.web.BatchData:
    """Construct and run a set of simulations in a batch."""
    sims = {task_name: make_simulation(device) for device, task_name in zip(devices, task_names)}
    batch = web.Batch(simulations=sims)
    return batch.run(path_dir="data")
[16]:
etas = list(np.arange(0.2, 0.9, 0.1))

task_names = []
devices = []

# dev simulation
task_names.append("inv_des_gc_dev")
devices.append(device)

# predictions simulations (vs eta)
for eta in etas:
    task_names.append(f"inv_des_gc_pred_bin{int(eta*100)}")
    device_prediction = pf.binarize_hard(device=prediction, eta=eta)
    devices.append(device_prediction)

# outcome simulations (vs eta)
for eta in etas:
    task_names.append(f"inv_des_gc_out_bin{int(eta*100)}")
    device_outcome = pf.binarize_hard(device=outcome, eta=eta)
    devices.append(device_outcome)
[17]:
batch_data = run_simulations(devices=devices, task_names=task_names)
10:30:46 EST Created task 'inv_des_gc_dev' with task_id
             'fdve-0820cb60-9885-4dd9-a5ea-70ca0c44f575' and task_type 'FDTD'.
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:30:49 EST Created task 'inv_des_gc_pred_bin20' with task_id
             'fdve-319badaf-011e-497a-8577-a3d6d566fa3b' and task_type 'FDTD'.
10:30:52 EST Created task 'inv_des_gc_pred_bin30' with task_id
             'fdve-ba012d03-bae2-40f7-b7b6-07673b07e781' and task_type 'FDTD'.
10:30:55 EST Created task 'inv_des_gc_pred_bin40' with task_id
             'fdve-73156ea6-9eaa-4b95-8f99-9d8b39b649d1' and task_type 'FDTD'.
10:30:58 EST Created task 'inv_des_gc_pred_bin50' with task_id
             'fdve-fa2006aa-f94e-4369-ba90-d86358c76d0e' and task_type 'FDTD'.
10:31:01 EST Created task 'inv_des_gc_pred_bin60' with task_id
             'fdve-76a167f3-a5cb-429f-aca1-f8d7630176e0' and task_type 'FDTD'.
10:31:03 EST Created task 'inv_des_gc_pred_bin70' with task_id
             'fdve-30b2591c-9abb-4eae-99b2-6d79a069e8ad' and task_type 'FDTD'.
10:31:05 EST Created task 'inv_des_gc_pred_bin80' with task_id
             'fdve-de821ef4-1af9-4d1d-a29e-5f619d75b05e' and task_type 'FDTD'.
10:31:07 EST Created task 'inv_des_gc_out_bin20' with task_id
             'fdve-e9021243-ef77-458c-850c-23e199abac4c' and task_type 'FDTD'.
10:31:09 EST Created task 'inv_des_gc_out_bin30' with task_id
             'fdve-f85fe271-3f56-470d-9fc6-841da995b001' and task_type 'FDTD'.
10:31:11 EST Created task 'inv_des_gc_out_bin40' with task_id
             'fdve-adc6804d-846e-4dae-a26c-c8474fdabf36' and task_type 'FDTD'.
10:31:13 EST Created task 'inv_des_gc_out_bin50' with task_id
             'fdve-885a22ec-6236-481f-89ae-336eed4b213e' and task_type 'FDTD'.
10:31:15 EST Created task 'inv_des_gc_out_bin60' with task_id
             'fdve-93fde6d3-0aaa-491f-8521-0efb93d9a643' and task_type 'FDTD'.
10:31:17 EST Created task 'inv_des_gc_out_bin70' with task_id
             'fdve-4a46ab14-a016-4e0a-9b72-0cdb4618770c' and task_type 'FDTD'.
10:31:20 EST Created task 'inv_des_gc_out_bin80' with task_id
             'fdve-116475cc-046f-4395-8d36-611667a93381' and task_type 'FDTD'.
10:31:41 EST Started working on Batch.
10:34:03 EST Maximum FlexCredit cost: 2.893 for the whole batch.
             Use 'Batch.real_cost()' to get the billed FlexCredit cost after the
             Batch has completed.
10:34:08 EST Batch complete.
[18]:
# extract the various sim_data from the batch data output
sim_data_dev = batch_data["inv_des_gc_dev"]
sim_data_pred = {eta: batch_data[f"inv_des_gc_pred_bin{int(eta*100)}"] for eta in etas}
sim_data_out = {eta: batch_data[f"inv_des_gc_out_bin{int(eta*100)}"] for eta in etas}
10:34:21 EST loading simulation from
             data/fdve-0820cb60-9885-4dd9-a5ea-70ca0c44f575.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:34:39 EST loading simulation from
             data/fdve-319badaf-011e-497a-8577-a3d6d566fa3b.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:34:52 EST loading simulation from
             data/fdve-ba012d03-bae2-40f7-b7b6-07673b07e781.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:35:08 EST loading simulation from
             data/fdve-73156ea6-9eaa-4b95-8f99-9d8b39b649d1.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:35:14 EST loading simulation from
             data/fdve-fa2006aa-f94e-4369-ba90-d86358c76d0e.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:35:21 EST loading simulation from
             data/fdve-76a167f3-a5cb-429f-aca1-f8d7630176e0.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:35:49 EST loading simulation from
             data/fdve-30b2591c-9abb-4eae-99b2-6d79a069e8ad.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:36:05 EST loading simulation from
             data/fdve-de821ef4-1af9-4d1d-a29e-5f619d75b05e.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:36:24 EST loading simulation from
             data/fdve-e9021243-ef77-458c-850c-23e199abac4c.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:36:43 EST loading simulation from
             data/fdve-f85fe271-3f56-470d-9fc6-841da995b001.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:36:53 EST loading simulation from
             data/fdve-adc6804d-846e-4dae-a26c-c8474fdabf36.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:37:05 EST loading simulation from
             data/fdve-885a22ec-6236-481f-89ae-336eed4b213e.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:37:17 EST loading simulation from
             data/fdve-93fde6d3-0aaa-491f-8521-0efb93d9a643.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:37:30 EST loading simulation from
             data/fdve-4a46ab14-a016-4e0a-9b72-0cdb4618770c.hdf5
/opt/homebrew/lib/python3.11/site-packages/rich/live.py:229: UserWarning:
install "ipywidgets" for Jupyter support
  warnings.warn('install "ipywidgets" for Jupyter support')
10:37:40 EST loading simulation from
             data/fdve-116475cc-046f-4395-8d36-611667a93381.hdf5
[19]:
def calculate_loss(sim_data_dict: dict) -> dict:
    """Extract the loss (dB) from the simulation data results."""
    loss_db_dict = {}
    for eta, sim_data in sim_data_dict.items():
        mode_amps = sim_data["gc_efficiency"]
        coeffs_f = mode_amps.amps.sel(direction="-")
        power_0 = np.abs(coeffs_f.sel(mode_index=0)) ** 2
        power_0_db = 10 * np.log10(power_0)
        loss_db = max(power_0_db)
        loss_db_dict[eta] = loss_db
    return loss_db_dict


loss_db_dev = calculate_loss({0.5: sim_data_dev})
loss_db_pred = calculate_loss(sim_data_pred)
loss_db_out = calculate_loss(sim_data_out)

etas = list(loss_db_pred.keys())
etas_dev = [0.5]
losses_pred = [loss_db_pred[eta] for eta in etas]
losses_out = [loss_db_out[eta] for eta in etas]
losses_dev = [loss_db_dev[0.5]]
losses_orig = [-2.30]

plt.figure(figsize=(10, 6))
plt.plot(0.5, losses_orig[0], "r*", label="Nominal (Constrained)", markersize=10)
plt.plot(etas_dev, losses_dev, "*", label="Nominal (Unconstrained)", markersize=10)
plt.plot(etas, losses_pred, "s-", label="Prediction (Unconstrained) Without Correction")
plt.plot(etas, losses_out, "^-", label="Prediction (Unconstrained) With Correction")
plt.xlabel("Prediction Binarization Threshold (0.5 is most likely)")
plt.ylabel("Loss (dB)")
plt.title("Predicted Variance of Grating Coupler Loss")
plt.legend()
plt.grid(True)
plt.show()
../_images/notebooks_AdjointPlugin14PreFab_37_0.png

The optimization process without constraints has significantly enhanced performance, achieving a lower loss of -1.85 dB compared to the -2.30 dB observed in the previous notebook. However, when considering predicted variations, the performance of this new design slightly deteriorates to -2.34 dB. Nevertheless, by applying specific corrections, we demonstrate that the anticipated chip-level performance can be restored to -1.84 dB. Through the adjustment of the binarization threshold within the uncertainty range of the predictions, we are able to assess the expected variance between devices. This not only underscores the substantial advantages of PreFab correction but also deepens our comprehension of the fabrication process’s capabilities.

Use the following code block to export your predictions and corrections. This will write the refined design into a new cell in the original GDS file located in misc/.

[20]:
import gdstk

gds_library = gdstk.read_gds(infile=GDS_FILE)
device_cell = pf.device_to_cell(
    device=device,
    cell_name="gc_device",
    library=gds_library,
    layer=(1, 0),
)
prediction_cell = pf.device_to_cell(
    device=pf.binarize(prediction),
    cell_name="gc_prediction",
    library=gds_library,
    layer=(9, 0),
)
corrected_cell = pf.device_to_cell(
    device=correction,
    cell_name="gc_correction",
    library=gds_library,
    layer=(90, 0),
)
outcome_cell = pf.device_to_cell(
    device=pf.binarize(outcome),
    cell_name="gc_outcome",
    library=gds_library,
    layer=(800, 0),
)

gc_cell = gds_library.new_cell(GDS_CELL_FINAL)
origin = (-prediction.shape[1] / 2 / 1000, -prediction.shape[0] / 2 / 1000)
gds_library[GDS_CELL_FINAL].add(gdstk.Reference(cell=device_cell, origin=origin))
gds_library[GDS_CELL_FINAL].add(gdstk.Reference(cell=prediction_cell, origin=origin))
gds_library[GDS_CELL_FINAL].add(gdstk.Reference(cell=corrected_cell, origin=origin))
gds_library[GDS_CELL_FINAL].add(gdstk.Reference(cell=outcome_cell, origin=origin))
gds_library.write_gds(outfile=GDS_FILE, max_points=8190)

If you’re interested in learning more about PreFab, please visit the website and GitHub page. There, you’ll find more resources and examples to help you get the most out of the tools.