Skip to content

Question About Unexpected Loss in Lumerical INTERCONNECT Simulation #240

@mahdijavid

Description

@mahdijavid

I designed a simple circuit consisting of two grating couplers connected by a waveguide. In Lumerical simulation To analyze the time delay, I added oscilloscopes at the input and output of the circuit. While I can observe the expected delay in the simulation, the signal experiences a significant loss that seems unrealistic to me. I’m concerned this might not align with the physical behavior I’d expect from such a setup.
Could you please advise on what might be causing this excessive loss? I suspect it could be related to the component parameters, simulation settings, or perhaps an oversight in how I’ve configured the oscilloscopes or elements. Any guidance or suggestions you could offer would be incredibly helpful. I’d be happy to provide more details about my setup if needed.
Thank you very much for your time and expertise.
Best regards,

Image

Image

test002.zip

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions