layers

The layers submodule contains functionality for implementing a logical “layer” in the simulated optical neural network. The API for this module is based loosely on Keras.

class neuroptica.layers.Activation(nonlinearity: neuroptica.nonlinearities.Nonlinearity)[source]

Bases: neuroptica.layers.NetworkLayer

Represents a (nonlinear) activation layer. Note that in this layer, the usage of X and Z are reversed! (Z is input, X is output, input for next linear layer)

__init__(nonlinearity: neuroptica.nonlinearities.Nonlinearity)[source]

Initialize the activation layer :param nonlinearity: a Nonlinearity instance

backward_pass(gamma: <MagicMock id='140414710167032'>) → <MagicMock id='140414710179488'>[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(Z: <MagicMock id='140414710133928'>) → <MagicMock id='140414710150480'>[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN

class neuroptica.layers.ClementsLayer(N: int, M=None, include_phase_shifter_layer=True, initializer=None)[source]

Bases: neuroptica.layers.OpticalMeshNetworkLayer

Performs a unitary NxM operator with MZIs arranged in a Clements decomposition. If M=N then the layer can perform any arbitrary unitary operator

__init__(N: int, M=None, include_phase_shifter_layer=True, initializer=None)[source]

Initialize the ClementsLayer :param N: number of input and output waveguides :param M: number of MZI columns; equal to N by default :param include_phase_shifter_layer: if true, include a layer of single-mode phase shifters at the beginning of the mesh (required to implement arbitrary unitary) :param initializer: optional initializer method (WIP)

backward_pass(delta: <MagicMock id='140414710279248'>, cache_fields=False, use_partial_vectors=False) → <MagicMock id='140414710299952'>[source]

Compute the backward pass :param delta: adjoint “output” electric fields backpropagated from the next ONN layer :param cache_fields: if true, fields are cached :param use_partial_vectors: if true, use partial vector method to speed up transfer matrix computations :return: adjoint “input” fields for previous ONN layer

forward_pass(X: <MagicMock id='140414710258320'>, cache_fields=False, use_partial_vectors=False) → <MagicMock id='140414710270832'>[source]

Compute the forward pass :param X: input electric fields :param cache_fields: if true, fields are cached :param use_partial_vectors: if true, use partial vector method to speed up transfer matrix computations :return: output fields for next ONN layer

class neuroptica.layers.DropMask(N: int, keep_ports=None, drop_ports=None)[source]

Bases: neuroptica.layers.NetworkLayer

Drop specified ports entirely, reducing the size of the network for the next layer.

__init__(N: int, keep_ports=None, drop_ports=None)[source]
Parameters:
  • N – number of input ports to the DropMask layer
  • keep_ports – list or iterable of which ports to keep (drop_ports must be None if keep_ports is specified)
  • drop_ports – list or iterable of which ports to drop (keep_ports must be None if drop_ports is specified)
backward_pass(delta: <MagicMock id='140414710587744'>) → <MagicMock id='140414710608392'>[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(X: <MagicMock id='140414710575288'>)[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN

class neuroptica.layers.NetworkLayer(input_size: int, output_size: int, initializer=None)[source]

Bases: object

Represents a logical layer in a simulated optical neural network. A NetworkLayer is different from a ComponentLayer, but it may contain a ComponentLayer or an OpticalMesh to compute the forward and backward logic.

__init__(input_size: int, output_size: int, initializer=None)[source]

Initialize the NetworkLayer :param input_size: number of input ports :param output_size: number of output ports (usually the same as input_size, unless DropMask is used) :param initializer: optional initializer method (WIP)

__weakref__

list of weak references to the object (if defined)

backward_pass(delta: <MagicMock id='140414710542120'>) → <MagicMock id='140414710554576'>[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(X: <MagicMock id='140414712625808'>) → <MagicMock id='140414710529664'>[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN

class neuroptica.layers.OpticalMeshNetworkLayer(input_size: int, output_size: int, initializer=None)[source]

Bases: neuroptica.layers.NetworkLayer

Base class for any network layer consisting of an optical mesh of phase shifters and MZIs

__init__(input_size: int, output_size: int, initializer=None)[source]

Initialize the OpticalMeshNetworkLayer :param input_size: number of input waveguides :param output_size: number of output waveguides :param initializer: optional initializer method (WIP)

backward_pass(delta: <MagicMock id='140414710229144'>, cache_fields=False, use_partial_vectors=False) → <MagicMock id='140414710237504'>[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(X: <MagicMock id='140414710187848'>, cache_fields=False, use_partial_vectors=False) → <MagicMock id='140414710216688'>[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN

class neuroptica.layers.ReckLayer(N: int, include_phase_shifter_layer=True, initializer=None)[source]

Bases: neuroptica.layers.OpticalMeshNetworkLayer

Performs a unitary NxN operator with MZIs arranged in a Reck decomposition

__init__(N: int, include_phase_shifter_layer=True, initializer=None)[source]

Initialize the ReckLayer :param N: number of input and output waveguides :param include_phase_shifter_layer: if true, include a layer of single-mode phase shifters at the beginning of the mesh (required to implement arbitrary unitary) :param initializer: optional initializer method (WIP)

backward_pass(delta: <MagicMock id='140414710349952'>) → <MagicMock id='140414710366560'>[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(X: <MagicMock id='140414710312576'>) → <MagicMock id='140414710325088'>[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN

class neuroptica.layers.StaticMatrix(matrix: <MagicMock id='140414710616752'>)[source]

Bases: neuroptica.layers.NetworkLayer

Multiplies inputs by a static matrix (this is an aphysical layer)

__init__(matrix: <MagicMock id='140414710616752'>)[source]
Parameters:matrix – matrix to multiply inputs by
backward_pass(delta: <MagicMock id='140414710649856'>)[source]

Compute the backward (adjoint) pass, given a backward-propagating field shined into the layer from the outputs :param delta: backward-propagating field shining into the NetworkLayer outputs :return: transformed “input” fields to feed to the previous layer of the ONN

forward_pass(X: <MagicMock id='140414710633304'>)[source]

Compute the forward pass of input fields into the network layer :param X: input fields to the NetworkLayer :return: transformed output fields to feed into the next layer of the ONN