losses
¶
The Losses submodule contains classes for computing common loss functions.
-
class
neuroptica.losses.
CategoricalCrossEntropy
[source]¶ Bases:
neuroptica.losses.Loss
Represents categorical cross entropy with a softmax layer implicitly applied to the outputs
-
static
L
(X: <MagicMock id='139905356476488'>, T: <MagicMock id='139905356489000'>) → <MagicMock id='139905355981320'>[source]¶ The scalar, real-valued loss function (vectorized over multiple X, T inputs) :param X: the output of the network :param T: the target output :return: loss function for each X
-
static
dL
(X: <MagicMock id='139905355989792'>, T: <MagicMock id='139905356002304'>) → <MagicMock id='139905356018912'>[source]¶ The derivative of the loss function dL/dX_L used for backpropagation (vectorized over multiple X) :param X: the output of the network :param T: the target output :return: dL/dX_L for each X
-
static