losses
¶
The Losses submodule contains classes for computing common loss functions.
-
class
neuroptica.losses.
CategoricalCrossEntropy
[source]¶ Bases:
neuroptica.losses.Loss
Represents categorical cross entropy with a softmax layer implicitly applied to the outputs
-
static
L
(X: <MagicMock id='140414710017552'>, T: <MagicMock id='140414710034160'>) → <MagicMock id='140414710050768'>[source]¶ The scalar, real-valued loss function (vectorized over multiple X, T inputs) :param X: the output of the network :param T: the target output :return: loss function for each X
-
static
dL
(X: <MagicMock id='140414710075688'>, T: <MagicMock id='140414710088200'>) → <MagicMock id='140414710100712'>[source]¶ The derivative of the loss function dL/dX_L used for backpropagation (vectorized over multiple X) :param X: the output of the network :param T: the target output :return: dL/dX_L for each X
-
static