Interface Loss


public interface Loss
Provides computation and (partial) derivation of popular activation functions and cross-entropy loss functions.
Author:
Emmanouil Krasanakis
  • Method Details

    • sigmoid

      static double sigmoid(double x)
      The sigmoid function 1/(1+exp(-x)).
      Parameters:
      x - The activation of the sigmoid function.
      Returns:
      The sigmoid value.
      See Also:
    • tanh

      static double tanh(double x)
      The tanh activation (exp(x)-exp(-x))/(exp(x)+exp(-x))
      Parameters:
      x - The activation of the tanh function.
      Returns:
      The tanh value.
      See Also:
    • relu

      static double relu(double x)
      The relu activation x if x > 0, 0 otherwise
      Parameters:
      x - The activation of the relu function.
      Returns:
      The relu value.
      See Also:
    • sigmoidDerivative

      static double sigmoidDerivative(double x)
      The derivative of the sigmoid(double) function.
      Parameters:
      x - The activation of the sigmoid function.
      Returns:
      The sigmoid derivative's value.
      See Also:
    • tanhDerivative

      static double tanhDerivative(double x)
      The derivative of the tanh(double) function.
      Parameters:
      x - The activation of the tanh function.
      Returns:
      The tanh derivative's value.
      See Also:
    • reluDerivative

      static double reluDerivative(double x)
      The derivative of the relu(double) function.
      Parameters:
      x - The activation of the relu function.
      Returns:
      The relu derivative's value.
      See Also:
    • crossEntropy

      static double crossEntropy(double output, double label)
      A cross entropy loss for one sample computes as -label*log(output) -(1-label)*log(1-output). To avoid producing invalid values, an eps of 1.E-12 is used to constraint the cross entropy in the range [-12, 12].
      Parameters:
      output - The output of a prediction task. Should lie in the range [0,1]
      label - The desired label of the prediction task. Should assume binary values 0 or 1
      Returns:
      The cross entropy value.
      Throws:
      IllegalArgumentException - If outputs out of the range [0,1] or labels are non-binary.
    • crossEntropyDerivative

      static double crossEntropyDerivative(double output, double label)
      The derivative of the crossEntropy(double, double) loss. To avoid producing invalid values, an eps of 1.E-12 is used to constraint the cross entropy in the range [-12, 12], which results to this derivative being constrained in the range [-1.E12, 1.E12].
      Parameters:
      output - The output of a prediction task. Should lie in the range [0,1]
      label - The desired label of the prediction task. Should assume binary values 0 or 1
      Returns:
      The cross entropy derivative's value.
      Throws:
      IllegalArgumentException - If outputs out of the range [0,1] or labels are non-binary.
    • crossEntropyDerivativeCategorical

      static double crossEntropyDerivativeCategorical(double output, double label)
      The derivative of the #crossEntropyCategorical(double, double) loss. To avoid producing invalid values, an eps of 1.E-12 is used to constraint the cross entropy in the range [-12, 12], which results to this derivative being constrained in the range [-1.E12, 1.E12].
      Parameters:
      output - The output of a prediction task. Should lie in the range [0,1]
      label - The desired label of the prediction task. Should assume binary values 0 or 1
      Returns:
      The cross entropy derivative's value.
      Throws:
      IllegalArgumentException - If outputs out of the range [0,1] or labels are non-binary.
    • crossEntropySigmoidDerivative

      static double crossEntropySigmoidDerivative(double x, double label)
      The derivative of crossEntropy(sigmoid(x), label) with respect to x. This function can avoid using an eps and is hence more precise than the expression crossEntropyDerivative(sigmoid(x), label)*sigmoidDerivative(x).
      Parameters:
      x - The activation of the sigmoid function.
      label - The desired label of the prediction task. Should assume binary values 0 or 1
      Returns:
      The cross entropy partial derivative with respect to the activation passed to an intermediate sigmoid transformation.
      Throws:
      IllegalArgumentException - If labels are non-binary.
    • crossEntropyTanhDerivative

      static double crossEntropyTanhDerivative(double x, double label)
      The derivative of crossEntropy(tanh(x), label) with respect to x. This function calculates crossEntropyDerivative(tanh(x), label)*tanhDerivative(x).
      Parameters:
      x - The activation of the tanh function.
      label - The desired label of the prediction task. Should assume binary values 0 or 1
      Returns:
      The cross entropy partial derivative with respect to the activation passed to an intermediate tanh transformation.
    • sigmoid

      static Tensor sigmoid(Tensor x)
      Applies sigmoid(double) element-by-element.
      Parameters:
      x - The activation tensor of the sigmoid function.
      Returns:
      The tensor of sigmoid values.
    • tanh

      static Tensor tanh(Tensor x)
      Applies tanh(double) element-by-element.
      Parameters:
      x - The activation tensor of the tanh function.
      Returns:
      The tensor of tanh values.
    • relu

      static Tensor relu(Tensor x)
      Applies relu(double) element-by-element.
      Parameters:
      x - The activation tensor of the relu function.
      Returns:
      The tensor of relu values.
    • sigmoidDerivative

      static Tensor sigmoidDerivative(Tensor x)
      Applies sigmoidDerivative(double) function.
      Parameters:
      x - The activation tensor of the sigmoid function.
      Returns:
      The tensor of sigmoid derivative values.
    • tanhDerivative

      static Tensor tanhDerivative(Tensor x)
      Applies tanhDerivative(double) function.
      Parameters:
      x - The activation tensor of the tanh function.
      Returns:
      The tensor of tanh derivative values.
    • reluDerivative

      static Tensor reluDerivative(Tensor x)
      Applies reluDerivative(double) function.
      Parameters:
      x - The activation tensor of the relu function.
      Returns:
      The tensor of relu derivative values.