Enum-Klasse ActivationFunction

java.lang.Object
java.lang.Enum<ActivationFunction>
de.edux.functions.activation.ActivationFunction
Alle implementierten Schnittstellen:
Serializable, Comparable<ActivationFunction>, Constable

public enum ActivationFunction extends Enum<ActivationFunction>
Enumerates common activation functions used in neural networks and similar machine learning architectures.

Each member of this enum represents a distinct type of activation function, a critical component in neural networks. Activation functions determine the output of a neural network layer for a given set of input, and they help normalize the output of each neuron to a specific range, usually between 1 and -1 or between 1 and 0.

This enum simplifies the process of selecting and utilizing an activation function. It provides an abstraction where the user can easily switch between different functions, making it easier to experiment with neural network design. Additionally, each function includes a method for calculating its derivative, which is essential for backpropagation in neural network training.

Available functions include:

  • SIGMOID: Normalizes inputs between 0 and 1, crucial for binary classification.
  • RELU: Addresses the vanishing gradient problem, allowing for faster and more effective training.
  • LEAKY_RELU: Variation of RELU, prevents "dying neurons" by allowing a small gradient when the unit is not active.
  • TANH: Normalizes inputs between -1 and 1, a scaled version of the sigmoid function.
  • SOFTMAX: Converts a vector of raw scores to a probability distribution, typically used in multi-class classification.

Each function overrides the calculateActivation and calculateDerivative methods, providing the specific implementation for the activation and its derivative based on input. These are essential for the forward and backward passes through the network, respectively.

Note: The SOFTMAX function additionally overrides calculateActivation for an array input, facilitating its common use in output layers of neural networks for classification tasks.

  • Enum-Konstanten - Details

  • Methodendetails

    • values

      public static ActivationFunction[] values()
      Gibt ein Array mit den Konstanten dieser Enum-Klasse in der Reihenfolge ihrer Deklaration zurück.
      Gibt zurück:
      ein Array mit den Konstanten dieser Enum-Klasse in der Reihenfolge ihrer Deklaration
    • valueOf

      public static ActivationFunction valueOf(String name)
      Gibt die Enum-Konstante dieser Klasse mit dem angegebenen Namen zurück. Die Zeichenfolge muss exakt mit einer ID übereinstimmen, mit der eine Enum-Konstante in dieser Klasse deklariert wird. (Zusätzliche Leerzeichen sind nicht zulässig.)
      Parameter:
      name - Name der zurückzugebenden Enumerationskonstante.
      Gibt zurück:
      Enumerationskonstante mit dem angegebenen Namen
      Löst aus:
      IllegalArgumentException - wenn diese Enum-Klasse keine Konstante mit dem angegebenen Namen enthält
      NullPointerException - wenn das Argument nicht angegeben wird
    • calculateActivation

      public abstract double calculateActivation(double x)
    • calculateDerivative

      public abstract double calculateDerivative(double x)
    • calculateActivation

      public double[] calculateActivation(double[] x)