- Smoothstep function
- Specific algebraic functions
- .
The integral of any continuous, non-negative, "bump-shaped" function will be sigmoidal, thus the cumulative distribution functions for many common probability distributions are sigmoidal. One such example is the error function, which is related to the cumulative distribution function (CDF) of a normal distribution.
Many natural processes, such as those of complex system learning curves, exhibit a progression from small beginnings that accelerates and approaches a climax over time. When a specific mathematical model is lacking, a sigmoid function is often used.[2]
A wide variety of sigmoid functions have been used as the activation function of artificial neurons, including the logistic and hyperbolic tangent functions.
Activation Functions
The activation ops provide different types of nonlinearities for use in neural networks. These include smooth nonlinearities (sigmoid
, tanh
, elu
, softplus
, and softsign
), continuous but not everywhere differentiable functions (relu
, relu6
, crelu
and relu_x
), and random regularization (dropout
).
All activation ops apply componentwise, and produce a tensor of the same shape as the input tensor.