# Custom Activation Function in Neural Network?

GROUPS:
 Hello! Is it possible to set user defined activation function for a layer in neural network? If this feature is not yet part of WL, are there any plans for such an addition as it would be very helpful.
11 days ago
5 Replies
 Yihe Dong 2 Votes ElementwiseLayer[f] allows you to define a layer with a custom activation function f applied to each element of the input:
11 days ago
 Sebastian Bodenstein 1 Vote @narendra : out of curiosity, is there are particular activation you are trying to implement yourself?
 @Sébastien Guillet While I can not disclose the exact details as I am currently not finished writing the paper yet, ElementWiseLayer does not do well with many functions beyond the normal commonly used ones.Example: The recently popular Swish Activation Function which performs better than ReLU in most cases. It is defined as x*sigmoid(x) elem = ElementwiseLayer[x*LogisticSigmoid[x]] gives an error ElementwiseLayer::invscf: x LogisticSigmoid[x] could not be symbolically evaluated as a unary scalar function. This is the same error I am receiving whenever the complexity of function is a little more than exponential or tanh.