Activation Function

Abstract

Inspire by nature world mode, a activation function is proposed. It is absolute function.Through test on mnist dataset and fully-connected neural network and convolutional neural network, some conclusions are put forward. The line of accuracy of absolute function is a little shaken that is different from the line of accuracy of relu and leaky relu. The absolute function can keep the negative parts as equal as the positive parts, so the individualization is more active than relu and leaky relu function. In order to generalization, the individualization is the reason of shake, the accuracy may be good in some set and may be worse in some set. The absolute function is less likely to be over-fitting. The batch size is small, the individualization is clear, vice versa. Through test on mnist and autoencoder, It is that the leaky relu function can do classification task well, while the absolute function can do generation task well. Because the classification task need more universality and generation task need more individualization. The pleasure irritation and painful irritation is not only the magnitude differences, but also the sign differences, so the negative part should keep as a part. Stimulation which happens frequently is low value, it is showed around zero in figure 1 . Stimulation which happens accidentally is high value, it is showed far away from zero in figure 1 . So the high value is the big stimulation, which is individualization.

1.1.. Preface

There are many activation function, such as sigmoid, tanh, relu, leaky relu, elu. The function frequently used is relu [1] . The relu function has dead relu problem that is some neuron never be activated, the related parameter never be updated. The absolute function doesn' t have dead relu problem, and gradient disappearance and gradient explosion, however it has some interesting characteristics. The image of absolute function is showed below. The formula is y=|x|. 



Fig1:The Image of Absolute Function

