Abstract:
In machine learning models, one of the most popular models is artificial neural networks. The activation function is one of the important parameters of neural networks. In this paper, the sigmoid function is used as an activation function with a fractional derivative approach to minimize the convergence error in backpropagation and to maximize the generalization performance of neural networks. The proportional Caputo definition is considered a fractional derivative. We evaluated three neural network models on the usage of the proportional Caputo derivative. The results show that the proportional Caputo derivative approach has higher classification accuracy than traditional derivative models in backpropagation for neural networks with and without L2 regularization.