高级检索

基于梯度下降法的Chebyshev前向神经网络

A Chebyshev Forward Neural Network Based on Gradient Descent Method

  • 摘要: 传统人工神经网络模型中,同一隐层各神经元的激励函数是相同的,这与人类神经元的实际情况不一致。为此,构造一种隐层各神经元激励函数互不相同的前向神经网络模型,采用一簇Chebyshev正交多项式序列作为其隐层各神经元的激励函数(简称Chebyshev前向神经网络),并为Chebyshev前向神经网络推导基于梯度下降法的网络参数训练算法。仿真实验表明,基于梯度下降法的Chebyshev前向神经网络算法能够有效调整网络参数,使之以较高的精度逼近具有复杂模式的样本数据集。

     

    Abstract: In traditional artificial neural network model, the activation functions of neurons in the same hidden layer are the same, which is not consistent with the actual situation of human neurons. For this reason, and a forward neural network model with different activation functions of each neuron in the hidden layer was constructed, a cluster of Chebyshev orthogonal polynomials was used as the activation function of each neuron of the hidden layer (Chebyshev forward neural network). A training algorithm for network parameters based on gradient descent method was derived for Chebyshev feedforward neural network. The simulation experiment shows that the Chebyshev forward neural network algorithm based on gradient descent method can effectively adjust the network parameters, and make it approximate in sample data set of complex patterns with high precision.

     

/

返回文章
返回