Description: bp神经网络算法是解决最优化问题的先进算法之一,本论文讨论了神经网络中使用最为广泛的前馈神经网络。其网络权值学习算法中影响最大的就是误差反向传播算法(back-propagation简称BP算法)。BP算法存在局部极小点,收敛速度慢等缺点。基于优化理论的Levenberg-Marquardt算法忽略了二阶项。该文讨论当误差不为零或者不为线性函数即二阶项S(W)不能忽略时的Hesse矩阵的近似计算,进而训练网络。-bp neural network algorithm to solve optimization problems, one of the advanced algorithm, the paper discusses the neural network in the most widely used feed-forward neural network. Its network weights learning algorithm in the greatest impact is the error back-propagation algorithm (back-propagation algorithm referred to as BP). BP algorithm for the existence of local minimum points, such as the shortcomings of slow convergence. Optimization theory based on the Levenberg-Marquardt algorithm ignores the second-order item. In this paper, the discussion when the error is not zero or not that is second-order linear function of S (W) can not be ignored when the Hesse matrix of approximate calculation, and then training the network. Platform: |
Size: 19456 |
Author:刘慧 |
Hits: