Description: 这是一个BP神经网络的程序,使用Matlab编辑的。-This is a program of Back-Propgation neural network,edited in Matlab. Platform: |
Size: 1024 |
Author:胡玉霞 |
Hits:
Description: 这是cmu大学的一个研究小组编写的神经网络中反向传播算法的源代码,很有参考价值。-This is a university research group to prepare the neural network back-propagation algorithm source code of great reference value. Platform: |
Size: 11264 |
Author:冯贵玉 |
Hits:
Description: 误差反向传播网络(Back propagation network,简称BP网络)是神经网络中最活跃的方法,且绝大多数采用了三层结构(输入层、一个隐含层和输出层).BP网络是一种非线性映射人工神经网络.本程序用vb实现的bp算法-error back-propagation network (Back propagation network, called BP) neural network is the most active, but the majority adopted a three-tier structure (input layer, a hidden layer and output layer). BP network is a non-linear mapping of artificial neural networks. The procedures used vb the algorithm to achieve bp Platform: |
Size: 28672 |
Author:fuyu |
Hits:
Description: 本程序为一个误差向后传播的三层前馈神经网络有指导的学习算法,用户可在文件中设置相应参数.-this program back to an error in transmission feed-forward neural network algorithm for the study guide, users can set up in the document corresponding parameters. Platform: |
Size: 475136 |
Author:彭斌 |
Hits:
Description: OCR字符识别源代码,基于神经网络,用C#写的 -OCR character recognition source code, based on neural network, using the C# to write Platform: |
Size: 54272 |
Author:谭剑 |
Hits:
Description: 完整的反向传播神经网络模式识别C++源代码,原版权威所著,学习物有所值.-integrity of the back-propagation neural network pattern recognition C source code, original authoritative book, learning value for money. Platform: |
Size: 215040 |
Author:John NI |
Hits:
Description: there are some newly released Neural Network Example Programs for Character Recognition, which based on Image Processing Toolbox,Neural Network Toolbox in MATLAB, which is quite informative for the beginners in Neural networks applicators-there are some newly released Neural Network Example Programs for Character Recognition, which based on Image Processing Toolbox, Neural Network Toolbox in MATLAB, which is quite informative for the beginners in Neural networks applicators Platform: |
Size: 12288 |
Author:宏姬 |
Hits:
Description: bp神经网络算法是解决最优化问题的先进算法之一,本论文讨论了神经网络中使用最为广泛的前馈神经网络。其网络权值学习算法中影响最大的就是误差反向传播算法(back-propagation简称BP算法)。BP算法存在局部极小点,收敛速度慢等缺点。基于优化理论的Levenberg-Marquardt算法忽略了二阶项。该文讨论当误差不为零或者不为线性函数即二阶项S(W)不能忽略时的Hesse矩阵的近似计算,进而训练网络。-bp neural network algorithm to solve optimization problems, one of the advanced algorithm, the paper discusses the neural network in the most widely used feed-forward neural network. Its network weights learning algorithm in the greatest impact is the error back-propagation algorithm (back-propagation algorithm referred to as BP). BP algorithm for the existence of local minimum points, such as the shortcomings of slow convergence. Optimization theory based on the Levenberg-Marquardt algorithm ignores the second-order item. In this paper, the discussion when the error is not zero or not that is second-order linear function of S (W) can not be ignored when the Hesse matrix of approximate calculation, and then training the network. Platform: |
Size: 19456 |
Author:刘慧 |
Hits:
Description: 每一个训练范例在网络中经过两遍传递计算:一遍向前传播计算,从输入层开始,传递各层并经过处理后,产生一个输出,并得到一个该实际输出和所所需输出之差的差错矢量;一遍向反向传播计算,从输出层至输入层,利用差错矢量对权值进行逐层修改。BP算法有很强的数学基础,戏剧性地扩展了神经网络的使用范围,产生了许多应用成功的实例,对神经网络研究的再次兴起过很大作用。
-Each training example in the network passing through the calculation twice: once to move the spread of computing, from the beginning of input layer, transmission floors and processed to produce an output, and a the actual output and the difference between the desired output error vector again to reverse the spread of computing, from the output layer to the input layer, using error vector values of the right to amend the layers. BP algorithm has a strong mathematical foundation, dramatically expanded the use of neural networks, resulting in a number of successful examples of application of neural network research have a significant role in the rise once again. Platform: |
Size: 1024 |
Author:军军 |
Hits:
Description: This program simulates a 3 or 4-layer Neural Network, and can be used to
simulate an arbitrary, complex, or non-linear function that would be
difficult to implement by traditional methods. The Back Propagation
method is used "teach" the network the desired function.
Adjacent layers of the net are fully interconnected that is, every
neuron in layer 1 is connected to every neuron in layer 2, and every
neuron in layer 2 is connected to every neuron in layer 3 (1->2, 2->3).
With a 4-layer net, there is further interconnection: 1->2, 1->3, 2->3,
2->4, 3->4.
- This program simulates a 3 or 4-layer Neural Network, and can be used to
simulate an arbitrary, complex, or non-linear function that would be
difficult to implement by traditional methods. The Back Propagation
method is used "teach" the network the desired function.
Adjacent layers of the net are fully interconnected that is, every
neuron in layer 1 is connected to every neuron in layer 2, and every
neuron in layer 2 is connected to every neuron in layer 3 (1->2, 2->3).
With a 4-layer net, there is further interconnection: 1->2, 1->3, 2->3,
2->4, 3->4.
Platform: |
Size: 49152 |
Author:cso |
Hits: