Location:
Search - backpropagation
Search list
Description: 累神經系統 Backpropagation.
Platform: |
Size: 641585 |
Author: TAE |
Hits:
Description: Neural Network backpropagation
Platform: |
Size: 68542 |
Author: joy |
Hits:
Description: Backpropagation Network Time-Series Forecasting 源码, 经典的BPN人工神经网络例子源码-Backpropagation Network Time-Series Forecasting source, the classic example of artificial neural network BPN source
Platform: |
Size: 33792 |
Author: |
Hits:
Description: 附加了动量项BP神经网络-Additional items of Momentum BP neural network
Platform: |
Size: 30720 |
Author: 高山 |
Hits:
Description: 自适应(Adaptive)神经网络源程序
The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables -Adaptive (Adaptive) The neural network source adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) 102206 with Multilayer Layer Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) RBF and Piecewise Linear Dynamic Networks with the Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables
Platform: |
Size: 200704 |
Author: 周志连 |
Hits:
Description: The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables is included-The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) Multilayer Layer 102206 with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) and RBF Networks with Piecewise Linear Dynamic Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables is included
Platform: |
Size: 198656 |
Author: 叶建槐 |
Hits:
Description: Backpropagation / ART1 / Kohonen / Radial Basis-Backpropagation/ART1/Kohonen/Radial B Fellowships
Platform: |
Size: 146432 |
Author: jack |
Hits:
Description: Single-layer neural networks can be trained using various learning algorithms. The best-known algorithms are the Adaline, Perceptron and Backpropagation algorithms for supervised learning. The first two are specific to single-layer neural networks while the third can be generalized to multi-layer perceptrons.-Single-layer neural networks can be train ed using various learning algorithms. The best- known algorithms are the Adaline. 102206 and Backpropagation algorithms fo r supervised learning. The first two are specific ic to single-layer neural networks while the th ird can be generalized to multi-layer perceptr ons.
Platform: |
Size: 818176 |
Author: 陈伟 |
Hits:
Description: 以类库的形式实现了BP神经网络算法。可从该类基础上派生新类,实现各类应用-In order to realize a form of class libraries BP neural network algorithm. Can be derived from the basis of such new categories, the realization of a wide range of applications
Platform: |
Size: 5120 |
Author: 唐述 |
Hits:
Description: bp, backpropagation 神经网络 简单认字 错误开发实例。-bp, backpropagation neural network to develop a simple example of character recognition errors.
Platform: |
Size: 79872 |
Author: ky lee |
Hits:
Description: 累神經系統 Backpropagation.-Tired nervous system Backpropagation.
Platform: |
Size: 642048 |
Author: |
Hits:
Description: Neural Network backpropagation
Platform: |
Size: 2076672 |
Author: joy |
Hits:
Description: Backpropagation Artificial Neural Network in C
Platform: |
Size: 98304 |
Author: 杨诚 |
Hits:
Description: 《Neural Networks for Text-to-Speech Phoneme Recognition》This paper presents two different artificial neural network approaches for phoneme recognition for text-to-speech applications: Staged Backpropagation Neural Networks and Self-Organizing Maps.- Neural Networks for Text-to-Speech Phoneme Recognition This paper presents two different artificial neural network approaches for phoneme recognition for text-to-speech applications: Staged Backpropagation Neural Networks and Self-Organizing Maps.
Platform: |
Size: 722944 |
Author: 付诗 |
Hits:
Description: * Lightweight backpropagation neural network.
* This a lightweight library implementating a neural network for use
* in C and C++ programs. It is intended for use in applications that
* just happen to need a simply neural network and do not want to use
* needlessly complex neural network libraries. It features multilayer
* feedforward perceptron neural networks, sigmoidal activation function
* with bias, backpropagation training with settable learning rate and
* momentum, and backpropagation training in batches.
-* Lightweight backpropagation neural network.* This a lightweight library implementating a neural network for use* in C and C++ Programs. It is intended for use in applications that* just happen to need a simply neural network and do not want to use* needlessly complex neural network libraries. It features multilayer* feedforward perceptron neural networks, sigmoidal activation function* with bias, backpropagation training with settable learning rate and* momentum, and backpropagation training in batches.
Platform: |
Size: 29696 |
Author: 正熹 |
Hits:
Description: Batch version of the back-propagation algorithm.
% Given a set of corresponding input-output pairs and an initial network
% [W1,W2,critvec,iter]=batbp(NetDef,W1,W2,PHI,Y,trparms) trains the
% network with backpropagation.
%
% The activation functions must be either linear or tanh. The network
% architecture is defined by the matrix NetDef consisting of two
% rows. The first row specifies the hidden layer while the second
% specifies the output layer.
%-Batch version of the back-propagation algorithm. Given a set of corresponding input-output pairs and an initial network [W1, W2, critvec, iter] = batbp (NetDef, W1, W2, PHI, Y, trparms) trains the network with backpropagation. The activation functions must be either linear or tanh. The network architecture is defined by the matrix NetDef consisting of two rows. The first row specifies the hidden layer while the second specifies the output layer.
Platform: |
Size: 2048 |
Author: 张镇 |
Hits:
Description: Backpropagation Neural network implementation in C# .NET
Platform: |
Size: 806912 |
Author: wanayngawai |
Hits:
Description: backpropagation mlp neural network
Platform: |
Size: 1024 |
Author: niksu |
Hits:
Description: Make a program to train a Backpropagation neural network which simulates the behavior of an XOR gate.
Platform: |
Size: 497664 |
Author: Juan Carlos |
Hits:
Description: A MLP code with backpropagation training algorithm designed for aproximation of functions problems.
Platform: |
Size: 1024 |
Author: Paulo |
Hits: