Hot Search : Source embeded web remote control p2p game More...
Location : Home Search - backpropagation neural network
Search - backpropagation neural network - List
DL : 0
The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms.~..~ There are 11 blocks that implement basically these 5 kinds of neural networks: 1) Adaptive Linear Network (ADALINE) 2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm A simulink example regarding the approximation of a scalar nonlinear function of 4 variables is included-The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) Multilayer Layer 102206 with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) and RBF Networks with Piecewise Linear Dynamic Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables is included
Update : 2025-04-04 Size : 194kb Publisher : 叶建槐

以类库的形式实现了BP神经网络算法。可从该类基础上派生新类,实现各类应用-In order to realize a form of class libraries BP neural network algorithm. Can be derived from the basis of such new categories, the realization of a wide range of applications
Update : 2025-04-04 Size : 5kb Publisher : 唐述

bp, backpropagation 神经网络 简单认字 错误开发实例。-bp, backpropagation neural network to develop a simple example of character recognition errors.
Update : 2025-04-04 Size : 78kb Publisher : ky lee

Neural Network backpropagation
Update : 2025-04-04 Size : 1.98mb Publisher : joy

Backpropagation Neural network implementation in C# .NET
Update : 2025-04-04 Size : 788kb Publisher : wanayngawai

backpropagtion neural network c
Update : 2025-04-04 Size : 11kb Publisher : nour

Precise and fast control chart pattern (CCP) recognition is important for monitoring process environments to achieve appropriate control and to produce high quality products. CCPs can exhibit six types of pattern: normal, cyclic, increasing trend, decreasing trend, upward shift and downward shift. Except for normal patterns, all other patterns indicate that the process being monitored is not functioning correctly and requires adjustment. This paper describes a new type of neural network for speeding up the training process and to compare three training algorithms in terms of speed, performance and parameter complexity for CCP recognition. The networks are multilayered perceptrons trained with a resilient propagation, backpropagation (BP) and extended delta-bar-delta algorithms. The recognition results of CCPs show the BP algorithm is accurate and provides better and faster results.
Update : 2025-04-04 Size : 366kb Publisher : singh

DL : 0
a matlab code for training a backpropagation neural network
Update : 2025-04-04 Size : 251kb Publisher : ssss

Neural Network Based Clustering using Self Organizing Map (SOM) in Excel Here is a small tool in Excel using which you can find clusters in your data set. The tool uses Self Organizing Maps (SOM) - originally proposed by T.Kohonen as the method for clustering. * Neural Network based Clustering tool in Excel (209 KB in Zipped format. 947 KB when unzipped.) Inside the downloaded zip file, you will find the Excel file containing the application. Before running it, I suggest that you go through the ReadMe worksheet. It contains brief instructions on how to run the tool. If you are interested in building Prediction and Classification models in Excel using Feedforward-Backpropagation Neural Network, here are two small Excel based tools for you. Also, if you are interested in Tree based Classification models, here is a Tree based classifier in Excel. -Neural Network Based Clustering using Self Organizing Map (SOM) in Excel Here is a small tool in Excel using which you can find clusters in your data set. The tool uses Self Organizing Maps (SOM)- originally proposed by T.Kohonen as the method for clustering. * Neural Network based Clustering tool in Excel (209 KB in Zipped format. 947 KB when unzipped.) Inside the downloaded zip file, you will find the Excel file containing the application. Before running it, I suggest that you go through the ReadMe worksheet. It contains brief instructions on how to run the tool. If you are interested in building Prediction and Classification models in Excel using Feedforward-Backpropagation Neural Network, here are two small Excel based tools for you. Also, if you are interested in Tree based Classification models, here is a Tree based classifier in Excel.
Update : 2025-04-04 Size : 209kb Publisher : Jessie

Neural network source code with 2 learning process using backpropagation algorithms.
Update : 2025-04-04 Size : 379kb Publisher : Kow

backpropagation mlp neural network
Update : 2025-04-04 Size : 1kb Publisher : niksu

Make a program to train a Backpropagation neural network which simulates the behavior of an XOR gate.
Update : 2025-04-04 Size : 486kb Publisher : Juan Carlos

This a C code for implementing a neural network(Backpropagation)-This is a C code for implementing a neural network(Backpropagation)
Update : 2025-04-04 Size : 3kb Publisher : 余洋

DL : 0
this is a backpropagation neural network training and testing design for two types of islamic caligraphy pattern recognition as a school project
Update : 2025-04-04 Size : 11kb Publisher : kim

High information redundancy and correlation in face images result in efficiencies when such images are used directly for recognition. In this paper, discrete cosine transforms are used to reduce image information redundancy because only a subset of the transform coefficients are necessary to preserve the most important facial features such as hair outline, eyes and mouth. We demonstrate experimentally that when DCT coefficients are fed into a backpropagation neural network for classification, a high recognition rate can be achieved by using a very small proportion of transform coefficients. This makes DCT-based face recognition much faster than other approaches.-High information redundancy and correlation in face images result in inefficiencies when such images are used directly for recognition. In this paper, discrete cosine transforms are used to reduce image information redundancy because only a subset of the transform coefficients are necessary to preserve the most important facial features such as hair outline, eyes and mouth. We demonstrate experimentally that when DCT coefficients are fed into a backpropagation neural network for classification, a high recognition rate can be achieved by using a very small proportion of transform coefficients. This makes DCT-based face recognition much faster than other approaches.
Update : 2025-04-04 Size : 25kb Publisher : mhm

back probagation in neural network
Update : 2025-04-04 Size : 370kb Publisher : bashar

模式识别课程作业-神经网络分类IRIS数据集.共两层网络,程序有详细注释。程序结果将输出到EXCEL文件中,也很详细。-Course work in pattern recognition- Neural Network Classification IRIS data set. A total of two networks, a detailed program notes. Program results will be output to the EXCEL file, and very detailed.
Update : 2025-04-04 Size : 2kb Publisher : yumingwei

This "classic" style of neural network is used for spontaneously generating an algorithm for the analysis of numeric patterns. The goal is to "teach" the network how to analyze a desired set of numeric inputs. For instance, the network might be taught to calculate the AND, OR, and XOR of two input numbers: given the inputs of 0 and 1, the network should output 0, 1, and 1. The trick is that the network is not taught how to analyze the numbers the network is given several sets of inputs and the correct output for each input set, and attempts to synthesize an algorithm to provide correct outputs. If this process is correctly performed, the network may be able to yield the correct analysis of input sets for which it has not been taught the correct output.-This "classic" style of neural network is used for spontaneously generating an algorithm for the analysis of numeric patterns. The goal is to "teach" the network how to analyze a desired set of numeric inputs. For instance, the network might be taught to calculate the AND, OR, and XOR of two input numbers: given the inputs of 0 and 1, the network should output 0, 1, and 1. The trick is that the network is not taught how to analyze the numbers the network is given several sets of inputs and the correct output for each input set, and attempts to synthesize an algorithm to provide correct outputs. If this process is correctly performed, the network may be able to yield the correct analysis of input sets for which it has not been taught the correct output.
Update : 2025-04-04 Size : 55kb Publisher : diana

307 - Backpropagation Neural Network v1.0 - Namira
Update : 2025-04-04 Size : 57kb Publisher : Nima

test code for backpropagation neural network
Update : 2025-04-04 Size : 1kb Publisher : hussein
« 12 3 4 »
CodeBus is one of the largest source code repositories on the Internet!
Contact us :
1999-2046 CodeBus All Rights Reserved.