Description: 感知器和其他聚类算法,如lmse。另外还包含了更多的比如c均知法,各种简单的非监督据类程序,有利于直接使用-Perceptron and other clustering algorithms, such as lmse. It also includes more like c all know, a variety of simple non-supervisory category According to procedures, it will directly use Platform: |
Size: 2499 |
Author:王若 |
Hits:
Description: 感知器和其他聚类算法,如lmse。另外还包含了更多的比如c均知法,各种简单的非监督据类程序,有利于直接使用-Perceptron and other clustering algorithms, such as lmse. It also includes more like c all know, a variety of simple non-supervisory category According to procedures, it will directly use Platform: |
Size: 2048 |
Author:王若 |
Hits:
Description: 模式识别中K均值、ISODATA等聚类算法以及感知器线性判别算法的Java实现,源码包含一个完整的Eclipse工程,便于重用-Pattern Recognition in the K-means, ISODATA clustering algorithm, etc., as well as Linear Discriminant Perceptron Algorithm Java realized, contains a complete source Eclipse project, to facilitate reuse Platform: |
Size: 351232 |
Author:YSH |
Hits:
Description: 统计模式识别工具箱(Statistical Pattern Recognition Toolbox)包含:
1,Analysis of linear discriminant function
2,Feature extraction: Linear Discriminant Analysis
3,Probability distribution estimation and clustering
4,Support Vector and other Kernel Machines-
This section should give the reader a quick overview of the methods implemented in
STPRtool.
• Analysis of linear discriminant function: Perceptron algorithm and multiclass
modification. Kozinec’s algorithm. Fisher Linear Discriminant. A collection
of known algorithms solving the Generalized Anderson’s Task.
• Feature extraction: Linear Discriminant Analysis. Principal Component Analysis
(PCA). Kernel PCA. Greedy Kernel PCA. Generalized Discriminant Analysis.
• Probability distribution estimation and clustering: Gaussian Mixture
Models. Expectation-Maximization algorithm. Minimax probability estimation.
K-means clustering.
• Support Vector and other Kernel Machines: Sequential Minimal Optimizer
(SMO). Matlab Optimization toolbox based algorithms. Interface to the
SVMlight software. Decomposition approaches to train the Multi-class SVM classifiers.
Multi-class BSVM formulation trained by Kozinec’s algorithm, Mitchell-
Demyanov-Molozenov algorithm Platform: |
Size: 4271104 |
Author:查日东 |
Hits:
Description: 对一组样本通过C-means算法进行聚类分析,然后对聚类结果用感知器算法进行分类,是模式识别课程的实验。-Samples of a group through the C-means cluster analysis algorithm, and then the results of the clustering algorithm used to classify perceptron is a pattern recognition course experiment. Platform: |
Size: 2048 |
Author:韩梅 |
Hits:
Description: 模式识别是对样本进行聚类,感知器算法是通过迭代计算修正权向量,使样本满足条件,从而实现分类。本程序对感知器算法进行了改进,当权向量不满足时,立即退出此轮计算,进入下一轮迭代,从而减少了计算次数。程序对代码有详细注释,对样本的个数和维数自动判别。-Pattern recognition, clustering samples, perception algorithm is iterative correction weight vector samples meet the conditions, in order to achieve the classification. The program improved perception algorithm in power vectors are not met immediately exit the current round of calculation, proceed to the next round of iteration, thereby reducing the number of calculations. The program code has detailed notes, and automatically determine the number of samples and number of dimensions. Platform: |
Size: 2048 |
Author:Mali Ang |
Hits:
Description: An important aspect of an ANN model is whether it needs
guidance in learning or not. Based on the way they learn, all
artificial neural networks can be divided into two learning
categories - supervised and unsupervised.
• In supervised learning, a desired output result for each input
vector is required when the network is trained. An ANN of the
supervised learning type, such as the multi-layer perceptron, uses
the target result to guide the formation of the neural parameters. It
is thus possible to make the neural network learn the behavior of
the process under study.
• In unsupervised learning, the training of the network is entirely
data-driven and no target results for the input data vectors are
provided. An ANN of the unsupervised learning type, such as the
self-organizing map, can be used for clustering the input data and
find features inherent to the problem.-An important aspect of an ANN model is whether it needs
guidance in learning or not. Based on the way they learn, all
artificial neural networks can be divided into two learning
categories - supervised and unsupervised.
• In supervised learning, a desired output result for each input
vector is required when the network is trained. An ANN of the
supervised learning type, such as the multi-layer perceptron, uses
the target result to guide the formation of the neural parameters. It
is thus possible to make the neural network learn the behavior of
the process under study.
• In unsupervised learning, the training of the network is entirely
data-driven and no target results for the input data vectors are
provided. An ANN of the unsupervised learning type, such as the
self-organizing map, can be used for clustering the input data and
find features inherent to the problem. Platform: |
Size: 125952 |
Author:Vishal |
Hits:
Description: EEG signals classification using the K-means clustering and a multilayer perceptron neural network model Platform: |
Size: 2897920 |
Author:ALi |
Hits:
Description: 基于matlab的Iris、乳腺癌数据集的模式识别分类算法,含有 遗传算法+SVM、isodata、感知器算法、LMSE、神经网络等算法的实现代码,用于聚类效果良好,是模式识别大作业的参考资料(The pattern recognition classification algorithm based on MATLAB for Iris and breast cancer data sets contains the implementation code of genetic algorithm +SVM, ISODATA, perceptron algorithm, LMSE, neural network and so on. It is used for good clustering results and is a reference for pattern recognition.) Platform: |
Size: 6144 |
Author:张小明12345 |
Hits: