Description: Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Platform: |
Size: 116681 |
Author:njustyw |
Hits:
Description: Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Platform: |
Size: 116736 |
Author:njustyw |
Hits:
Description: 从因子分析的角度出发解决基因表达谱分析问题。为解决独立成分分析方法在求解过程中的不稳定性,提出一种基于选择性独立成分分析的DNA微阵列数据集成分类器。首先对基因表达水平的重构误差进行分析,选择部分重构误差较小的独立成分进行样本重构,然后基于重构后的样本同时训练多个支持向量机基分类器,最后选择部分分类正确率较高的基分类器进行最大投票以得到最终结果。在3个常用测试集上验证了本文设计方法的有效性。-This paper tries to deal with gene expression problem in view of factor analysis. In order to overcome the instability problem caused by performing independent component analysis, a DNA microarray data ensemble classifier based on selective independent component analysis is proposed. The reconstruction error of each gene is analyzed firstly and a part of independent components which contribute relatively small reconstruction errors are selected to reconstruct new samples. After that, several support vector machine base classifiers are trained simultaneously. Finally, the best base classifiers with high correct rates are selected to participate in the ensemble, using the majority voting method. Results on three publicly available microarray datasets show the feasibility and validity of the method proposed in this paper. Platform: |
Size: 1427456 |
Author:cumtgyy |
Hits:
Description: 用adaboost算法生成基支持向量机分类器,并对识别结果进行简单投票法集成。附有支持向量机工具箱和adaboost算法流程说明。-Adaboost algorithm to generate the base with a support vector machine classifier, and the recognition result is a simple voting method integration. With support vector machine algorithm toolbox and adaboost process description. Platform: |
Size: 1642496 |
Author:李娜 |
Hits:
Description: 将多类别问题分解成多个二类别问题是解决多类别分类问题的常用方式。传统one against all(OAA)分解方式的性能更多的依赖于个体分类器的精度,而不是它的差异性。本文介绍一种基于集成学习的适于多类问题的神经网络集成模型,其基本模块由一个OAA方式的二类别分类器和一个补充多类分类器组成。测试表明,该模型在多类问题上比其他经典集成算法有着更高的精度,并且有较少存储空间和计算时间的优势。-Decompose multi-class problem into multiple binary class problems is a common way to solve multi-class problem. The performance of the traditional one against all (OAA) decomposition way mainly depends on the accuracy of individual classifiers, not their diversity. In this paper, a new ensemble learning model applicable to multiclass domains is proposed. The proposed model is a neural network ensemble in which the base learners are composed by the union of a binary classifier and a complement multi-class classifier. Experimental results show that our model has higher accuracy than other classical ensemble learning for multi-class problems. And it has the superiority with less storage space and computation time. Platform: |
Size: 8192 |
Author:刘茂 |
Hits:
Description: this is matlab code for ensemble classifier such as decision tr-this is matlab code for ensemble classifier such as decision tree Platform: |
Size: 41984 |
Author:parmalik kumar |
Hits:
Description: 利用多特征以及随机子空间的分子显形识别
Phenotype Recognition with Combined Features and Random Subspace Classifier Ensemble, published on Bioinformatics 20-Phenotype Recognition with Combined Features and Random Subspace Classifier Ensemble, published on Bioinformatics 2011 Platform: |
Size: 21504 |
Author:bzhang |
Hits:
Description: Random Forest code for classification. Random Forest is an ensemble method to build classifier. Platform: |
Size: 4096 |
Author:cooldc |
Hits:
Description: adboost算法的测试和训练,能够做多分类器融合,获得很高分类精度-adboost algrithm, it can be used to operate multi-classifier fusion Platform: |
Size: 6144 |
Author:donghui |
Hits:
Description: In this era an emerging filed in the data mining is data stream mining. The current research technique of the data stream is classification which mainly focuses on concept drift data. In mining drift data with the single classifier is not sufficient for classifying the data. Because of the high dimensionality and does not get processed within considerable time, memory, false alarm rate is high, classification accuracy result is low. In this paper, proposed a Genetic based Intuitionistic fuzzy version of k-means has been introduced for grouping interdependent features. The proposed method achieves improvement in classification accuracy and perhaps to select the least number of features which show the way to simplification of learning task. The experimental shows that the advocated method performs well when compared with existing methods. Platform: |
Size: 107520 |
Author:Opencvresearcher |
Hits:
Description: In the distributed processing, where common labeled data may be not
available for designing classifier ensemble, however, an ensemble solution is
necessary, traditional fixed decision aggregation could not account for class prior
mismatch or classifier dependencies in electronic technology. Previous
transductive learning strategies have several drawbacks, e.g., feasibility of the
constraints was not guaranteed and heuristic learning was applied. We overcome
these problems by developing improved iterative scaling (IIS) algorithm for
optimal solution. This method is shown to achieve improved decision accuracy
over the earlier approaches in electronic technology Platform: |
Size: 157696 |
Author:yangs |
Hits:
Description: 此PDF为英文文章,介绍了新的步态识别算法,即对行人足底的压力来进行步态识别,并采用集成KNN分类器来提高识别率,有一定的借鉴意义。-This PDF for the English article, introduced a new gait recognition algorithm, that is, the pressure on the pedestrian foot to carry out gait recognition, and the use of integrated KNN classifier to improve the recognition rate, there is a certain reference value. Platform: |
Size: 632832 |
Author:pudn |
Hits:
Description: 处理非平衡问题的集成方法,基于随机森林的集成学习-Ensemble learning method,which is based on the random forest classifier, to deal with data imbalance problem Platform: |
Size: 446464 |
Author:liuzi |
Hits:
Description: 集成学习将若干基分类器的预测结果进行综合,具体包括Bagging算法和AdaBoost算法;还有随机森林算法,利用多棵树对样本进行训练并预测的一种分类器-Integrated learning integrates the prediction results of several base classifiers, including Bagging algorithm and AdaBoost algorithm and random forest algorithm, using a tree to train the sample and predict a classifier Platform: |
Size: 2048 |
Author:董小鱼 |
Hits:
Description: 对分类错误的样本所对应权重进行累加得到加权误分类率, 这里我们是基于权重向量而不是其他的错误计算指标来评价分类器的-The weighting of the samples with the wrong classification is weighted by the weighted error rate, where we uate the classifier based on the weight vector rather than the other erroneous metric Platform: |
Size: 1024 |
Author:刘雨晴 |
Hits: