Hot Search : Source embeded web remote control p2p game More...
Location : Home Search - Decision Trees
Search - Decision Trees - List
DL : 0
machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive-bayes, decision tables, majority, induction algorithms, classifiers, categorizers, general logic diagrams, instance-based algorithms, discretization, lazy learning, bagging, MineSet. -machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive - bayes, decision tables, the majority, induction algorithms, classifiers, categorizers, general logic diagrams. instance-based algorithms, discretization. lazy learning, bagging, MineSet.
Update : 2008-10-13 Size : 3.01mb Publisher : infinite8

决策树lisp代码 决策树lisp代码-decision tree lisp code decisio n tree lisp code
Update : 2008-10-13 Size : 3.39kb Publisher : 潘煜

Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Update : 2008-10-13 Size : 113.95kb Publisher : njustyw

DL : 0
machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive-bayes, decision tables, majority, induction algorithms, classifiers, categorizers, general logic diagrams, instance-based algorithms, discretization, lazy learning, bagging, MineSet. -machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive- bayes, decision tables, the majority, induction algorithms, classifiers, categorizers, general logic diagrams. instance-based algorithms, discretization. lazy learning, bagging, MineSet.
Update : 2025-02-17 Size : 3.01mb Publisher : infinite8

决策树lisp代码 决策树lisp代码-decision tree lisp code decisio n tree lisp code
Update : 2025-02-17 Size : 3kb Publisher : 潘煜

Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Update : 2025-02-17 Size : 114kb Publisher : njustyw

本资料含有国外最经典的数据挖掘的论文。有决策树,贝叶斯等-This information contains a foreign classic data mining papers. There are decision trees, Bayesian, etc.
Update : 2025-02-17 Size : 2.75mb Publisher : zhangli

DL : 0
NeC4.5 is a variant of C4.5 decision tree, which could generate decision trees more accurate than standard C4.5 decision trees, through regarding a neural network ensemble as a pre-process of C4.5 decision tree.
Update : 2025-02-17 Size : 8kb Publisher :

模式识别中多类分类问题决策树间接Induction of Decision Trees-Pattern Recognition in many types of decision tree classification of indirect Induction of Decision Trees
Update : 2025-02-17 Size : 875kb Publisher : john

using decision trees in NIDS
Update : 2025-02-17 Size : 296kb Publisher : ali

The matlab code implements the ensemble of decision tree classifiers proposed in: "L. Nanni and A. Lumini, Input Decimated Ensemble based on Neighborhood Preserving Embedding for spectrogram classification, Expert Systems With Applications doi:10.1016/j.eswa.2009.02.072 "
Update : 2025-02-17 Size : 1kb Publisher : loris nanni

The algorithm ID3 (Quinlan) uses the method top-down induction of decision trees. Given a set of classified examples a decision tree is induced, biased by the information gain measure, which heuristically leads to small trees. The examples are given in attribute-value representation. The set of possible classes is finite. Only tests, that split the set of instances of the underlying example languages depending on the value of a single attribute are supported.
Update : 2025-02-17 Size : 4kb Publisher : Minh

决策树,Machine Learning, Tom Mitchell, McGraw Hill,第3章决策树源码-decision-trees
Update : 2025-02-17 Size : 3kb Publisher : jinjiabao

The program using C++ and OpenCV library demonstrates how to use different decision trees
Update : 2025-02-17 Size : 1kb Publisher : zeus

A comparative study on heuristic algorithms for generating fuzzy decision trees
Update : 2025-02-17 Size : 223kb Publisher : Artz

Visual C# desktop application for decision trees.
Update : 2025-02-17 Size : 322kb Publisher : ileZ

决策树是数据挖掘分类算法的一个重要方法。在各种分类算法中,决策树是最直观的一种。决策树(Decision Tree)是在已知各种情况发生概率的基础上,通过构成决策树来求取净现值的期望值大于等于零的概率。-A decision tree is an important method of data mining classification algorithms. In various classification algorithms, decision trees is the most intuitive one. Decision tree (Decision Tree) is based on the probability of occurrence of each case are known, through the decision tree to constitute greater than the probability of obtaining the expected net present value equal to zero.
Update : 2025-02-17 Size : 162kb Publisher : yanhaohui

使用matlab建立决策树,分析数据。决策过程中使用了matlab自带的建立决策树的函数。-Using decision trees to predict contact lens type.Training data file (lense.txt) and description file (comments.txt).
Update : 2025-02-17 Size : 2kb Publisher : lyr

Implementation of decision tree in data mining to solve multi branch problem
Update : 2025-02-17 Size : 51kb Publisher : reidsneo

Machine Learning Decision Trees
Update : 2025-02-17 Size : 75kb Publisher : 宗晓
« 12 3 4 5 »
CodeBus is one of the largest source code repositories on the Internet!
Contact us :
1999-2046 CodeBus All Rights Reserved.