Location:
Search - Leave-One-Out
Search list
Description: New in this version:
Support for multi-class pattern recognition using maxwins, pairwise [4] and DAG-SVM [5] algorithms.
A model selection criterion (the xi-alpha bound [6,7] on the leave-one-out cross-validation error).
-New in this version : Support for multi-class pattern recognition u maxwins sing, Pairwise [4] and DAG - SVM [5] algorithms. A mode l selection criterion (the xi-alpha bound [6, 7] on the leave-one-out cross-validation erro r).
Platform: |
Size: 43182 |
Author: 吴成 |
Hits:
Description: Feature scaling for kernel Fisher discriminant analysis using leave-one-out cross validation.
FS-KFDA is a package for implementing feature scaling for kernel fisher discriminant analysis.-Feature scaling for kernel Fisher discrim inant analysis using leave-one-out cross vali dation. FS-KFDA is a package for implementing f eature scaling for kernel fisher discriminant analysis.
Platform: |
Size: 511443 |
Author: ihatexlet |
Hits:
Description: 基于NSVM的两类SVM分类器,matlab7.1运行通过,main中做了PCA的特征提取、leave one out cross-valiation和5-fold cross-validation(重复10次的平均值)
Platform: |
Size: 6862 |
Author: Conangg |
Hits:
Description: New in this version:
Support for multi-class pattern recognition using maxwins, pairwise [4] and DAG-SVM [5] algorithms.
A model selection criterion (the xi-alpha bound [6,7] on the leave-one-out cross-validation error).
-New in this version : Support for multi-class pattern recognition u maxwins sing, Pairwise [4] and DAG- SVM [5] algorithms. A mode l selection criterion (the xi-alpha bound [6, 7] on the leave-one-out cross-validation erro r).
Platform: |
Size: 43008 |
Author: 吴成 |
Hits:
Description: Feature scaling for kernel Fisher discriminant analysis using leave-one-out cross validation.
FS-KFDA is a package for implementing feature scaling for kernel fisher discriminant analysis.-Feature scaling for kernel Fisher discrim inant analysis using leave-one-out cross vali dation. FS-KFDA is a package for implementing f eature scaling for kernel fisher discriminant analysis.
Platform: |
Size: 510976 |
Author: |
Hits:
Description: 留一法的matlab源代码,很不错的东西-Leave-one of the matlab source code, it is a good thing
Platform: |
Size: 2048 |
Author: 潘勇 |
Hits:
Description: The leave-one-out cross-validation scheme is a method for estimating
% the average generalization error. When calling
% [Eloo,H] = loo(NetDef,W1,W2,PHI,Y,trparms) with trparms(1)>0, the network
% will be retrained a maximum of trparms(1) iterations for each input-output
% pair in the data set, starting from the initial weights (W1,W2). If
% trparms(1)=0 an approximation to the loo-estimate based on "linear
% unlearning" is produced. This is in general less accurate, but is much
% faster to calculate.-The leave-one-out cross-validation scheme is a method for estimating the average generalization error. When calling [Eloo, H] = loo (NetDef, W1, W2, PHI, Y, trparms) with trparms (1)> 0, the network will be retrained a maximum of trparms (1) iterations for each input-output pair in the data set, starting from the initial weights (W1, W2). If trparms (1) = 0 an approximation to the loo-estimate based on linear unlearning is produced. This is in general less accurate, but is much faster to calculate.
Platform: |
Size: 3072 |
Author: 张镇 |
Hits:
Description: 留一模型选择法leave-one-out model selection,适合支持向量机分类和回归时进行参数选择。-looms uses a slightly modified
BSVM to perform model selection on binary classification problems.
Currently the RBF kernel is supported.
Platform: |
Size: 56320 |
Author: Mountain |
Hits:
Description: Multiple Parameter Selection for LS-SVM using Smooth Leave-one-out Error
还有文章和源码,一起研究学习!-Using SLOO-MPS for tuning multiple parameters for LS-SVM to overcome the parameter fixing problem.
Platform: |
Size: 338944 |
Author: yjx |
Hits:
Description: The code implements a probabilstic Neuraol network for classification problems trained with a Leave One Out Cross Validation Scheme in Matlab (version 7 or above). The following toolboxes are required: statidtics, optimization and neural networks.
Platform: |
Size: 33792 |
Author: Alfredo/Passos |
Hits:
Description: M-files for PLS, PLS-DA, with leave-one-out cross-validation and prediction
Platform: |
Size: 5120 |
Author: robbie |
Hits:
Description: 模式识别中的留一法。适合于初学者。包括源代码和相关的测试数据。-The leave-one-out method
Platform: |
Size: 2048 |
Author: ys |
Hits:
Description: 留一法交叉检验,用于得到留一法交叉检验系数-Leave one out cross-validation for the leave-one-out cross examination coefficient
Platform: |
Size: 1024 |
Author: qieqie |
Hits:
Description: leave one out for R,线性回归-leave one out for R
Platform: |
Size: 1024 |
Author: 魏瑶 |
Hits:
Description: LS-SVM Leave-One-Out Cross-Validation Demo
G. C. Cawley, "Leave-one-out cross-validation based model selection criteria for weighted LS-SVMs", Proceedings of the International Joint Conference on Neural Networks (IJCNN-2006), pages 1661-1668, Vancouver, BC, Canada, July 16-21 2006. (pdf)theoval.cmp.uea.ac.uk/~gcc/matlab/
Platform: |
Size: 5120 |
Author: leozajung |
Hits:
Description: cross validation leave one out for calculating pls rmsecv
Platform: |
Size: 1024 |
Author: Ahmed serag |
Hits:
Description: Método leave-one-out
Platform: |
Size: 1024 |
Author: Jerfson Paiva |
Hits:
Description: 留一交叉验证方法。在做分类器分类时,使用留一交叉验证方法对分类器性能进行评价-leave one out
Platform: |
Size: 1024 |
Author: 娃娃 |
Hits:
Description: SVM Light工具箱 Matlab接口,已经编译好,可直接用(SVMlight, by Joachims, is one of the most widely used SVM classification and regression package. It has a fast optimization algorithm, can be applied to very large datasets, and has a very efficient implementation of the leave-one-out cross-validation. Distributed as C++ source and binaries for Linux, Windows, Cygwin, and Solaris. Kernels: polynomial, radial basis function, and neural (tanh).)
Platform: |
Size: 66560 |
Author: ym89413
|
Hits:
Description: LWP是一种Matlab / Octave工具箱实现局部加权多项式回归(也被称为局部回归/局部加权散点平滑/黄土/ LOWESS和核平滑)。使用此工具箱,您可以使用九个具有度量窗口宽度或最近邻窗口宽度的任意一个内核来拟合任意维度的数据的局部多项式。还提供了一个优化内核带宽的函数。优化可采用留一交叉验证,GCV,AICC、AIC,FPE,T,执行,或单独的验证数据。鲁棒拟合也可用。(LWP is a Matlab/Octave toolbox implementing Locally Weighted Polynomial regression (also known as Local Regression / Locally Weighted Scatterplot Smoothing / LOESS / LOWESS and Kernel Smoothing). With this toolbox you can fit local polynomials of any degree using one of the nine kernels with metric window widths or nearest neighbor window widths to data of any dimensionality. A function for optimization of the kernel bandwidth is also available. The optimization can be performed using Leave-One-Out Cross-Validation, GCV, AICC, AIC, FPE, T, S, or separate validation data. Robust fitting is available as well.)
Platform: |
Size: 5120 |
Author: baidudu |
Hits: