Description: The Swendsen-Wang Cuts algorithm is used to label atomic regions (superpixels) based on their intensity patterns using generative models in a Bayesian framework. The prior is based on areas of connected components, which provides a clean segmentation result. A performance comparison of the Swendsen-Wang Cuts algorithm with the Gibbs sampler shows that our algorithm is 400 times faster.
A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang for Image Analysis. J. Comp. Graph. Stat. 16, No 4, 2007 (pdf)
A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang to sampling arbitrary posterior probabilities, PAMI, 27, August 2005 (pdf)
A. Barbu, S.C. Zhu. Graph Partition By Swendsen-Wang Cuts, ICCV 2003 (pdf) Platform: |
Size: 12946406 |
Author:bevin |
Hits:
Description: 分割是在MRI analysis.We的基本问题之一,同时考虑了多种MR图像分割,其中,例如,可能是一个系列的问题经过一段时间的扫描相同的组织(的2D/3D)图像,图像的数量,或不同的切片图像的对称部分。 MR图像的多是分割份额常见的结构信息,因此他们可以协助彼此分割的程序。我们提出了一个贝叶斯共同分割算法在共享的信息整个图像是通过利用马尔可夫随机场前,和吉布斯采样后采样是有效的聘用。由于我们的共同拉动分割算法考虑到所有的图像信息的同时,它提供比个人更准确和坚实的结果分割,如支持从模拟和实际结果的例子。-Segmentation is one of the basic problems in MRI analysis.We consider the problem of simultaneously segmenting multiple MR images, which, for example, could be a series
of (2D/3D) images of the same tissue scanned over time,different slices of a volume image, or images of symmetric parts. The multiple MR images to be segmented share
common structure information and hence they are able to assist each other in the segmentation procedure. We propose a Bayesian co-segmentation algorithm where the shared information
across images is utilized via a Markov random field prior, and a Gibbs sampler is employed for efficient posterior sampling. Because our co-segmentation algorithm pulls
all the image information into consideration simultaneously,it provides more accurate and robust results than the individual
segmentation, as supported by results from both simulated and real examples. Platform: |
Size: 812032 |
Author:命运 |
Hits:
Description: 这是国外研究Gibbs采样和Bayesian推理的研究人员写的工具包软件,最新版本为V1.4.3。很适合研究机器学习及其贝叶斯推理的科研人员使用。-The BUGS (Bayesian inference Using Gibbs Sampling) project is concerned with flexible software for the Bayesian analysis of complex statistical models using Markov chain Monte Carlo (MCMC) methods. The project began in 1989 in the MRC Biostatistics Unit and led initially to the `Classic BUGS program, and then onto the WinBUGS software developed jointly with the Imperial College School of Medicine at St Mary s, London. Development now also includes the OpenBUGS project in the University of Helsinki, Finland. There are now a number of versions of BUGS, which can be confusing. Platform: |
Size: 11234304 |
Author:王磊 |
Hits:
Description: 汽车高斯曲面拟合
---
2程序,以适应到表面二维高斯:
子= A *的进出口( -((西为X0)^2/2/sigmax^2 +(艺Y0的)^2/2/sigmay^ 2)。。)+ b的
这些例程是自动在某种意义上说,他们并不需要出发对模型参数的猜测规范。
autoGaussianSurfML(十一,彝,子)适合通过对模型参数的最大似然(最小二乘)。它首先计算了该模型在许多可能的参数值,然后选择最佳质量设置和细化与lsqcurvefit它。
autoGaussianSurfGS(十一,彝,紫)的估计,通过指定数据的贝叶斯生成模型,然后采取通过从模型吉布斯抽样样本后ofthis模型参数。这种-Auto Gaussian Surface fit
---
2 routines to fit a 2D Gaussian to a surface:
zi = a*exp(-((xi-x0).^2/2/sigmax^2+ (yi-y0).^2/2/sigmay^2))+ b
The routines are automatic in the sense that they do not require the specification of starting guesses for the model parameters.
autoGaussianSurfML(xi,yi,zi) fits the model parameters through maximum likelihood(least-squares). It first evaluates the quality of the model at many possible values of the parameters then chooses the best set and refines it with lsqcurvefit.
autoGaussianSurfGS(xi,yi,zi) estimates the model parameters by specifying a Bayesian generative model for the data, then taking samples from the posterior ofthis model through Gibbs sampling. This method is insensitive to local minimain posterior and gives meaningful error bars (Bayesian confidence intervals) Platform: |
Size: 7168 |
Author:zzskzcau |
Hits:
Description: In this paper, we propose a Bayesian methodology for
receiver function analysis, a key tool in determining the deep structure
of the Earth’s crust.We exploit the assumption of sparsity for
receiver functions to develop a Bayesian deconvolution method as
an alternative to the widely used iterative deconvolution.We model
samples of a sparse signal as i.i.d. Student-t random variables.
Gibbs sampling and variational Bayes techniques are investigated
for our specific posterior inference problem. We used those techniques
within the expectation-maximization (EM) algorithm to
estimate our unknown model parameters. The superiority of the
Bayesian deconvolution is demonstrated by the experiments on
both simulated and real earthquake data. Platform: |
Size: 3350528 |
Author:张洋 |
Hits:
Description: PURPOSE: MCMC estimates Bayesian heteroscedastic AR(k) model
imposing stability restrictions using Gibbs sampling
y = b0 + y(t-1) b1 + y(t-2) b2 +,...,y(t-k) bk + E,
E = N(0,sige*V), sige = gamma(nu,d0), b = N(c,T),
V = diag(v1,v2,...vn), r/vi = ID chi(r)/r, r = Gamma(m,k)- PURPOSE: MCMC estimates Bayesian heteroscedastic AR(k) model
imposing stability restrictions using Gibbs sampling
y = b0+ y(t-1) b1+ y(t-2) b2+,...,y(t-k) bk+ E,
E = N(0,sige*V), sige = gamma(nu,d0), b = N(c,T),
V = diag(v1,v2,...vn), r/vi = ID chi(r)/r, r = Gamma(m,k) Platform: |
Size: 3072 |
Author:Jack |
Hits:
Description: 图像去噪-A Generative Perspective on MRFs in Low-Level Vision-A Generative Perspective on MRFs in Low-Level Vision
Markov random fields (MRFs) are popular and generic
probabilistic models of prior knowledge in low-level vision.
Yet their generative properties are rarely examined, while
application-specific models and non-probabilistic learning
are gaining increased attention. In this paper we revisit
the generative aspects of MRFs, and analyze the quality of
common image priors in a fully application-neutral setting.
Enabled by a general class of MRFs with flexible potentials
and an efficient Gibbs sampler, we find that common models
do not capture the statistics of natural images well. We
show how to remedy this by exploiting the efficient sampler
for learning better generative MRFs based on flexible potentials.
We perform image restoration with these models
by computing the Bayesian minimum mean squared error
estimate (MMSE) using sampling. This addresses a number
of shortcomings that have limited generative MRFs so far,
and le Platform: |
Size: 1216512 |
Author:孙文义 |
Hits: