Welcome![Sign In][Sign Up]
Location:
Search - SHANNON ENTROPY

Search list

[Otherentropy

Description: shannon Tsallis escort Tsallis renyi entropy and their relative entropy
Platform: | Size: 1962 | Author: guanwenye | Hits:

[Otherentropy

Description: shannon Tsallis escort Tsallis renyi entropy and their relative entropy
Platform: | Size: 2048 | Author: guanwenye | Hits:

[Windows Developentropy

Description: The functions include extensive Shannon and nonextensive Tsallis,escort Tsallis,and renyi entropy. the funcition names start with K_q_ indicate relative entropys Usage for all the seven functions:
Platform: | Size: 2048 | Author: 冯小晶 | Hits:

[Program docshannon-report

Description: 这是一篇英文文献,文章题目是Claude Shannon and "A Mathematical Theory of Communication".是香农的《通信的数学理论》一文的读书报告。内容包括:香农的生平背景、消息、信息、熵函数、信源编码、信道编码等。-This is an English literature article entitled Claude Shannon and " A Mathematical Theory of Communication" . Are Shannon' s " Mathematical Theory of Communication," one man, reading the report. Include: the context of Shannon' s life, message, information, entropy function, source coding, channel coding.
Platform: | Size: 259072 | Author: yuan | Hits:

[Program docInformationTheoryAndCodingTechniques

Description: 信息论是由通信技术、概率论、随机过程和数理统计等相结合逐步发展而形成的一门新兴科学。其奠基人是美国数学家香农,他的《通信的数学原理》奠定了信息论的基础。 这是信息论与编码技术的教学课件。压缩包内含有6个ppt文件,每个ppt自成一章:1绪论、2信源及其熵、3信道及其容量、4信息率失真函数、5信源编码、6信道编码。是初学信息论的最好材料。-Information communication technology is, probability theory and mathematical statistics, stochastic processes, such as the combination formed by the progressive development of an emerging science. The founders of the United States Shannon mathematician, his " Mathematical Theory of Communication" has laid a foundation of information theory. This is the information theory and coding technology courseware. Compressed packet contains 6 ppt documents, each a separate chapter ppt: 1 Introduction, 2 source and its entropy, and its 3-channel capacity, information rate-distortion function 4, 5, source coding, channel coding 6. Information theory is the best learning materials.
Platform: | Size: 3452928 | Author: yuan | Hits:

[Special EffectsMaximumShannoninformationentropysegement

Description: Maximum Shannon information entropy的图像分割,是结合概率统计的方法进行的分割,理论上可以得到图像分割的更多细节。打开即可使用-Maximum Shannon information entropy of the image segmentation, is a combination of the methods of probability and statistics division, in theory, image segmentation can get more details. Open to use
Platform: | Size: 31744 | Author: 南风吹 | Hits:

[MacOS developToolboxv1.0

Description: Joaquin Goñ i <jgoni@unav.es> & Iñ igo Martincorena <imartincore@alumni.unav.es> University of Navarra - Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). December 13th, 2007. Information Theory Toolbox v1.0 The Toolbox includes: -entropy -conditional entropy -mutual information -redundancy -symmetric uncertainty -Kullback-Leibler divergence -Jensen-Shannon divergence type help for each command to get a detailed description Citation: If you use them for your academic research work,please kindly cite this toolbox as: Joaquin Goñ i, Iñ igo Martincorena. Information Theory Toolbox v1.0. University of Navarra - Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). -Joaquin Goñ i <jgoni@unav.es> & Iñ igo Martincorena <imartincore@alumni.unav.es> University of Navarra- Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). December 13th, 2007. Information Theory Toolbox v1.0 The Toolbox includes: -entropy -conditional entropy -mutual information -redundancy -symmetric uncertainty -Kullback-Leibler divergence -Jensen-Shannon divergence type help for each command to get a detailed description Citation: If you use them for your academic research work,please kindly cite this toolbox as: Joaquin Goñ i, Iñ igo Martincorena. Information Theory Toolbox v1.0. University of Navarra- Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain).
Platform: | Size: 16384 | Author: le thanh tan | Hits:

[Software Engineeringentropy

Description: 这个函数包含广泛的和非广泛的香农Tsallis、escort Tsallis,和renyi 熵. -This function contains a wide range of broad-based and non-Shannon, Tsallis, escort Tsallis, and renyi entropy. The functions include extensive Shannon and nonextensive Tsallis, escort Tsallis, and renyi entropy. The funcition names start with K_q_ indicate rel
Platform: | Size: 3072 | Author: SHIJIA | Hits:

[Special Effectsentropy

Description: The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
Platform: | Size: 2048 | Author: jeremy | Hits:

[matlabfilter1

Description: 高通滤波器,带通滤波器,数字FIR滤波器-The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
Platform: | Size: 1024 | Author: jeremy | Hits:

[matlabMyEntropy

Description: 计算香浓信息熵,方便快捷的matlab代码,可以简单直接调用-Calculated Shannon information entropy, convenient matlab code can be simple and direct calls
Platform: | Size: 1024 | Author: wains | Hits:

[matlabmipbrink_thresh

Description: This function returns the optimal threshold value using the 2-D Shannon entropy method
Platform: | Size: 1024 | Author: ugur | Hits:

[OtherGray-entropy-one-threshold-

Description: 灰度熵与传统shannon熵的区别在于,灰度熵考虑了图像中目标和背景类内灰度级的均匀性,有较好的分割效果。-The difference of Gray entropy and shannon entropy is that the gray entropy considers the target and the background class grayscale image uniformity, and has better segmentated results.
Platform: | Size: 470016 | Author: 戚齐 | Hits:

[Special Effectsexponential-entropy

Description: 图像分割-最大指数熵阈值分割方法,性能优于最大Shannon熵-Image segmentation- maximum exponential entropy threshold segmentation method,better than Shannon entropy
Platform: | Size: 6144 | Author: 殷骏 | Hits:

[matlabspectrum-Shannon-entropy

Description: 信号的奇异谱香农熵和奇异谱指数熵 matlab 源程序-Signal singular spectrum Shannon entropy and singular entropy source spectral index
Platform: | Size: 1024 | Author: 江燕 | Hits:

[matlabpower-spectrum-Shannon-entropy

Description: 信号的功率谱香农熵和功率谱指数熵 matlab 源程序-Signal power spectrum Shannon entropy and entropy matlab source power spectral index
Platform: | Size: 1024 | Author: 江燕 | Hits:

[Software EngineeringEntropy

Description: This the source code completly made my me of the function entropy. It is used in various mathematics functions. it is the code for shannon entropy used in physics. Even it is used in soft computing subjects by computer science people. It gives the amount of disorder in the variable having different values obtained at different times. It is very usefull and easily understandable to those who understand the formula for entropy.-This is the source code completly made my me of the function entropy. It is used in various mathematics functions. it is the code for shannon entropy used in physics. Even it is used in soft computing subjects by computer science people. It gives the amount of disorder in the variable having different values obtained at different times. It is very usefull and easily understandable to those who understand the formula for entropy.
Platform: | Size: 1024 | Author: Arpita | Hits:

[matlabShannon-entropy-calculation

Description: 本程序可以很快的计算香农熵,计算较快,只需添加原始信号- This procedure can quickly calculate the Shannon entropy, the calculation is fast, just add the original signal
Platform: | Size: 2048 | Author: 高文宾 | Hits:

[matlabentropy

Description: 计算香浓熵,用于信号处理,表征信号混乱度。试过了,可以正常使用(Shannon entropy is calculated for signal processing and signal confusion is calculated. Tried, can be used normally)
Platform: | Size: 1024 | Author: zack9017 | Hits:

[AI-NN-PRentropy

Description: 求解信号的香农熵和指数熵,分别从功率谱和奇异谱的角度求解(The Shannon entropy and exponential entropy of signals are obtained.)
Platform: | Size: 1024 | Author: spidereason | Hits:
« 12 3 »

CodeBus www.codebus.net