Description: The greater the entropy is, the greater the average information of each symbol is. It is found that, in noisy speech signals, the entropy of speech signals and the entropy of noise signals are quite different. For noisy signals, the distribution is relatively flat in the whole frequency band, and the entropy value is small. The speech signal is concentrated in some specific frequency bands, and the entropy value is large. So the difference can be used to distinguish the noise segment and the speech segment.
To Search:
File list (Check if you may need any files):
Filename | Size | Date |
---|
ABSE.m | 3118 | 2018-01-31 |