Entropy is a concept related to uncertainty or disorder. The Spectral Entropy of a signal is based on Shannon’s entropy [48] from information theory, and it measures the irregularity or complexity of digital signals in the frequency domain. After performing a Fourier Transform, the signal is converted into a power spectrum, and the information entropy of the latter represents the Power Spectral Entropy of the signal [49]. Consider to be a random variable and its respective probability, then Shannon Entropy can be calculated as follows:
The Spectral Entropy treats the signal’s normalized power distribution in the frequency domain as a probability distribution, and calculates the Shannon Entropy of it. Therefore, the Shannon Entropy in this context is the Spectral Entropy of the signal if we consider to be the probability distribution of a power spectrum:
is the power spectral density, which is equal to the absolute value of the signal’s Discrete Fourier Transform.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.
Tips for asking effective questions
+ Description
Write a detailed description. Include all information that will help others answer your question including experimental processes, conditions, and relevant images.