Adjusted mutual information (AMI)64 is introduced to measure discrepancies in clustering results between neighboring configurations.
One assumes that Ut is the clustering result at time step t and Ut+1 at time step t + 1. Their entropy is the amount of uncertainty for a partition set, as defined by:
where P(i) is the probability that an object picked at random from U falls into class Ui.
The probability is defined as
The mutual information (MI) between Ut and Ut+1 is calculated using
Normalized against chance, AMI can then be calculated:
AMI ranges from 0 to 1. If the value of AMI is close to zero, it indicates that two clustering results are largely independent. An AMI of exactly 1 indicates that two clustering results are equal.
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.