In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable’s potential states or possible outcomes.
This measures the expected amount of information needed to describe the state of the variable, considering the distribution of probabilities across all potential states.
Given a discrete random variable which takes values in the set and is distributed according to , the entropy is
where denotes the sum over the variable’s possible values.
This might seem quite unusual, but I personally had occasions where I used Entropy as a proxy to measure some aspects of users behaviour.
The following DAX code snippets are good examples of how to implement Entropy and use it in Power BI
The previous snippets compute the overall entropy of the values in the column. To calculate the individual entropy of each distinct value, you can use the following DAX formula :