Calculates the Shannon entropy of a discrete probability distribution, measuring the average information content or uncertainty. Shannon entropy is fundamental in information theory and is used to quantify the uncertainty in portfolio returns, market regimes, or trading signals. Higher entropy indicates greater uncertainty or diversity in outcomes.
Use Cases:
Formula: H(X) = -Σ p(x) log(p(x))
Credits: 5 credits per request (Pro Tier)
API key for authentication. Get your key at https://finceptbackend.share.zrok.io/auth/register