Calculates the conditional entropy H(Y|X) from a joint probability distribution, measuring the remaining uncertainty in Y given knowledge of X. Conditional entropy quantifies how much information Y provides beyond what’s known from X, crucial for understanding predictive relationships.
Use Cases:
Formula: H(Y|X) = H(X,Y) - H(X) = -ΣΣ p(x,y) log(p(y|x))
Credits: 5 credits per request (Pro Tier)
API key for authentication. Get your key at https://finceptbackend.share.zrok.io/auth/register