Calculates the cross entropy between two probability distributions P and Q, measuring the average number of bits needed to identify an event from P when using a coding scheme optimized for Q. Cross entropy is fundamental in machine learning loss functions and information theory.
Use Cases:
Formula: H(P,Q) = -Σ p(x) log(q(x))
Note: Cross entropy = Shannon entropy + KL divergence
Credits: 5 credits per request (Pro Tier)
API key for authentication. Get your key at https://finceptbackend.share.zrok.io/auth/register