Beta Phase: Square45 is currently in beta testing. Expect some features or content to be incomplete or missing.
45

Cross-Entropy Loss

H(p, q) = - \sum_{i} p(i) log(q(i)) Measures the difference between two probability distributions, central to language model training objectives.