Beta Phase:
Square45 is currently in beta testing. Expect some features or content to be incomplete or missing.
45
English
Français
العربية
Deutsch
🏠
/
Computer Science
/
Natural Language Processing
/
Cross-Entropy Loss
Cross-Entropy Loss
H(p, q) = - \sum_{i} p(i) log(q(i)) Measures the difference between two probability distributions, central to language model training objectives.