Cross-Entropy
Cross-entropy is a pivotal concept in both information theory and machine learning, serving as a metric to measure the divergence between two probability distributions. In machine learning, it is used as a loss function to quantify discrepancies between predicted outputs and true labels, optimizing model performance, especially in classification tasks.
•
4 min read