Interactive Loss Calculator

Choose what the model predicts:
mat
chair
table
floor
elephant
80%
Cross-Entropy Loss:
L = -log(p) = -log(0.8) = 0.223

Current Prediction Scenario

Prediction: "mat"
Actual correct word: "mat"
✅ Correct prediction!
80%
Loss: 0.223

Try Different Scenarios

Click different words above and adjust confidence to see how loss changes:

  • High confidence + correct = Low loss (good!)
  • High confidence + wrong = High loss (bad!)
  • Low confidence + correct = Medium loss
  • Low confidence + wrong = Medium loss

Key Insights

  • Lower loss means better predictions
  • Cross-entropy heavily penalizes confident wrong predictions
  • The model learns by trying to minimize this loss

Loss Behavior Visualization

Example Scenarios

Why Cross-Entropy?

  • Smooth gradient: Provides clear direction for improvement
  • Penalizes confidence: Wrong predictions with high confidence get heavily penalized
  • Probabilistic: Works naturally with softmax probability outputs
  • Information theoretic: Measures how "surprised" the model is by the correct answer