Test Text Input
0
Total Tokens
0.00
Avg Log Prob
0.00
Cross-Entropy
Key Insights
- Enter text to see perplexity calculations
- Lower perplexity indicates better language modeling
- Compare different models on the same test text
Model Comparison
Understanding Perplexity
- Definition: 2^(cross-entropy) - measures model uncertainty
- Interpretation: Average number of equally likely next words
- Lower is better: Less confused by the text
- Typical ranges: N-grams: 50-200, Neural: 20-50, Modern LLMs: <10
- Comparison tool: Use same test set for fair evaluation