From: Enhancing recurrent neural network-based language models by word tokenization
Model
Perplexity
Entropy reduction (%)
GT5
113.473
–
KN3
99.1785
2.85
KN5
98.9021
2.9
Basic RNN
70.58
10.04
Proposed model
68.42
10.69