From: Enhancing recurrent neural network-based language models by word tokenization
Model | Correction accuracy | Average (%) | ||
---|---|---|---|---|
Test set 1 (%) | Test set 2 (%) | Test set 3 (%) | ||
3-gram+ GT3 | 72.65 | 70.85 | 69.03 | 70.84 |
RNNLM | 74. 5 | 73.85 | 71.07 | 73.14 |
Proposed model | 76.03 | 75.36 | 71.50 | 74.30 |