Different <s></s><unk> probabilities between kenlm and berkeleylm

160 Views Asked by At

I build ngram language model using kenlm and berkeleylm, but they give very different probability to token .

The kenlm gives:

ngram 1=164482
ngram 2=4355352
ngram 3=15629476

\1-grams:
-6.701107   <unk>   0
0   <s> -1.9270477
-1.8337007  </s>    0

while berkeleylm gives:

\data\
ngram 1=164481
ngram 2=4291478
ngram 3=15629476

\1-grams:
-99.000000  <s> -2.079426
-1.833699   </s>
and no <unk> token probability

I want to know why they handle these differently and how would these differences lead to different results?

0

There are 0 best solutions below