Perplexity measures how well a probability model predicts a sample. In NLP, perplexity is one method of evaluating language models (LMs). Perplexity is the exponential of the entropy.

What is entropy? Entropy is the average number of bits required to encode the information contained in the random variable. The exponential of entropy is, therefore, the total amount of all possible information.

A LM is a probability distribution over sentences. Perplexity is the inverse probability of the test set, normalised by the number of words.

Ryan

Ryan

Data Scientist

Leave a Reply