Skip to main content🚀 100 of 100 founding spots left, lock in 40% off forever →
Humanize Me
← Glossary

Perplexity

Perplexity measures how surprising the next word in a text is, given the previous words. Lower means more predictable; higher means more unexpected.

Why AI text has low perplexity

Language models pick the most likely next token most of the time. The output sounds fluent because each word is a confident choice, and detectors see that confidence as a fingerprint. Human writers occasionally pick a low-probability word, change direction mid-sentence, or leave something slightly awkward. That irregularity raises perplexity and is one of the primary signals detectors use to call text "human-written."

How to raise perplexity in your draft

Related terms