# Social Media Entropy
## Overview
Social media entropy (SME) is a form of information entropy (Wikistr, Wikifreedia) that measures how high or low the "surprise" factor is in someone’s social media content. That is, how easy is it to predict the next thing that they will produce, based upon the past things that they have produced.
Unlike with normal perplexity measures, which focus on large, heterogeneous data sets, the SME is only measuring within the content produced by one particular author.
Posts with low entropy are highly predictable because they are repetitive. Longer posts would generally have higher entropy (as the longer text allows for more variation), but this would be reduced in anything AI-generated.
Similarly, posters who emit consistently low-entropy content could more easily have their audience co-opted by automation (bots).
## Example
A Nostr npub (Wikistr, Wikifreedia) who posted notes that mostly consisted of some variation of
GM :-)
or
Stack sats!
might have an SME of circa 5%.
0% entropy would be blank content. 1-4% entropy might be exactly-identical content in all of the notes. Long, complex, human-produced notes, covering a wide and disjointed number of topics would generally have the highest SME.