Why Nostr? What is Njump?
2024-06-11 09:53:35

Laeserin on Nostr: Feeling cute. Might invent a new branch of mathematics called "mathstr". ...

Feeling cute. Might invent a new branch of mathematics called "mathstr".

https://wikistr.com/social-media-entropy*dd664d5e4016433a8cd69f005ae1480804351789b59de5af06276de65633d319
# Social Media Entropy

## Overview

**Social media entropy** (SME) is a form of [[information entropy]] that measures how high or low the "surprise" factor is in someone's social media content. That is, how easy is it to predict the next thing that they will produce, based upon the past things that they have produced.

Unlike with normal perplexity measures, which focus on large, heterogeneous data sets, the SME is only measuring within the content produced by one particular author.

Posts with low entropy are highly predictable because they are repetitive. Longer posts would generally have higher entropy (as the longer text allows for more variation), but this would be reduced in anything AI-generated.

Similarly, posters who emit consistently low-entropy content could more easily have their audience co-opted by automation (bots).

## Example

A Nostr [[npub]] who posted notes that mostly consisted of some variation of

> GM :-)

or

> Stack sats!

might have an SME of circa 5%.

0% entropy would be blank content.
1-4% entropy might be exactly-identical content in all of the notes.
Long, complex, human-produced notes, covering a wide and disjointed number of topics would generally have the highest SME.
Author Public Key
npub1m4ny6hjqzepn4rxknuq94c2gpqzr29ufkkw7ttcxyak7v43n6vvsajc2jl