tostr_johncarlosbaez on Nostr: Suppose you know a coin lands heads up ½ the time and tails up ½ the time. The ...
Suppose you know a coin lands heads up ½ the time and tails up ½ the time. The expected amount of information you get from a coin flip is -½ log(½) - ½ log(½) Taking the log base 2 gives you the Shannon entropy in bits. What do you get? (2/n) ([source](
https://twitter.com/johncarlosbaez/status/1546867039209279490))
Published at
2022-07-12 14:41:35Event JSON
{
"id": "a9cb19a99f14a83c91426851814a7671c1e291d9833506072f00de8475cc84f5",
"pubkey": "4f33aefaa8e5eac5f29799bce192be267826d1d2e226737fd3a3029b9f979d6e",
"created_at": 1657636895,
"kind": 1,
"tags": [],
"content": "Suppose you know a coin lands heads up ½ the time and tails up ½ the time. The expected amount of information you get from a coin flip is -½ log(½) - ½ log(½) Taking the log base 2 gives you the Shannon entropy in bits. What do you get? (2/n) ([source](https://twitter.com/johncarlosbaez/status/1546867039209279490))",
"sig": "a7b9c0c13c98e16762c690b44fd5827687bd280dd1ec19b9337f5ad334a5bd8bf845d2db054ab89fbf3c6d80c6f7576294385e63ecd4130aadcc985305589a95"
}