Event JSON
{
"id": "a7f52f6db6f00149a2bd22eb8ebc1fa1528cc6125e73e0af7595d94e5c03c7a3",
"pubkey": "8179879e743ecc0b539b67420e7dc29a1f097751a00fa1c74d3cea319465223b",
"created_at": 1721034988,
"kind": 1,
"tags": [
[
"p",
"1eb38365eec2cd3d769fe440fc65299612da1292aa0c29902a6d4207c3879e18"
],
[
"p",
"91d84ea8918e897526c606f21427cfafe3f336835837ff21e99b4dee6c7fc294"
],
[
"e",
"eb00f32652f70140419b25d8c924ea6e152b80537de9729e04487e18bb8afa6f",
"",
"root"
],
[
"e",
"402f5f7134c7afee045a24fb4315d5d2a9fd44bd1f2b141d7cb5255f57e0feb9",
"",
"reply"
],
[
"proxy",
"https://manganiello.social/objects/fda7d494-25a3-4cf7-af65-2610dba28806",
"activitypub"
],
[
"L",
"pink.momostr"
],
[
"l",
"pink.momostr.activitypub:https://manganiello.social/objects/fda7d494-25a3-4cf7-af65-2610dba28806",
"pink.momostr"
],
[
"-"
]
],
"content": "this is actually an interesting problem also from a technical implementation point of view.\n\nIf you want to share aggregated/fuzzied data with downstream pipelines, then at some point in your data flow (probably as close as possible to the generation of the raw data) you'll necessarily need a system that collects the raw records and aggregates them.\n\nDifferential privacy by definition destroys information. You may be clever and introduce noise in the original records in such a way that they are not uniquely identifiable, while preserving the value of the aggregated information.\n\nI'm curious if other implementations have found other clever ways around this problem. But it may as well be the case that distributed aggregated data networks just can never be 100% private - for the same reason why a Tor network can never be 100% private: in both systems you'll always need a node that can access the raw information in some format.",
"sig": "6267918c02b8c430f4304afe53754ce4c15e3d889d0701dd3cc0ac45ee14819579e88ca6b5928bd7c8a67262fe78a3819abf37b435a1940e836f2b98bf15f57d"
}