Cory Doctorow on Nostr: Dangle the incentive of profit before a market's teeming participants and they align ...
Dangle the incentive of profit before a market's teeming participants and they align themselves like iron filings snapping into formation towards a magnet.
But markets have a problem: they are prone to #RewardHacking. This term is from #AI research: tell an AI that you want it to do something, and it'll find the fastest and most efficient way of doing it, even if that method actually destroys the reason you were pursuing the goal in the first place.
https://learn.microsoft.com/en-us/security/engineering/failure-modes-in-machine-learning3/
Published at
2024-01-27 17:33:38Event JSON
{
"id": "1a34299457502bd989a063a0bce046017ecf1459ee2f5a36574f9df843247418",
"pubkey": "21856daf84c2e4e505290eb25e3083b0545b8c03ea97b89831117cff09fadf0d",
"created_at": 1706376818,
"kind": 1,
"tags": [
[
"e",
"18e20c3f3b01ff55d3af0a5cdf4eb1b17c24c7d25393391b311c83e2ce0548c5",
"wss://relay.mostr.pub",
"reply"
],
[
"t",
"rewardhacking"
],
[
"t",
"ai"
],
[
"content-warning",
"Long thread/3"
],
[
"proxy",
"https://mamot.fr/users/pluralistic/statuses/111829111150767401",
"activitypub"
]
],
"content": "Dangle the incentive of profit before a market's teeming participants and they align themselves like iron filings snapping into formation towards a magnet.\n\nBut markets have a problem: they are prone to #RewardHacking. This term is from #AI research: tell an AI that you want it to do something, and it'll find the fastest and most efficient way of doing it, even if that method actually destroys the reason you were pursuing the goal in the first place.\n\nhttps://learn.microsoft.com/en-us/security/engineering/failure-modes-in-machine-learning\n\n3/",
"sig": "70901db4c577dc735adf73a53d0089f05f8f0ff8b6dc8dca6624fe6a75e2e1d4d7c633cb41f4974c8308c0f65210708f9c7d5d59ec1a36daa450e848e21910d3"
}