someone on Nostr: what did you say? it is a weird model. sometimes it says it doesn't know or nobody ...
what did you say?
it is a weird model. sometimes it says it doesn't know or nobody knows it. for example i asked "what does Nostr stand for?". It said "nobody knows". somehow it estimates that it does not know the answer?
that kind of answers did not happen with llama 3, it always hallucinated before accepting "it does not know".
Published at
2025-05-03 16:36:40Event JSON
{
"id": "6e9a4a79507a540e5ce65fe21411a3ec6151d8779d58d2fa7a00a4633653ab37",
"pubkey": "9fec72d579baaa772af9e71e638b529215721ace6e0f8320725ecbf9f77f85b1",
"created_at": 1746290200,
"kind": 1,
"tags": [
[
"p",
"e6a9a4f853e4b1d426eb44d0c5db09fdc415ce513e664118f46f5ffbea304cbc",
"wss://eden.nostr.land/",
"rossbates"
],
[
"e",
"4bdf27e50712c0573e7be91a0d4a3f15553b808dc9d5452dde6acc5be971ccf8",
"wss://nos.lol/",
"root",
"e6a9a4f853e4b1d426eb44d0c5db09fdc415ce513e664118f46f5ffbea304cbc"
]
],
"content": "what did you say?\n\nit is a weird model. sometimes it says it doesn't know or nobody knows it. for example i asked \"what does Nostr stand for?\". It said \"nobody knows\". somehow it estimates that it does not know the answer? \n\nthat kind of answers did not happen with llama 3, it always hallucinated before accepting \"it does not know\".",
"sig": "10d98ed298ee120a91699a9833dae81476000d69e39a4db5524389e3b5de376387a6f5df1a9703791921860686169ccf876742ee77fde7aff3aff63f48861590"
}