Why Nostr? What is Njump?
2024-09-07 11:02:24

Gert :debian: :gnu: :linux: on Nostr: “…indiscriminate use of model-generated content in training causes irreversible ...

“…indiscriminate use of model-generated content in training causes irreversible defects in the resulting models, in which tails of the original content distribution disappear. We refer to this effect as ‘model collapse’ and show that it can occur in LLMs as well as in variational autoencoders (VAEs) and Gaussian mixture models (GMMs)”

#ai
https://www.nature.com/articles/s41586-024-07566-y
Author Public Key
npub1c2c6lgjxax5pnqp4t8fempdgupzdhvvvchnyae2xnyrf5vpa07zqjdvehu