And that, my dear, would seem fairly evident — but now also fairly proved:

“We find that use of model-generated content in training causes irreversible defects in the resulting models.” Specifically looking at probability distributions for text-to-text and image-to-image AI generative models, the researchers concluded that “learning from data produced by other models causes model collapse — a degenerative process whereby, over time, models forget the true underlying data distribution … this process is inevitable, even for cases with almost ideal conditions for long-term learning.”

Researchers Warn of ‘Model Collapse’ As AI Trains On AI-Generated Content

© Henning Bertram 2023