Model Collapse
Degradation in AI quality when models are trained on AI-generated content.
Definition
Model collapse occurs when AI systems trained on AI-generated content progressively degrade in quality. Each generation amplifies errors and loses diversity, like a copy of a copy gradually losing fidelity.
This phenomenon threatens future AI development as synthetic content increasingly pollutes training data.
Why It Matters
Understanding model collapse explains why high-quality human-created content remains valuable for AI development.
Organizations relying on AI-generated content should understand these long-term ecosystem implications.
Examples in Practice
Researchers demonstrate that AI image generators trained on AI images produce increasingly distorted outputs over generations.
A content platform implements verification to distinguish human and AI content, preserving training data quality.
AI developers invest in curating verified human-created datasets to avoid model collapse in future training.