Here’s why turning to AI to train future AIs may be a bad idea

ChatGPT, Gemini, Copilot and other AI tools whip up impressive sentences and paragraphs from as little as a simple line of text prompt. To generate those words, the underlying large language models were trained on reams of text written by humans and scraped from the internet. But now, as generative AI tools flood the internet with a large amount of synthetic content, that content is being used to train future generations of those AIs. If this continues unchecked, it could be disastrous, researchers say.

Training large language models on their own data could lead to model collapse, University of Oxford computer scientist Ilia Shumailov and colleagues argued recently in

→ Continue reading at Science News

More from author

Related posts

Advertisment

Latest posts

Comcast’s cable networks spinoff raises questions about future of the TV business | CNN Business

New York CNN  —  Comcast just jump-started a new season of major change across the television industry. And no...

Ford to cut 4,000 jobs in Europe | CNN Business

London CNN  —  Ford plans to cut almost 4,000 jobs in Europe over the next three years, about 14%...

McDonald’s has temporarily stopped selling espresso drinks at some US locations | CNN Business

New York CNN  —  A machine is giving McDonald’s a headache, and this time it’s not the one that...