mesamune@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.comexternal-linkmessage-square87fedilinkarrow-up1483arrow-down113cross-posted to: futurology@lemmy.worldartificial_intel@lemmy.mltechnology@lemmy.mltech@pawb.socialtechnology@beehaw.org
arrow-up1470arrow-down1external-linkThe AI feedback loop: Researchers warn of ‘model collapse’ as AI trains on AI-generated contentventurebeat.commesamune@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square87fedilinkcross-posted to: futurology@lemmy.worldartificial_intel@lemmy.mltechnology@lemmy.mltech@pawb.socialtechnology@beehaw.org
minus-squaredanielbln@lemmy.worldlinkfedilinkEnglisharrow-up4·7 months agoMicrosoft’s Phi model was largely trained on synthetic data derived from GPT-4.
minus-squaregapbetweenus@feddit.delinkfedilinkEnglisharrow-up1·edit-27 months agoI’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.
Microsoft’s Phi model was largely trained on synthetic data derived from GPT-4.
I’m to lazy to search for the paper, not sure it was Microsoft, but with my rather basic knowledge of modeling (studied system biology) - it seemed rather crazy and impossible, so I remembered it.