voidx@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agoAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comexternal-linkmessage-square34fedilinkarrow-up1189arrow-down14cross-posted to: web3_xr@sh.itjust.worksanerdydystopia@sh.itjust.worksbecomeme@sh.itjust.works
arrow-up1185arrow-down1external-linkAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comvoidx@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agomessage-square34fedilinkcross-posted to: web3_xr@sh.itjust.worksanerdydystopia@sh.itjust.worksbecomeme@sh.itjust.works
minus-squareCanadaPlus@lemmy.sdf.orglinkfedilinkEnglisharrow-up4·edit-27 months agoWell, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.
Well, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.