haxor@derp.fooMB to Hacker News@derp.fooEnglish · 9 个月前NY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comexternal-linkmessage-square10fedilinkarrow-up134arrow-down14file-text
arrow-up130arrow-down1external-linkNY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comhaxor@derp.fooMB to Hacker News@derp.fooEnglish · 9 个月前message-square10fedilinkfile-text
minus-squareLvxferrelinkfedilinkEnglisharrow-up8arrow-down2·9 个月前 2 seconds later someone can train a new one “Training” datasets: GPT-3 - 300? 500? billion tokens Bard - 1.5 trillion words GPT-4 - 13 trillion tokens Does this look like the amount of content that you’d get in two seconds??? Maybe they should learn to code like those coal miners they pitied. And maybe you should go back to Reddit.
“Training” datasets:
Does this look like the amount of content that you’d get in two seconds???
And maybe you should go back to Reddit.