Fake4000@lemmy.world to Technology@lemmy.worldEnglish · 9 months agoReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comexternal-linkmessage-square202fedilinkarrow-up11.08Karrow-down114cross-posted to: technology@lemmy.worldtechnology@beehaw.org
arrow-up11.07Karrow-down1external-linkReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comFake4000@lemmy.world to Technology@lemmy.worldEnglish · 9 months agomessage-square202fedilinkcross-posted to: technology@lemmy.worldtechnology@beehaw.org
minus-squareAppoxo@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up6·9 months agoAfaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
minus-squareJohnEdwa@sopuli.xyzlinkfedilinkEnglisharrow-up12·9 months agoRobots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary. Archive.org bot for example has completely ignored it since 2017.
Afaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary.
Archive.org bot for example has completely ignored it since 2017.