• Kerfuffle@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    If we’re talking about something like LLaMA (i.e. people can run the model locally) then it’s impossible to do that directly. A model can’t collect data, metrics, phone home, anything like that by itself. The article sounds like it’s talking about that kind of thing, not providing a service that people can access a model through (in the lines of ChatGPT).