This thought entered my mind today as I came across a thread on Quora, and noticed that they have added a feature where ChatGPT would have a go at answering the question.

Today alone I have used a few varying “AI” tools, including one which automatically paraphrases text for you, one which analyses your writing in SwiftKey as you type, and of course the big players like Bard and Bing Chat. It got me thinking about whether these features are actually valuable, and if we would start to see them on this platform.

  • Cosmiiko
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I don’t think so. Integrating with existing AI services equals direct privacy concerns to me.
    A solution would be a self-hosted model, but then that could cost a lot in machine resources for instance owners, especially bigger instances.

    Even then I’m having a hard time thinking of a useful usecase for an LLM integration on Lemmy (or even social media in general).