• wagesj45@kbin.run
    link
    fedilink
    arrow-up
    5
    ·
    2 months ago

    Would be nice if their license changes for this one to something a little more FOSS-ish. Not a huge fan of usage restrictions.

    That being said, I wonder how it will compare to Mistral. I’ve been using the Mistral-7B-Instruct-v0.2 and have been absolutely blown away by its capabilities for its size. Makes me wonder how much more we can squeeze out of these smaller models.

    Actually, it doesn’t say anything about parameter size other than a guess. I hope they keep making these smaller models, though.

  • raldone01@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    Maybe they will have a 30-40b model that would be a nice compromise between capability and performance on my machine.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    2 months ago

    This is the best summary I could come up with:


    Meta has been scrambling to catch up to OpenAI, which took it and other big tech companies like Google by surprise when it launched ChatGPT over a year ago and the app went viral, turning generative AI questions and answers into everyday, mainstream experiences.

    Meta has largely taken a very cautious approach with AI, but that hasn’t gone over well with the public, with previous versions of Llama criticized as too limited.

    Most notably, Meta’s Llama families, built as open source products, represent a different philosophical approach to how AI should develop as a wider technology.

    In doing so, Meta is hoping to play into wider favor with developers versus more proprietary models.

    “Latency matters a lot along with safety along with ease of use, to generate images that you’re proud of and that represent whatever your creative context is,” Cox said.

    Ironically — or perhaps predictably (heh) — even as Meta works to launch Llama 3, it does have some significant generative AI skeptics in the house.


    The original article contains 603 words, the summary contains 168 words. Saved 72%. I’m a bot and I’m open source!