• kitnaht@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    15 hours ago

    I’ve found that 4o is substantially worse than the previous model at a ton of things. So I run all of my LLMs locally now through OLLAMA.