- cross-posted to:
- fosai@lemmy.world
- localllama@sh.itjust.works
- cross-posted to:
- fosai@lemmy.world
- localllama@sh.itjust.works
Original Mistral AI blog: https://mistral.ai/news/mixtral-of-experts/
You must log in or register to comment.
“Eclipses” is a strong word to describe the relationship and there’s a reason they compared it to 3.5. Regardless, it’s impressive and exciting!
It’s amazing how quickly this space is moving, and it’s even more amazing that a lot of these models can run on consumer hardware, too.
I’m kinda getting FOMO that I bought an 8GB AMD graphics card last year. Maybe I should have spent a bit more to get a 24GB NVidia card and 64GB of RAM.
32k tokens seems to be approaching a usable space.
I tried MistralOrca and, while it was impressive, the token limit prevented it from being immediately useful.