I’m not talking about the speed I’m talking about the quality of output. I don’t think you understand how these models are transformed into “uncensored models” but a lot of the time using abliteration messed then up.
Buddy I have a running and been testing 7b and 14b compared to the cloud deepseek. Any sources, any evidence to back what you’re saying? Or just removed and complaining?
Kccp, hugging face, grab a model that fits your vram in gguf format. I think two clicks after downloaded.
I know how to download and run models, what I’m saying is, all the “uncensored” deepseek models are abliterated and perform worse
It’s the same model, your pc just sucks lmfao
I’m not talking about the speed I’m talking about the quality of output. I don’t think you understand how these models are transformed into “uncensored models” but a lot of the time using abliteration messed then up.
Buddy I have a running and been testing 7b and 14b compared to the cloud deepseek. Any sources, any evidence to back what you’re saying? Or just removed and complaining?
I’m not talking about the cloud version at all. I’m talking about the 32b and 14b models vs ones people have “uncensored”.
I was hoping someone knew of an “uncensored” version of deepseek that was good, that could run locally, because I haven’t seen one.
I don’t know what you mean by “removed”.