Hi, Has anyone posted a DeepSeek R1 that is abliterated in the 671B? THe largest model size I can find is 70B with the training wheels removed from haihai-ai.
Also I have had problems with those models checking the internet for newer data than the training data on topics. Is there an easy way to fix this?
Thanks
There doesnât seem to be anything that has been processed with 671B. I guess itâs because there arenât enough GPU resourcesâŚ
Also I have had problems with those models checking the internet for newer data than the training data on topics. Is there an easy way to fix this?
The model itself should not have this function (it has a function to call a search, but does not include the search function itself), so it is probably a problem with the software settings. I think it is being done by Llamacpp, Ollama, smolagents, or so. Letâs review their settings.
I am running ollama/llama with the defaults.
Can anyone out there help with finding a 671B abliterated model?
Hi Huihui-ai. Can you explain why you have not released an abliterated 671B model or if you are working on one? Is there anything the community could help you with?
It seems like theyâre trying to do itâŚ
There doesnât seem to be anything that has been processed with 671B.
v club
Congrats!