Hi has anyone posted a Deepseek R1 that is abliterated in the 671B?

Hi, Has anyone posted a DeepSeek R1 that is abliterated in the 671B? THe largest model size I can find is 70B with the training wheels removed from haihai-ai.
Also I have had problems with those models checking the internet for newer data than the training data on topics. Is there an easy way to fix this?
Thanks

2 Likes

There doesn’t seem to be anything that has been processed with 671B. I guess it’s because there aren’t enough GPU resources…

Also I have had problems with those models checking the internet for newer data than the training data on topics. Is there an easy way to fix this?

The model itself should not have this function (it has a function to call a search, but does not include the search function itself), so it is probably a problem with the software settings. I think it is being done by Llamacpp, Ollama, smolagents, or so. Let’s review their settings.

1 Like

I am running ollama/llama with the defaults.

1 Like

Can anyone out there help with finding a 671B abliterated model?

1 Like

Hi Huihui-ai. Can you explain why you have not released an abliterated 671B model or if you are working on one? Is there anything the community could help you with?

1 Like

It seems like they’re trying to do it…

1 Like

There doesn’t seem to be anything that has been processed with 671B.
v club

1 Like

Congrats!